<?xml version="1.0" encoding="UTF-8"?><xml><records><record><source-app name="Biblio" version="7.x">Drupal-Biblio</source-app><ref-type>47</ref-type><contributors><authors><author><style face="normal" font="default" size="100%">Hassan Abu Alhaija</style></author><author><style face="normal" font="default" size="100%">Sellent, Anita</style></author><author><style face="normal" font="default" size="100%">Kondermann, Daniel</style></author><author><style face="normal" font="default" size="100%">Carsten Rother</style></author></authors></contributors><titles><title><style face="normal" font="default" size="100%">Graphflow—6D large displacement scene flow via graph matching</style></title><secondary-title><style face="normal" font="default" size="100%">Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)</style></secondary-title></titles><dates><year><style  face="normal" font="default" size="100%">2015</style></year></dates><volume><style face="normal" font="default" size="100%">9358</style></volume><pages><style face="normal" font="default" size="100%">285–296</style></pages><isbn><style face="normal" font="default" size="100%">9783319249469</style></isbn><language><style face="normal" font="default" size="100%">eng</style></language><abstract><style face="normal" font="default" size="100%">We present an approach for computing dense scene flow from two large displacement RGB-D images. When dealing with large displacements the crucial step is to estimate the overall motion correctly. While state-of-the-art approaches focus on RGB information to establish guiding correspondences, we explore the power of depth edges. To achieve this, we present a new graph matching technique that brings sparse depth edges into correspondence. An additional contribution is the formulation of a continuous-label energy which is used to densify the sparse graph matching output. We present results on challenging Kinect images, for which we outperform state-of-the-art techniques.</style></abstract></record></records></xml>