Information-Driven Direct RGB-D Odometry - CVPR 2020 (oral)

Information-Driven Direct RGB-D Odometry - CVPR 2020 (oral)
Results for the paper:
 
Alejandro Fontán, Javier Civera, Rudolph Triebel
Information-Driven Direct RGB-D Odometry
CVPR 2020 (oral)
 
Paper draft:
https://vision.in.tum.de/_media/spezial/bib/fontan20information.pdf
 
Project web:
https://rmc.dlr.de/rm/en/staff/alejandro.fontanvillacampa/IDNav
 
Abstract:
This paper presents an information-theoretic approach to point selection for direct RGB-D odometry. The aim is to select only the most informative measurements, in order to reduce the optimization problem with a minimal impact in the accuracy. It is usual practice in visual odometry/SLAM to track several hundreds of points, achieving real-time performance in high-end desktop PCs. Reducing their computational footprint will facilitate the implementation of odometry and SLAM in low-end platforms such as
small robots and AR/VR glasses. Our experimental results show that our novel information-based selection criteria allows us to reduce the number of tracked points an order of magnitude (down to only 24 of them), achieving an accuracy similar to the state of the art (sometimes outperforming it) while reducing 10x the computational demand.
Duration:00:06:49