PhD Thesis

Event-based simultaneous localization and mapping

Work default illustration


  • Started: 12/09/2018
  • Thesis project read: 10/10/2019


In order to exploit the advantages of the event cameras and apply them to the mobile robotics field, our research will concentrate on solving the localization and mapping problem by using the asynchronous information provided by these cameras. To this end, we will accurate track the six degrees of freedom (6DoF) of the camera motion -for high-speed applications-, and recover the 3D scene information. The aim is to develop an integrated event based SLAM device that works with two parallel threads of tracking and mapping in real time and for highly dynamic and challenging lightning conditions. This approach is unconventional and distinguishes itself from conventional camera algorithms due to its non-dependence of image intensity values and in the asynchronous nature of events; hence, novel algorithms for depth estimation, spatiotemporal
feature extraction, feature aggregation and tracking need to be studied. The fast responses of the event camera allow measurements in the range of the MHz; thus, depending of the solution implemented, the estimations, computations and overall results for tracking and mapping can be
performed with no delay.

The work is under the scope of the following projects:

  • EB-SLAM: Event-based simultaneous localization and mapping (web)