PhD Thesis

Event-based SLAM

Work default illustration

Information

  • Started: 12/09/2018
  • Finished: 17/02/2023

Description

Event-based cameras are novel sensors with a bio-inspired design that exhibit a high dynamic range and extremely low latency. Their sensing principle is different from the one of conventional cameras. These novel sensors generate asynchronous streams of events triggered proportionally to the logarithmic pixel-level change of brightness. Each event is timestamped with a micro-second resolution and transmitted as soon as it is fired. Thus, event cameras convey non-redundant information and capture fast motion. These assets make them suitable for high dynamics and challenging lighting applications without image blurring, over, or underexposure problems. The event camera’s operational principle requires new algorithms to deal with the image intensity’s absence and leverage its fast asynchronous response.
This thesis explores new event-based monocular solutions to the localization and mapping problem (SLAM) in human-made scenarios. Concerning the event camera’s natural edge response, line features were evaluated throughout this thesis’s chapters. A high-speed camera pose tracking approach was proposed that exploits the fast response of an event camera to estimate its 6 degrees of freedom accurately. Lines are efficiently matched to the events with a proposed fast data-association mechanism. In the tracking formulation, event-per-event and window-of-events representations were analyzed to identify their advantages and real-time implementation feasibility without introducing speed bottlenecks. Additionally, several estimation variants were implemented regarding measurement and motion models.
This thesis also analyzes sensor fusion methodologies to combine events with inertial data. The inertial measurements come at a lower rate than events; thus, the objective is to avoid conventional inertial integration and cooperate with the event data to correct state parameters. The sensorial fusion allowed accurate estimations but might slow down and constrain the motion dynamics and event-data rate to the inertial sensor capabilities.
Finally, the proposed mapping approach leverages the natural edge highlighting of events to recover and optimize straight lines in human-made scenarios. Line-based reconstructed maps have better representativeness for some tasks than point clouds since line features provide a notion of connectivity, boundary, and a sense of neighborhood. In this work, mapping and tracking strategies are executed in parallel threads that add to the map new detected lines as the camera moves while updating the old ones with incoming observations. All variants tested in this thesis for tracking and sensor fusion are compatible with the mapping thread and can be combined freely as best suited for some given experimental conditions.
The main contributions of this work are three-folded: (1) event-based camera pose tracking aiming at high dynamics and challenging lighting conditions, (2) event data and inertial measurements fusion with the primary objective of preserving the event camera’s fast response characteristic, and (3) event-based tracking and mapping system with line features in real- time for human-made environments. It is shown that monocular-based event-line SLAM can perform in challenging scenarios with a high level of accuracy and real-time performance, where conventional frame-based methods cannot operate.
These thesis contributions were validated through extensive experimentation with real data in different scenarios and with different degrees of motion aggressiveness. Appropriate comparisons against state-of-the-art SLAM approaches were included.

The work is under the scope of the following projects:

  • MdM: Unit of Excellence María de Maeztu (web)
  • EB-SLAM: Event-based simultaneous localization and mapping (web)
  • EBCON: Motion estimation and control with event cameras (web)