Research Project

EBCON: Motion estimation and control with event cameras


National Project

Start Date


End Date


Project Code


Project illustration


Project Description

Project PID2020-119244GB-I00 funded by MCIN/ AEI /10.13039/501100011033

Agile motion control for high-dynamic robotics applications relies on two key elements: a very fast and reliable system to accurately compute motion estimates, and a fast and reliable mechanism to compute and execute motion control commands.
Event cameras are bio-inspired silicon retinas that detect independently the change of luminance in each pixel, and produce an asynchronous feed of pixel coordinates where there has been change, called events. Since no frame or image is produced, events can be detected and transmitted in the order of microseconds. In this project, we will investigate methods for fast, robust and accurate estimation of relevant figures of interest regarding dynamics, from camera motion to environment structure, while also exploring ways to use these fast estimators to produce commands for fast motion control. The objective is to provide tools for a complete perception-action pipeline suitable for the most demanding high-dynamics robotic platforms such as humanoids, legged robots, aerial vehicles and aerial manipulators.
Regarding perception, EBCON is the second project we undertake at IRI devoted to event-based vision. In the first project, EBSLAM, our efforts were in recovering a good estimate of camera motion whilst building a map of the environment. Today we are capable of estimating camera motion at 10kHz rate, for motion bursts with accelerations of 20g, and with precision comparable to that of an Optitrack system. In EBCON we will continue our efforts on SLAM for event cameras, being able to build the map automatically and in real-time. Secondly, we will resort to the use of artificial neural networks to learn discriminative features from events. Moreover, we will also investigate the use of spiking neural nets to compute salient features from events, compute optical flow, and recover camera motion.
In biology, most information processing happens through spike-based representations: spikes encode sensory data, spikes perform computation, and spikes transmit actuator commands. Therefore, the event-based paradigm seems applicable not only to perception and inference, but also to control. In EBCON we will explore the design of reactive controllers requiring fast reaction times, such as obstacle avoidance for UAVs. However, modern high-dynamics robots such as humanoids or aerial manipulators require the execution of carefully planned maneuvers and cannot resort just to reactive control. We need to step up to deliberative control techniques such as nonlinear model predictive control (nMPC). In this project, we want to explore techniques that, departing from the most modern nMPC techniques, allow us to approach the dynamics observed by our event-based estimators.
In the course of EBCON, results will be demonstrated in a number of robotic prototypes with increasing levels of complexity. Motion estimation will be developed first for a hand-held camera moving freely in 3D, for a UAV moving also freely in 3D but including also its kinematic model. In the same way, the control algorithms will be first developed for simulated settings, then for a simple 2DoF arm, for UAV control, for the constrained motion of IRIs Ivo dual-arm mobile manipulator, and finally for the complex case of the LAAS humanoid robot TALOS with 32 DoFs. Two robotics companies have shown interest in the project objectives and are expectant of its outcome. When time comes we will seek the transfer of project results to their platforms.

Project Publications

Journal Publications

  • J. Solà, J. Vallvé, J. Casals, J. Deray, M. Fourmy, D. Atchuthan, A. Corominas Murtra and J. Andrade-Cetto. WOLF: A modular estimation framework for robotics based on factor graphs. IEEE Robotics and Automation Letters, 7(2): 4710-4717, 2022.

    Open/Close abstract Abstract Info Info pdf PDF