Research line

Mobile Robotics and Intelligent Systems Image

The research activities of the MOBILE ROBOTICS line are aimed to endow mobile robots and ubiquitous computing devices the necessary skills to aid humans in everyday life activities. These skills range from pure perceptual activities such as tracking, recognition or situation awareness, to motion skills, such as localization, mapping, autonomous navigation, path planning or exploration.

Head of line: Alberto Sanfeliu Cortés

Head of line

Tech. transfer

Our activity finds applications in several fields through collaboration with our technological partners

Research projects

We carry out projects from national and international research programmes.
→ More about our research projects

<< Back to Mobile Robotics and Intelligent Systems main page

Urban service robotics

The group focuses on the design and development of service mobile robots for human assistance and human robot interaction. This includes research on novel hardware and software solutions to urban robotic services such as surveillance, exploration, cleaning, transportation, human tracking, human assistance and human guiding.

Research area 1 of Mobile Robotics

Social robotics

The group's work on social robotics has an emphasis in human robot interaction and collaboration, developing new techniques to predict and learn human behaviors, human-robot task collaboration, and the generation of emphatic robot behaviors using all types of sensors, computer vision techniques and cognitive systems technologies.

Research area 2 of Mobile Robotics

Robot localization and robot navigation

This research area tackles the creation of robust single and cooperative, indoor and outdoor robot localization solutions, using multiple sensor modalities such as GPS, computer vision and laser range finding, INS sensors and raw odometry. The area also seeks methods and algorithms for autonomous robot navigation, and robot formation; and the application of these methods on a variety of indoor and outdoor mobile robot platforms.

Research area 3 of Mobile Robotics

SLAM and robot exploration

We develop solutions for indoor and outdoor simultaneous localization and mapping using computer vision and three-dimensional range data using Bayesian estimation. The research includes the development of new filtering and smoothing algorithms that limit the load of maps using information theoretic measures; as well as the design and construction of novel sensors for outdoor mapping. This research area also studies methods for autonomous robotic exploration.

Research area 4 of Mobile Robotics

Tracking in computer vision

We study the development of robust algorithms for the detection and tracking of human activities in indoor and outdoor areas, with applications to service robotics, surveillance, and human-robot interaction. This includes the development of fixed/moving single camera tracking algorithms as well as detection and tracking methods over large camera sensor networks.

Research area 5 of Mobile Robotics

Object recognition

The group also performs research on object detection and object recognition in computer vision. Current research is heavily based on boosting and other machine learning methodologies that make extensive use of multiple view geometry. We also study the development of unique feature and scene descriptors, invariant to changes in illumination, cast shadows, or deformations.

Research area 6 of Mobile Robotics

These are the latest research projects of the Mobile Robotics and Intelligent Systems research line:

These are the most recent publications (2024 - 2023) of the Mobile Robotics and Intelligent Systems

  • A.M. Puig-Pey, J.L. Zamora, B. Amante, J. Moreno, A. Garrell Zulueta, A. Grau, Y. Bolea, A. Santamaria-Navarro and A. Sanfeliu. Human acceptance in the human-robot interaction scenario for last-mile goods delivery, 2023 IEEE International Conference on Advanced Robotics and Its Social Impacts, 2023, Berlin, Germany, pp. 33-39.

    Open/Close abstract Abstract Info Info pdf PDF
  • Y. Tian and J. Andrade-Cetto. Egomotion from event-based SNN optical flow, 2023 ACM International Conference on Neuromorphic Systems, 2023, Santa Fe, NM, USA, pp. 8:1-8.

    Open/Close abstract Abstract Info Info pdf PDF
  • E. Repiso, A. Garrell Zulueta and A. Sanfeliu. Real-life experiment metrics for evaluating human-robot collaborative navigation tasks, 32nd IEEE International Symposium on Robot and Human Interactive Communication, 2023, Busan, Korea, pp. 660-667.

    Open/Close abstract Abstract Info Info pdf PDF
  • C. Lemardelé, A. Baldó, A. Aniculaesei, A. Rausch, M. Conill, L. Everding, T. Vietor, T. Hegerhorst, R. Henze, L. Mátyus, L. Pagès, V. Roca, A. Sanfeliu, A. Santamaria-Navarro and I. Tóháti. The LogiSmile Project - Piloting Autonomous Vehicles for Last-Mile Logistics in European cities. Transportation Research Procedia, 71: 180-187, 2023.

    Open/Close abstract Abstract Info Info pdf PDF
  • M. Dalmasso, J.E. Domínguez, I.J. Torres, P. Jiménez, A. Garrell Zulueta and A. Sanfeliu. Shared task representation for human–robot collaborative navigation: The collaborative search case. International Journal of Social Robotics: 1-27, 2023, to appear.

    Open/Close abstract Abstract Info Info pdf PDF
  • G. Coll, I.J. Torres, A. Grau, E. Guerra and A. Sanfeliu. Accurate detection and depth estimation of table grapes and peduncles for robot harvesting, combining monocular depth estimation and CNN methods. Computers and Electronics in Agriculture: 108362, 2023.

    Open/Close abstract Abstract Info Info pdf PDF
  • O. Gil and A. Sanfeliu. Human motion trajectory prediction using the Social Force Model for real-time and low computational cost applications, 6th Iberian Robotics Conference, 2023, Portugal, pp. 1-12, Springer, to appear.

    Open/Close abstract Abstract Info Info pdf PDF
  • J.E. Domínguez and A. Sanfeliu. Inference vs. explicitness. Do we really need the perfect predictor? The human-robot collaborative object transportation case, 32nd IEEE International Symposium on Robot and Human Interactive Communication, 2023, Busan, Korea, pp. 1866-1871.

    Open/Close abstract Abstract Info Info pdf PDF
  • J.E. Domínguez and A. Sanfeliu. Improving human-robot interaction effectiveness in human-robot collaborative object transportation using force prediction, 2023 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2023, Detroit, MI, USA, pp. 7839-7845.

    Open/Close abstract Abstract Info Info pdf PDF
  • A. Dhamanaskar, M. Dimiccoli, E. Corona, A. Pumarola and F. Moreno-Noguer. Enhancing egocentric 3D pose estimation with third person views . Pattern Recognition, 138(109358), 2023.

    Open/Close abstract Abstract Info Info pdf PDF
  • P. Vial, N. Palomeras, J. Solà and M. Carreras. Underwater Pose SLAM using GMM scan matching for a mechanical profiling sonar. Journal of Field Robotics, 2023, to appear.

    Open/Close abstract Abstract Info Info pdf PDF
  • J.L. Crowley, J. Coutaz, J. Grosinger, J. Vazquez, C. Angulo, A. Sanfeliu, L. Iocchi and A.G. Cohn. A hierarchical framework for collaborative Artificial Intelligence. IEEE Pervasive Computing, 22(1): 9-18, 2023.

    Open/Close abstract Abstract Info Info pdf PDF
  • C. Debeunne, J. Vallvé, A. Torres and D. Vivet. Fast bi-monocular visual odometry using factor graph sparsification, 2023 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2023, Detroit, MI, USA, pp. 10716-10722.

    Open/Close abstract Abstract Info Info pdf PDF
  • J. Laplaza, R. Romero, A. Sanfeliu and A. Garrell Zulueta. Body gesture recognition to control a social mobile robot, 2023 ACM/IEEE International Conference on Human-Robot Interaction, 2023, Stockholm, in Companion of the HRI'23, pp. 456-460.

    Open/Close abstract Abstract Info Info pdf PDF
  • E. Repiso, A. Garrell Zulueta and A. Sanfeliu. Adaptive social planner to accompany people in real-life dynamic environments. International Journal of Social Robotics, 2023, to appear.

    Open/Close abstract Abstract Info Info pdf PDF
  • J.E. Domínguez, N.A. Rodríguez and A. Sanfeliu. Perception-intention-action cycle as a human acceptable way for improving human-robot collaborative tasks, 2023 ACM/IEEE International Conference on Human-Robot Interaction, 2023, Stockholm, in Companion of the HRI'23, pp. 567-571.

    Open/Close abstract Abstract Info Info pdf PDF
  • W.O. Chamorro, J. Solà and J. Andrade-Cetto. Event-IMU fusion strategies for faster-than-IMU estimation throughput, 4th CVPR International Workshop on Event Vision, 2023, Vancouver, pp. 3975-3982.

    Open/Close abstract Abstract Info Info pdf PDF

Mobile Robotics Laboratory

The Mobile Robotics Laboratory is an experimental area primarily devoted to hands-on research with mobile robot devices. The lab includes 3 Pioneer platforms, 2 service robots for urban robotics research based on Segway platforms, and a 4-wheel rough outdoor mobile robot, a six-legged LAURON-III walking robot, and a vast number of sensors and cameras.

Mobile Robotics Laboratory

Barcelona Robot Laboratory

The Barcelona Robot Lab encompasses an outdoor pedestrian area of 10.000 sq m., and is provided with 21 fixed cameras, a set of heterogeneous robots, full coverage of wifi and mica devices, and partial gps coverage. The area has moderate vegetation and intense cast shadows, making computer vision algorithms more than challenging.

Barcelona Robot Laboratory
Group photo

Researchers

PhD Students

Master Students

TFG Students

Support Staff