Research Project

CANOPIES: A Collaborative Paradigm for Human Workers and Multi-Robot Teams in Precision Agriculture Systems

Type

European Project

Start Date

01/01/2021

End Date

31/12/2024

Project Code

H2020-ICT-2020-2-101016906

Project illustration

Staff

Project Description

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 101016906

In CANOPIES, our goal is to develop a novel collaborative human-robot paradigm addressing the challenges of Human Robot Interaction and Human-Robot Collaboration in the unstructured highly dynamic outdoor environment of permanent crop farming (Agri-Food Area). Our approach will be demonstrated through an integrated system composed by farming robots and logistics robots with a real-world validation of two economically relevant agronomic operations within a table-grape vineyard: harvesting and pruning. CANOPIES represents the first attempt to introduce a collaborative paradigm in the field of precision agriculture for permanent crops where farmworkers can efficiently work together with teams of robots to perform agronomic interventions, like harvesting or pruning in table-grape vineyards. Both operations require complex processes of perception, communication, shared planning in agreement, prediction of human intentions, interaction and action. But also, both agronomic operations should be done in real life conditions, that is, in changing illumination and cast shadows, changing agronomic situations, where the vine branches or grapes can make it difficult to harvest or prune in a safe manner, due to the robot physical proximity to the human, while operating in real time. CANOPIES ambition will be achieved by introducing: i) novel human-robot interaction methodologies for enhanced safety and coexistence, ii) novel human-robot collaboration methodologies for increased system adaptability and intuitive usability; iii) novel multi-robot coordination methodologies for improved scalability. CANOPIES impact will contribute to filling the current gap in the development of fully autonomous robotic solutions for permanent crops by introducing a novel concept of farming robots, where we leverage an effective interaction with the human workers to mitigate the greater complexity of permanent crops as compared with field crops.

Website: http://www.canopies-project.eu/

Project Publications

Journal Publications

  • W.O. Chamorro, J. Solà and J. Andrade-Cetto. Event-based line SLAM in real-time. IEEE Robotics and Automation Letters, 7(3): 8146-8153, 2022.

    Open/Close abstract Abstract Info Info pdf PDF
  • M. Peral, A. Sanfeliu and A. Garrell Zulueta. Efficient hand gesture recognition for human-robot interaction. IEEE Robotics and Automation Letters, 7(4): 10272-10279, 2022.

    Open/Close abstract Abstract Info Info pdf PDF
  • O. Gil, A. Garrell Zulueta and A. Sanfeliu. Social robot navigation tasks: Combining machine learning techniques and Social Force Model. Sensors, 21(7087): 23, 2021.

    Open/Close abstract Abstract Info Info pdf PDF

Conference Publications

  • J. Laplaza, N.A. Rodriguez, J.E. Domínguez, F. Herrero, S. Hernández, A. López, A. Sanfeliu and A. Garrell Zulueta. IVO Robot: A new social robot for Human-Robot collaboration, 2022 ACM/IEEE International Conference on Human-Robot Interaction, 2022, Sapporo, Japan, pp. 860–864.

    Open/Close abstract Abstract Info Info pdf PDF
  • J. Laplaza, A. Garrell Zulueta, F. Moreno-Noguer and A. Sanfeliu. Context and intention for 3D human motion prediction: Experimentation and user study in handover tasks, 31st IEEE International Symposium on Robot and Human Interactive Communication, 2022, Napoli, Italy, pp. 630-635.

    Open/Close abstract Abstract Info Info pdf PDF
  • J. Laplaza, A. Pumarola, F. Moreno-Noguer and A. Sanfeliu. Attention deep learning based model for predicting the 3D human body pose using the robot human handover phases, 30th IEEE International Symposium on Robot and Human Interactive Communication, 2021, Vancouver, Canada, pp. 161-166.

    Open/Close abstract Abstract Info Info pdf PDF