Publication
Unsupervised trajectory segmentation and gesture recognition through curvature analysis and the Levenshtein distance
Conference Article
Conference
Iberian Robotics Conference (ROBOT)
Edition
7th
Pages
1-8
Doc link
https://doi.org/10.1109/ROBOT61475.2024.10796932
File
Authors
Projects associated
Abstract
This work discusses segmentation and gesture recognition in human-driven robotic trajectories, a technique with applications in several sectors such as in robot-assisted minimally invasive surgery (RMIS) training. By decomposing entire movements-gestures-into smaller actions -sub-gestures-, we can address gesture recognition accurately. This paper extends a bottom-up approach used in surgical gesture segmentation and incorporates natural language processing (NLP) techniques to match sub-gestures with letters and treating gestures as words. We evaluated our algorithm using two different datasets with trajectories on 2D and 3D. This NLP-inspired model obtains an average F1-score of 94.25% in the segmentation tasks, an accuracy of 87.05% in the learning stage, and an overall accuracy of 88.79% in the fully automated execution. These results indicate that the method effectively identifies and interprets new surgical gestures autonomously without the need for human intervention.
Categories
robot kinematics, robot programming, robots.
Scientific reference
G. Tapia, A. Colomé and C. Torras. Unsupervised trajectory segmentation and gesture recognition through curvature analysis and the Levenshtein distance, 7th Iberian Robotics Conference, 2024, Madrid, Spain, pp. 1-8.
Follow us!