Master Thesis

3D human pose estimation from egocentric videos of social interactions

Work default illustration

Student/s

Supervisor/s

Information

  • Started: 13/11/2023

Description

Understanding social interactions from a first-person
perspective has compelling applications in Assistive Robotics and Augmented/Virtual
reality. A crucial cue for social interaction understanding is the body pose of
interacting people, which is paramount in nonverbal communication. However,
estimating the body pose of the camera wearer from first-person (egocentric) videos is
a challenging task since the camera wearer is largely out of view from a typical
wearable camera.
This project aims at addressing this challenge by leveraging inter-person interaction
dynamics and 3D scene context. The method will be validated using the recently
introduced Egobody dataset: https://sanweiliti.github.io/egobody/egobody.html.
The student is expected to have excellent programming skills and familiarity with deep
learning frameworks (preferably PyTorch). She/he will work in close collaboration with
our expert team that has large experience in both 3D pose estimation and social
interaction analysis from egocentric videos.