Publication

MultiPhys: Multi-person physics-aware 3D motion estimation

Conference Article

Conference

IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)

Edition

2024

Pages

2331-2340

Doc link

http://dx.doi.org/10.1109/CVPR52733.2024.00226

File

Download the digital copy of the doc pdf document

Abstract

We introduce MultiPhys, a method designed for recovering multi-person motion from monocular videos. Our focus lies in capturing coherent spatial placement between pairs of individuals across varying degrees of engagement. MultiPhys, being physically aware, exhibits robustness to jittering and occlusions, and effectively eliminates penetration issues between the two individuals. We devise a pipeline in which the motion estimated by a kinematic-based method is fed into a physics simulator in an autoregressive manner. We introduce distinct components that enable our model to harness the simulator's properties without compromising the accuracy of the kinematic estimates. This results in final motion estimates that are both kinematically coherent and physically compliant. Extensive evaluations on three challenging datasets characterized by substantial inter-person interaction show that our method significantly reduces errors associated with penetration and foot skating, while performing competitively with the state-of-the-art on motion accuracy and smoothness.Results and code can be found in our project page.

Categories

computer vision, pattern recognition.

Author keywords

3D human motion, motion estimation, SMPL

Scientific reference

N. Ugrinovic, B. Pan, G. Pavlakos, D. Paschalidou, B. Shen, J. Sanchez, F. Moreno-Noguer and L. Guibas. MultiPhys: Multi-person physics-aware 3D motion estimation , 2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2024, Seattle, USA, pp. 2331-2340.