Efficient Online Multi-Person 2D Pose Tracking with Recurrent Spatio-Temporal Affinity Fields
In this work, I have developed the worlds fastest multi-person articulated pose tracker. It is speed invariant to the number of people, and accuracy invariant to the frame rate of the input camera. Our system runs at 30 FPS.
We do this by formulating the pose in video detection problem as a recurrent structure, that encodes both spatial and temporal connections across limbs in time and space.
We also propose a novel temporal topology cross-linked across limbs which can consistently handle body motions of a wide range of magnitudes.
This work has been accepted to CVPR 2019 as an Oral Paper, and the code will be released to Openpose soon.
Our paper can be found here: Link