Model Driven Segmentation of Articulating Humans in Laplacian Eigenspace

Citation

A. Sundaresan and R. Chellappa, “Model Driven Segmentation of Articulating Humans in Laplacian Eigenspace,” in IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 30, no. 10, pp. 1771-1785, Oct. 2008, doi: 10.1109/TPAMI.2007.70823.

Abstract

We propose a general approach using Laplacian Eigenmaps and a graphical model of the human body to segment 3D voxel data of humans into different articulated chains. In the bottom-up stage, the voxels are transformed into a high-dimensional (6D or less) Laplacian Eigenspace (LE) of the voxel neighborhood graph. We show that LE is effective at mapping voxels on long articulated chains to nodes on smooth 1D curves that can be easily discriminated, and prove these properties using representative graphs. We fit 1D splines to voxels belonging to different articulated chains such as the limbs, head and trunk, and determine the boundary between splines using the spline fitting error. A top-down probabilistic approach is then used to register the segmented chains, utilizing their mutual connectivity and individual properties. Our approach enables us to deal with complex poses such as those where the limbs form loops. We use the segmentation results to automatically estimate the human body models. While we use human subjects in our experiments, the method is fairly general and can be applied to voxel-based segmentation of any articulated object composed of long chains. We present results on real and synthetic data that illustrate the usefulness of this approach.


Read more from SRI