Conformation-based Parameterization of Scanned Face Data
Nowadays, a variety of methodologies developed for 3D shape
capture are available, ranging from laser range scanning (e.g.,
CyberWare, VIVID), and stereo
photogrammetry (e.g., EyeTronics, 3QTech), to structured light projection.
With these devices or techniques, one can capture and digitize the
complete surfaces of a large number of human faces accurately.
For some applications, such as face shape analysis, we are more
concerned with applying geometry processing algorithms to a
group of models than to a single one. However, when
given a group of scanned shapes, to perform simple operations,
such as computing their average shape or computing the norm of
their differences, is not trivial. The main cause for this
difficulty is the generally different sampling patterns used to
describe the geometry. Geometry processing algorithms involving
multiple models require a consistent parameterization and a
common sampling pattern. A dense set of correspondences between
the individual instances of the collection of scanned shapes need
to be established. Moreover, the scanned data typically consists
of hundreds of thousands of 3D points. Such a dense data-set
cannot be easily used for animation. Although mesh decimation
results in a usable approximation of the scanned data, there is
not enough control over the connectivity of the mesh. This can
lead to artifacts in the animation due to misalignment of the
triangle edges.
In this project, we present a mesh conformation method for
establishing a point-to-point correspondence among scanned 3D
face shapes. Our method is based on fitting a deformable generic
mesh which has a predefined configuration onto detailed human
face range scans in a global-to-local fashion. The mesh
conformation globally maps the generic mesh to the scanned 3D
shape based on semi-automatically specified corresponding feature
points through the application of a Radial Basis Function
(RBF)-based volume warping. In order to generate more
correspondences for a good match and to ease the task of manually
specifying the correspondences, we develop an automatic procedure
to refine the feature point sets. After global warping, a local
deformation is carried out to fit all vertices on the generic
mesh to the scanned surface. We formulate an optimization problem
to solve for the local deformation using an energy function that
is a combination of two measures: the proximity of transformed
vertices to the scanned shape and the smoothness over the
surface. We demonstrate reconstruction and parameterization of
186 human face scans in a large database. With our method, we
explore a variety of applications, including texture transfer,
morphing, and statistical analysis of shape. We have implemented
our method in the context of modeling human faces; however, given
the availability of a suitable generic model, the general
techniques that we have developed could be applied to other
objects.
The three-step mesh conformation method:
Global shape warping.
Automatic refinement of the feature point sets.
Local shape deformation.
More examples.
Papers:
Yu Zhang, Shuhong Xu, Yi Su, Chiet Sing Chong and Terence Hung. "A mesh conformation method for face shape reconstruction from range scans". Advances in Computational Sciences and Technology, 1(1):1-20, July 2007.
Copyright 2005 to 2013, Yu
Zhang. This material may not be published, modified or otherwise
redistributed in whole or part without prior approval.