Full-head Texture Synthesis for Human Head Cloning
With a significant increase in the quality and availability of 3D capture methods, a common approach towards creating face models of real humans uses laser range scanners to accurately acquire both the face geometry and
texture. One limitation of the scanner technology, however, is that the complete head geometry can not be easily captured as the hair in dark color absorbs all of the laser radiation. The top and back of the head are generally not digitized unless the hair is artificially colored white or the subject wears a light-color cap, but that destroys the texture. In most cases, only the frontal face can be properly textured. There is no automatic mechanism provided to generate a full-head texture from the acquired single frontal-face image for realistic rendering towards a "cloned" head.
We present a technique to efficiently generate a
parameterized full-head texture for modeling heads with a high
degree of realism. We start with a generic head model with a known
topology. It is deformed to fit the face scan of the particular
human subject using a volume morphing approach. The facial
texture associated with the scanned geometry is then transferred
to the original undeformed generic mesh. We automatically
construct a parameterization of the 3D head mesh over a 2D
texture domain, which gives immediate correspondence between all
the scanned textures via a single, prototype layout. After having
performed a vertex-to-image binding for vertices of the head
mesh, we generate a cylindrical full-head texture from the
remaining parameterized texture of the face area. We also address
the creation of individual textures for ears. Apart from an
initial feature point selection for the texturing, our method
works automatically without any user interaction. Our main contribution is a technique that uses a frontal-face
image of the scanned data to generate a full-head texture for
photorealistic rendering and morphing with minimal manual
intervention. This includes the new algorithms to automatically
parameterize textures of a set of unregistered face scans to
establish the mapping correspondence, to robustly produce
individual full-head skin texture, and to efficiently create ear
textures from a single input image.
Adaptation of a generic model to scanned data.
Left to right: source model after global warping; final mesh geometry after local deformation; textur-mapped deformed source model
Mesh parameterization and synthesis of a full-head skin texture.
Left to right: texture transferred to the undeformed source model; cylinderical face texture image; color-coded texture binding of the texture mesh; synthesized cylinderical full-head texture.
Texturing ears.
Left to right: feature points specified in the scanned image and 3D source model; reference ear meshes with smooth shading and texture mapping after global alignment; Final appearance after local adaptation.
Full-head textured models and morphing of textured heads.
Papers:
Yu Zhang. "An efficient texture generation technique for human head cloning and morphing". Proc. International Conference on Computer Graphics Theory and Applications (GRAPP2006), pp. 267-278, Setubal, Portugal, Feb. 2006.
Copyright 2005-2013, Yu
Zhang. This material may not be published, modified or otherwise
redistributed in whole or part without prior approval.