Feature points based facial animation retargeting
Résumé
Figure 1: The bottom row shows the virtual face animated by retargeting expressions from the source face (top row). Abstract We present a method for transferring facial animation in real-time. The source animation may be an existing 3D animation or 2D data providing by a video tracker or a motion capture system. Based on two sets of feature points manually selected on the source and target faces (the only manual work required), a RBF network is trained and provides a geometric transformation between the two faces. At each frame, the RBF transformation is applied on the new feature points positions of the source face, resulting in new positions for target feature points according with the expression of the source face and the morphology of the target face. According to their displacements along time, we deform the target mesh on the GPU with the linear blend skinning (LBS) method. In order to make our approach attractive to novice user, we propose a procedural technique to automatically rig the target face by generating vertices weights for the skinning deformation. To summarize, our method provides interactive expression transfer with a minimal human intervention during setup and accepts various kind of animation sources.
Origine | Fichiers produits par l'(les) auteur(s) |
---|
Loading...