Expression-robust 3D face recognition via weighted sparse representation of multi-scale and multi-component local normal patterns
Résumé
In the theory of differential geometry, surface normal, as a first order surface differential quantity, determines the orientation of a surface at each point and contains informative local surface shape information. To fully exploit this kind of information for 3D face identification, this paper proposes a novel highly discriminative facial shape descriptor, namely Multi-Scale and Multi-Component Local Normal Patterns (MSMC-LNP). Given a registered facial range image, three components of normal vectors are first estimated, leading to three normal images. Then, each normal image is encoded locally to Local Normal binary Patterns (LNP) at different scales. To utilize spatial information of facial shape, each normal image is divided into several patches, and their LNP histograms are computed and concatenated according to facial configuration. Finally, each original facial surface is represented by a set of LNP histograms including both global and local cues. Moreover, to make the proposed solution robust to the variations of facial expressions, being subtle, prototypical or exaggerated, we propose to learn the weight of each local patch under a given encoding scale and normal component. Based on the learned weights and the weighted LNP histograms, we formulate a Weighted Sparse Representation-based Classifier (W-SRC). In contrast to the overwhelming of 3D FR algorithms which were only benchmarked on the FRGC v2.0 dataset, we carried out extensive experiments on the FRGC v2.0, Bosphorus, BU-3DFE and 3D-TEC databases, thus enclosing 3D face data captured under different scenarios through various sensors and depicting in particular different challenges with respect to facial expressions. The experimental results show that the proposed approach consistently achieves competitive rank-one identification rates over those datasets despite their heterogeneous nature, and demonstrates thereby its effectiveness and its generalization ability.
Origine | Fichiers produits par l'(les) auteur(s) |
---|
Loading...