A novel face anonymization framework with the ability to preserve facial expressions is GANonymization.


In recent years, with the exponential growth in the availability of personal data and the rapid advancement of technology, concerns regarding privacy and security have been amplified. As a result, data anonymization has become more important because it plays a crucial role in protecting people’s privacy and preventing accidental sharing of sensitive information.

When sharing and analyzing data, it is usual practice to apply data anonymization techniques including generalization, suppression, randomization, and perturbation to preserve user privacy. These techniques do, however, have drawbacks. Generalization can result in accuracy loss and lower information, suppression can lead to incomplete data sets, randomization techniques can open the door to re-identification attacks, and perturbation can add noise that has an influence on the quality of the data. When putting these strategies into practice, it’s critical to strike a balance between privacy and data utility in order to effectively get around their drawbacks.

Acquiring and sharing sensitive face data can be particularly difficult, especially when making datasets publicly available. However, there are promising opportunities in using facial data for tasks such as emotion recognition. To address these challenges, a research team from Germany proposed a novel approach to face anonymization that focuses on emotion recognition.

The authors introduce GANonymization, a novel face anonymization framework that preserves facial expressions. The framework utilizes a generative adversarial network (GAN) to synthesize an anonymized version of a face based on a high-level representation.

The GANonymization framework consists of four components: face extraction, face segmentation, facial landmarks extraction, and re-synthesis. In the face extraction step, the RetinaFace framework detects and extracts visible faces. The faces are then aligned and resized to meet the requirements of the GAN. Face segmentation is performed to remove the background and focus solely on the face. Facial landmarks are extracted using a media-pipe face-mesh model, providing an abstract representation of the facial shape. These landmarks are projected onto a 2D image. Finally, a pix2pix GAN architecture is employed for re-synthesis, using landmark/image pairs from the CelebA dataset as training data. The GAN generates realistic face images based on landmark representations, ensuring the preservation of facial expressions while removing irrelevant traits.

To evaluate the effectiveness of the proposed approach, the research team conducted a comprehensive experimental investigation. The evaluation encompassed multiple aspects, including assessing the anonymization performance, considering the preservation of emotional expressions, and examining the impact of training an emotion recognition model. They compared the approach with DeepPrivacy2 regarding anonymization performance using the WIDER dataset. They also assessed the preservation of emotional expressions using AffectNet, CK+, and FACES datasets. The proposed approach outperformed DeepPrivacy2 in preserving emotional expressions across the datasets, as demonstrated through inference and training scenarios. The experimental investigation provided evidence of the effectiveness of the proposed approach in terms of anonymization performance and preservation of emotional expressions. In both aspects, the findings demonstrated superiority over the compared method, DeepPrivacy2. These results contribute to understanding and advancing face anonymization techniques, particularly in maintaining emotional information while ensuring privacy protection.

In conclusion, we presented in this article a new approach, GANonymization, a novel face anonymization framework that utilizes a generative adversarial network (GAN) to preserve facial expressions while removing identifying traits. The comprehensive experimental investigation demonstrated the approach’s effectiveness in terms of anonymization performance and preservation of emotional expressions. In both aspects, the proposed approach outperformed DeepPrivacy2, a comparative method, indicating its superiority. These findings contribute to advancing face anonymization techniques and highlight the potential for maintaining emotional information while ensuring privacy protection.