Loading…

An Efficient Attribute-Preserving Framework for Face Swapping

By leveraging deep neural networks, recent face swapping techniques have performed admirably in generating faces that maintain consistent identities. Nevertheless, while these methods accurately transfer source identities, they often struggle to preserve important attributes (such as head poses, exp...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on multimedia 2024, Vol.26, p.6554-6565
Main Authors: Wang, Tianyi, Li, Zian, Liu, Ruixia, Wang, Yinglong, Nie, Liqiang
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:By leveraging deep neural networks, recent face swapping techniques have performed admirably in generating faces that maintain consistent identities. Nevertheless, while these methods accurately transfer source identities, they often struggle to preserve important attributes (such as head poses, expressions, and gaze directions) in the target faces. As a consequence, the current research in this domain has not resulted in satisfactory performance. In this article, we propose an efficient attribute-preserving framework, called AP-Swap, for short, for face swapping. Our approach incorporates two innovative modules designed specifically to preserve critical facial attributes. First, we propose a global residual attribute-preserving encoder (GRAPE), which adaptively extracts globally complete attribute features from target faces. Second, in addition to the regular network streams for the source and target facial images, we introduce a network stream that takes into account the facial landmarks of the target faces. This additional stream enables our landmark-guided feature entanglement module (LFEM), which efficiently preserves fine-grained facial attributes by conducting a landmark-based attribute-preserving (LBAP) operation. Through extensive quantitative and qualitative experiments, we demonstrate the superiority of AP-Swap over other state-of-the-art methods in terms of facial attribute preservation and model efficiency, along with satisfactory identity consistency performance.
ISSN:1520-9210
1941-0077
DOI:10.1109/TMM.2024.3354573