Unified unsupervised and semi-supervised domain adaptation network for cross-scenario face anti-spoofing
•We address the cross-scenario face ant-spoofing problem via domain adaptation.•We take consideration of unsupervised and semi-supervised settings.•Different distribution alignment operations are conducted for better generalization.•Our method is competitive with state-of-the-art methods in face ant...
Saved in:
Published in: | Pattern recognition 2021-07, Vol.115, p.107888, Article 107888 |
---|---|
Main Authors: | , , , |
Format: | Article |
Language: | eng |
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | •We address the cross-scenario face ant-spoofing problem via domain adaptation.•We take consideration of unsupervised and semi-supervised settings.•Different distribution alignment operations are conducted for better generalization.•Our method is competitive with state-of-the-art methods in face anti-spoofing.
Due to the environmental differences, many face anti-spoofing methods fail to generalize to unseen scenarios. In light of this, we propose a unified unsupervised and semi-supervised domain adaptation network (USDAN) for cross-scenario face anti-spoofing, aiming at minimizing the distribution discrepancy between the source and the target domains. Specifically, two modules, i.e., marginal distribution alignment module (MDA) and conditional distribution alignment module (CDA), are designed to seek a domain-invariant feature space via adversarial learning and make the features of the same class compact, respectively. By adding/removing the CDA module, the network can be easily switched for semi-supervised/unsupervised setting, in which sense our method is named with “unified”. Moreover, the adaptive cross-entropy loss and normalization techniques are further incorporated to improve the generalization. Extensive experimental results show that the proposed USDAN outperforms state-of-the-art methods on several public datasets. |
---|---|
ISSN: | 0031-3203 1873-5142 |