Loading…
Learning Cognitive Features as Complementary for Facial Expression Recognition
Facial expression recognition (FER) has a wide range of applications, including interactive gaming, healthcare, security, and human‐computer interaction systems. Despite the impressive performance of FER based on deep learning methods, it remains challenging in real‐world scenarios due to uncontroll...
Saved in:
Published in: | International journal of intelligent systems 2024-01, Vol.2024 (1) |
---|---|
Main Authors: | , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Facial expression recognition (FER) has a wide range of applications, including interactive gaming, healthcare, security, and human‐computer interaction systems. Despite the impressive performance of FER based on deep learning methods, it remains challenging in real‐world scenarios due to uncontrolled factors such as varying lighting conditions, face occlusion, and pose variations. In contrast, humans are able to categorize objects based on both their inherent characteristics and the surrounding environment from a cognitive standpoint, utilizing concepts such as cognitive relativity. Modeling the cognitive relativity laws to learn cognitive features as feature augmentation may improve the performance of deep learning models for FER. Therefore, we propose a cognitive feature learning framework to learn cognitive features as complementary for FER, which consists of Relative Transformation module (AFRT) and Graph Convolutional Network module (AFGCN). AFRT explicitly creates cognitive relative features that reflect the position relationships between the samples based on human cognitive relativity, and AFGCN implicitly learns the interaction features between expressions as feature augmentation to improve the classification performance of FER. Extensive experimental results on three public datasets show the universality and effectiveness of the proposed method. |
---|---|
ISSN: | 0884-8173 1098-111X |
DOI: | 10.1155/2024/7321175 |