Loading…

Selective Mutual Learning: An Efficient Approach for Single Channel Speech Separation

Mutual learning, the related idea to knowledge distillation, is a group of untrained lightweight networks, which simultaneously learn and share knowledge to perform tasks together during training. In this paper, we propose a novel mutual learning approach, namely selective mutual learning. This is t...

Full description

Saved in:
Bibliographic Details
Main Authors: Tan, Ha Minh, Vu, Duc-Quang, Lee, Chung-Ting, Li, Yung-Hui, Wang, Jia-Ching
Format: Conference Proceeding
Language:English
Subjects:
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Mutual learning, the related idea to knowledge distillation, is a group of untrained lightweight networks, which simultaneously learn and share knowledge to perform tasks together during training. In this paper, we propose a novel mutual learning approach, namely selective mutual learning. This is the simple yet effective approach to boost the performance of the networks for speech separation. There are two networks in the selective mutual learning method, they are like a pair of friends learning and sharing knowledge with each other. Especially, the high-confidence predictions are used to guide the remaining network while the low-confidence predictions are ignored. This helps to remove poor predictions of the two networks during sharing knowledge. The experimental results have shown that our proposed selective mutual learning method significantly improves the separation performance compared to existing training strategies including independently training, knowledge distillation, and mutual learning with the same network architecture.
ISSN:2379-190X
DOI:10.1109/ICASSP43922.2022.9746022