Publications

All Publications

An Information Fusion Approach to Learning With Instance-Dependent Label Noise

Abstract

Instance-dependent label noise (IDN) widely exists in real-world datasets and usually misleads the training of deep neural networks. Noise transition matrix (i.e., the probability that clean labels flip into noisy labels) is used to characterize the label noise and achieves statistically consistent classifiers for underlying distribution that the data belongs to. However, most of instances are long-tail, i.e., the number of appearance for each instance is usually limited, which leads to the gap between underlying distribution and empirical distribution, and model degeneration. To mitigate the distribution mismatch problem, we propose posterior transition matrix to posteriorly model label noise given limited observed noisy labels achieving statistically consistent classifiers for underlying and empirical distribution}. Note that even if the instance is corrupted by the same noise transition matrix, the intrinsic randomness incurs to different noisy labels, and thus requires different correction methods. Motivated by this observation, we propose an Information Fusion (IF) approach to fine-tune the noise transition matrix based on estimated posterior transition matrix. Specifically, we adopt the noisy labels and model predicted probability to estimate posterior transition matrix and then correct the noise transition matrix in forward propagation. Empirical evaluations on synthetic and real-world datasets demonstrate that our method is superior to the state-of-the-art approaches, and achieve more stable training for learning from the instance-dependent label noise.

Author: Li Li, Rui Chen, Soo-Hyun Choi, Xia Hu

Published: International Conference on Learning Representation (ICLR)

Date: Apr 25, 2022