Home
Scholarly Works
MixNN: Combating Noisy Labels in Deep Learning by...
Conference

MixNN: Combating Noisy Labels in Deep Learning by Mixing with Nearest Neighbors

Abstract

Noisy labels are ubiquitous in real-world datasets, especially in the ones from web sources. Training deep neural networks on noisy datasets is a challenging task, as the networks have been shown to overfit the noisy labels in training, resulting in performance degradation. When trained on noisy datasets, deep neural networks have been observed to fit t he clean samples during an "early learning" phase, before eventually memorizing the mislabeled samples. We further explore the representation distributions in the early learning stage and find that the representations of similar samples from the same classes congregate regardless of their noisy labels. Inspired by these findings, we propose MixNN, a novel framework to mitigate the influence of noisy labels. In contrast with existing methods, which identify and eliminate the mislabeled samples, we modify the mislabeled samples by mixing them with their nearest neighbors through a weighted sum approach. The weights are calculated with a mixture model learning from the sample loss distribution. To enhance the performance in the presence of extreme label noise, we propose a strategy to estimate the soft targets by gradually correcting the noisy labels. We demonstrate that the estimated targets yield a more accurate approximation to ground truth labels and a better quality of the learned representations with more separated and clearly bounded clusters. Extensive experiments in two benchmarks and two challenging real-world datasets demonstrate that our approach outperforms the existing state-of-the-art methods.

Authors

Lu Y; He W

Volume

00

Pagination

pp. 847-856

Publisher

Institute of Electrical and Electronics Engineers (IEEE)

Publication Date

December 18, 2021

DOI

10.1109/bigdata52589.2021.9671816

Name of conference

2021 IEEE International Conference on Big Data (Big Data)
View published work (Non-McMaster Users)

Contact the Experts team