Journals
  Publication Years
  Keywords
Search within results Open Search
Please wait a minute...
For Selected: Toggle Thumbnails
Robust learning method by reweighting examples with negative learning
Boshi ZOU, Ming YANG, Chenchen ZONG, Mingkun XIE, Shengjun HUANG
Journal of Computer Applications    2024, 44 (5): 1479-1484.   DOI: 10.11772/j.issn.1001-9081.2023050880
Abstract237)   HTML5)    PDF (1241KB)(328)       Save

Noisy label learning methods can effectively use data containing noisy labels to train models and significantly reduce the labeling cost of large-scale datasets. Most existing noisy label learning methods usually assume that the number of each class in the dataset is balanced, but the data in many real-world scenarios tend to have noisy labels, and long-tailed distributions often present in the dataset simultaneously, making it difficult for existing methods to select clean examples from noisy examples in the tail class according to traning loss or confidence. To solve noisy long-tailed learning problem, a ReWeighting examples with Negative Learning (NLRW) method was proposed, by which examples were reweighted adaptively based on negative learning. Specifically, at each training epoch, the weights of examples were calculated according to the output distributions of the model to head classes and tail classes. The weights of clean examples were close to one while the weights of noisy examples were close to zero. To ensure accurate estimation of weights, negative learning and cross entropy loss were combined to train the model with a weighted loss function. Experimental results on CIFAR-10 and CIFAR-100 datasets with various imbalance rates and noise rates show that, compared with the optimal baseline model TBSS (Two stage Bi-dimensional Sample Selection) for noisy long-tail classification, NLRW method improves the average accuracy by 4.79% and 3.46%, respectively.

Table and Figures | Reference | Related Articles | Metrics