Skip to yearly menu bar Skip to main content


Poster

Joint Asymmetric Loss for Learning with Noisy Labels

Jialiang Wang · Xianming Liu · Xiong Zhou · Gangfeng Hu · Deming Zhai · Junjun Jiang · Xiangyang Ji


Abstract:

Learning with noisy labels is an important and challenging task for training accurate deep neural networks.To mitigate label noise, prior studies have proposed various robust loss functions, particularly symmetric losses. Nevertheless, symmetric losses usually suffer from the underfitting issue due to the overly strict symmetric condition. To address this problem, the Active Passive Loss (APL) jointly optimizes an active and a passive loss to mutually enhance the overall fitting ability.Within APL, symmetric losses have been successfully extended, yielding advanced robust loss functions.Despite these advancements, emerging theoretical analyses indicate that asymmetric loss functions, a new class of robust loss functions, possess superior properties compared to symmetric losses. However, existing asymmetric losses are not compatible with advanced optimization frameworks such as APL, limiting their practical potential and applicability. Motivated by this theoretical gap and the promising properties of asymmetric losses, we extend the asymmetric loss function to the more complex passive loss scenario and propose the Asymetric Mean Square Error (AMSE), a novel asymmetric loss function. We rigorously establish the necessary and sufficient condition under which AMSE satisfies the asymmetric condition.By substituting the traditional symmetric passive loss in APL with our proposed AMSE, we introduce a novel robust loss framework termed Joint Asymmetric Loss (JAL).Extensive experiments demonstrate the effectiveness of our method in mitigating label noise.

Live content is unavailable. Log in and register to view live content