Skip to yearly menu bar Skip to main content


Poster

EntAugment: Entropy-Driven Adaptive Data Augmentation Framework for Image Classification

Suorong Yang · Furao Shen · Jian Zhao

Strong blind review: This paper was not made available on public preprint services during the review process Strong Double Blind
[ ]
Tue 1 Oct 7:30 a.m. PDT — 9:30 a.m. PDT

Abstract:

Data augmentation (DA) has been widely used to improve the generalization of deep neural networks. Despite its effectiveness, most existing DA methods typically employ augmentation operations with random magnitudes for each sample, which might inevitably introduce noises, induce distribution shifts, and elevate the risk of overfitting. In this paper, we propose EntAugment, a tuning-free and adaptive DA framework. Unlike previous work, EntAugment dynamically assesses and adjusts the augmentation magnitudes for each sample during training, leveraging insights into both the inherent complexities of training samples and the evolving status of deep models. Specifically, in EntAugment, the magnitudes are determined by the information entropy derived from the probability distribution obtained by applying the softmax function to the model's output. Additionally, to further enhance the efficacy of EntAugment, we extend upon EntAugment by introducing a novel entropy regularization term, EntLoss for better generalization. Theoretical analysis further demonstrates that EntLoss, compared to traditional cross-entropy loss, achieves closer alignment between the model distributions and underlying dataset distributions. Moreover, EntAugment and EntLoss can be utilized separately or jointly. We conduct extensive experiments across multiple image classification tasks and network architectures with thorough comparisons of existing DA methods. The proposed methods outperform others without introducing any auxiliary models or noticeable extra computational costs, highlighting both effectiveness and efficiency. Code will be made available soon.

Live content is unavailable. Log in and register to view live content