Abstract
In this paper, we propose Epsilon Consistent Mixup (∈mu). ∈mu is a data-based structural regularization technique that combines Mixup's linear interpolation with consistency regularization in the Mixup direction, by compelling a simple adaptive tradeoff between the two. This learnable combination of consistency and interpolation induces a more flexible structure on the evolution of the response across the feature space and is shown to improve semi-supervised classification accuracy on the SVHN and CIFAR10 benchmark datasets, yielding the largest gains in the most challenging low label-availability scenarios. Empirical studies comparing ∈mu and Mixup are presented and provide insight into the mechanisms behind ∈mu's effectiveness. In particular, ∈mu is found to produce more accurate synthetic labels and more confident predictions than Mixup.
Original language | English (US) |
---|---|
Article number | e425 |
Journal | Stat |
Volume | 11 |
Issue number | 1 |
DOIs | |
State | Published - Dec 2022 |
All Science Journal Classification (ASJC) codes
- Statistics and Probability
- Statistics, Probability and Uncertainty