Adversarial Mixing Policy for Relaxing Locally Linear Constraints in Mixup

Guang Liu, Yuzhao Mao, Huang Hailong, Gao Weiguo, Li Xuan


Abstract
Mixup is a recent regularizer for current deep classification networks. Through training a neural network on convex combinations of pairs of examples and their labels, it imposes locally linear constraints on the model’s input space. However, such strict linear constraints often lead to under-fitting which degrades the effects of regularization. Noticeably, this issue is getting more serious when the resource is extremely limited. To address these issues, we propose the Adversarial Mixing Policy (AMP), organized in a “min-max-rand” formulation, to relax the Locally Linear Constraints in Mixup. Specifically, AMP adds a small adversarial perturbation to the mixing coefficients rather than the examples. Thus, slight non-linearity is injected in-between the synthetic examples and synthetic labels. By training on these data, the deep networks are further regularized, and thus achieve a lower predictive error rate. Experiments on five text classification benchmarks and five backbone models have empirically shown that our methods reduce the error rate over Mixup variants in a significant margin (up to 31.3%), especially in low-resource conditions (up to 17.5%).
Anthology ID:
2021.emnlp-main.238
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2998–3008
Language:
URL:
https://aclanthology.org/2021.emnlp-main.238
DOI:
10.18653/v1/2021.emnlp-main.238
Bibkey:
Cite (ACL):
Guang Liu, Yuzhao Mao, Huang Hailong, Gao Weiguo, and Li Xuan. 2021. Adversarial Mixing Policy for Relaxing Locally Linear Constraints in Mixup. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 2998–3008, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Adversarial Mixing Policy for Relaxing Locally Linear Constraints in Mixup (Liu et al., EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.238.pdf
Video:
 https://aclanthology.org/2021.emnlp-main.238.mp4
Code
 pai-smallisallyourneed/mixup-amp
Data
SST