When in Doubt: Improving Classification Performance with Alternating Normalization
Autor: | Jia, Menglin, Reiter, Austin, Lim, Ser-Nam, Artzi, Yoav, Cardie, Claire |
---|---|
Rok vydání: | 2021 |
Předmět: | |
Druh dokumentu: | Working Paper |
Popis: | We introduce Classification with Alternating Normalization (CAN), a non-parametric post-processing step for classification. CAN improves classification accuracy for challenging examples by re-adjusting their predicted class probability distribution using the predicted class distributions of high-confidence validation examples. CAN is easily applicable to any probabilistic classifier, with minimal computation overhead. We analyze the properties of CAN using simulated experiments, and empirically demonstrate its effectiveness across a diverse set of classification tasks. Comment: Findings of EMNLP 2021 |
Databáze: | arXiv |
Externí odkaz: |