Publication Type
Conference Proceeding Article
Version
acceptedVersion
Publication Date
12-2023
Abstract
Foundation models like CLIP allow zero-shot transfer on various tasks without additional training data. Yet, the zero-shot performance is less competitive than a fully supervised one. Thus, to enhance the performance, fine-tuning and ensembling are also commonly adopted to better fit the downstream tasks. However, we argue that such prior work has overlooked the inherent biases in foundation models. Due to the highly imbalanced Web-scale training set, these foundation models are inevitably skewed toward frequent semantics, and thus the subsequent fine-tuning or ensembling is still biased. In this study, we systematically examine the biases in foundation models and demonstrate the efficacy of our proposed Generalized Logit Adjustment (GLA) method. Note that bias estimation in foundation models is challenging, as most pre-train data cannot be explicitly accessed like in traditional long-tailed classification tasks. To this end, GLA has an optimization-based bias estimation approach for debiasing foundation models. As our work resolves a fundamental flaw in the pre-training, the proposed GLA demonstrates significant improvements across a diverse range of tasks: it achieves 1.5 pp accuracy gains on ImageNet, a large average improvement (1.4-4.6 pp) on 11 few-shot datasets, 2.4 pp gains on long-tailed classification.
Discipline
Databases and Information Systems
Research Areas
Data Science and Engineering
Publication
Proceedings of the 37th Conference on Neural Information Processing Systems (NeurIPS 2023), New Orleans, December 10-16
First Page
1
Last Page
18
Publisher
NeurIPS
City or Country
New Orleans
Citation
ZHU, Beier; TANG, Kaihua; SUN, Qianru; and ZHANG, Hanwang.
Generalized logit adjustment: Calibrating fine-tuned models by removing label bias in foundation models. (2023). Proceedings of the 37th Conference on Neural Information Processing Systems (NeurIPS 2023), New Orleans, December 10-16. 1-18.
Available at: https://ink.library.smu.edu.sg/sis_research/8473
Copyright Owner and License
Authors
Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-No Derivative Works 4.0 International License.