Improving Robustness and Efficiency in Active Learning with Contrastive Loss. (arXiv:2109.06873v1 [cs.LG])

This paper introduces supervised contrastive active learning (SCAL) by
leveraging the contrastive loss for active learning in a supervised setting. We
propose efficient query strategies in active learning to select unbiased and
informative data samples of diverse feature representations. We demonstrate our
proposed method reduces sampling bias, achieves state-of-the-art accuracy and
model calibration in an active learning setup with the query computation 11x
faster than CoreSet and 26x faster than Bayesian active learning by
disagreement. Our method yields well-calibrated models even with imbalanced
datasets. We also evaluate robustness to dataset shift and out-of-distribution
in active learning setup and demonstrate our proposed SCAL method outperforms
high performing compute-intensive methods by a bigger margin (average 8.9%
higher AUROC for out-of-distribution detection and average 7.2% lower ECE under
dataset shift).

Source: https://arxiv.org/abs/2109.06873

webmaster

Related post