An Adaptive Batch Normalization in Deep Learning. (arXiv:2211.02050v1 [cs.LG])

Batch Normalization (BN) is a way to accelerate and stabilize training in
deep convolutional neural networks. However, the BN works continuously within
the network structure, although some training data may not always require it.
In this research work, we propose a threshold-based adaptive BN approach that
separates the data that requires the BN and data that does not require it. The
experimental evaluation demonstrates that proposed approach achieves better
performance mostly in small batch sizes than the traditional BN using MNIST,
Fashion-MNIST, CIFAR-10, and CIFAR-100. It also reduces the occurrence of
internal variable transformation to increase network stability



Related post