A Survey of Deep Learning: From Activations to Transformers. (arXiv:2302.00722v1 [cs.LG])

Deep learning has made tremendous progress in the last decade. A key success
factor is the large amount of architectures, layers, objectives, and
optimization techniques that have emerged in recent years. They include a
myriad of variants related to attention, normalization, skip connection,
transformer and self-supervised learning schemes — to name a few. We provide a
comprehensive overview of the most important, recent works in these areas to
those who already have a basic understanding of deep learning. We hope that a
holistic and unified treatment of influential, recent works helps researchers
to form new connections between diverse areas of deep learning.

Source: https://arxiv.org/abs/2302.00722


Related post