Towards quantifying information flows: relative entropy in deep neural networks and the renormalization group. (arXiv:2107.06898v1 [hep-th])

We investigate the analogy between the renormalization group (RG) and deep
neural networks, wherein subsequent layers of neurons are analogous to
successive steps along the RG. In particular, we quantify the flow of
information by explicitly computing the relative entropy or Kullback-Leibler
divergence in both the one- and two-dimensional Ising models under decimation
RG, as well as in a feedforward neural network as a function of depth. We
observe qualitatively identical behavior characterized by the monotonic
increase to a parameter-dependent asymptotic value. On the quantum field theory
side, the monotonic increase confirms the connection between the relative
entropy and the c-theorem. For the neural networks, the asymptotic behavior may
have implications for various information maximization methods in machine
learning, as well as for disentangling compactness and generalizability.
Furthermore, while both the two-dimensional Ising model and the random neural
networks we consider exhibit non-trivial critical points, the relative entropy
appears insensitive to the phase structure of either system. In this sense,
more refined probes are required in order to fully elucidate the flow of
information in these models.

Source: https://arxiv.org/abs/2107.06898

webmaster

Related post