Nonparametric Topological Layers in Neural Networks. (arXiv:2111.14829v1 [cs.LG])

Various topological techniques and tools have been applied to neural networks
in terms of network complexity, explainability, and performance. One
fundamental assumption of this line of research is the existence of a global
(Euclidean) coordinate system upon which the topological layer is constructed.
Despite promising results, such a textit{topologization} method has yet to be
widely adopted because the parametrization of a topologization layer takes a
considerable amount of time and more importantly, lacks a theoretical
foundation without which the performance of the neural network only achieves
suboptimal performance. This paper proposes a learnable topological layer for
neural networks without requiring a Euclidean space; Instead, the proposed
construction requires nothing more than a general metric space except for an
inner product, i.e., a Hilbert space. Accordingly, the according
parametrization for the proposed topological layer is free of user-specified
hyperparameters, which precludes the costly parametrization stage and the
corresponding possibility of suboptimal networks.

Source: https://arxiv.org/abs/2111.14829

webmaster

Related post