Authors: Akiyoshi Sannai,Masaaki Imaizumi
ArXiv: 1910.06552
Document:
PDF
DOI
Abstract URL: https://arxiv.org/abs/1910.06552v2
A large number of group invariant (or equivariant) networks have succeeded in handling invariant data such as point clouds and graphs. However, generalization theory for the networks has not been well developed, because several essential factors for generalization theory, such as size and margin distribution, are not very suitable to explain invariance and equivariance. In this paper, we develop a generalization error bound for invariant and equivariant deep neural networks. To describe the effect of the properties on generalization, we develop a quotient feature space, which measures the effect of group action for invariance or equivariance. Our main theorem proves that the volume of quotient feature spaces largely improves the main term of the developed bound. We apply our result to a specific invariant and equivariant networks, such as DeepSets (Zaheer et al. (2017)), then show that their generalization bound is drastically improved by $\sqrt{n!}$ where $n$ is a number of permuting coordinates of data. Moreover, we additionally discuss the representation power of invariant DNNs, and show that they can achieve an optimal approximation rate. This paper is the first study to provide a general and tight generalization bound for a broad class of group invariant and equivariant deep neural networks.