How Jellyfish Characterise Alternating Group Equivariant Neural Networks
Abstract
The study characterizes all possible alternating group ($A_n$) equivariant neural networks using tensor powers of $\mathbb{R}^{n}$ and provides a basis for their learnable, linear layer functions, which also generalize to local symmetries.
We provide a full characterisation of all of the possible alternating group (A_n) equivariant neural networks whose layers are some tensor power of R^{n}. In particular, we find a basis of matrices for the learnable, linear, A_n-equivariant layer functions between such tensor power spaces in the standard basis of R^{n}. We also describe how our approach generalises to the construction of neural networks that are equivariant to local symmetries.
Models citing this paper 0
No model linking this paper
Datasets citing this paper 0
No dataset linking this paper
Spaces citing this paper 0
No Space linking this paper
Collections including this paper 0
No Collection including this paper