Publications

Efficient equivariant learning using algebraic invariant theory

Published in (pre-print coming soon to Arxiv, under review), 2024

Abstract: We exploit algebraic invariant theory to provide a natural structure to equivariant learning algorithms. In particular, this avoids repeated averaging over group orbits, which is a common inefficiency in existing equivariant learning implementations. Invariant theory provides a flexible and universal theoretical framework for equivariant learning without the need for architectures tailored to specific groups. We provide a Python package to calculate algebraic generators for equivariant functions, which underpin practical implementations of this framework, and demonstrate the efficiency of our approach in the context of equivariant neural fields.