Approximating Lipschitz continuous functions with GroupSort neural networks

Ugo Tanielian · Gerard Biau

Keywords: [ Deep Learning ] [ Theory ]

[ Abstract ]
Thu 15 Apr 7:30 a.m. PDT — 9:30 a.m. PDT


Recent advances in adversarial attacks and Wasserstein GANs have advocated for use of neural networks with restricted Lipschitz constants. Motivated by these observations, we study the recently introduced GroupSort neural networks, with constraints on the weights, and make a theoretical step towards a better understanding of their expressive power. We show in particular how these networks can represent any Lipschitz continuous piecewise linear functions. We also prove that they are well-suited for approximating Lipschitz continuous functions and exhibit upper bounds on both the depth and size. To conclude, the efficiency of GroupSort networks compared with more standard ReLU networks is illustrated in a set of synthetic experiments.

Chat is not available.