Poster

On the number of linear functions composing deep neural network: Towards a refined definition of neural networks complexity

Yuuki Takai · Akiyoshi Sannai · Matthieu Cordonnier

Keywords: [ Deep Learning ] [ Theory ]

[ Abstract ]
Wed 14 Apr 6 a.m. PDT — 8 a.m. PDT

Abstract:

The classical approach to measure the expressive power of deep neural networks with piecewise linear activations is based on counting their maximum number of linear regions. This complexity measure is quite relevant to understand general properties of the expressivity of neural networks such as the benefit of depth over width. Nevertheless, it appears limited when it comes to comparing the expressivity of different network architectures. This lack becomes particularly prominent when considering permutation-invariant networks, due to the symmetrical redundancy among the linear regions. To tackle this, we propose a refined definition of piecewise linear function complexity: instead of counting the number of linear regions directly, we first introduce an equivalence relation among the linear functions composing a piecewise linear function and then count those linear functions relative to that equivalence relation. Our new complexity measure can clearly distinguish between the two aforementioned models, is consistent with the classical measure, and increases exponentially with depth.

Chat is not available.