Skip to yearly menu bar Skip to main content


Poster

Sequence Length Independent Norm-Based Generalization Bounds for Transformers

Jacob Trauger · Ambuj Tewari

Multipurpose Room 2 - Number 144

Abstract:

This paper provides norm-based generalization bounds for the Transformer architecture that do not depend on the input sequence length. We employ a covering number based approach to prove our bounds. We use three novel covering number bounds for the function class of bounded linear mappings to upper bound the Rademacher complexity of the Transformer. Furthermore, we show this generalization bound applies to the common Transformer training technique of masking and then predicting the masked word. We also run a simulated study on a sparse majority data set that empirically validates our theoretical findings.

Live content is unavailable. Log in and register to view live content