Skip to yearly menu bar Skip to main content


Sequence Length Independent Norm-Based Generalization Bounds for Transformers

Jacob Trauger · Ambuj Tewari

MR1 & MR2 - Number 144
[ ]
Fri 3 May 8 a.m. PDT — 8:30 a.m. PDT


This paper provides norm-based generalization bounds for the Transformer architecture that do not depend on the input sequence length. We employ a covering number based approach to prove our bounds. We use three novel covering number bounds for the function class of bounded linear mappings to upper bound the Rademacher complexity of the Transformer. Furthermore, we show this generalization bound applies to the common Transformer training technique of masking and then predicting the masked word. We also run a simulated study on a sparse majority data set that empirically validates our theoretical findings.

Live content is unavailable. Log in and register to view live content