Poster
Tensor Network-Constrained Kernel Machines as Gaussian Processes
Frederiek Wesel
In this paper we establish a new connection between Tensor Network (TN)-constrained kernel machines and Gaussian Processes (GPs). We prove that the outputs of Canonical Polyadic Decomposition (CPD) and TensorTrain (TT)-constrained kernel machines converge in the limit of large ranks to the same GP which we fully characterize, when specifying appropriate i.i.d. priors across their components. We show that TT-constrained models achieve faster convergence to the GP compared to their CPD counterparts for thesame number of model parameters. The convergence to the GP occurs as the ranks tend toinfinity, as opposed to the standard approachwhich introduces TNs as an additional constraint on the posterior. This implies that thenewly established priors allow the models tolearn features more freely as they necessitateinfinitely more parameters to converge to aGP, which is characterized by a fixed learningrepresentation and thus no feature learning.As a consequence, the newly derived priors yield more flexible models which can better fit the data, albeit at increased risk of overfitting. We demonstrate these considerations by means of two numerical experiments.
Live content is unavailable. Log in and register to view live content