Doubly Mixed-Effects Gaussian Process Regression

Jun Ho Yoon · Daniel Jeong · Seyoung Kim

[ Abstract ]
Wed 30 Mar 8:30 a.m. PDT — 10 a.m. PDT
Oral presentation: Oral 10: Gaussian processes / Optimization / Online ML
Wed 30 Mar 6 a.m. PDT — 7 a.m. PDT


We address the multi-task Gaussian process (GP) regression problem with the goal of decomposing input effects on outputs into components shared across or specific to tasks and samples. We propose a family of mixed-effects GPs, including doubly and translated mixed-effects GPs, that performs such a decomposition, while also modeling the complex task relationships. Instead of the tensor product widely used in multi-task GPs, we use the direct sum and Kronecker sum for Cartesian product to combine task and sample covariance functions. With this kernel, the overall input effects on outputs decompose into four components: fixed effects shared across tasks and across samples and random effects specific to each task and to each sample. We describe an efficient stochastic variational inference method for our proposed models that also significantly reduces the cost of inference for the existing mixed-effects GPs. On simulated and real-world data, we demonstrate that our approach provides higher test accuracy and interpretable decomposition.

Chat is not available.