Skip to yearly menu bar Skip to main content


Poster

Flexible Copula-Based Mixed Models in Deep Learning: A Scalable Approach to Arbitrary Marginals

Nir Sharon


Abstract:

We introduce copula-based neural networks (COPNN), a novel framework that extends beyond the limitations of Gaussian marginals for random effects in mixed models. COPNN integrates the flexibility of Gaussian copulas in capturing rich dependence structures with arbitrary marginal distributions, with the expressive power of deep neural networks (DNN), allowing it to model large non-Gaussian data in both regression and classification settings, while using batch learning and stochastic gradient descent. Unlike traditional linear and non-linear mixed models, which assume Gaussianity for random effects, COPNN leverages copulas to decouple the marginal distribution from the dependence structure, caused by spatial, temporal and high-cardinality categorical features. This is achieved by minimizing a batch negative log-likelihood (NLL) loss in the continuous case, and a batch negative pairwise log-likelihood in the binary case. We demonstrate COPNN’s effectiveness through extensive experiments on both simulated and real datasets. COPNN reduces NLL and MSE in the regression setting, and improves predictive accuracy in the classification setting, compared to previous state of the art methods which integrate random effects into DNN. Our real-world experiments, conducted on datasets from automotive pricing and retail traffic forecasting, further validate COPNN's ability to improve performance over traditional methods for dealing with high-cardinality categorical features.

Live content is unavailable. Log in and register to view live content