Noise Contrastive Meta-Learning for ConditionalDensity Estimation using Kernel Mean Embeddings

Jean-Francois Ton · Lucian CHAN · Yee Whye Teh · Dino Sejdinovic

Keywords: [ Models and Methods ] [ Kernel Methods ]

[ Abstract ]
Thu 15 Apr 7:30 a.m. PDT — 9:30 a.m. PDT


Current meta-learning approaches focus on learning functional representations of relationships between variables, \textit{i.e.} estimating conditional expectations in regression. In many applications, however, the conditional distributions cannot be meaningfully summarized solely by expectation (due to \textit{e.g.} multimodality). We introduce a novel technique for meta-learning conditional densities, which combines neural representation and noise contrastive estimation together with well-established literature in conditional mean embeddings into reproducing kernel Hilbert spaces. The method shows significant improvements over standard density estimation methods on synthetic and real-world data, by leveraging shared representations across multiple conditional density estimation tasks.

Chat is not available.