Two-Sample Test with Kernel Projected Wasserstein Distance

Jie Wang · Rui Gao · Yao Xie

[ Abstract ]
Mon 28 Mar 10:15 a.m. PDT — 11:45 a.m. PDT
Oral presentation: Oral 11: Learning theory / Kernels
Wed 30 Mar 7 a.m. PDT — 8 a.m. PDT


We develop a kernel projected Wasserstein distance for the two-sample test, an essential building block in statistics and machine learning: given two sets of samples, to determine whether they are from the same distribution. This method operates by finding the nonlinear mapping in the data space which maximizes the distance between projected distributions. In contrast to existing works about projected Wasserstein distance, the proposed method circumvents the curse of dimensionality more efficiently. We present practical algorithms for computing this distance function together with the non-asymptotic uncertainty quantification of empirical estimates. Numerical examples validate our theoretical results and demonstrate good performance of the proposed method.

Chat is not available.