Skip to yearly menu bar Skip to main content


Learning Adaptive Kernels for Statistical Independence Tests

Yixin Ren · Yewei Xia · Hao Zhang · Jihong Guan · Shuigeng Zhou

MR1 & MR2 - Number 4
[ ] [ Project Page ]
Fri 3 May 8 a.m. PDT — 8:30 a.m. PDT


We propose a novel framework for kernel-based statistical independence tests that enable adaptatively learning parameterized kernels to maximize test power. Our framework can effectively address the pitfall inherent in the existing signal-to-noise ratio criterion by modeling the change of the null distribution during the learning process. Based on the proposed framework, we design a new class of kernels that can adaptatively focus on the significant dimensions of variables to judge independence, which makes the tests more flexible than using simple kernels that are adaptive only in length-scale, and especially suitable for high-dimensional complex data. Theoretically, we demonstrate the consistency of our independence tests, and show that the non-convex objective function used for learning fits the L-smoothing condition, thus benefiting the optimization. Experimental results on both synthetic and real data show the superiority of our method. The source code and datasets are available at \url{}.

Live content is unavailable. Log in and register to view live content