Skip to yearly menu bar Skip to main content


Poster

Learning Adaptive Kernels for Statistical Independence Tests

Yixin Ren · Yewei Xia · Hao Zhang · Jihong Guan · Shuigeng Zhou

Multipurpose Room 1 - Number 4

Abstract:

We propose a novel framework for kernel-based statistical independence tests that enable adaptatively learning parameterized kernels to maximize test power. Our framework can effectively address the pitfall inherent in the existing signal-to-noise ratio criterion by modeling the change of the null distribution during the learning process. Based on the proposed framework, we design a new class of kernels that can adaptatively focus on the significant dimensions of variables to judge independence, which makes the tests more flexible than using simple kernels that are adaptive only in length-scale, and especially suitable for high-dimensional complex data. Theoretically, we demonstrate the consistency of our independence tests, and show that the non-convex objective function used for learning fits the L-smoothing condition, thus benefiting the optimization. Experimental results on both synthetic and real data show the superiority of our method. The source code and datasets are available at \url{https://github.com/renyixin666/HSIC-LK.git}.

Live content is unavailable. Log in and register to view live content