Poster
Ordered -information Growth: A Fresh Perspective on Shared Information
Danqi Liao · Szu Hui Ng
[
Abstract
]
Abstract:
Mutual information (MI) is widely employed as a measure of shared information between random variables. However, MI assumes unbounded computational resources—a condition rarely met in practice, where predicting a random variable from must rely on finite resources. -information addresses this limitation by employing a predictive family to emulate computational constraints, yielding a directed measure of shared information. Focusing on the mixed setting (continuous and discrete ), here we highlight the upward bias of empirical -information, , even when is low-complexity (e.g., shallow neural networks). To mitigate this bias, we introduce -Information Growth (VI-Growth), defined as , where represent independent variables. While VI-Growth effectively counters over-estimation, more complex predictive families may lead to under-estimation. To address this, we construct a sequence of predictive families of increasing complexity and compute the maximum of VI-Growth across these families, yielding the ordered VI-Growth (O-VIG). We provide theoretical results that justify this approach, showing that O-VIG is a provably tighter lower bound for the true -Information than empirical -Information itself, and exhibits stronger convergence properties than -Information. Empirically, O-VIG alleviates bias and consistently outperforms state-of-the-art methods in both MI estimation and dataset complexity estimation, demonstrating its practical utility.
Live content is unavailable. Log in and register to view live content