Skip to yearly menu bar Skip to main content


Poster

Ordered V-information Growth: A Fresh Perspective on Shared Information

Danqi Liao · Szu Hui Ng


Abstract: Mutual information (MI) is widely employed as a measure of shared information between random variables. However, MI assumes unbounded computational resources—a condition rarely met in practice, where predicting a random variable Y from X must rely on finite resources. V-information addresses this limitation by employing a predictive family V to emulate computational constraints, yielding a directed measure of shared information. Focusing on the mixed setting (continuous X and discrete Y), here we highlight the upward bias of empirical V-information, I^V(XY), even when V is low-complexity (e.g., shallow neural networks). To mitigate this bias, we introduce V-Information Growth (VI-Growth), defined as hatIV(XY)I^V(XY), where X,YPXPY represent independent variables. While VI-Growth effectively counters over-estimation, more complex predictive families may lead to under-estimation. To address this, we construct a sequence of predictive families V1,V2,,V of increasing complexity and compute the maximum of VI-Growth across these families, yielding the ordered VI-Growth (O-VIG). We provide theoretical results that justify this approach, showing that O-VIG is a provably tighter lower bound for the true V-Information than empirical V-Information itself, and exhibits stronger convergence properties than V-Information. Empirically, O-VIG alleviates bias and consistently outperforms state-of-the-art methods in both MI estimation and dataset complexity estimation, demonstrating its practical utility.

Live content is unavailable. Log in and register to view live content