Skip to yearly menu bar Skip to main content


Membership Testing in Markov Equivalence Classes via Independence Queries

Jiaqi Zhang · Kirankumar Shiragur · Caroline Uhler

MR1 & MR2 - Number 39
[ ]
Sat 4 May 6 a.m. PDT — 8:30 a.m. PDT
Oral presentation: Oral: Causality
Fri 3 May 2:30 a.m. PDT — 3:30 a.m. PDT

Abstract: Understanding causal relationships between variables is a fundamental problem with broad impacts in numerous scientific fields. While extensive research have been dedicated to _learning_ causal graphs from data, its complementary concept of _testing_ causal relationships has remained largely unexplored. In our work, we take the initiative to formally delve into the _testing_ aspect of causal discovery. While _learning_ involves the task of recovering the Markov equivalence class (MEC) of the underlying causal graph from observational data, our _testing_ counterpart addresses a critical question: _Given a specific MEC, can we determine if the underlying causal graph belongs to it with observational data?_We explore constraint-based testing methods by establishing bounds on the required number of conditional independence tests. Our bounds are in terms of the size of the maximum clique ($s'$) and the size of the maximum undirected clique ($s$) of the given MEC. In the worst case, we show a lower bound of $\exp(\Omega(s))$ independence tests. We then give an algorithm that resolves the task with $\exp(O(s'))$ independence tests. Notably, our lower and upper bounds coincide when considering moral MECs ($s'=s$). Compared to _learning_, where algorithms often use a number of independence tests that is exponential in the maximum in-degree, our results show that _testing_ is a relatively more manageable task. In particular, it requires exponentially less independence tests in graphs featuring high in-degrees and small clique sizes.

Live content is unavailable. Log in and register to view live content