Skip to yearly menu bar Skip to main content


Efficient Quantum Agnostic Improper Learning of Decision Trees

Sagnik Chatterjee · Tharrmashastha SAPV · Debajyoti Bera

MR1 & MR2 - Number 79
[ ]
Fri 3 May 8 a.m. PDT — 8:30 a.m. PDT

Abstract: The agnostic setting is the hardest generalization of the PAC model since it is akin to learning with adversarial noise. In this paper, we give a poly $(n, t, 1/\epsilon)$ quantum algorithm for learning size $t$ decision trees over $n$-bit inputs with uniform marginal over instances, in the agnostic setting, without membership queries (MQ). This is the first algorithm (classical or quantum) for efficiently learning decision trees without MQ. First, we construct a quantum agnostic weak learner by designing a quantum variant of the classical Goldreich-Levin algorithm that works with strongly biased function oracles. Next, we show how to quantize the agnostic boosting algorithm by Kalai and Kanade (2009) to obtain the first efficient quantum agnostic boosting algorithm (that has a polynomial speedup over existing adaptive quantum boosting algorithms). We then use the quantum agnostic boosting algorithm to boost the weak quantum agnostic learner constructed previously to obtain a quantum agnostic learner for decision trees. Using the above framework, we also give quantum decision tree learning algorithms without MQ in weaker noise models.

Live content is unavailable. Log in and register to view live content