Skip to yearly menu bar Skip to main content


Poster

Two-Timescale Linear Stochastic Approximation: Constant Stepsizes Go a Long Way

Liu Yang


Abstract: Previous studies on two-timescale stochastic approximation (SA) mainly focused on bounding mean-squared errors under diminishing stepsize schemes. In this work, we investigate {\it constant} stpesize schemes through the lens of Markov processes, proving that the iterates of both timescales converge to a unique joint stationary distribution in Wasserstein metric. We derive explicit geometric and non-asymptotic convergence rates, as well as the variance and bias introduced by constant stepsizes in the presence of Markovian noise. Specifically, with two constant stepsizes α<βα<β, we show that the biases scale linearly with both stepsizes as Θ(α)+Θ(β) up to higher-order terms, while the variance of the slower iterate (resp., faster iterate) scales only with its own stepsize as O(α) (resp., O(β)). Unlike previous work, our results require no additional assumptions such as β2α nor extra dependence on dimensions. These fine-grained characterizations allow tail-averaging and extrapolation techniques to reduce variance and bias, improving mean-squared error bound to O(β4+1t) for both iterates.

Live content is unavailable. Log in and register to view live content