Restart perturbations for lazy, reversible Markov chains: trichotomy and pre-cutoff equivalence

07/05/2019
by   Daniel Vial, et al.
0

Given a lazy, reversible Markov chain with n states and transition matrix P_n, a distribution σ_n over the states, and some α_n ∈ (0,1), we consider restart perturbations, which take the following form: with probability 1-α_n, sample the next state from P_n; with probability α_n, sample the next state from σ_n (i.e. "restart" at a random state). Our main object of study is π_n - π̃_n, where π_n and π̃_n are the stationarity distributions of the original and perturbed chains and · denotes total variation. Our first result characterizes π_n - π̃_n in terms of the ϵ-mixing times t_mix^(n)(ϵ) of the P_n chain, assuming these mixing times exhibit cutoff. Namely, we show that if α_n t_mix^(n)(ϵ) → 0, then π_n - π̃_n → 0 for any restart perturbation; if α_n t_mix^(n)(ϵ) →∞, then π_n - π̃_n → 1 for some restart perturbation; and if α_n t_mix^(n)(ϵ) → c ∈ (0,∞), then _n →∞π_n - π̃_n ≤ 1 - e^-c for any restart perturbation (and the bound is tight). Similar "trichotomies" have appeared in several result results; however, these existing results consider generative models for the chain, whereas ours applies more broadly. Our second result shows the weaker notion of pre-cutoff is (almost) equivalent to a certain notion of "sensitivity to perturbation", in the sense that π_n - π̃_n → 1 for certain perturbations. This complements a recent result by Basu, Hermon, and Peres, which shows that cutoff is equivalent to a certain notion of "hitting time cutoff".

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro