Tight Regret Bounds for Noisy Optimization of a Brownian Motion

01/25/2020
by   Zexin Wang, et al.
0

We consider the problem of Bayesian optimization of a one-dimensional Brownian motion in which the T adaptively chosen observations are corrupted by Gaussian noise. We show that as the smallest possible expected simple regret and the smallest possible expected cumulative regret scale as Ω(1 / √(T log (T))) ∩O(log T / √(T)) and Ω(√(T / log (T))) ∩O(√(T)·log T) respectively. Thus, our upper and lower bounds are tight up to a factor of O( (log T)^1.5 ). The upper bound uses an algorithm based on confidence bounds and the Markov property of Brownian motion, and the lower bound is based on a reduction to binary hypothesis testing.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro