Learning Mixtures of Spherical Gaussians via Fourier Analysis

04/13/2020
by   Hariharan Narayanan, et al.
0

Suppose that we are given independent, identically distributed samples x_l from a mixture μ of no more than k of d-dimensional spherical gaussian distributions μ_i with variance 1, such that the minimum ℓ_2 distance between two distinct centers y_l and y_j is greater than √(d)Δ for some c ≤Δ, where c∈ (0,1) is a small positive universal constant. We develop a randomized algorithm that learns the centers y_l of the gaussians, to within an ℓ_2 distance of δ < Δ√(d)/2 and the weights w_l to within cw_min with probability greater than 1 - (-k/c). The number of samples and the computational time is bounded above by poly(k, d, 1/δ). Such a bound on the sample and computational complexity was previously unknown when ω(1) ≤ d ≤ O(log k). When d = O(1), this follows from work of Regev and Vijayaraghavan. These authors also show that the sample complexity of learning a random mixture of gaussians in a ball of radius Θ(√(d)) in d dimensions, when d is Θ( log k) is at least poly(k, 1/δ), showing that our result is tight in this case.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro