Agnostic Sample Compression for Linear Regression

10/03/2018
by   Steve Hanneke, et al.
0

We obtain the first positive results for bounded sample compression in the agnostic regression setting. We show that for p in 1,infinity, agnostic linear regression with ℓ_p loss admits a bounded sample compression scheme. Specifically, we exhibit efficient sample compression schemes for agnostic linear regression in R^d of size d+1 under the ℓ_1 loss and size d+2 under the ℓ_∞ loss. We further show that for every other ℓ_p loss (1 < p < infinity), there does not exist an agnostic compression scheme of bounded size. This refines and generalizes a negative result of David, Moran, and Yehudayoff (2016) for the ℓ_2 loss. We close by posing a general open question: for agnostic regression with ℓ_1 loss, does every function class admit a compression scheme of size equal to its pseudo-dimension? This question generalizes Warmuth's classic sample compression conjecture for realizable-case classification (Warmuth, 2003).

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro