A Transfer Principle: Universal Approximators Between Metric Spaces From Euclidean Universal Approximators

04/24/2023
βˆ™
by   Anastasis Kratsios, et al.
βˆ™
0
βˆ™

We build universal approximators of continuous maps between arbitrary Polish metric spaces 𝒳 and 𝒴 using universal approximators between Euclidean spaces as building blocks. Earlier results assume that the output space 𝒴 is a topological vector space. We overcome this limitation by "randomization": our approximators output discrete probability measures over 𝒴. When 𝒳 and 𝒴 are Polish without additional structure, we prove very general qualitative guarantees; when they have suitable combinatorial structure, we prove quantitative guarantees for HΓΆlder-like maps, including maps between finite graphs, solution operators to rough differential equations between certain Carnot groups, and continuous non-linear operators between Banach spaces arising in inverse problems. In particular, we show that the required number of Dirac measures is determined by the combinatorial structure of 𝒳 and 𝒴. For barycentric 𝒴, including Banach spaces, ℝ-trees, Hadamard manifolds, or Wasserstein spaces on Polish metric spaces, our approximators reduce to 𝒴-valued functions. When the Euclidean approximators are neural networks, our constructions generalize transformer networks, providing a new probabilistic viewpoint of geometric deep learning.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro