A short tour of operator learning theory: Convergence rates, statistical limits, and open questions

arXiv:2603.00819v1 Announce Type: cross Abstract: This paper surveys recent developments at the intersection of operator learning, statistical learning theory, and approximation theory. First, it reviews error bounds for empirical risk minimization with a focus on holomorphic operators and neural...

A short tour of operator learning theory: Convergence rates, statistical limits, and open questions
arXiv:2603.00819v1 Announce Type: cross Abstract: This paper surveys recent developments at the intersection of operator learning, statistical learning theory, and approximation theory. First, it reviews error bounds for empirical risk minimization with a focus on holomorphic operators and neural network approximations. Next, it illustrates fundamental performance limits in terms of sample size by adopting a minimax perspective and considering various notions of regularity beyond holomorphy. The paper ends with a discussion on the interplay between these two perspectives and related open questions.