[IEEE Trans. on Information Theory, January 1996, pp. 48-54]
Concept Learning using Complexity Regularization
Gábor Lugosi and Kenneth Zeger
Abstract
We apply the method of complexity regularization to learn concepts from large
concept classes. The method is shown to automatically find a good balance
between the approximation error and the estimation error. In particular, the
error probability of the obtained classifier is shown to decrease as
$O(\sqrt{log(n)/n})$ to the achievable optimum, for large nonparametric classes
of distributions, as the sample size n grows. We also show that if the Bayes
error probability is zero and the Bayes rule is in a known family of decision
rules, the error probability is O(log(n)/n) for many large families, possibly
with infinite VC dimension.