# Empirical processes for those who are familiar with statistical learning theory.

# Goal

This page describes basic facts and references on empirical process theories for those who are already familiar with statistical learning theory.

## Rademacher complexity

Recall the definition of **Rademacher complexity**.

It is good to know that Rademacher complexity is an intermediate product in proving a uniform deviation bound. **Symmetrization** reduces the problem of bounding the uniform deviation to bounding the Rademacher complexity. After reducing the problem to bounding the Rademacher complexity, various techniques can be used.

## Margin-based bounds, discretization, chaining, Dudley integral

There are various ways to bound the Rademacher complexity.

Historically, the bounds based on **discretization** was first developed. It is a technique to bound R by metric entropy. However, this does not yield good rates (e.g., the set of D-variate Lipschitz-continuous functions on [0, 1]^D have n^{-1/(2+D)} rate. For D=1, this is n^{-1/3} and it is not the typical rate for smooth functions (n^{-1/2})).

Then the **chaining argument** appeared which yields a better rate dependence. This offers a better rate basically for free. **Dudley integral** is a technique used in the chaining argument. It was invented later than the chaining argument, but it is a default argument attributed to Dudley.

Lastly, margin-based bounds such as those for kernel methods appeared. They directly bound the Rademacher complexity, hence they do not require explicit treatment of the metric entropy. As a result, they do not necessarily exhibit a dependence on the input dimensionality.

## Dimension dependence in the upper bounds

You may be wondering about the connection between the margin-based bounds and the classical upper bounds such as n^{-1/(2+d)}. In many textbooks on statistics, you see the input dimension in many textbooks while you often donâ€™t encounter them in margin-based bounds.

Therefore, the dimensional dependence may or may not appear depending on how you bound the Rademacher complexity. Depending on the function class, the dimensional dependence may or may not explicitly appear.

Many classical bounds analyze the complexity of non-parametric smooth classes such as Holder class. They have rather explicit parameters of smoothness. On the other hand, margin-based bounds somehow hides the smoothness of functions in the norms of parameters. As a result, although they are somehow parallel in that they are different assumptions / classes to bound the Rademacher complexity, the dimensional dependence or smoothness dependence is explicit in one (metric entropy based bounds) and it is implicit in another (margin-based bounds).

The important thing is that they are somehow parallel. They are trying to bound different classes of functions. They use different quantities (smoothness or parameter norms). However, they are both methods to bound the Rademacher complexity.