[IEEE Trans. on Information Theory, May 1995, pp. 665-676]

Fixed Rate Universal Lossy Source Coding and Rates of Convergence for Memoryless Sources

Tamás Linder, Gábor Lugosi, and Kenneth Zeger

Abstract

A fixed rate universal lossy coding scheme is introduced for i.i.d. sources. It is shown for finite alphabet sources and arbitrary single letter distortion measures that as the sample size n grows the expected distortion obtained using this universal scheme converges to Shannon's distortion rate function D(R) at a rate O(log(n)/n). The scheme can be extended to universal quantization of real i.i.d sources subject to a squared error criterion. It is shown in this case that the per letter distortion converges to D(R) at a rate $O(\sqrt{log(n)/n})$ both in expectation and almost surely for any real valued bounded i.i.d. source.