[IEEE Trans. on Information Theory, submitted November 13, 2023]
Competitive Advantage of Huffman and Shannon-Fano Codes
Spencer Congero and Kenneth Zeger
Abstract
For any finite discrete source,
the competitive advantage of prefix code C1 over prefix code C2
is the probability C1 produces a shorter codeword
than C2,
minus the probability C2 produces
a shorter codeword than C1.
For any source,
a prefix code is competitively optimal if it has
a nonnegative competitive advantage over all other prefix codes.
In 1991, Cover proved that
Huffman codes are competitively optimal
for all dyadic sources.
We prove the following asymptotic converse:
As the source size grows,
the probability a Huffman code for a randomly chosen non-dyadic source
is competitively optimal converges to zero.
We also prove:
(i) For any source,
competitively optimal codes cannot exist
unless a Huffman code is competitively optimal;
(ii) For any non-dyadic source,
a Huffman code has
a positive competitive advantage over a Shannon-Fano code;
(iii) For any source, the competitive advantage
of any prefix code over a Huffman code is strictly less than 1/3;
(iv)
For each integer n ≥ 3,
there exists a source of size n
and some prefix code whose competitive advantage over a Huffman code is
arbitrarily close to 1/3; and
(v) For each positive integer n,
there exists a source of size n
and some prefix code whose competitive advantage over a Shannon-Fano code becomes
arbitrarily close to 1 as n → ∞.