COMPARING DIFFERENT ALGORITHMS is hard. For
almost any pair of algorithms and measure of
algorithm performance like running time or solution
quality, each algorithm will perform better than the
other on some inputs.a For example, the insertion sort
algorithm is faster than merge sort on already-sorted
arrays but slower on many other inputs. When two
algorithms have incomparable performance, how can
we deem one of them “better than” the other?
Worst-case analysis is a specific modeling choice
in the analysis of algorithms, where the overall
performance of an algorithm is summarized by its
worst performance on any input of a given size. The
“better” algorithm is then the one with superior worst-case performance. Merge sort, with its worst-case
asymptotic running time of Θ(n log n) for arrays of
length n, is better in this sense than insertion sort,
which has a worst-case running time of Θ(n2).
a In rare cases a problem admits an instance-optimal algorithm, which is as good as every other algorithm on every input, up to a constant factor.
23 For most problems, there is no instance-optimal
algorithm, and there is no escaping the incomparability of different algorithms.
While crude, worst-case analysis
can be tremendously useful, and it is
the dominant paradigm for algorithm
analysis in theoretical computer science. A good worst-case guarantee
is the best-case scenario for an algorithm, certifying its general-purpose
utility and absolving its users from understanding which inputs are relevant
to their applications. Remarkably, for
many fundamental computational
problems, there are algorithms with
excellent worst-case performance
guarantees. The lion’s share of an undergraduate algorithms course comprises algorithms that run in linear or
near-linear time in the worst case.
For many problems a bit beyond
the scope of an undergraduate course,
however, the downside of worst-case
analysis rears its ugly head. Here, I
review three classical examples where
worst-case analysis gives misleading
or useless advice about how to solve a
problem; further examples in modern
machine learning are described later.
These examples motivate the alternatives to worst-case analysis described
in the article.b
The simplex method for linear
programming. Perhaps the most famous failure of worst-case analysis
concerns linear programming, the
problem of optimizing a linear func-
b For many more examples, analysis frameworks, and applications, see the author’s lecture notes.
The need for deeply understanding
when algorithms work (or not)
has never been greater.
BY TIM ROUGHGARDEN
˽ Worse-case analysis takes a "Murphy's
Law" approach to algorithm analysis,
which is too crude to give meaningful
algorithmic guidance for many
important problems, including linear
programming, clustering, caching,
and neural network training.
˽ Research going "beyond worst-case
analysis" articulates properties of
realistic inputs, and proves rigorous and
meaningful algorithmic guarantees for
inputs with these properties.
˽ Much of the present and future
research in the area is motivated by
the unreasonable effectiveness
of machine learning algorithms.