and Linear Equations
By Bruce Hendrickson
AlgOritHMic AdVAnces cAn come
from the most unexpected places.
The following paper by Koutis, Miller,
and Peng is an elegant case in point.
It describes an emerging approach to
solving linear systems of equations
that relies heavily on techniques from
OK, I admit it. Solving systems of
equations is one of those “supposed
to be good for you” topics that we all
had to suffer through but most of us
quickly forgot. But it also happens to
be hugely important. For decades, scientists and engineers have simulated
the world by solving linear systems.
Systems with billions of variables are
commonly solved, and this computation almost certainly consumes the
majority of supercomputing cycles in
the world. In recent years, linear systems have played a key role in page
ranking, data analysis, recommendation systems and numerous other aspects of our data-centric world.
Back in high school we were all
taught how to solve such systems by
repeatedly subtracting one equation
from another to eliminate a variable.
Unfortunately, this simple approach
takes O(n3) time for n equations with
n variables. These days, large systems
are usually solved using a different
type of approach in which better and
better approximations to the answer
are computed until an acceptable
tolerance is achieved. The traditional
theory behind these iterative methods
relies on numerical analysis, not a
favorite class for most computer science students.
In recent years, a small cadre of researchers has been building a novel
theoretical framework for analyzing
a very important class of linear systems. With this new approach, the
performance of an algorithm can
be evaluated using techniques from
graph theory—a discipline quite different from numerical analysis. The
authors describe the quirky history of
and linear solvers
described in the
following paper is
a perfect example
this line of research, starting with the
conceptual breakthrough by Vaidya
and continuing with the theoretical
heavy lifting provided by Spielman,
Teng, and collaborators.
Beyond its novelty value, this research affords insights that traditional
approaches cannot. Numerical analysis techniques (and the algorithms that
come from them) often have to assume
that the system of equations comes
from a grid with very simple connectivity. These assumptions are clearly not
true for data-centric applications like
those mentioned above. By contrast,
the new techniques are quite general
and impose no constraints on the
structure of the system of equations.
They lead to algorithmic approaches
and analysis tools that are unlike anything that has come before. In addition, theoretical computer scientists
think about computational complexity
in a different way than numerical analysts do, and so provide a new framework for algorithmic analysis.
This is a challenging area of re-
search because it cuts across disci-
plines and requires a depth of expertise
in multiple areas. This paper provides
an accessible introduction to this rap-
idly evolving set of ideas. Its key contri-
bution is a significantly simpler algo-
rithm that retains desirable theoretical
properties. Unfortunately, “simpler”
is still probably too complex for the
numerical computing community to
adopt. But this work brings these ideas
much closer to practical realization.
Bruce hendrickson ( email@example.com) is senior
Manager for Computational sciences and Mathematics
at sandia national laboratories. he is also an affiliated
faculty member in the computer science department at
the university of new Mexico.