formation about our pattern language
is available at http://parlab.eecs.berke-
ley.edu/wiki/patterns/patterns.):
n Structural patterns describe the
overall structure of a computation
without constraining the actual
computation itself. These include
patterns such as pipe and filter,
agent and repository, map reduce,
and static task graph, among others.
n Computational patterns describe
several important classes of compu-
tation that arise frequently in com-
putationally intensive applications.
Computational patterns include lin-
ear algebra, spectral methods, and
branch and bound.
nAlgorithm strategy patterns de-
scribe ways of decomposing a com-
putation into parallel units. These
include data parallelism, specula-
tion, and pipeline parallelism.
n Implementation strategy patterns
describe ways of implementing par-
allel computations and their cor-
responding data structures. These
include patterns such as loop par-
allelism, single program multiple
data, master-worker, and shared
queue.
nConcurrent execution patterns
form the lowest level of our pattern
language, and describe ways of in-
teracting with parallel hardware.
For example, patterns like single
instruction multiple data, thread
pool, message passing, etc.
The pattern language helps us understand the parallelism in the applications we write, and it also helps us
build tools for helping other programmers take advantage of parallelism.
Having a common vocabulary to discuss the parallelism we have found in
our applications, it is only natural to
build tools and libraries that take advantage of the things we know about
various patterns to help programmers
implement computations that conform to those patterns. We call these
tools and libraries frameworks.
FRAMEWORKS
Frameworks help programmers implement applications by providing libraries of useful computations, as well as
support for the intelligent composition
of these computations. Pattern-oriented frameworks allow computations to
“Now that parallel
processors are
integrated into
monolithic silicon
devices that share
the same off-chip
memory, the cost
of communication
and synchronization
has been reduced by
orders of magnitude.
This broadens the
scope of applications
that can benefit from
parallelism.”
be expressed only in harmony with a
specific composition of patterns from
our pattern language. By focusing on
a particular composition of patterns,
a pattern-oriented framework is able
to take the computation expressed by
the programmer, and restructure it according to the knowledge it has about
the computation due to the restrictions
placed on it by a set of patterns. This
gives the framework the ability to take
advantage of parallel hardware, while
simultaneously keeping the programmer focused on the problem domain
they’re interested in, rather than the
details of the parallel implementation.
We divide parallel frameworks into
two main classes: application and pro-
gramming. Application frameworks
take advantage of domain-specific
knowledge about a particular compu-
tational domain. They provide a library
of components useful to the domain,
following the computation as well as
the structural and computational pat-
terns that are useful for composing
these components into applications.
We are in process of creating applica-
tion frameworks for speech recogni-
tion and computer vision, and early re-
sults are promising, with applications
created using these frameworks hav-
ing both high performance as well as
high programmer productivity.
SEJITS
There are many ways to build frameworks that capture high-level descriptions of a computation, construct
parallel implementations of the computation, and then execute them on
parallel hardware. A group of projects
at Berkeley have joined to use a common methodology and infrastructure
for these frameworks, which we call
SEJITS: selective embedded just-in-time specialization [ 2].
Since there will be many frameworks, each targeting different domains and compositions of patterns,
it is important to reduce confusion
over minor syntactical issues, in order to make learning how to use a new
framework as familiar and low-over-head as possible. For this reason we
create frameworks that are expressed
in a subset of existing productivity languages, such as Python or Ruby. We