Technology | DOI: 10.1145/1897816.1897823
chipping Away at
Power-saving processor algorithms have the potential
to create significant energy and cost savings.
The inForMaTion TeChnoL- ogY industry is in the van- guardof “going green.” Proj- ects such as a $100 million hydro-powered high-performance data center planned for Holyoke, MA, and green corporate entities
such as Google Energy, the search giant’s new electrical power subsidiary,
are high-profile examples of IT’s big
moves into reducing the greenhouse
gases caused by computers.
However, the true benefits of such
projects are likely to be limited; most
users in areas supplied by coal, oil, or
natural gas-fired power plants would
likely find it difficult to change to a fully sustainable supply source.
SCREENSHO T COUR TES Y OF MISER WARE, INC.
These market dynamics have not
been lost on government research directors. Agencies such as the U.S. National Science Foundation (NSF) have
begun encouraging just the sort of
research into component-level power
management that might bring significant energy savings and reduced climatic impact to end users everywhere
without sacrificing computational
In fact, the NSF has held two workshops in the newly emphasized science of power management, one in
An intelligent power-management application, Granola uses predictive algorithms to
dynamically manage frequency and voltage scaling in the chips of consumer Pcs.
2009 and one in 2010. Krishna Kant, a
program director in the Computer Systems Research (CSR) cluster at the NSF,
says the power management project is
part of the NSF’s larger Science, Engineering, and Education for Sustain-ability (SEES) investment area.
“There are some fundamental ques-
tions that haven’t been answered, and
NSF funding might help answer them,”
Kant says. “These have been linger-
ing for quite some time. For instance,
when you look at the question of how
much energy or power you really need
to get some computation done, there
has been some research, but it tends