news
Technology | DOI: 10.1145/2347736.2347742
Gregory Mone
Redesigning
the Data Center
Faced with rising electricity costs, leading companies
have begun revolutionizing the way data centers work,
from the hardware to the buildings themselves.
LAte lAst YeAr, Stanford Uni- versity researcher Jonathan Koomey released a report de- tailing a few surprising trends about the energy squanderers known as data centers. Previous
estimates suggested that electricity
consumption in massive server farms
would double between 2005 and 2010.
Instead, the number rose by 56% worldwide, and merely 36% in the U.S. The
slower-than-expected growth stemmed
from a number of changes, including a
stagnant economy and the rise of virtualization software.
Yet experts say a more fundamental
change is also starting to take effect—
one that could lead to much greater
improvements in efficiency. Over the
past seven or so years, leading companies have begun revising the way they
design, maintain, and monitor data
centers, from the physical building all
the way down to the hardware doing
the computation. Recently, Google,
Facebook, and other major companies
have begun releasing details on the efficiency of their facilities, and revealing
a few of the technological tricks they
have devised to achieve those gains.
Still, these leaders are the exception rather than the rule. There are no
solid estimates of the total number of
data centers in the U.S., and the Silicon Valley giants are secretive about
exactly how many they operate, but
they hardly dominate from an energy
standpoint. In all, U.S. facilities consume between 65 and 88 billion kilowatt hours per year, and Google, for
instance, accounts for less than 1% of
that figure.
The fact remains that the average
data center is still largely inefficient.
The standard measure of a data center’s efficiency is its PUE, or power
usage effectiveness. PUE is the total
over the past seven
or so years, leading
companies have
begun revising the
way they design,
maintain, and monitor
data centers, from
the physical building
all the way down to
the hardware doing
the computation.
energy used to operate a data center
divided by the amount devoted to actual computing. That total includes
lighting, fans, air conditioners, and
even electrical losses as power is transferred from the grid to physical hardware. Ideally, a data center would run
at a PUE of 1.0, and all of the electricity
would go toward computing. Yahoo!,
Facebook, and Google have all touted
facilities scoring below 1. 1. Across industries, though, these numbers are
hardly common. “What happens in the
typical data center is that it’s more like
2.0,” explains Koomey.
Until recently, most companies did
not even bother measuring PUE. They
had little sense of how and where en-
ergy was used or lost within the facility.
“The primary reason all of this happens
is because there’s not great accounting
of the energy in data centers,” says Raju
Pandey, the chief technical officer of
Synapsense, a Folsom, CA-based com-
pany that performs data center optimi-
zations. “There’s an incredible amount
of wastage.”
heating up
If you had walked into the average
data center 10 years ago, you would
have needed a sweater. The American
Society of Heating, Refrigerating and
Air-Conditioning Engineers recommended these facilities be maintained
at temperatures between 60 and 65 degrees Fahrenheit to prevent the equipment inside from overheating. And the
machines that cool the space are often
inefficient. “Traditional data centers
basically have the same air-conditioning unit you’d put in your house,” says
Bill Weihl, Facebook’s manager of energy efficiency and sustainability.
The rationale was that warmer
temperatures could lead to hardware
failures, but several experts doubt-
ed this was actually the case. When
Google began planning a new breed
of data center in 2004, the company
started testing the temperature lim-
its of its hardware. “We started run-
ning our servers warmer and moni-
toring the failure rates,” says Joe
Kava, Google’s director of data center
operations. In the end, Google simply
did not see any major problems. “The
servers ran just fine,” he adds, “and if
you know your servers can run at 80
degrees, you can redesign your cool-
ing system entirely.”
Google found that it could avoid re-
lying on giant air-conditioning units,
as did other companies. The most ef-
ficient data centers now hover at tem-
peratures closer to 80 degrees Fahr-
enheit, and instead of sweaters, the
technicians walk around in shorts.
Facebook’s data centers in Lulea, Swe-
den and Prineville, OR, have no me-