Information Technology Reference
In-Depth Information
amount of electrical power and generate a tremendous amount of heat. To
support these technologies, during 2008 world power consumption exceeded
US$30 billion [3] when an average data center consumed as much energy as
25,000 households [4]. About 15% of these costs are due to removing the heat
generated throughout the infrastructure [5]. The situation is critical since
the numbers are growing. In 2010, the worldwide data center consumption
reached 1.5% of global energy, having increased by 56% since 2005 [6].
To this end, major players in the data center and high-end computing
markets often negotiate energy deals with electricity suppliers to build or
upgrade power substations, near or immediately next to, their computing
facilities. Alternatively, when not enough power infrastructures can be built
at or near computing facilities, many companies move their computing facili-
ties to the power source (e.g., Google [7], and Microsoft [8]).
In addition to the economic impact of excessive energy consumption, the
environmental impact has affected the data center community. The heat
and the carbon footprint emanating from cooling systems are dramatically
harming the environment. According to Mullins [9], US data centers use
about 59 billion kWh of electricity, exceeding US$4.1 billion and generating
864 million metric tons of CO 2 emissions released into the atmosphere.
Both research and industry have recently proposed several approaches to
tackle the power consumption issue in data center facilities. Industry has
begun to shift the goal from performance to energy, reporting not only
FLOPS but also FLOPS per watt and measuring the average power consump-
tion when executing the LINPACK (HPL) benchmark [10]. Today, metrics
such as being in the Green500 list [2] are beginning to be of importance. Also,
reference companies around the world, such as Google, IBM, or Amazon, are
implementing measures to make their data centers more efficient and begin-
ning to measure the power usage effectiveness (PUE) of their facilities.
PUE is one of the most representative metrics and consists of the facility's
total power consumption divided by the computational power. PUE close to
1 means the data center is using most of the power for the computing infra-
structure instead of it being lost or devoted to cooling devices. Average PUE
for 2011 was around 1.83%, which does not represent a sufficient reduction
for sustainable infrastructures. According to Amazon data center estima-
tions [11], expenses related to operational costs of the servers reach 53% of
the budget, while energy costs add up to 42%, which are broken down into
cooling (19%) and power consumption of the infrastructure (23%). Therefore,
the cooling problem needs to be approached to restrain the upward trend [3]
and to prevent these technologies reaching beyond the limit of sustainability.
Researchers have done a massive amount of work to address these issues
and provide energy-aware computing environments. From the data room
perspective, previous work addressed the power consumption problem by
means of optimizing cooling costs at the resource manager level by assign-
ing longer tasks to servers with lower inlet temperature [5]. From the infor-
mation technology (IT) perspective, research has proposed solutions to
Search WWH ::




Custom Search