Hardware Reference
In-Depth Information
Answer
We multiply the critical load of 8 MW by the PUE and by the average power
usage from Figure 6.13 to calculate the average power usage:
The monthly cost for power then goes from $475,000 in Figure 6.14 to $205,000
at $0.03 per kilowat-hour and to $1,015,000 at $0.15 per kilowat-hour. These
changes in electricity cost change the hourly server costs from $0.11 to $0.10 and
$0.13, respectively.
Example
What would happen to monthly costs if the amortization times were all made
to be the same—say, 5 years? How does that change the hourly cost per server?
Answer
The spreadsheet is available online at htp://mvdirona.com/jrh/TalksAndPapers/
PerspectivesDataCenterCostAndPower.xls . Changing the amortization time to 5
years changes the first four rows of Figure 6.14 to
Servers
$1,260,000
37%
Networking equipment
$242,000
7%
Power and cooling infrastructure
$1,115,000
33%
Other infrastructure
$245,000
7%
and the total monthly OPEX is $3,422,000. If we replaced everything every 5
years, the cost would be $0.103 per server hour, with more of the amortized costs
now being for the facility rather than the servers, as in Figure 6.14 .
The rate of $0.11 per server per hour can be much less than the cost for many companies that
own and operate their own (smaller) conventional datacenters. The cost advantage of WSCs
led large Internet companies to offer computing as a utility where, like electricity, you pay
only for what you use. Today, utility computing is beter known as cloud computing.
Search WWH ::




Custom Search