My teams have invested substantial time and effort into enhancing our power usage effectiveness (PUE) - the ratio of total power consumption including cooling and transformer losses divided by how much of the power is actually used by computing equipment.
In the graphic above, BIDMC has achieved a PUE of 1.82, which is low compared to many corporations. We've done cold aisle containment, floor tile ventilation, and hot air recapture to reduce our Computer Room Air Conditioning (CRAC) needs substantially. We've matched the average of most green computing initiatives.
Despite all our efforts, we are limited by the constraints of the standard commercial hardware we run and the building we use.
Facebook has designed its own buildings and created its own servers via its Open Compute Project . Initial power usage effectiveness ratios are 1.07, compared with an average of 1.5 for their existing facilities.
Here's an overview of how they did it.
They've removed uninterruptible power supplies and centralized chilling units, which we cannot do because of architectural/engineering limitations of our building design. We're likely to achieve a PUE of 1.5, but could only achieve 1.07 by opening a new, fresh-built data center.
Here's a look at the kind of energy efficiency that cloud providers are achieving by creating dedicated mega data center buildings.
On April 28, I'm keynoting the Markley Group's annual meeting and you can be sure that I'll include power and cooling in my list of the things that keep me up at night.
Congratulations, Facebook!
2 comments:
John, those pue figures are interesting, I can't take PUEs to the bank. How about translating them to a business case. For example: what would be the time to payback on your investment if you DID build a "fresh-built data center" & lowered your figure from 1.5 to something like 1?
Given a $250,000 per year electrical spend in one of my data centers, a PUE of 1.07 is more than $100,000 in annual savings.
Post a Comment