Data centers are a lynchpin of our modern economy. Server rooms power small- to medium-sized businesses, enterprise data centers support major corporations and server farms host cloud computing services. Keeping up with the explosive growth of digital content, big data, e-commerce and Internet traffic is making data centers one of the fastest growing consumers of electricity in developed countries.
In fact, data centers use nearly 2 percent of the world's supply of electricity at any given time, and 37 percent of that amount is used to keep computing equipment cool. Not only is this a drain on the power grid, but it also taxes water supply. A 15-megawatt data center can use up to 360,000 gallons of water a day — that’s more than half the water in an Olympic-size swimming pool.
Data center power consumption is on the rise, increasing 56 percent worldwide and 36 percent in the U.S. from 2005-2010. These substantial energy demands come at a price, and controlling operational costs in data centers has been a persistent challenge. IT systems are designed to ramp up and down based on a businesses’ use, yet cooling systems in data centers were not previously designed to do that.
Traditional data centers can incur excessive energy expenses from three main cost drivers:
- Over-building a data center
- Underutilizing the data center that has been built
- Inefficiently using cooling technology
Subscription users only!
Subscribers are able to view the whole article. Please register/subscribe (it's free and easy) to read all articles.