The Present-Data Centers Today
Today, data centers consume about 2% of U.S. power generation. As a result, everyone from the company CFO to the Environmental Protection Agency (EPA) is interested in a data center’s footprint and wants to do something about it – both economically and environmentally. How much power do all data centers consume in total?
Just one major company data center may draw approximately 40 megawatts. To put that in perspective, that’s power enough for approximately 20,000 to 40,000 homes. Energy used to run servers and cool them within all U.S. data centers is the equivalent of a full year’s output from five 1,000 megawatt power plants. Or put another way, enough to power 5 million homes. To put it bluntly … modern data centers are “energy hogs!”
Unfortunately, from 2000 to 2010, not much had happened in the way of intelligent data center evolution. There has grown and still remains a major disconnect between component manufacturers , data center builders, facility managers, and users has emerged in terms of needs and solutions.
Regardless of the industry involved, the first and most important step in building a new or expanded data center is a commitment to careful planning and collaboration among its many stakeholders. At the very least, this should involve the organization’s IT department leadership, its facilities management department, and the project’s CM/GC, architect, and data center planning consultants.
The typical data center cooling system consumes up to 50% of the total facility power. Up to 50% of the cooling system power is wasted due to design flaws and overly conservative safety margins for the IT equipment. These common conditions make the cooling system one of the best opportunities to increase overall data center efficiency. After full accounting, up to 25% of total power can be recovered by improving cooling system design and operation for the typical data center.