The Internet's Hidden Energy Hogs: Data Servers

Data servers run by Google, Yahoo, and others use lots of power, but there's a new push to conserve

By + More

CHANTILLY, VA.—The scene is one part Battlestar Galactica, one part detention center. Inside, picture a tall, metal fence—a cage, really—with a doorway guarded by a uniformed security officer. Behind it, row after row of whirring, blinking machines. Data servers, stacked on top of one another, stand about 8 feet high. Neon lights glow. Fiber-optic cables debouch like grapevines.

SoftLayer's newest data center sits in a nondescript building in an office park in suburbia. No big sign announces its presence. Deliberately anonymous, nothing hints at its essential role in the digital age. Or its appetite for energy.

In the paper-to-pixel revolution, server farms like SoftLayer's are the foot soldiers, and Northern Virginia is one of the major hubs of activity. More than 50 percent of all Internet traffic in the United States flows through this tech-rich region. Twenty-four hours a day, seven days a week, hundreds of thousands of servers here rapidly transmit E-mails, process Internet search queries, safeguard classified data, handle online financial transactions, and store videos and medical records. And suck up megawatts.

These server farms are growing fast, fed by an apparently recessionproof demand for electronic information. By 2012, Virginia-based Dominion Power estimates that fully 10 percent of all the electricity it sends to northern Virginia will be gobbled up by these data centers. Ashburn, Va., about 30 miles west of Washington, D.C., is home to servers used by Yahoo, AOL, AT&T, and dozens of others and is a particularly big energy hog. Nationally, data center electricity use has more than doubled between 2000 and 2006, and is expected to double again by 2011.

The energy implications of this growth are huge. Power companies are starting to fret about predictions that show them running out of sufficient generating capacity as early as 2011, raising the specter of rolling blackouts. Energy costs for the companies that operate these constantly running machines are climbing. And with increasing concern about greenhouse gases, server farms are attracting the same sort of furrowed-eyebrow scrutiny as other heavy energy users.

All these factors are prompting a frenzied, ambitious effort to make these facilities more energy efficient. There have been isolated attempts for the better part of the past decade. But in the past year or two, some of the field's giants have started to share data. One is Google, which is notorious for being tight-lipped about data-center operations, refusing to reveal even the locations of most of its facilities. Lately, though, it has begun to speak out about energy efficiency as the industry grows. "We've got close to 1.5 billion people online, which is a lot, but it's only 20 percent of the world's population," says Erik Teetzel, Google's energy program manager. "So, yes, we are going to need more computing power, but you don't necessarily need to use more energy."

Cooling towers. There's obviously ample room for improvement. In 2005, researchers at the Lawrence Berkeley National Laboratory found that "a single high-powered rack of servers consumes enough energy in a single year to power a hybrid car across the United States 337 times." In the average data center, in fact, only about half of the energy is used by processors themselves; the other half is gobbled up to cool the facility. Google says it has been able to reduce the energy needed to regulate the temperature in its facilities to about 20 percent of the industry average, in part by using water-based cooling towers.

Some energy gains can come from simple structural changes. At SoftLayer's Chantilly facility, which opened last May, the servers are stacked in dense vertical racks, the racks are arranged in rows, and between the rows are alternating hot and cold aisles. The cold aisles have vents in the floor that emit cold air that cools the front faces of the servers; the hot rows get the exhaust from the back side. The racks also are tightly packed—up to 44 servers per rack. The higher the density of the servers—the more servers per square foot— the fewer feet in need of cooling.