Monday, November 12, 2007

Taming the Guzzlers That Power the World Wide Web

NY Times
November 7, 2007

By MATTHEW L. WALD

BEHIND every Google search, direct deposit, MapQuest request and rant on a blog is a data center crammed with machines called servers and, behind them, a power plant.

And similar to the way computer-processing speed doubles every 18 to 24 months, energy consumption by the data centers climbs, too. It doubled between 2001 and 2006 and — unless energy efficiency becomes a priority — will do so again by 2011, according to a study this year by the Environmental Protection Agency.

By last year, data centers scattered from northern Virginia to Washington State were consuming 1.5 percent of the nation’s electricity supply, the E.P.A. study says, straining the system in areas where power demand is high.

“The amount of energy spent on data centers is huge, and it’s not really very well understood,” said Brian Brouillette, a vice president of Hewlett-Packard, one of several companies that supply data centers with energy-efficient equipment and information.

Companies tend to be secretive about how much it costs to run their servers, but several experts said that energy costs can be 40 percent of the cost of operating a data center. In three years, the cost of running a server can top its purchase price.

Not that data center managers, eager to expand their facilities, think about energy first.

“It’s not misunderstood,” Mr. Brouillette said. “It’s just nonunderstood.”

In the dark ages, circa 2000, servers consisted of stand-alone computers, each drawing hundreds of watts, stacked in racks 19 inches wide and 6 feet high. Today the demand for the Web is so high that the racks are organized in ever-expanding “farms” — like rows of corn — each holding maybe dozens of “blades,” or motherboards with a single processor.

A farm typically draws 10,000 watts or more, generating heat with each one. For every watt to run the computers, the farm may need nearly a watt for air-conditioning.

It costs $4.5 billion a year for the electricity to run the nation’s server farms, according to the E.P.A., a sum ultimately picked up by consumers.

The environmental news isn’t all bad. When people use the Internet to shop or to work at home, they are using less energy to move themselves and their goods around, offsetting some of the energy used for computers. But without improvements, the study said, server farms will require new power plants around the country, something environmentalists and some power companies want to avoid.

With aggressive improvements, consumption in 2011 could fall back to the levels of 2001, the study said. But even an enlightened data center manager would have problems identifying which servers are most efficient. Unlike cars, refrigerators and washing machines, servers are not required to meet federal energy standards. The Energy Star program, which identifies especially efficient products, evaluates many pieces of computer equipment, but not servers.

Managers who buy data center equipment, including servers and related hardware, “need objective, credible energy performance information,” the E.P.A. said.

Besides the savings from lower utility bills and the benefits from reducing power plant emissions, many data centers have a pressing reason to improve efficiency. “An increasing number of data centers in both the private and public sectors are hitting their limits in terms of space, power and cooling capacity,” Lowell Sachs, the senior manager of federal government affairs at Sun Microsystems, said in a written statement. In other words, many centers cannot increase capacity without increasing energy efficiency (or spending a fortune to build more farms).

One problem, again, has to do with the priorities of data center managers, who choose servers based on speed, performance and reliability but generally not on energy use.

“In many data centers, those responsible for purchasing and operating the I.T. equipment are not the same people that are responsible for the power and cooling infrastructure, who, in turn, typically pay the utility bills,” the E.P.A. said. “This leads to a split incentive, in which those who are most able to control the energy use of the I.T. equipment (and therefore the data center) have little incentive to do so.”

Or, as Mr. Brouillette put it, “Facilities professionals and I.T. professionals don’t have a lot of reason to talk to each other.”

Unbeknown to most consumers (although obvious to anyone who has opened up a computer), computers convert alternating current into direct current, which in turn is converted to various voltages. In the process, “a desktop PC will waste half the energy it pulls from the wall,” says William E. Weihl, the director of energy strategy at Google. That is one reason Google assembles its own servers, choosing components that lose less energy, he said.

Manufacturers are beginning to pay attention. “Not many years ago, typical was 75 to 80 percent efficient,” said Richard DuBois, the senior vice president of marketing at Emerson, a St. Louis maker of servers. “We’re pretty religiously running in excess of 90 percent right now.”

And Hewlett-Packard has said that by 2010 it will improve the competency of some of its servers by 50 percent over 2005 models.

Computers use some strategies that do not work well for servers. For example, some save energy by shutting down disk drives. So a laptop, for example, may hesitate when the user wants data. Servers cannot afford that wait, Mr. DuBois said.

Efficiency experts see savings potential in the chips that drive servers and PCs alike. Some chips use software to turn off sections that are not in use, reducing immediate power consumption and cooling requirements; improved hardware is reducing electricity leaks. There is also room for improvement in the windowless rooms that house servers, which must be kept cool or risk breakdowns. These chambers tend to use modern air-conditioning equipment but distribute the cool air in ways that are not optimal, experts say.

“If you don’t have a profound understanding of where the hot spots are, you just overchill everything,” Mr. Brouillette said. “It’s like if in your house you kept the air-conditioning on full tilt because one room had poor air circulation.”

His company and others offer thermal mapping — or color-coded pictures indicating temperatures around the room — to help data centers direct cool air to the right places.

Meanwhile, Google is planning to put one of its server farms out in farm country: Council Bluffs, Iowa. It’s a good location for windmills, the company said.

No comments: