An exclusive gaming industry community targeted
to, and designed for Professionals, Businesses
and Students in the sectors and industries
of Gaming, New Media and the Web, all closely
related with it's Business and Industry.
A Rich content driven service including articles,
contributed discussion, news, reviews, networking, downloads,
and debate.
We strive to cater for cultural influencers,
technology decision makers, early adopters and business leaders in the gaming industry.
A medium to share your or contribute your ideas,
experiences, questions and point of view or network
with other colleagues here at iVirtua Community.
A new study from the Lawrence Berkeley National Laboratory, released today, reveals that the electricity used by server computers doubled between 2000 and 2005. The report, which appears to be the most definitive assessment of data center energy consumption yet produced, underscores the growing importance of energy efficiency in effective IT management. The report's author, Jonathan Koomey, told Technology Review,
Quote:
"I was surprised by the doubling. I expected some growth, but not quite as large."
The increase in power consumption is largely attributable to the proliferation of cheap servers, according to the study:
Quote:
Almost all of this growth was the result of growth in the number of the least expensive servers, with only a small part of that growth being attributable to growth in the power use per unit. Total power used by servers represented about 0.6% of total U.S. electricity consumption in 2005. When cooling and auxiliary infrastructure are included, that number grows to 1.2%, an amount comparable to that for color televisions. The total power demand in 2005 (including associated infrastructure) is equivalent (in capacity terms) to about five 1000 MW power plants for the U.S. and 14 such plants for the world. The total electricity bill for operating those servers and associated infrastructure in 2005 was about $2.7 B and $7.2 B for the U.S. and the world, respectively.
The estimate that servers account for 1.2 percent of overall power consumption in the U.S. is, as the San Francisco Chronicle reports, considerably lower than some previous estimates, which put data center power consumption as high as 13 percent of total U.S. consumption. It should be noted that the study, underwritten by AMD, looks only at power consumption attributable to servers, which represents about 60% to 80% of total data center power consumption. Electricity consumed by storage and networking gear is excluded. The study also excludes custom-built servers, such as the ones used by Google. The number of servers Google runs is unknown but is estimated to be in the hundreds of thousands.
We need to find more energy-efficient resources quickly! I read somewhere that 0.200g of Uranium 235 can power an entire city for a year. Why don't we switch to a source of energy like this?