In the past few years the rapid adoption of electronics and computing both in the consumer marketplace as well as the commercial sector have led to exploding energy consumption. Now according to a new study from leading IT consulting firm Gartner, over 70% of I.T. operations managers are examining more closely the issues of power and cooling. The reasoning for this is quiet simple, as servers and computers becomes more powerful the devices are consuming a great deal more energy. At present it is estimated that between 40% and 50% of the cost of running a data center can be power.
There are many gains to be had by also switching to newer generation processors from both of the major manufactures on the market AMD and Intel. Both have new generations of chips out on the market which boast twice the performance while using half of the power. Another strategy is to also examine the amount of peak usage, as a new study by IBM has suggested that at most only 10% of PC processing power is used on average during the day. These strategies amongst a host of whole other ones being laid out for companies to examine could result in savings of 50% or more for a data centre, meaning that the average data centre could save up to $1 million USD.