The internet seems to many of its users as something that is just “there.” Web surfers may be aware of the energy consumption of their own computers and mobile devices, but they rarely if ever consider that the data they are accessing is being provided by expansive server farms, some of which use up more energy than a small town. 1.5% of the total generated electricity in the United States is consumed by data centers, and that percentage is expected to rise year on year.

1/3rd of California’s Electricity Must Be Solar

Apple has been approved by North Carolina’s Catawba County to reshape over 170 acres to create a solar panel array in order to feed electricity to its new energy-hungry local data center, and many server farm operators are looking to solar as their zero carbon footprint renewable source of choice. Inland California is swiftly becoming one of the world’s primary solar power centers mostly due to Governor Jerry Brown’s commitment to generate one third of the state’s massive electrical needs through solar energy. The main problem with locating server farms in the Mojave Desert are the temperatures, which regularly zoom over 120 degrees F, and thus require enormous current draws just to keep the servers cool enough to operate. If the electricity is channeled hundreds of miles away to areas of California’s coastline, which benefit from somewhat milder ambient temperatures, the energy losses through transmission diminish the benefits of the solar generation.

Hours of Sunshine, Ray Angle & Ambient Temperatures

Locations such as the highlands around Arica, Chile or Ica, Peru have the advantage of nearly 99% sunshine hours, a near-equatorial geographical positioning for optimal sun ray angle, and sufficient elevation to avoid extremely high temperatures. But due to their remoteness they have not yet been tapped for solar server farms. Sun ray angle is a prime determining factor in eliminating the very arid Arctic territories from consideration even though they do offer extremely favorable cold temperatures that minimize the energy requirements to extensively cool those hot-running server farms. Another advantage of these sorts of locations is that they are often subject to higher than average consistent wind speeds, which can facilitate air turbines supplementing the solar production.

Iceland Is Becoming a Primary Server Farm Choice

British cloud computing company Colt is building a data center in Iceland, of all places, but it’s not as crazy as it may seem at first. Iceland rides astride a volcanic hotspot and thus derives virtually all of its electrical supply through geothermal sources, and the low ambient temperatures year round help hold down the air conditioning costs. Reykjavik’s temperatures average consistently between 30 and 55 degrees F and the capital has not seen an 80 degree day in recorded history. The position halfway across the North Atlantic Ocean also allows companies in both North America and Europe to operate their web services from a single optimal location.

Modern Server CPUs Slash Electricity Requirements

Server-specific CPUs such as Intel’s Sandy Bridge 8 core Xeons and AMD’s Interlagos 16 module Opterons have taken massive leaps forward in minimizing energy consumption as performance per watt ratios skyrocket. Applying the latest technologies in microprocessor design can cut down a data center’s power consumption by up to 90% over older CPUs. Still the data needs of the world’s burgeoning online population continue to grow, and with the advent of cloud computing, the requirements for server farms is forecasted to keep growing indefinitely.

The majority of data centers and server farms unfortunately are still reliant on primary grid power and consume an insignificant amount of solar or other renewable energy. The newer data centers are tripping all over each other to declare themselves fully renewable, but most server farms are still facilities that were designed in the last century and do not reflect the realities of today’s preferred energy-generating profiles. It costs millions of dollars to update servers to the most modern and energy-efficient microprocessors, thus it is an expenditure that some data centers try to delay as much as possible… even though the electricity savings could go a long way towards paying for the upgrades.