Google – 1.68m tons CO2 in 2011

Google has released data about its carbon footprint, saying it released 1.68m tonnes of carbon dioxide in 2011, not counting emission offsets.

The breakdown was 1.44m tonnes from purchased electricity to run data centres and offices; 29,500 tonnes from vehicles including Street View cars; and 208,000 tonnes from business travel, employee commuting, server manufacturing, data centre construction and fuel for offices.

Emissions per $1m revenue were 44.3 tonnes, compared to 49.3 tonnes in 2010 and 54.7 tonnes in 2009.

“Without efficiency measures in our data centers our footprint would have been about twice as big,” it said.

Its data centres use 50 per cent of the energy of ‘typical’ data centres, with the use of ‘smart’ temperature controls, using outside air or water for cooling, and measuring everything.

The “Power Usage Effectiveness”, a calculation of total energy use divided by energy need to drive the servers, is 1.14, compared to a typical industry figure of 2, the company says.

It minimises the number of times power is converted from one type of current to another, and keeps power supplies as close to the load as possible.

The company buys renewable energy, and “high quality” carbon offsets, which brings its carbon impact to zero.

100 Google searches generate 20g of CO2, as much carbon dioxide as you’d generate drying your hands with an electric dryer, ironing a shirt or to make 1.5 tablespoons of orange juice, the company says.

3 weeks of watching YouTube non stop would generate 3kg of CO2, equivalent to the emissions from one load of laundry.

SynapSense – help you optimise data centres

SynapSense Corporation of Folsom, California, reports that it has made advancements to its data centre optimisation system.

The company provides systems to improve management of data centres, to reduce energy consumption, and also to ensure system reliability.

Its “ThermaNode DX” can monitor the evaporating and cooling coils in computer room air condition units, and helps identify preventative maintenance opportunities.

The system has been used by GE to monitor energy use of its data centres.

“While pressure mounts to reduce operating costs, managing a critical facility has always been and always will be about operational resiliency,” says Steve Doublett Principal Technologist, Datacenters at GE.

“The SynapSense Data Center Optimization Platform provides the visibility and trending necessary to do both.

“At GE, we used the SynapSense solution to improve the distribution of cooling throughout the data center, reduce cooling costs, and all the while, ensure that server inlet temperatures continuously comply with ASHRAE and internal thresholds established by IT.”

There are also tools to optimize loads, recover stranded power and identify available power in real-time.

Press release

The world’s greenest data centre?

Green Mountain – a data centre in Stavanger

Stavanger colocation data centre facility Green Mountain is opening what it claims is the world’s “greenest data centre” in early 2013

Green Mountain, a data centre facility just north of Stavanger, Norway, opening in early 2013, claims to be the world’s “greenest data centre”.

The facility was formerly used for NATO ammunitions storage. No-one knows for sure but it may have been nuclear ammunition. It is buried deep inside a mountain, starting 100m away from the tunnel entrance.

Today, the facility is being made available as a place to run your high performance computing, with 21,500 m2 of space available.

You get the benefit of Norway’s hydroelectric electricity supply, so no emissions of fossil fuels from the enormous data centre power consumption.

The cooling for the data centre uses water from 100m deep in the neighbouring fjord, which is 8 degrees C all year around, which means that the only power consumption associated with cooling the high density racks is for pumping the chilled water around the facility.

The pumping energy is minimised by using the siphon effect, like a toilet flush, sucking water up from the depths into a holding tank the size of an Olympic swimming pool, which is below sea water level.

The overall “Power Usage Effectiveness”, calculated as the total power consumption of the plant (including cooling, lighting, pumping, IT equipment) divided by the power consumption of the IT equipment is around 1.19 which is well below the average in UK of 1.4-1.8 so significantly less power is required to run the data centre.

The site is designed so if someone launches a missile into it from the sea, the missile will travel straight into a special missile holding area and miss your servers. In order to hit your servers it would need to turn right once inside the mountain.

The centre is owned by Peder Smedvig, a local entrepreneur who previously founded the Smedvig operator of offshore oil rigs, which was subsequently sold to Seadrill in 2006.

The storage site is sealed and has a special air mixture with oxygen at a level you can breathe but not high enough to support a fire (similar to the air on aeroplanes), which means that fires are not possible.

There are 3 separate fibre optic paths from the centre to Stavanger city, and from there, 2 different routes to the UK and 2 routes to Denmark.

The power availability to the site is estimated at 99.99997 per cent, with 3 independent feeds of grid power and in addition stand by diesel generators are installed for ultimate power security.

The power costs has been estimated at 40 per cent less than UK costs, and can be fixed for 10 years.

Data latency from London is 6.5 milliseconds and from Amsterdam 12 milliseconds.

The company argues that it makes much better sense to run your data centres near a source of hydroelectric power, because there are losses associated with moving power around the world, but no losses associated with moving data.

Or to put it another way, if you were based in London and wanted to use ‘Green’ power for your data centre, if you had the centre in London you would encounter transmission losses bringing the power from Norway. But if your data centre was in Norway, there would be no transmission losses associated with moving the data from Norway to London.

Also involved as “partners” are Norwegian IT company Evry Stavanger; engineering consultancy COWI Nordic, and power company LYSE.

There are 4 separate underground halls, and an administration building some distance away.

The space is being sold by square metre at rates competitive with cities in Europe and able to accommodate server cabinets with 20kW or more power requirements.

National Australia Bank white paper on reducing carbon

The National Australia Bank has published a white paper on how it is applying technical solutions to reducing carbon emissions.

It has calculated the projected “power usage effectiveness” of a new data centre is it building, showing that the ‘worst case scenario’ will be 1.35. Power usage effectiveness is calculated as the total power used by the centre (including cooling and lighting) divided by the power used to run the servers.

The company says it has achieved a great deal of efficiency through ‘virtualisation’ (running an application on a computer in a data centre, instead of running on a computer locally to whoever is doing the task).

“The goal is to realize 90% virtualization of servers across the company by the time the project is completed,” the bank says.

The company says it has been carbon neutral since 2010.

Projects include heating, cooling and lighting adjustments, major refurbishments at office buildings, automatic switch-off of PCs not in use, and the design and installation of a trigeneration plant (providing power, heat and cooling).