At first glance, the old barge docked at the Mare Island Naval Shipyard looks like it's just taking a breather on its slow journey to the scrap yard.
Most of the browned, pitted metal panels that form the deck are slightly warped, creating hundreds of tiny reflective pools. A few have been removed, creating man-sized holes that nearly landed this unwary journalist in the bowels of the ship. Below deck, machines roar and hiss as workers scoop out the barge's insides. Over the next couple of months, the vessel will be transformed into a state-of-the-art world-first. Other companies have tried to build their own versions of this ship, and until now, all have failed.
- Do 'green' technologies make a difference in the long run?
- Connected devices quietly mine our data: privacy experts
- Nest smart thermostat outages highlight issues with smart home technology
- Google test-flies internet-beaming balloons
"You're seeing the future," says Kirk Horton. "You're seeing the revolution."
Seeing the revolution takes some imagination.
The vice-president of Nautilus Data Technologies leads me across the deck carefully, avoiding the puddles and holes. The age and condition of the barge is part of the plan; the company says it intends to only retrofit pre-owned vessels as part of its commitment to environmental sustainability. Horton says in about five months, this ship — certified as sea-worthy by the U.S. Coast Guard — will be fully operational. Already companies have bought space for their servers.
"This is the world's first highly efficient, highly sustainable waterborne data centre," says Horton.
Whether they're the size of airplane hangars or tiny closets tucked away in the basement, data centres house rows and rows of disk arrays and routers — the building blocks of the internet — that store and transmit our data.
Using the ocean for efficiency
What makes this data centre so special isn't just that it's in the ocean, but the fact that it will be cooled by the very water upon which it floats.
"So this is our heat exchange," says Arnold Magcale, the company's CEO. We step into a small shack next to the barge which houses a miniature version of what will be installed on the barge. It was used to prove that his concept actually works.
He points to the pipes that run behind the server racks. The water in the pipes absorbs heat, then is expelled back in the ocean while cool water is drawn in. A virtuous circle, he says, that has passed every environmental assessment so far.
"What we're doing here is moving water versus moving air, which is five times more efficient," says Magcale.
It can save companies as much as 40 per cent on their energy bill, adds Horton.
"With the advent of big data, as cloud technology further progresses, you're going to see more and more advanced IT technology — the server infrastructure, the equipment, the storage devices — they will continue to draw more and more power," says Horton.
The Nautilus barge — located about 40 kilometres northeast of San Francisco — is an attempt to solve a problem most people didn't even know existed.
'The new modern-day factories'
Every time you update your Facebook profile, every time you email a friend, every time you stream your favourite show, somewhere in a dark room in a building far away, lights flicker, servers whir and air conditioners roar. Every year, we use more data. Every year, the number of data centres grows. And every year, those data centres use more electricity.
"Data centres are the new modern-day factories," says Mukesh Khattar, technical executive with the Electric Power Research Institute, an organization funded by the electrical utility industry.
In 2000, before the prevalence of streaming companies like Netflix, data centres accounted for one per cent of U.S. power consumption, he says. By 2015, that number tripled.
"That number's increasing continuously," says Khattar. "And you can see that. Everybody has a cellphone these days, everybody has a portable device. All of these devices are connected in the back-end to a data centre."
Inside the data centres, the servers generate so much heat that if they're not kept cool, they melt.
"For every unit of energy that goes into powering IT in an average data centre, you need another unit of energy to cool the data centre down," says Pierre Delforge, director of high-tech sector efficiency at the National Resources Defense Council, a non-profit environmental advocacy group.
If you think of each data centre as a plane taking off, Delforge says, only about 10 per cent of the seats — the servers — are used. That's because data centres are designed to handle "peak load," which is the maximum amount of traffic they're expected to experience, like a rush of customers on Cyber Monday.
"The problem is the other 364 days in the year, they're still running all the servers," says Delforge. "They're not powering them down when they're not needed."
Energy efficiency as an after-thought?
Out there, he says, is a robot army close to 15 million strong, waiting for orders that rarely come.
Khattar believes it's because those who run corporate data centres aren't responsible for how much energy their IT systems use; they're judged on reliability and speed.
"Do you want to wait for a few seconds to get your picture downloaded? No!" says Khattar. "We want it instantaneously. And companies are just responding to that."
You might think the villain in the black hat would be internet giants like Facebook. To handle the company's one trillion page views each month, Facebook operates several server farms, some of which are about the size of six football fields. But big companies have big energy bills, so they have an incentive to cut the amount of power their data centres use.
"The estimate — I think it was about a year ago — was that we saved over $2 billion," says Facebook's director of sustainability Bill Weihl. "Which means it's well worth investing the time and money."
That investment led to the development of new, stripped-down, highly efficient servers that produce less heat. Weihl says three of Facebook's data centres run entirely on clean energy. At the other end of the high-tech spectrum, the company uses something called "free cooling." Basically … windows.
"We open up the window on one side and blow the hot air out. We open up the door on the other side and bring in cool air from outside," says Weihl. "The amount of energy we've saved is the equivalent of the energy used in a year by about 78,000 U.S. homes, and avoided emissions [are] the same as taking about 95,000 cars off the road."
Small server closets can add up
Experts like Delforge say it may be counter-intuitive, but the Facebooks of the world aren't really the problem.
"The cloud-computing companies like Facebook, Google, and others, they're only responsible for collectively about five per cent of all data centre energy use," Delforge says. "Individually they use a lot of energy, but there's relatively few of them compared to all the small server closets and small rooms that you find in virtually every floor of every office building in the country."
Those account for about half of the energy used by data centres, he says.
"This is a very old data centre," says Khattar, taking me on a short tour of the classroom-sized data centre being phased out by his own organization.
To keep their servers cool, most companies with small data centres just blast the A/C. The people who run the majority of IT departments aren't aware that the industry standard has changed. New data shows that the air used to cool servers can actually be about 11 degrees Celsius (20 F) warmer.
"The mechanical equipment — the hardware — doesn't require you to be as cold as in the past," Khattar says. "You can use much warmer air … and your system will work very efficiently under those conditions."
And there's more good news. While older data centres require as much energy to cool as they do to operate, new ones only need one-tenth of the energy.
"The newer ones being built by the large companies are already more efficient," Khattar says. "There's a big, big improvement happening in the infrastructure side."
But Delforge is still skeptical.
"At the moment we're seeing a few leaders in the high-tech industry and other sectors pioneer new technology that can significantly reduce data-centre energy, but we need more than just a few shining examples," Delforge says. "We need the majority — and eventually all data-centre operators — to use these best practices."
Even if they do, there's another challenge: the Jevons paradox. Nineteenth-century economist William Stanley Jevons observed that when technology improves efficiency, consumption doesn't go down, it goes up.
And that, Delforge fears, seems to be the case with data centres.
"Progress is being outpaced by the rapid growth of the industry," he says.
Watch Kim Brunhuber's report on data centres and power consumption on The National, Friday at 9 p.m.