DC in the sea...

Iain C

Well-Known Member
Joined
20 Oct 2009
Messages
2,366
Visit site
Fairly bonkers idea from Microsoft here!

https://www.bbc.co.uk/news/technology-44368813

I'll have another think about it. No, as a sailor and IT professional I still think it's bonkers. Looking forward to seeing little purple UKHO symbol for "Data Centre" appear on my charts.

So what's first to hit it? Anchor? Submarine? Galvanic corrosion?
 
A nice warm object sitting quietly on the sea bed, hmmm. I wonder what antifouling they use ?


Perhaps they could put it on one of those big tanks for fish farming, and breed turtles or Portugese Man-of-War jellyfish. Ah, Mr Gates, I have a suggestion....
 
Not entirely bonkers.

These cloud-style data centres have been pretty commoditised and containerised for some years. I presume the point is to reduce the cost by using the sea to cool the centre - and given what a high proportion of the cost of a DC is dedicated to cooling why not?

Of course it requires that you can lose the entire unit without causing an outage but that is pretty common for DCs these days anyway
 
I'm sure there's very convincing arguments for this. MSFT aren't daft, and even the "no-brainer" of siting DCs in places like Iceland is starting to become much harder financially and environmentally. But I'd love to see the stats around it...sure, no real estate costs, reduced cooling costs etc, but putting the links in is not a job for a bloke with a JCB and changing a duff server blade has never needed a diver before has it? And is the sea actually cold enough to make all of the other headaches such as waterproofing/corrosion/access/setup costs worth the effort?
 
The commoditisaiton is common in DCs these days - all the compute power is built into a standard unit that is simply plugged in to Power/Data and then ready to go. Much cheaper than wiring bits up individually on site. It is pretty much as simple as lifting it in with a crane and connecting a couple of plugs. These units probably only have a designed life of 2-3 years as the speed of hardware development means that most grids replace their CPUs at that sort of rate as it is more cost effective than running older cores.

You would never need to get inside - if a blade fails you simply don't use it.

Making waterproof steel boxes that can survive underwater is not a complex engineering task.

I doubt if at the moment the economics actually make it cheaper than a land based alternative but it is certainly worth investigating
 
Cooling makes a fair chunk of the cost (25% ish - look up PUE) of Date Centres. Google have several European DCs that use passive cooling.

Also they may be near wind/hyrdo/cheap power sources.

Finally many network links are routed around the shore anyway, so there is bound to be a relatively local place to plumb in to the internet.

Cloud servers have been designed to fail in place for probably around 10 years or so, and also Cloud providers replicate services / data between locations (or the decent ones do).

Finally - why bother with Anti-foul - its not moving ...
 
Its a brilliant idea, why does it need anti-fouling it is not going anywhere says sailor, engineer and occupant of Orkney for a few years.
 
As said the mechanical cooling requirement for Data Centres is massive, so immersing it in cold water makes huge sense, if you can deal with the issues of immersion in seawater.

Not good for my business, but hey ho (large capacity water chillers).
 
Top