Microsoft Testing Underwater Datacenters with Project Natick
Underwater plants, fish and other living organisms have been around since before us, but for the first time ever Microsoft is bringing us underwater datacenters.
The Redmond giant took the cloud to the depths of the Pacific Ocean about a kilometer from the shore in a 38,000-pound container called Project Natick.
According to the researchers, the idea is to find better solutions for datacenter deployment that would be beneficial to a number of aspects like cooling, latency, power source availability as with underwater datacenters you are looking at hydrokinetic energy sources.
The vision of operating containerized datacenters offshore near major population centers anticipates a highly interactive future requiring data resources located close to users. Deepwater deployment offers ready access to cooling, renewable power sources, and a controlled environment.
Project Natick was started back in 2015 so that they can test the water (literally in this case) for future adoption of the solution on a much larger scale.
For those of you who do not know, airconditioning is the biggest cost they have to bear for the maintenance of a data center and this gets greatly reduced when the data center is underwater.
Project Natick reflects Microsoft’s ongoing quest for cloud data center solutions that offer rapid provisioning, lower costs, high responsiveness, and are more environmentally sustainable.
The need for more and more datacenters in the world these days also makes it imperative that they are geographically spread out; once again, underwater datacenters can attend to a majority of that requirement.
Being the Managing Editor of TechFrag, Sarmad splits his time between keeping up with latest news, technology, gaming and other awesome things like unearthing the merits of staying up at night and Californication!