Our lives are totally tangled with technology whether you like it or not. The most expensive thing on this planet is no longer precious metals and stones, it is data. From giant corporations to a small start-up, everyone is running after data.
Data literally drives everything around you. Checked out a pair of boots or a newly launched book on the internet? You are most likely to be haunted by images, ads, and targeted messages for the next couple of days. So, what has technology giant Microsoft got to add to this collection of endless data?
What is Microsoft underwater data center?
Microsoft has conducted an experiment to check if it is feasible to place a data center underwater. The experiment has been a huge success. In the near future, you are most likely to find data centres at the bottom of the sea. The operation has been named “Project Natick” and it could possibly lead to cloud computing infrastructures near the cities that are surrounded by seas or lakes.
The aim of the project
The primary aim of “Project Natick” was to determine if the conditions of the underwater environment would favour the proper functioning of a proper data centre. The secondary aim was to demonstrate that creating underwater data centres is feasible from a logistical, environmental, and economic point of view.
Why did Microsoft put a data center under water?
Microsoft recovered an underwater data center of the Scottish Orkney Islands and later published the results of the trial. This particular data centre consisted of 864 servers that were strategically placed inside a container and then connected with a submarine cable to the mainland. It was located in the North Sea since spring 2018.
The center was around 35 meters deep in the waters and every minute detail was monitored for a clear idea of performance and reliability. After the recovery, engineers of the Redmond Company studied the data centre. They even took note of the air inside the data centre to understand the effectiveness of this innovative idea.
What was the result of the experiment?
The data revealed that the servers performed pretty well. Surprisingly, it also showed up to eight times the reliability of infrastructures located on plain land.
Engineers and researchers are now trying to determine the origin of this higher reliability rate. They are aiming to replicate the same conditions on terrestrial servers to test if it obtains more efficiency and greater performance.
The advantage of the project
So, why did the submarine data center perform better than the terrestrial one’s? The team has been investigating and they believe the advantage could depend on the fact that the atmosphere in the capsule container is different.
In fact, it has been meticulously created by the Naval Group in collaboration with engineers and technicians of the navy specialized in renewable energy. The centre was filled with nitrogen, which is considered to be less corrosive than oxygen.
The prospects of the project
There are more benefits to an underwater data center. It consumes much less energy compared to lands where the power grid is not very reliable for a long period of time. A center with greater energy efficiency can help cut costs as well.
In the underwater centers, there is no need to artificially cool the servers, which is one of the costliest management items. The favourable temperature conditions of the sea combined with the usage of 100% renewable energy sources make the project a successful one. The energy was created through a complex and efficient network of wind and solar plants that cover the Orkney Islands.
With the success of Project Natick, Microsoft might build smaller data centers in coastal areas around the globe pretty soon. They can manage the Microsoft Azure division’s cloud and edge computing services. This could be the next big thing in the field of transfer and management of data.