The business of data centres is changing, or at least trying to, since technological advances have now far outpaced available bandwidth and infrastructure
IoT is advancing too, with an estimated 20.8 billion connected devices expected to be on the market by 2020; this brings new challenges to data processing and potentially needed changes to the intricacies of a data centre. One of the proposed solutions is to situate data centres under water, closer to dense population hubs, enabling more efficient edge computing processes.
Microsoft is already on the case with Phase Two of their experimental data centre project after the successful completion of the first phase. The project, titled Project Natick, was born after research demonstrated that most people live within 200km of the coast, it would then make sense to locate data centres closer to where they’re needed most. Whilst most can’t imagine how exactly water and computers mix together (we’ve all spilled a cup of tea on a keyboard and lived to regret it, right?) there are many good reasons to set up servers at the bottom of the ocean, at least Microsoft developers seem to think so.
Firstly, let’s talk about how this slightly dystopian idea will theoretically function. A location off the coast of Scotland, near Orkney, was chosen for its proximity to the European Marine Energy Centre, engaging the already existing renewable energy infrastructure (on and off-shore) to power the vessel. The underwater ‘data centre’ is filled with nitrogen to prevent corrosion of the servers inside but the technology itself isn’t new or cutting edge, all the tech in use is already well understood. The vessels themselves are deliberately the size of a shipping container to facilitate future ease of manufacturing and transportation should the experiment prove successful; the units were in fact made in France and shipped to England on a truck to test the logistics chain.
The experiment will last for five years, during which time no changes or repairs can be made to the tech inside the vessels should anything malfunction. What remains to be seen is the effect this will have on sea life; Microsoft insists that Phase One successfully demonstrated no disruptions, with negligible noise levels and practically no water heating close to the vessel – it measured just more than a few thousandths of a degree warmer than the water a few feet away from the unit.
Close monitoring will take place to measure environmental impact as the state of oceans continues to decline; there’s potential to use underwater data centres like artificial reefs, which has already been done with multiple sunken vessels, in order to improve marine environments and promote biodiversity.
Still, even if the experiment succeeds, experts seem to think that it will only provide short-term relief to a growing problem of data centre energy use. However, promising discoveries are currently being made by Google’s AI research company DeepMind, creating technology which is able to save the amount of electricity it uses to power its own data centres by an impressive 40%.
Moving forward Microsoft envision creating multiple data centres around the world, off the shores of countries, bringing faster connectivity to increasingly data hungry devices and its consumers.
About the Author
Matthew Walker-Jones specializes in content covering topics including data driven marketing, online data protection, data recovery and cyber security. With a passion for all things data, Matthew is constantly staying up to date with the latest news on data security information.