Current Issue
This Month's Print Issue

Follow Fast Company

We’ll come to you.

4 minute read

Innovation By Design

Why Microsoft Is Testing Data Centers That Live On The Ocean Floor

The ocean is cold, dark, and awash in kinetic energy—which makes it surprisingly ideal for data centers.

  • 01 /08
  • 02 /08
  • 03 /08
  • 04 /08
  • 05 /08
  • 06 /08
  • 07 /08
  • 08 /08

Last year, a dive crew dropped a perfectly fine server rack encased in steel into the Pacific Ocean. When they brought it back up a few months later, it looked like something that had been underwater for years, covered in algae and barnacles. The electronics inside it, which you could find at any conventional data center, were cool and dry. Is this the future of data center design? A team of engineers at Microsoft, operating under the name Project Natick, think it could be.

Keeping Data Cold—And Close
At first glance, the idea of operating data centers on the very cold ocean floor has an elegant logic to it: a warehouse full of server racks needs constant and very pricey temperature control. Why not outsource that job to the ocean floor, where water temperatures tend to hover around 32 degrees Fahrenheit?

But according to the Natick team, there's actually a different reason for pursuing ocean-bound data storage: the fact that "half the world's population lives within 200 kilometers [or 120 miles] of the sea," as one engineer puts it in a video. By reducing the distance that internet users' data must travel to reach their homes or offices, Microsoft's cloud computing arm could reduce the cost of installing and maintaining that geographic network.

Yesterday Microsoft published an account of the project's first field test, which happened last fall off of the Pacific, presumably nearby Microsoft's Redmond, Washington, headquarters. Inside a steel capsule, engineers installed a server rack, along with all the necessary accoutrements, from a steady power supply to temperature regulation. While all the internal electronics were off-the-shelf, the massive tank that encased it was, needless to say, a custom job, more akin to submarine design than data center architecture. (In fact, Microsoft says a good deal of insight came from a Cloud Infrastructure & Operations team member who worked on subs for the Navy.)

In the end, the finished product weighed 38,000 pounds and spent two months at the bottom of the ocean—during which the team could monitor everything from its operational capacity to the humidity inside its steel tank. Microsoft says the next step for the project will be building a larger version of the hulking steel tube that's four times as large and stays submerged for a full year.

But while Microsoft is nowhere near rolling out such a solution on any scale in its business, the project hints at some of the challenges data center operators are facing.

It's Not A Data Center, It's A Data Bunker
Take one of Microsoft's cloud competitors, Amazon Web Services. In a story last month, Ingrid Burrington described the incredible expansion of the company's anonymous network of data centers that shepherd as much as 80% of internet traffic. Amazon even builds its own power substations for these centers, and is rushing to build enough to meet demand.

Similarly, Microsoft's cloud computing arm, Azure, has struggled to keep up with demand. Azure has more than 100 data centers across the world, ranging from India to Iowa, and the company depends on these fragile buildings to keep everything from Bing to XBox One running—leading, in some cases, to very public outages.

The design of everything from the building itself to the server hardware will affect how well these data centers can meet demand. So it's no surprise to see Microsoft experimenting with new design ideas for the data centers of the future; Facebook is also pouring resources into design experiments aimed at revolutionizing its data center designs.

In Project Natick's case, the big payoffs could come from two places. First, there's the simplicity and speed of manufacturing one of these underwater tanks. The company describes it as a "lights out" system, meaning that it doesn't require any humans to act as steward and, therefore, doesn't need to meet traditional human occupancy needs. That made it not only fast to build (90 days), but turned it "from a construction project—which require permits and other time-consuming aspects—to a manufacturing one," Microsoft says.

Second, there's the fact that, as Microsoft suggests, an underwater data hub could be powered by kinetic energy harvested from the ocean itself. Similar projects are being proposed in the U.K. and elsewhere, where coastal power stations could suck kinetic energy from coastal waters. Microsoft hints that its data hubs could be powered by similar technology one day, writing that "this could make data centers work independently of existing energy sources, located closer to coastal cities, powered by renewable ocean energy." That's in addition to the reduced cost of cooling the servers, thanks to freezing ocean currents.

Coastal cities, already threatened by climate change, will continue to boom over the next century. Their data demands will only increase, as will the severe weather and flooding doomed to plague them. In that light, designing a data center that can operate underwater, without access to a power grid, begins to seem less like a moonshot and more like a very shrewd idea.

The Fast Company Innovation Festival