Between August and November of last year, technology giant Microsoft visited the Central Coast to conduct research on the feasibility of submerging data centers underwater to lower system cooling costs and to increase Web speed one day for millions of users worldwide.
Typically, electronics and water don’t mix. But the company tested the viability of submerging sealed computing equipment, a test datacenter, in the ocean water off the Cal Poly Pier in Avila Beach.
The company wanted to know if the ocean can effectively serve as a cooling system for data centers, referred to as cloud servers, that transmit video streams, social networking, email and other digital communications. A 38,000-pound container protected the technology from the ocean elements. The cloud server contained computing power equivalent to 300 desktop computers.
The research, dubbed Project Natick, tested a 10-foot-by-7-foot capsule submerged about 30 feet underwater off the pier for 105 days. Microsoft christened the capsule Leona Philpot, a nod to a character in the company’s “Halo” video game. (A curious choice, since Philpot’s back story is that she broke her neck while diving into a pool before being anointed homecoming queen as she recovered in a wheelchair through spinal column regeneration.)
Never miss a local story.
They liked the idea of a pier that extends a kilometer out to sea, where there’s 30 feet of water depth and power to plug into.
Dean Wendt, Cal Poly’s director of Coastal Marine Sciences
Data centers are currently housed on land, including in states such as Iowa, Virginia, Illinois, Texas and California. But during operations, they can generate substantial heat so air conditioning is required to prevent a system crash. If the ocean’s cooling effects can be used without impacting the devices through leaks and damage, the potential for reducing costs on a larger scale could signal an innovative approach to data delivery.
“Putting the gear under cold ocean water could fix the problem,” a New York Times article on the project stated. “It may also answer the exponentially growing energy demands of the computing world because Microsoft is considering pairing the system either with a turbine or a tidal energy system to generate electricity.”
The Microsoft researchers chose the long Cal Poly Pier in Port San Luis Harbor because it’s closed to the public and is equipped with an industrial electrical system, which the university uses for marine research and other activities. Microsoft used the system to provide power and data cables to the Leona Philpot capsule.
“They liked the idea of a pier that extends a kilometer out to sea, where there’s 30 feet of water depth and power to plug into,” said Dean Wendt, Cal Poly’s director of Coastal Marine Sciences, who oversees the pier’s operations. “The restricted access also was important as well. It’s much harder to do this kind of project on a public pier.”
Microsoft declined last week to answer specific questions from The Tribune about the project, including whether the company plans to conduct future tests or move forward with permanent infrastructure.
On its website, however, the company notes, “... the knowledge gained from the three months this vessel was underwater could help make future data centers more sustainable, while at the same time speeding data transmission and cloud deployment. And yes, maybe even someday, data centers could become commonplace in seas around the world.”
Cal Poly’s role
Microsoft worked on the technical side of the research along with a company it subcontracted, Santa Barbara-based Aquantis. which tested the vessel that housed and sealed the technology.
Cal Poly’s role was peripheral — aiding researchers in using the instrumentation on the pier and powering the site.
“We provided help with executing the project,” Wendt said. “This was not a money-making operation. We have these kinds of relationships with all sorts of entities that use the pier facility. The rates we charged essentially cover the costs for our staff time and maintenance.”
Wendt said Cal Poly received $130,000 from Microsoft for the associated staffing, maintenance and operational costs. The money will go toward pier maintenance and operations. The pier has more than $2 million in deferred maintenance needs, Wendt said.
Wendt said the university tries to line up projects that have an educational component, and one of the university’s mechanical engineering students participated in the study.
Tom Moylan, the university’s marine operations manager, said Microsoft’s research “was the largest” outside project ever undertaken at the Cal Poly Pier.
“The device was too heavy to launch from the pier,” Moyland said. “They came in with the pressure vessel on a barge and it sat on the ocean floor. I saw it on the first day when they lowered it in and on the last day.”
Data cables were hooked up from the pier to the vessel. Microsoft used its own wireless modem technology on the pier.
“The feedback is that they had a positive experience with Cal Poly,” Wendt said. “We haven’t heard of a phase II of Natick. But I think that Microsoft was pleased with how we supported the project.”
Bill Toman, the program manager of the California Wave Energy Test Center Project for the Institute for Advanced Technology and Public Policy at Cal Poly, said the Central Coast could be a viable location for Microsoft to potentially launch a larger underwater data center operation for commercial use.
“There is a tremendous amount of data connectivity that comes ashore right here in Morro Bay,” Toman said. “The National Oceanic and Atmospheric Administration has charts for mariners that show the location of numerous telecommunication cables that come ashore in the Morro Bay area. Using those cables could be the least cost in routing from Asia. So you can see where Microsoft may want to use our area.”
Microsoft noted on its website: “There are many subsea cables which allow the Internet to span the oceans, connecting devices and data centers around the world. Project Natick (at the Cal Poly Pier) was also connected via a cable to land and then to the Internet.”
Microsoft sees big potential
According to Microsoft’s website, more than half of the world’s population lives by the coast, so submerging data centers in coastal waters could increase the speed of data transmission delivery to users. A successful underwater data center could “enable rapid response to market demand, quick deployment for natural disasters and special events such as the World Cup.”
“Data centers are the backbone of cloud computing, and contain groups of networked computers that require a lot of power for all kinds of tasks: storing, processing and/or distributing massive amounts of information,” the company states. “... When data centers are closer to where people live and work, there is less ‘latency,’ which means that downloads, Web browsing and games are all faster.”
It’s still early days in evaluating whether this concept could be adopted by Microsoft and other cloud service providers.
And a full-scale project also would be environmentally conscientious, according to the company.
“Natick data centers are envisioned to be fully recycled, made from recycled material which in turn is recycled at the end of life of the data center,” according to Microsoft. “... A Natick data center co-located with offshore renewable energy sources could be truly zero emission: no waste products, whether due to the power generation, computers, or human maintainers, are emitted into the environment.”
However, the company makes it clear it’s in the early stages.
“Project Natick is currently at the research stage,” the website states. “It’s still early days in evaluating whether this concept could be adopted by Microsoft and other cloud service providers.”