You’ve probably heard of the “Internet of Things”. It’s a connected network of everyday objects that talk to each other, such as cars, kitchen appliances and heart monitors. But did you know that the Internet of Things also extends deep underwater off Canada’s three coasts?
Think of it as a Fitbit for the ocean. Made possible by world-leading Oceans 2.0 data management software, Ocean Networks Canada’s (ONC) infrastructure is continuously monitoring the pulse and vital signs of our deep sea and coastal environments. Thousands of Internet-connected sensors gather real-time continuous data⎯everything from temperature, salinity, tides, seismic activity to underwater noise levels and video footage.
Oceans 2.0 helps us #knowtheocean
Gathering 250 GB per day from an expanding network of Internet-connected instruments, Oceans 2.0 makes ocean data available to scientists, communities, and leaders, helping them to make informed decisions about climate change, earthquake and tsunami detection, marine safety, life in the ocean, and more (Figure 1). And it’s starting to attract international attention.
Oceans 2.0 operates like an Amazon shopping cart, except it’s free. Browse through the data from ONC’s hundreds of underwater and land-based sensors, select the information you want, and then confirm your order, which is downloaded to your computer.
Developing robust software systems like Oceans 2.0 that can reliably collect, process, analyze, and archive huge volumes of data is probably the single biggest challenge facing the Internet of Things. Who are the visionaries and architects behind ONC’s complex and continuously expanding ocean database, and how has it evolved over the last dozen years?
“The initial vision was to develop a new way to interact with the deep sea by extending the Internet into the ocean,” explains Benoît Pirenne, ONC’s director of user engagement. “Gathering real-time data from the sea bed hundreds of kilometres offshore had never been done before.” Benoît has been overseeing the in-house development of this massive data management system since 2004. His previous experience archiving data from the Hubble Space telescope provided Benoit with a unique understanding of how to gather data from a remote underwater laboratory (Figure 2).
In 2005⎯more than a year before ONC’s first underwater infrastructure was deployed in Saanich Inlet⎯a prototype data archiving system was developed and tested. “This was something new,” says software engineer Eric Guillemot, hired as the project’s software architect (Figure 3). “There weren’t too many places we could look to find something similar that had already been done and try to adapt it for our needs. So, we had to build it from the ground up.”
Eric describes software engineering as the most challenging of engineering projects. “A physical object⎯such as a bridge⎯can be planned and designed in detail before it’s built. But software code⎯made up of nothing but ones and zeroes⎯is more dynamic, with many more opportunities for error,” says Eric. Building a complex software system requires an agile approach; iterative cycles that involve development, testing, and improving as new instruments, locations, and protocols are continually being added to ONC’s growing underwater observatories.
Following the deployment of the Saanich Inlet (VENUS) observatory in 2006 and the northeast Pacific offshore array (NEPTUNE) in 2009, scientists from a variety of disciplines started using ONC’s continuous real-time data for their research. As they adopted this new way of doing science, it offered new challenges for the data management team. “Biologists and geophysicists, for example, have totally different requirements,” comments Eric. “For some, one decimal point is enough. Others require maximum precision.” Dealing with these complex and sometimes contradictory science requirements keeps Eric and his software team on their toes.
Ultimately, the complexity of Oceans 2.0 has been its biggest challenge and its biggest success. (Figure 4). “It has been exciting to see the system evolve as we deal with day-to-day needs, challenges, and improvements,” says Benoît.
From the beginning, a key goal was to develop a system that would minimize data loss. To accomplish this, Benoît understood the importance of capturing the underwater data as soon as it reached dry land. Land-based shore stations⎯located as close to the underwater instruments as possible⎯continue to be a key design feature of ONC’s successful observatory networks (Figure 5).
“The reason why Oceans 2.0 is so flexible and reliable is the sophisticated messaging queue system,” say Ben Biffard, ONC’s senior scientific programmer. The data travels along fibre-optic cable from the underwater sensor to the shore station⎯where a driver creates an initial message or data signature⎯before sending it on to the main servers at the University of Victoria. “Once this message is in the data system, it’s pretty hard for it to get lost. The messaging system or the server can go down, but it always retains this data.”
Now that Oceans 2.0 has successfully weathered more than a decade of agile development, ONC’s sophisticated, proven, reliable, and scalable data management and instrument control system is making waves internationally.
In 2014, Oceans 2.0 was selected to be part of the International Council for Science World Data System, whose mission is to promote long-term stewardship of, and universal and equitable access to, quality-assured scientific data and data services, products, and information across a range of disciplines in the natural and social sciences, and the humanities.
Since 2016, ONC has been partnering with the US National Oceanic and Atmospheric Administration’s (NOAA) Office of Ocean Exploration and Research (OER) to integrate Oceans 2.0’s video browsing and annotation systems⎯SeaTube and SeaScribe⎯into their video data acquisition process (Figure 6).
NOAA approached ONC to help resolve their video annotation challenges during expeditions.
"Prior to SeaScribe, we had to make notes on spreadsheets, notebooks, or chat logs, which required a lot of time and effort. We risked missing observations and we did not capture the exact position or depth. Now we use SeaScribe to rapidly record observations linked with position, depth, and a video clip of the observation,” says oceanographer Michael Ford, who works with NOAA Fisheries and Smithsonian Environmental Research Center. “SeaTube allows us to review our observations, assemble 3D representations, and refine a multi-level taxonomic identification. These are critical stepping off points for species discovery, analyses of environmental tolerances, and geographic range. The improvement with the SeaScribe and SeaTube pairing is that we can get to that stepping off point much faster than ever before.”
This important international collaboration was an opportunity for ONC to take its data management system to the next level. Thanks to NOAA’s feedback, a major Oceans 2.0 update⎯due for release in March 2018⎯will significantly improve SeaTube’s and SeaScribe’s functionality, ease of use, and layout.
What’s next for Oceans 2.0?
Oceans 2.0 is a dynamic system that is constantly growing and changing. In addition to integrating new sensors, locations, and protocols into the system, ONC’s data teams are developing specialty data products and web services, turning data into knowledge to help us #knowtheocean. Stay tuned as we further evolve Oceans 2.0 into an ocean Fitbit that will accommodate British Columbia’s earthquake early warning system in 2019.
Curious to know more? Take Oceans 2.0 for a test drive yourself: preview or download live data from one of our hundreds of Internet connected sensors.