Published On: 13 June 2023Categories: Digital Twin Ocean, News

By harnessing the power of data and cutting-edge technology, the European Digital Twin of the Ocean aims to revolutionize ocean research, support policy and other stakeholders, and make ocean knowledge more open and impactful than ever before.

Solutions were the focus of European Commission President Ursula von der Leyen’s speech at the One Ocean Summit last year. “Today, we know the ocean is vulnerable and is threatened by our misdeeds…Europe can make a huge difference as a maritime power. But only together will we turn the tide, and let our oceans teem with life again.” Among the European solutions brought to the table at the Summit – the Digital Twin of the Ocean.

Developed as part of the EU’s Mission: restore our ocean and waters by 2030 the European Digital Twin of the Ocean (DTO) brings together physical, chemical, biological, and socio-economic data and models to create a virtual replica of the ocean. Coupling artificial intelligence tools such as machine learning with modelling techniques, European High-Performance Computing, and simulation capabilities, the DTO will help boost our knowledge of the ocean and the understanding of the potential impact of the actions and decisions we take, to address crucial environmental challenges, as the ones created by climate change, biodiversity loss and more. The construction of the DTO is well underway, with its operational infrastructure, digital framework and core model suite being developed through the EDITO projects, EDITO-Infra and Edito-Model Lab, but also with the support of several other related projects such as Iliad, Blue Cloud, AquaInfra and many others.

An open and collaborative platform

Although the DTO will propel scientific understanding of the ocean, the aim is that by bringing together multiple sources of data, models and other tools into an open and accessible platform, the DTO will also be useful to “the larger society, the policymakers, and the actors in the blue economy,” says Dr Zoi Konstantinou, policy officer at the European Commission’s Directorate-General for Maritime Affairs and Fisheries (DG MARE).

To ensure that the DTO can be valuable to as many different stakeholders as possible, its creation is “by principle, an inclusive and open process,” says Dr Nicolas Segebarth, policy officer at DG Research & Innovation. “The Digital Ocean Forum is one of the platforms where we try to have this happen.”

The first Digital Ocean Forum held last year brought together experts from different fields to discuss the development of the DTO. This month, the second instalment of the Forum will include a technical workshop that brings together experts from over 50 European-funded projects. “All these projects are working on things relevant to the DTO – their acquired data, their work on data sharing infrastructure, or on physical models, ecological models, or socio-economic models,” Segebarth explains.

Powered by data

Robust data is a crucial part of any model, and the DTO is no different and will be fed by continuous observations from thousands of sensors across the world’s oceans and numerous satellites, integrating existing as well as new data flows. For in situ ocean observation data, sources include the Euro-Argo European Research Infrastructure Consortium for Observing the Ocean (Euro-Argo ERIC) programme, which deploys and maintains autonomous floats that collect key oceanographic data, making up about 25% of the international Argo programme. “The floats are freely drifting in the ocean. Every ten days, it dives down to 2000m, some down to 6000m, and then, on its way up, collects data,” explains Dr Yann-Hervé De Roeck, program manager of Euro-Argo ERIC. The first floats collected salinity and temperature data, but the newer floats also collect biogeochemical information such as pH, dissolved oxygen content or chlorophyll.

Not only is the depth data useful for understanding ocean processes, but “one of the main operational uses is meteorology and climate prediction,” De Roeck explains. For example, the data “gives information about the variations of heat dissipation of the ocean, which gives you better weather forecasts. With deeper measurements, it can help for seasonal climate forecasts.”

The historical and future data collected by the Argo fleet will also be essential for understanding how the ocean changes over space and time, particularly in light of climate change. “The development of DTO must foster the sustainability of this type of measuring network,” De Roeck says.

Bringing together multiple sources

The DTO isn’t just about ocean data. “We also need to think about how we’re going to combine this data and other kinds of data coming from society, the economy, the business sector, etc.,” says Konstantinou. The EU Public Infrastructure for the European Digital Twin Ocean (EDITO-Infra) project is charged with meeting this challenge. “We will build the public infrastructure backbone of the DTO to integrate key data services,” explains Dr Marina Tonani, project manager at Mercator Ocean International. EDITO-Infra is the sister project to the EDITO-Model Lab, which in parallel will develop a new generation of models and simulation tools for the DTO, to manipulate, analyse, and understand marine information, with enhanced ocean forecasting and ocean climate prediction capabilities.

One of the infrastructure components EDITO-Infra are working on is data lakes – repositories for storing and processing large amounts of data in different formats. The challenge is to have all those different data formats ‘talk’ to each other. The first phase will bring together Copernicus Marine Service and EMODnet data.

“These two assets have their own standards that differ in terms of organisation of the data,” explains Tonani, noting that communities often have different standards designed to meet their specific needs. “We are developing a set of standards so that we will be able to interoperate the data, so we can benefit from the information coming from very different communities and have a comprehensive view of what is happening in the marine environment.”

As the DTO develops, other data sources will be brought into the DTO. For example, on the agenda of the upcoming Digital Ocean Forum is a discussion with the EU project representatives about the data lakes and other components of the DTO to ensure “what we will build will really be interoperable for the information they generate.”

“It is a challenge, but it is a good challenge. It is amazing what we are trying to do with the digital twin in general,” says Tonani. “This comprehensive information on the marine environment will have a big impact.”