The world’s coral reefs are dying. Increasing ocean temperatures and rising pollution levels are causing mass bleaching events, where stressed corals eject the symbiotic algae that keeps them alive. Scientists have been monitoring coral reef health for years, and the prognosis is not looking good.
In order to solve a problem, we need to understand it. One of the biggest challenges we face in tackling issues like reef health is the sheer amount of data we need to gather to work out what state they’re in.
Until now this has been a slow process. It relies on scientists surveying vast areas of reef, noting what coral species are present and how much area they cover. Doing all of this by hand takes a really long time.
Here’s where it gets fun: we now have technology that can do the boring work for us.
Here’s where it gets less fun: in order for this technology to work, it first must learn how to do its job.
Here’s where it gets fun again: NASA have specialised fluid lensing cameras. These are able to photograph the ocean floor by removing the distortion caused by rippling water above. Since 2019 they’ve been used to map large areas of shallow water reef in Puerto Rico. These images now need to be analysed.
NASA’s supercomputer, Pleiades, is learning to recognise corals so the mapping process can be automatic. In order for Pleiades to learn how to accurately identify corals, NASA have released a new game called NeMO-Net. In the game, users travel the oceans aboard a research vessel called the Nautilus, locating and identifying corals. The sea floor in the game comes from images captured by fluid lensing cameras, and users will be identifying real, living corals.