AI Speeds Rubbish Detection

The wealth of sonar data gathered by industry could help clean-up the oceans. AI is making it easy.

By Wendy Laursen

Ghost nets kill a variety of marine life.

©Christian Howe, WWF
Listen to this article

Some 50,000 tons, 20 percent of all fishing gear, is lost in the world's oceans each year. Once adrift, these nets entangle marine organisms, which act as bait, attracting predators into the net, perpetuating the cycle of capture and mortality known as ghost fishing.

WWF Germany, together with Accenture and Microsoft, has launched an initiative to combat this threat with new efficiency: using AI to search for nets in high-resolution sonar data.

Raw sonar signals are converted into grayscale images and then separated into left and right views to account for the dual-channel nature of side-scan sonar data. WWF selected DeepLabV3 to provide pixel-level classification which has enabled a 90% success rate despite the irregular shape of ghost nets.

Crayton Fenn explains how sonar positioning data is obtained

©Christian Howe, WWF

Steel cable identified using sonar positioning.

©Christian Howe, WWF

Divers verifying sonar positions

©Christian Howe, WWF

Dr Zhongqi Miao, Applied Research Scientist at Microsoft’s AI for Good Lab, explains pixel-level classification: “It is one of the three fundamental computer vision tasks: image classification, object detection, and segmentation. Basically, the model is doing classification of every single pixel in an input image such that we would see groups of pixels that are classified into certain categories (either background or ghost nets in our case). And this can give us more accurate locations of ghost nets in the images, which we can use to calculate the actual geo-locations of these nets.”

WWF is now asking for more sonar data via the recently launched online platform GhostNetZero.ai.

“If we can specifically examine existing image data from heavily fished ocean zones, this will be a real game-changer for the search for ghost nets. We hope that research institutes, authorities, and companies will participate in this collaboration," says Gabriele Dederer, research diver and ghost net project manager at WWF Germany.

The online platform allows WWF experts to review and validate what AI finds, but they only need to check a fraction of the images—AI speeds up the process by highlighting where to look. Once a ghost net is confirmed, qualified divers use an app to verify the net’s position and begin retrieval.

Beyond ghost nets, the organizations are now planning to include other types of lost fishing gear, such as crab pots, which also pose threats to marine life.

In a separate initiative, a research team at the Technical University of Munich (TUM) is using AI to search and recover a range of litter, including ghost nets. They have developed an autonomous diving robot that uses an AI system to analyze objects with sonar and cameras, pick them up, and brings them to the surface.

TUM's Nicolas Hoischen, Zara Zothabayeva, Tzu-Yuan Huang and Hamish Grant (from left to right) discuss TUM's new diving robot at the port of Marseille. ©TUM
TUM's Nicolas Hoischen and Tzu-Yuan Huang inspect the diving robot. ©TUM

A service boat supplies the underwater robots with power and data connections via cable. It also sends ultrasonic waves into the depths to generate a rough map of the seabed. A dedicated search robot about 50 centimeters long also scans the seabed. Armed with this information, a submarine, powered by eight mini turbines, dives to the locations where rubbish is detected and grabs it. A winch then loads the rubbish on to an additional autonomous boat that serves as a floating waste container.

“Since we first have to identify the rubbish and grasping objects requires a high degree of precision, we have a camera and sonar on board that enable orientation even in murky water,” explains Dr Stefan Sosnowski, Chair of Information Technology Control at TUM.

Identifying rubbish is no trivial matter. This is because hardly any image material is available for underwater objects that could help to train neural networks. “That’s why the project partners have so far labelled over 7,000 images as objects that don't belong on the seabed,” says Sosnowski.

Dr Bart DeSchutter of TU Delft explains: “The AI models we use are object detection networks where we primarily use the so-called YOLO real-time object detection algorithm that is specifically trained for underwater environments. Once trained, the YOLO network finds where relevant objects (such as litter) are in an image and also what type of object it is (for example a plastic bottle).” The networks also detect various types of flora and fauna, to ensure that those are not harmed during litter collection.

DeSchutter and his team are currently developing improved and extended object detection algorithms to further increase reliability and robustness, so that the system can ultimately detect litter wherever it may be deployed.

The four-fingered hand on the autonomous gripper developed by TUM has a volume of approximately one cubic meter, can squeeze with a force of 4,000 newtons, and can grasp objects weighing up to 250kg.

TUM’s gripper can pick up objects weighing up to 250 kilograms. ©TUM

The gripper produces a 3D representation of objects to grasp with a stereo camera in its “palm.” AI models from TU Delft produce a 3D model of the object to be picked up so optimal grasp points can be calculated.

Special sensors enable it to gauge how much force it can apply without causing damage. This prevents plastic buckets from breaking, for example, or glass bottles from shattering. This is currently done directly by using the torque measurements from sensors in each of the joints. Enhancing the sensitivity of the force/torque interaction with objects through machine learning is something that the researchers are actively working on but have not rolled out in trials yet.

So far, the system has been trialed in the port of Marseille in France, and the researchers expect it to be profitable at depths of 16 meters or more.

December 2025
Teledyne Marine