[Editor’s Note: Currently 80 percent of the dugongs can be detected in images.]
This post originally appeared on http://www.digitaltrends.com/cool-tech/endangered-sea-cow-drones-ml/.
It’s one thing to want to protect endangered animals, but another entirely to keep track of them. Case in point: the dugong, a medium-sized marine mammal often referred to as a sea cow. Cute they may be, but spotting them in large bodies of water is easier said than done.
Since marine researchers want to do so to keep tabs on population sizes, conservation status, and their important habitat areas, that poses a bit of a problem.
Fortunately, this is where Dr. Amanda Hodgson of Australia’s Murdoch University comes in. A member of the university’s Cetacean Research Unit, Hodgson has been using drones and machine-learning technology to better identify dugongs in their natural environment.
Using drones for aerial photography offers a new way to get the necessary images for Hodgson’s work, but opens up the problem of how best to spot the sea cows in a massive numberof photos. This is the point at which Hodgson turned to machine learning — and Queensland University of Technology computer scientist Frederic Maire — for help.
Together, they developed a detector using free open-source machine-learning platform TensorFlow, with the goal of identifying dugongs in photos automatically. This method had to work with images of varying complexity, such as ones where seagrass is visible on the seabed, or others where glare and whitecaps can be seen on the surface of the water.
“We developed an efficient machine-learning system for automating the detection of marine species in aerial imagery,” Maire told us. “The effectiveness of the approach can be credited to the combination of a well-suited region proposal method and the use of deep neural networks. Given a large image, the region proposal module generates a list of subwindows of the image, centered on candidate blobs. Each subwindow is then fed to a neural network classifier that predicts whether or not the subwindow contains a dugong.”
The latest version of the detector can find 80 percent of the dugongs in images. That number will hopefully increase in the future.
“The better news is that as we feed the detector with more images of known dugongs, and tell it which ones it got wrong, the accuracy of detections will continue to improve,” Hodgson noted. “This technology could be applied to surveys of any species as long as you start off which a set of images to train the detector.”