Poachers are wiping out wildlife population of the world. Conservationists and computer scientists are teaming up with the hope that AI can help them keep up with these poachers. Announcements made by Cornell University last week mentioned "that its Elephant Listening Project (ELP), which tracks African forest elephants in dense, remote parts of central Africa, has started seeing promising results from its work with Conservation Metrics, an artificial-intelligence start-up."
In the national park of Republic of Congo "Nouable Ndoki" a dozen of acoustic sensors have been installed by ELP to monitor the movement of elephants in an area of around 580 sq miles. It reports patterns in elephant calls and sounds of gunshots to park rangers who use the information to identify where herds are gathering and poachers are working. But the process of collecting the audio, analyzing the enormous files, and reporting back to the rangers can take up to three or four months.
Conservation Metrics are using machine learning and deep neural networks in the hope that it will shift the burden of sifting through audio from humans to machines and will reduce the time they need to analyze the audio with this process. ELP director Peter Wrege describes the development of AI tools as "critical" to conservation work. "The quicker we can get that kind of information to the people who are making the decisions that affect [the animals'] survival, obviously the better," says Wrege.
According to Wrege "ELP has been working with Conservation Metrics in some capacity for three years, the start-up recently got funding from Microsoft's AI for earths initiative that will enable it to focus on the technology." Wrege continued by saying that "the use of AI tools has a long way to go in helping ecologists and park rangers deal with the on-the-ground challenges faced by poaching." ELP is most excited about Conservation Metrics' ability to develop an improved sound detector. Better audio tools could more accurately distinguish between, say, the sounds of a gunshot versus the sound of an elephant stepping on a branch.
Running algorithms on thousands of audio clips and associated words is how the voice recognition in Siri, Alexa and other virtual assistant works now a day. Limiting the sounds to gunshots or other noises associated with poachers only narrows what the algorithms is looking for making it more accurate.
How it works?
Fei Fang, an assistant professor at Carnegie Mellon University in Pennsylvania, has been working on AI and conservation since 2013. Her work uses game theory and machine learning to help park rangers plan more efficient patrols. Fang's software analyzes different sources of data, like the geography of an area, where animal paths lie, and reports of where rangers have found poaching activity. The algorithm then tries to draw a route that increases the likelihood rangers on patrol will intersect with poachers and animals, while minimizing the elevation change, time of the expedition, and return trip to base camp. Juggling the impact of different demands on spatial navigation has already proven to be a task perfectly suited for machine learning.
Fang says AI researchers need to do more work to fully understand the challenges faced by park rangers and ecologists. "Clearly AI cannot do everything and it cannot replace the human effort," Fang says. "We would like to work more with the people on the ground to see what they might need, how they do things in the field, and what are the aspects that can potentially help?"