Santa Cruz Tech Beat

Featured

Big data for conservation: Conservation biologists find new applications for AI tools

By Tim Stephen
UC Santa Cruz

August 5, 2019 — Santa Cruz, CA

(Image above: Researchers use sound recordings to detect marbled murrelets flying to their nests in old-growth redwood forests. The endangered seabirds are extremely secretive around their nesting sites, but their calls can be heard as they fly to and from the ocean where they feed, and AI algorithms can be used to identify the murrelet calls in thousands of hours of recordings. Credit: Miranda Powell)

Deep learning algorithms can be trained to recognize anything from the types of vegetation in a coastal wetland to the sound of a bird hitting a power line

Automated cameras and other sensors deployed in the wild are transforming the way biologists monitor natural ecosystems and animal populations. These technologies can collect huge amounts of data, however, and conservation biologists are increasingly turning to the tools of artificial intelligence (AI) to sort through it all.

In particular, a machine learning method called “deep learning,” already widely used in face recognition and other image- and speech-recognition applications, is now being applied by conservation biologists to analyze images, videos, and sound recordings of everything from African elephants to aquatic insects.

Biologists Donald Croll and Bernie Tershy, who run the Conservation Action Lab at UC Santa Cruz, have been using deep learning in their work on seabird conservation. They began developing acoustic technologies for monitoring seabird populations as a research project and eventually started a company, Conservation Metrics, to provide wildlife monitoring services.

“We want to increase the efficiency of conservation monitoring,” said Croll, a professor of ecology and evolutionary biology. “We do these conservation interventions, like removing invasive species from an island, but ongoing monitoring to track how seabird populations recover after the intervention is expensive. AI tools can help us automate that.”

Automated acoustic sensors can record the sounds at a nesting site for weeks or months at a time. Then it’s a matter of finding the calls made by the species of interest in thousands of hours of recordings. Initially, the researchers used traditional pattern-matching tools to analyze the acoustic data, but they found that machine learning offers a much more powerful approach.

“Our performance has vastly improved with the use of deep learning techniques,” said co-founder Matthew McKown, who began working on the project as a postdoctoral researcher in Croll’s lab and is now CEO of Conservation Metrics.

The company’s acoustic monitoring projects include studies of forest elephants and rare bats in Africa, endangered marbled murrelets nesting in old-growth redwood forests, and even a project with researchers in Hawaii to detect collisions of birds with power lines for more accurate assessments of where and how often they occur. The company is also starting to expand beyond acoustic monitoring to image analysis, as in a project with Duke University researchers using drone images to count Olive Ridley sea turtles off their nesting beaches in Costa Rica.

The demand for these tools is growing rapidly, McKown said. “When we started, other than a few academic labs, we were the only people we knew of applying machine learning to conservation monitoring data. And now it’s exploding.”

Long-term monitoring

At UC Santa Cruz, researchers are developing and applying techniques involving machine learning in a wide range of conservation-related projects. Peter Raimondi, professor of ecology and evolutionary biology, is just beginning to assess the potential of AI to help with long-term monitoring of intertidal ecosystems. For nearly 30 years he has overseen intertidal surveys along the west coast of North America, from Baja to Alaska. The traditional approach involves teams of researchers doing repeated sampling at established sites where they painstakingly identify and count every species at points on a grid.

“The question that has always plagued researchers doing this kind of survey work is whether we’re really capturing all the diversity when we’re only sampling a fraction of the total area,” Raimondi said. “Now we’re starting to fly drones over these sites to capture aerial images, and then we’ll see if we can train an algorithm to identify the species in those images.”

The process starts with aerial images of sites already sampled by people on the ground. Those data are used to train a deep learning algorithm to identify species in the images. Large amounts of data are needed for this iterative training process, but if the algorithm is accurate enough, then drones can be used to survey much larger areas than can be practically surveyed by people on the ground.

“It has real potential,” Raimondi said. “We have funding to do this in California, and we hope to do it for the whole west coast. We know the algorithms don’t do well if you try to apply them too broadly, so we can’t train it with data from the Central Coast and then take it to Oregon—you have to do local calibrations. So we’ll always need people doing sampling on the ground, but we may be able to automate a lot of it.”

Ross Davison, a graduate student in the Coastal Science & Policy Program, is also using aerial images from drones and machine learning in his research on coastal wetlands. “The algorithms are pretty robust for mapping vegetation and water inundation,” he said. “We’re hoping to use it for everything from initial biomass assessments and species delineation, to eventually even plant health assessments.”

As climate change drives changes in wetlands and other dynamic ecosystems, the ability to rapidly collect this kind of information will be increasingly important for developing effective management strategies, he said. The combination of aerial imaging and machine learning has the potential to save a lot of time and effort.

“It can take out a lot of the barriers to doing this type of analysis. That’s what we’re excited about,” Davison said.

No silver bullet

McKown said he often gets asked by field biologists if he’s trying to replace them with technology, but field work is still an essential part of the process. “You have to have people in the field to be able to interpret the results,” he said. “This isn’t a silver bullet. Like any tool, you have to understand how to incorporate it into your project. We always have a person in the loop, an analyst who is sampling from the computer’s predictions to understand how it’s performing.”

The range of potential applications is constantly expanding. Andrew Port, a computer engineering graduate student in Professor Roberto Manduchi’s Computer Vision Lab in the Baskin School of Engineering, has been working with the conservation organization Rare on a sustainable fisheries project in Central America. There, the challenge is to certify the origins of fish to ensure that they were caught in accordance with agreements designed to protect certain areas. The fishermen say they can tell where a fish was caught just by looking at it, and research has shown that some species do have slight variations in shape depending on what area of the reef they come from.

“The problem then was to develop an automated system, like a cell phone app, that’s as good as the intuition of the fishermen,” Port said. “Newer models of cell phones have chips designed to do these kinds of calculations. What’s driving all this are huge advances in computer vision and machine learning.”

Port is also working with Nicholas Macias, a UCSC graduate student in ecology and evolutionary biology, who is deploying a camera system in streams to monitor insects drifting downstream. The goal is to measure the availability of prey for fish and how it changes over time. The system can acquire 30,000 images in an hour, just the kind of data analysis challenge that AI can help with.

“He developed a really effective method of photographing the insects that float through the stream,” Port said. “I’ve been working with him to develop a deep-learning model that can differentiate between different kinds of insects in the photos.”

Port’s work in Manduchi’s lab has focused on assistive technologies for people with visual impairments. His connection with Rare came through Vikram Sahai, a UCSC Foundation trustee who worked at Google for 15 years.

“I left with the idea of doing these kinds of fun projects where I can use my background in big data and machine learning to help conservation organizations,” Sahai said. “Often the practitioners in the field don’t know what technology can do for them, and I don’t know what problems they need to solve.”

After meeting the CEO of Rare, Sahai visited their offices to learn about their work and identified some projects that might benefit from technological innovations. He said he always looks for opportunities to get UCSC students involved. “One thing about UCSC is the students are very aware of environmental challenges, and they want to work on meaningful projects,” Sahai said.

Sahai is also an adviser for Wildlife Insights, a new platform for processing images from wildlife camera traps developed by a partnership involving Google, Conservation International, and other organizations. Microsoft’s AI for Earth is a similar program that is working to put cloud services and AI tools in the hands of those working to solve global environmental challenges. (Conservation Metrics is among the groups funded by AI for Earth.)

“In the conservation space, what’s happened is the cost of cameras is coming down, and with deep learning we’re better at processing images and videos, so it’s now become relatively cheap to deploy large numbers of camera traps or video cameras and process the data by machine learning,” Sahai explained. “Conservationists collect all these images, but to go through and analyze it all is very time consuming. Now we can automate the processing, and all this data can be in the cloud, which really enables them to share it with other researchers and disseminate results.”

McKown said he is excited to see so much innovation in this area. “It’s great to see, and I’m excited to learn from how other people are approaching these same questions,” he said.

At UC Santa Cruz, researchers are also applying AI tools in fields ranging from astronomy to genomics. But Port said he’s been struck by the level of interest from ecologists. “I’ve noticed a real demand from the ecology community, because they have so many monitoring tasks, and these models are perfect for that,” he said.

Xavier Prochaska, a professor of astronomy and astrophysics, has co-founded a campus-wide Applied Artificial Intelligence Initiative to bring together faculty and students engaged in AI across all the divisions.

“A key goal is to get faculty talking across disciplines and sharing ideas,” Prochaska said. “We started a seminar series and offer tutorials for students. Machine learning is a rapidly evolving field, and building an interdisciplinary community will help us stay on the cutting edge of this.”

###

Tagged

Related Posts

Sign up for our free weekly email digest!

  • This field is for validation purposes and should be left unchanged.

Follow Now

Facebook Feed

This message is only visible to admins.
Problem displaying Facebook posts.
Click to show error
Error: Server configuration issue