When Erin Moreland set out to become a research zoologist, she envisioned days spent sitting on cliffs, drawing seals and other animals to record their lives for efforts to understand their activities and protect their habitats.

Instead, Moreland found herself stuck in front of a computer screen, clicking through thousands of aerial photographs of sea ice as she scanned for signs of life in Alaskan waters. It took her team so long to sort through each survey — akin to looking for lone grains of rice on vast mounds of sand — that the information was outdated by the time it was published.

“There’s got to be a better way to do this,” she recalls thinking. “Scientists should be freed up to contribute more to the study of animals and better understand what challenges they might be facing. Having to do something this time-consuming holds them back from what they could be accomplishing.”

Woman sits on boat with iceberg behind her
NOAA scientist Erin Moreland felt sure there was a technological solution to help her team sort through millions of aerial images of ice each year. She hit the jackpot with artificial intelligence. (Photo provided by NOAA)

That better way is now here — an idea that began, unusually enough, with the view from Moreland’s Seattle office window and her fortuitous summons to jury duty. She and her fellow National Oceanic and Atmospheric Administration scientists now will use artificial intelligence this spring to help monitor endangered beluga whales, threatened ice seals, polar bears and more, shaving years off the time it takes to get data into the right hands to protect the animals.

The teams are training AI tools to distinguish a seal from a rock and a whale’s whistle from a dredging machine’s squeak as they seek to understand the marine mammals’ behavior and help them survive amid melting ice and increasing human activity.

Moreland’s project combines AI technology with improved cameras on a NOAA turboprop airplane that will fly over the Beaufort Sea north of Alaska this April and May, scanning and classifying the imagery to produce a population count of ice seals and polar bears that will be ready in hours instead of months. Her colleague Manuel Castellote, a NOAA affiliate scientist, will apply a similar algorithm to the recordings he’ll pick up from equipment scattered across the bottom of Alaska’s Cook Inlet, helping him quickly decipher how the shrinking population of endangered belugas spent its winter.

The data will be confirmed by scientists, analyzed by statisticians and then reported to people such as Jon Kurland, NOAA’s assistant regional administrator for protected resources in Alaska.

Scientist Manuel Castellote (right) goes out in Alaska’s Cook Inlet each spring and fall to collect microphones at the bottom of the sea. He and his team first ping the equipment, instructing it to release the microphone so it can resurface. Then they bring it onboard to download the data before guiding the equipment back down to the ocean floor, where it will listen for another six months. (Photo by Daniela Huson with Ocean Conservation Research)

Kurland’s office in Juneau is charged with overseeing conservation and recovery programs for marine mammals around the state and its waters and helping guide all the federal agencies that issue permits or carry out actions that could affect those that are threatened or endangered.

Of the four types of ice seals in the Bering Sea — bearded, ringed, spotted and ribbon — the first two are classified as threatened, meaning they are likely to become in danger of extinction within the foreseeable future. The Cook Inlet beluga whales are already endangered, having steadily declined to a population of only 279 in last year’s survey, from an estimate of about a thousand 30 years ago.

Individual groups of beluga whales are isolated and don’t breed with others or leave their home, “so if this population goes extinct, no one else will come in; they’re gone forever,” says Castellote. “Other belugas wouldn’t survive there because they don’t know the environment. So you’d lose that biodiversity forever.”

Yet recommendations by Kurland’s office to help mitigate the impact of human activities such as construction and transportation, in part by avoiding prime breeding and feeding periods and places, are hampered by a lack of timely data.

“There’s basic information that we just don’t have now, so getting it will give us a much clearer picture of the types of responses that may be needed to protect these populations,” Kurland says. “In both cases, for the whales and seals, this kind of data analysis is cutting-edge science, filling in gaps we don’t have another way to fill.”

A man and a woman stand in front of a helicopter.
Erin Moreland’s first ice seal survey was in 2007, flying in a helicopter based on an icebreaker. Scientists collected 90,000 images and spent months scanning them but only found 200 seals. It was a tedious, imprecise process. (Photo provided by NOAA)

The AI project was born years ago, when Moreland would sit at her computer in NOAA’s Marine Mammal Laboratory in Seattle and look across Lake Washington toward Microsoft’s headquarters in Redmond, Washington. She felt sure there was a technological solution to her frustration, but she didn’t know anyone with the right skills to figure it out. 

She hit the jackpot one week while serving on a jury in 2018. She overheard two fellow jurors discussing AI during a break in the trial, so she began talking with them about her work. One of them connected her with Dan Morris from Microsoft’s AI for Earth program, who suggested they pitch the problem as a challenge that summer at the company’s Hackathon, a week-long competition when software developers, programmers, engineers and others collaborate on projects. Fourteen Microsoft engineers signed up to work on the problem.

“Across the wildlife conservation universe, there are tons of scientists doing boring things, reviewing images and audio,” Morris says. “Remote equipment lets us collect all kinds of data, but scientists have to figure out how to use that data. Spending a year annotating images is not only a bad use of their time, but the questions get answered way later than they should.”

Moreland’s idea wasn’t as simple as it may sound, though. While there are plenty of models to recognize people in images, there were none — until now — that could find seals, especially real-time in aerial photography. But the hundreds of thousands of examples NOAA scientists had classified in previous surveys helped technologists, who are using them to train the AI models to recognize which photographs and recordings contained mammals and which didn’t.

“Part of the challenge was that there were 20 terabytes of data of pictures of ice, and working on your laptop with that much data isn’t practical,” says Morris. “We had daily handovers of hard drives between Seattle and Redmond to get this done. But the cloud makes it possible to work with all that data and train AI models, so that’s how we’re able to do this work, with Azure.”

Can you spot the seals in this aerial photograph (left)? Look at the thermal image (right), and then back at the photo — can you even see them now? This is what AI will help NOAA scientists sort through. (Photo provided by NOAA, from a survey of Alaska’s Kotzebue Sound, where the ice had melted, forcing the seals closer together than normal.)

Moreland’s first ice seal survey was in 2007, flying in a helicopter based on an icebreaker. Scientists collected 90,000 images and spent months scanning them but only found 200 seals. It was a tedious, imprecise process.

Ice seals live largely solitary lives, making them harder to spot than animals that live in groups. Surveys are also complicated because the aircraft have to fly high enough to keep seals from getting scared and diving, but low enough to get high-resolution photos that enable scientists to differentiate a ring seal from a spotted seal, for example. The weather in Alaska — often rainy and cloudy — further complicates efforts.

Subsequent surveys improved by pairing thermal and color cameras and using modified planes that had a greater range to study more area and could fly higher up to be quieter. Even so, thermal interference from dirty ice and reflections off jumbled ice made it difficult to determine what was an animal and what wasn’t.

And then there was the problem of manpower to go along with all the new data. The 2016 survey produced a million pairs of thermal and color images, which a previous software system narrowed down to 316,000 hot spots that the scientists had to manually sort through and classify. It took three people six months.



Article Source