WILL ARTIFICIAL INTELLIGENCE MAKE CITIZEN SCIENTISTS OBSOLETE?

Source – psmag.com

In Serengeti National Park, there are 225 hidden cameras constantly photographing the creatures that roam this Tanzanian wilderness.

To date, these camera traps have captured more than three million images. For a small team of scientists living in the park, it’s a treasure trove. Through Serengeti Snapshot, as the program is called, they’ve studied everything from the migrations of the region’s herbivores to the surprising co-existence of lions, hyenas, and cheetahs.

It’s work that wouldn’t have been possible without an army of 30,000 citizen scientists, who manually sorted the collection, identifying and naming the species in each frame. It’s time-consuming work, and the volunteers are doing it for kicks. The project lives on Zooniverse, a Web platform where citizen scientists can help various scientific projects.

Now, a team of computer scientists at the University of Wyoming has found a way to automate this manual task, using artificial intelligence to identify the animals in the images quickly and accurately. By “training” the computer with the images that have been manually labeled by the volunteers, the team’s model has learned to process 99.3 percent of the images just as accurately as the human brain.

Getting these images manually classified presents “a very big bottleneck for ecologists,” says Mohammad Sadegh Norouzzadeh, a Ph.D. student at the University of Wyoming, and the first author of the paper. “[Artificial intelligence techniques] can save a tremendous amount of time for ecologists collecting the data they need.”

The benefits of this automation for science are clear: More data combined with more efficient analysis equals better scientific results, and hopefully better conservation of the natural environment.

But where does it leave the citizen scientist? A paper published on Wednesday highlights how, as people migrate to cities and lose touch with nature, citizen science “can increase emotional and cognitive connections to nature” and make participants more supportive of conservation efforts. The United States National Park Service has also found that engaging the public in hands-on work can spur further environmental action. As machines become more intelligent, and the need for human input declines, will the public’s engagement with science also suffer?

“Even though we’re hopeful that machine learning will reduce the number of images that we need people to look at, I don’t think there’s any risk that we’ll end up reducing the engagement of volunteers,” says Ali Swanson, founder of Serengeti Snapshot. “There are so many ecology and conservation projects producing so much data that I think the demand for volunteer effort is nowhere near being met.”

Rather than eradicating the need for input from enthusiastic volunteers, researchers will continue to depend on such volunteers for training their artificially intelligent models. Currently, only “supervised learning”—where machines are taught to identify animals through human-classified examples—can produce sufficiently accurate results. Until the machines can be weaned off this method, humans will still be responsible for the initial labeling process.

The easiest work is often the dreariest. Once machines have learned to classify the most obvious images, the valuable skills of a human workforce can be better utilized, Sadegh Norouzzadeh says—tasks such as classifying the 0.7 percent of images that the computer couldn’t.

“One of the problems was that the [Serengeti] project was getting boring for the citizen scientists,” Norouzzadeh says, emphasizing that the work slowed over time as volunteers dropped out. “If we use machine learning and it automatically processes most of the images, then the task could be more challenging for the citizen scientists, and more challenges means more interest in the project.”

Still, many of the current opportunities for everyday people to contribute involve working out the kinks in this new technology. As it improves, and as more information leads to better-trained models, it seems possible that this army of volunteer labelers could be rendered obsolete. I wanted to know if there is something out there—a task so intrinsically human—that it can’t be done by a computer.

“People bring a lot to the table that computers can’t,” Swanson says. “We talk a lot about ‘serendipitous discovery’ at the Zooniverse, and how volunteers often make really cool discoveries that are actually tangential to the main research question, such as the discovery of a new type of astronomical object, or unexpectedly tracking the outbreak of the Spanish Flu across the Royal Navy while looking at old ship logs for climatological data.”

Scientists are also relying more and more on the people who know their environment most intimately, such as indigenous footprint trackers. One such initiative is Footprint Identification Technology, a new approach developed by group called WildTrack based in North Carolina. It depends on indigenous people collecting photographs of animal footprints and identifying the species—something that most people, even most citizen scientists, can’t do. With this knowledge, WildTrack hopes to train an algorithm to automatically classify much more data than the group could handle manually, in hopes of revealing new insights into the various creatures roaming the planet.

Even if the need for human classification does decline, artificial intelligence is creating new opportunities for citizens to engage with nature more easily than ever before, generating an important stream of data for scientists in the process.

I discovered this for myself quite recently. It was Friday evening, and my boyfriend and I were out of milk. On our way to Target, I came across a tree that grows by my bus stop and that I’d finally managed to identify as a honeylocust. I’d recently stumbled across iNaturalist, an app that uses deep learning techniques to identify plants and animals. Based on the app’s description, it sounded as though it would be able to draw upon previously uploaded images of other honeylocusts to classify the particular tree in question.

Even though I’ve spent the last five years of my life writing about the environment, I’m no botanist. My knowledge of the plant world peaked early, when I developed a passion for snapdragons, mainly because they were the coolest plant growing in my grandmother’s garden. My ignorance beyond this one flower is something I’ve long wanted to remedy, but it was difficult to know where to begin. Manually identifying the honeylocust had required me to individually Google every species on the city government’s official tree spreadsheet. It wasn’t a sustainable solution.

Could iNaturalist save me hours of botanical procrastination? I pointed the camera at the honeylocust and clicked, “What did you see?” A word popped up on the screen: “honeylocust.” In the 10-minute walk to Target, we added oak and elm to our collection. Since then, just in our neighborhood, we’ve spotted wild blue indigo, meadow anemone, western columbine, and hairy beardtongue, among others.

That’s not to say that the app works perfectly. It thinks our orchid is a radish. But that’s where humans take over. And the more images that are labeled by citizen scientists, the more the app will learn and improve.

The purpose of iNaturalist is explicitly to connect people more closely to nature. Additionally, individuals can build communal projects that transcend personal curiosity. For instance, Boston has invited locals to document the various species of oyster living in the harbor, with an eye toward determining whether invasive European oysters are encroaching on native varieties.

Developers at iNaturalist have also built a page for “research-grade observations,” which can be used in scientific studies. Data collected by citizen scientists through the app has been used to investigate topics including the effect of invasive pests(such as the Asian longhorned beetle) on carbon sequestration in forests, and the transmission of viruses via fruit bats in parts of Asia.

I’m not sure how useful our documentation of the wildflowers of Southside Chicago will be, but it has added texture to our own lives. Our streets are suddenly cloaked in color and diversity. Our grocery trips are an opportunity for exploration and discovery. In fact, they always were; we just didn’t know it.

Related Posts

Subscribe
Notify of
guest
3 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
3
0
Would love your thoughts, please comment.x
()
x
Artificial Intelligence