Translate to multiple languages

Sunday, December 31, 2017

AI is learning from our encounters with nature — and that's a concern | ABC Online - Analysis & Opinion

This article was originally published on The Conversation.

"The idea seems wonderful — a phone app that allows you to take a photo of a plant or animal and receive immediate species identification and other information about it." says Andrew Robinson,  Communications Scientist and Scholar at the Australian National University.  

A "Shazam for nature" sounds wonderful, but what are its true implications?
Photo: ABC Open contributor merrijignic

A "Shazam for nature", so to speak.

We are building huge repositories of data related to our natural environments, making this idea a reality.

But there are ethical concerns that should be addressed: about how data is collected and shared, who has the right to share it and how we use public data for machine learning.
And there's a bigger concern — whether such apps change what it means to be human.

Encounters with dandelions 
Oliver Sacks, the brilliant neurologist and author, once arranged to take a group of his patients on a field trip to the New York Botanic Garden. One of his patients, a severely autistic young man named Steve, hadn't stepped outside the facility for years. He never spoke; indeed, the doctors believed him incapable of speech.

In the gardens with Sacks, however, the invigorated Steve plucked a flower, and to the surprise of everyone, uttered the word "dandelion."
Over the last decade, this affinity so many of us feel for nature — what the famed biologist Edward Wilson termed "biophilia" — has resulted in an explosion of big data.
In the Global Biodiversity Information Facility (GBIF, an online database run out of Copenhagen) there are 682,447 records of human encounters with dandelions. Overall, the database holds more than 850 million observations of over 1 million different species of flora and fauna.

It's an impressive achievement, a gestating, global catalogue of life. It allows us to see the world in new ways.

For example just this year, thanks to the more than 42,000 recorded sightings from more than 5,000 participants using, we've gained unprecedented insight into the behaviour of the world's largest fish species.

Or on an bigger scale, the millions of bird observations generated through an app called eBird have allowed us to visualise the precise migratory routes of over 100 different bird species.

At the same time, in an outcome largely unforeseen by its early collectors, info-engineers are using the data to train artificial intelligence (AI), particularly computer vision apps to help us interpret the plants and animals we see around us.

And these tools are raising some interesting, sometimes troubling questions.

Joseph Banks in your pocket 
In one sense, of course, such tools are magical. The fictional tricorder of Star Trek is a magnificent device, scanning alien life forms, making them familiar. If we had a version on Earth, it would be the equivalent of a pocket-sized Joseph Banks, a trusty sidekick of discovery, filling us with a sense of confidence and control.

In China the latest version of the Baidu browser (a so-called Chinese Google) comes with a plant recognition feature built into it. Point your camera at a dandelion and you'll see the Chinese name for it — 蒲公英.

Such apps are triggering a new wave of botanical interest among the general population in China.

But there are also questions about these AI tools interfering with our ability — perhaps a human need — to easily transfer our unique nature expertise to, or gain expertise from, other people. 
Read more... 

Source: ABC Online

If you enjoyed this post, make sure you subscribe to my Email Updates!