Teaching ‘common sense’ to artificial intelligence

Source: koin.com

CORVALLIS, Ore. (KOIN) — Ever wonder why virtual assistant Siri can easily tell you what the square root of 1,558 is in an instant but can’t answer the question “what happens to an egg when you drop it on the ground?”

Artificial intelligence (A.I.) interfaces on devices like Apple’s iPhone or Amazon’s Alexa often fall flat on what many people consider to be basic questions, but can be speedy and accurate in their responses to complicated math problems. That’s because modern A.I. currently lacks common sense.

“What people who don’t work in A.I. everyday don’t realize is just how primitive what we call ‘A.I.’ is nowadays,” machine-learning researcher Alan Fern of Oregon State University’s College of Engineering told KOIN 6 News. “We have A.I.s that do very specialized, specific things, specific tasks, but they’re not general purpose. They can’t interact in general ways because they don’t have the common sense that you need to do that.”

Fern is a researcher of a nearly $9 million grant just awarded to OSU by the Defense Advanced Research Projects Agency (DARPA) of the U.S. Department of Defense, the entity responsible for creating the internet. The four year project’s aim is to integrate common sense into A.I.

Fern defines common sense as “the obvious deductions we make quickly and easily by most members of a population when behaving in the environment.

An example of common sense knowledge that your average adult may take for granted, and indeed A.I. systems struggle to comprehend, is object permanence–the ability to know an object that we just saw exists even if it is now temporarily hidden from view. The reason babies under six months old seem genuinely surprised during a game of peek-a-book is also because they lack object permanence.

Fern said the goal of the project is to get computers, and in turn robots, to have the common sense capabilities of an 18-month-old child.

He will be working with two collaborators to develop and train a “machine common sense service” that will learn about its environment in a manner similar to that of a toddler, using computer simulations.

One of the collaborators is behavioral psychologist Karen Adolph of New York University and another is roboticist Tucker Hermans of University of Utah.

Adolph will provide videos of toddlers in order for Fern to make a computer model of how babies explore their environment. A “virtual toddler” will then be tested in a 3-D simulated environment, which will look like “a simple robot in a video game exploring a virtual space.”

As a little baby goes from only being able to look around in its environment, to being able to crawl, and then being able to grab things and manipulate them, it begins a cycle of exploration that is arguably a more sophisticated learning style than many of today’s top machine-learning computers.

“Every stage, there’s exploration going on. A type of exploration that’s enabled by your current capabilities, which enables you to learn new common sense principals and capabilities,” Fern explained.

It’s an eventual goal that such developments in A.I. in a simulated environment will be able to be used in robots in the real world.

Fern said he sees practical assistance from A.I. to humans one of the main real world application for the technology.

“If you want a robot in your home that’s able to empty a dishwasher, or bring you a glass of water…there’s so many common sense principals that have to be employed and that cannot just be pre-programmed case-by-case.”

Related Posts

Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x
Artificial Intelligence