Transterrestrial Musings  


Amazon Honor System Click Here to Pay

Space
Alan Boyle (MSNBC)
Space Politics (Jeff Foust)
Space Transport News (Clark Lindsey)
NASA Watch
NASA Space Flight
Hobby Space
A Voyage To Arcturus (Jay Manifold)
Dispatches From The Final Frontier (Michael Belfiore)
Personal Spaceflight (Jeff Foust)
Mars Blog
The Flame Trench (Florida Today)
Space Cynic
Rocket Forge (Michael Mealing)
COTS Watch (Michael Mealing)
Curmudgeon's Corner (Mark Whittington)
Selenian Boondocks
Tales of the Heliosphere
Out Of The Cradle
Space For Commerce (Brian Dunbar)
True Anomaly
Kevin Parkin
The Speculist (Phil Bowermaster)
Spacecraft (Chris Hall)
Space Pragmatism (Dan Schrimpsher)
Eternal Golden Braid (Fred Kiesche)
Carried Away (Dan Schmelzer)
Laughing Wolf (C. Blake Powers)
Chair Force Engineer (Air Force Procurement)
Spacearium
Saturn Follies
JesusPhreaks (Scott Bell)
Journoblogs
The Ombudsgod
Cut On The Bias (Susanna Cornett)
Joanne Jacobs


Site designed by


Powered by
Movable Type
Biting Commentary about Infinity, and Beyond!

« Max Population Predicted | Main | A New Oxymoron »

Sociable Robots

A long but interesting article on the history, and current state of the art:

Cog was designed to learn like a child, and that’s how people tended to treat it, like a child. Videos of graduate students show them presenting Cog with a red ball to track, a waggling hand to look at, a bright pink Slinky to manipulate — the toys children are given to explore the world, to learn some basic truths about anatomy and physics and social interactions. As the robot moved in response to the students’ instructions, it exhibited qualities that signaled “creature.” The human brain has evolved to interpret certain traits as indicators of autonomous life: when something moves on its own and with apparent purpose, directs its gaze toward the person with whom it interacts, follows people with its eyes and backs away if someone gets too close. Cog did all these things, which made people who came in contact with it think of it as something alive. Even without a face, even without skin, even without arms that looked like arms or any legs at all, there was something creaturelike about Cog. It took very little, just the barest suggestion of a human form and a pair of eyes, for people to react to the robot as a social being.

...The robot expressed a few basic emotions through changes in its facial expression — that is, through the positioning of its eyes, lips, eyebrows and pink paper ears. The emotions were easy for an observer to recognize: anger, fear, disgust, joy, surprise, sorrow. According to psychologists, these expressions are automatic, unconscious and universally understood. So when the drivers on Kismet’s motors were set to make surprise look like raised eyebrows, wide-open eyes and a rounded mouth, the human observer knew exactly what was going on.

There are videos to demonstrate.

At craft shows, I've always been impressed at the artistry of the people who can take a bunch of hardware, like nuts and bolts and other things, and weld or solder it together into something that looks like a dog, or cow, or even person, and not just those entities, but ones with expressions on their faces, and even in their bodies. It is amazing how quickly and how much one can infer, falsely or not, from very little.

Posted by Rand Simberg at July 29, 2007 02:13 PM
TrackBack URL for this entry:
http://www.transterrestrial.com/mt-diagnostics.cgi/7952

Listed below are links to weblogs that reference this post from Transterrestrial Musings.
Comments

If it walks like a duck, etc. I would argue that a robot that behaved sufficiently like a human would be human. The trick is the "sufficiently like a human" part -- there is obviously a very long way to go until designers of robots can successfully mimic more than a very limited repertoire of human behavior.

Posted by Jonathan at July 29, 2007 06:28 PM

Sadly, we're a long way from bots like Six.

Posted by philw at July 29, 2007 06:41 PM

For a while now I've thought that the traditional belief that more intelligent robots would be all cold, hard logic with no emotion was precisely backwards. This is one area where the movies AI and Blade Runner got it more right than just about any other depictions. As much as I like the character of Lt. Commander Data, he just doesn't fit reality.

Throughout the development of automated computing we have always been blind to the reality of the difficulty of various problems in AI. We thought at first that the ability of early computers to tackle seemingly complex problems like chess and calculus meant that they were close to things like natural language processing, and other hallmarks of human intelligence. But they were oh so wrong. Now, of course, we think that feelings and emotions are the hallmark of humanity, while reason and logic are easier things to grasp. It is interesting how little we know about who we really are and what those things are which especially define us.

It should be obvious to anyone who thinks on the problem for any considerable time with any considerable effort and insight that feelings and emotions are not the pinnacle of humanity and are not as difficult to achieve as many believe. Animals much less sophisticated than humans express feelings and emotions, dogs and cats and birds and elephants and even mice. Each have their individual characters and idiosyncrasies, each have their moods, feelings, and emotions. Dogs can feel sad and lonely and jubilant and ashamed. Indeed, domesticated dogs have developed the ability to read and respond to human emotions, they can tell when their owner is sad or happy or angry. It seems preposterous to think that we could create a machine borne intelligence which possessed faculties far beyond those of dogs or cats but yet lacked the simple abilities to feel and to express. Children too develop their emotional abilities far before they develop cognitive abilities, indicating something of the nature of the relative difficulties of the problems.

I can come to no other conclusion than that the earliest generations of artificial intelligences will be more akin to pets and to children (and things we have no name for now) than to keen, emotionless intellects.

Posted by Robin Goodfellow at July 29, 2007 09:03 PM

The "robot" has nothing in common with AI. It's nothing more than a 3d, moving painting. In art, you employ the symbols that provoke an image in the person's brain that identify the object that has been painted. These symbols are generally learned, culturally dependent and not inherent.

I'm sure this set up gets government grants and will help design the next "Furbie" toy, but it's a boondoggle in the attempt to create AI.

Posted by K at July 30, 2007 12:25 AM

Robotics like this is pretty cool and interesting. But maybe we need robots like the one in "The Day the Earth Stood Still". To stop wars and keep the peace.

Of course that wouldn't be necessary if WE acted more human.

Posted by Steve at July 30, 2007 05:59 AM

Actually, having wars is (unfortunately) a very human trait.

Posted by Rand Simberg at July 30, 2007 11:28 AM

Actually, having wars is (unfortunately) a very human trait.

Why "unfortunately"? War sucks for the 1-5% of the participants who get killed outright, and for the losing side, but for the species itself, war is a very successful and powerful method of growth, natural selection, and improvement. That's why we do it. If we prospered more as a species by being as placid and nonaggressive as cows, we would be, evolution being what it is.

Europeans are without doubt the most aggressive and warmongering subgroup of the species in the last 10,000 years. Also the most successful, and have also done by far the most to advance the general welfare of the species. Is this entirely coincidence? Seems doubtful.

Posted by Carl Pham at July 30, 2007 12:33 PM


Post a comment
Name:


Email Address:


URL:


Comments: