Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5

Meet Kismet

#1
Yazata Offline
Kismet was a computer-driven (sorta) humanoid robot head, designed to display human like emotional expressions and to interact with humans and to learn those expression and when to use them appropriately much the way human infants appear to. I'm inclined to think that there are important lessons to be learned about human psychology from this.


[Image: menzel-interest-web.JPG]
[Image: menzel-interest-web.JPG]



One lesson is that as a stimulus-response mechanism, Kismet learns by observing how people react to what it does. It makes one kind of facial expression and humans act one way, it makes another and they react another way. So it can exert some control over its environment by making the appropriate expressions. Just program it to "want" to maximize certain kinds of responses and minimize others. Kizmet was apparently programmed to "want" to foster an infant/caretaker kind of relationship. So it had the makings of becoming an efficient little psychological-manipulation mechanism.  

And as several people in the video below note, a great deal of what we take to be Kismet's social intelligence might be us projecting our own ideas about feelings and motives into Kizmet. Humans tend to anthropomorphize everything, to see faces in the clouds.

So the question naturally arises, are we really all that different than Kizmet? Might we be doing much the same thing all the time with each other? (And ourselves?)

Is it possible that other people's mental and emotional states aren't some mysterious things (spiritual substances?) that they possess in their heads or somewhere?

Might other people's mental and emotional states be more along the lines of interpretive categories that we attribute to those other people as part of our continuing efforts to make sense of their behavior?

Is it even possible that we are turning that same projective/interpretive process back on ourselves when we introspect? Is 'happiness' a mysterious occult something that we possess and intuit within ourselves in some unknown way, or more accurately an interpretive category that we apply to a certain set of behaviors that we display? (Even if they are neural behaviors not visible from outside.) Something that we might have evolved to want to maximize (pleasure) or minimize (pain).

Watch Kismet in action in this video:

https://www.youtube.com/watch?time_conti...KRZX5KL4fA

http://www.ai.mit.edu/projects/humanoid-...ismet.html
Reply
#2
Syne Offline
Social/emotional intelligence does seem to rely on the quality of a person's theory of mind, or behavior modeling.
We're different from any machine because we actually experience the motivators for expressions, and can thus attribute motivations in others. Only sociopaths are solely motivated to manipulate others, so Kismet may fail as a human but succeed as an artificial sociopath.
Reply
#3
C C Offline
(Aug 6, 2018 12:37 AM)Yazata Wrote: Kismet was a computer-driven (sorta) humanoid robot head, designed to display human like emotional expressions and to interact with humans and to learn those expression and when to use them appropriately much the way human infants appear to.


I first remember seeing Kismet years ago on some television science program, though not sure it was the same decade this video dates from. Like the guy says, "Humans are suckers." This was aptly demonstrated way back in the 1960s by ELIZA, one of the earliest chatbot programs. Its users thought ELIZA really understood them -- and they even sought advice from it, despite the inventor and supervisors repeatedly declaring that it was not intelligent.

Easy to either fool or captivate humans with outer appearances, which to a lesser degree is what all fictional / imaginary entertainment has relied upon for centuries to hook readers, viewers, and audiences to invest private characteristics into descriptive narratives, images, and objects. Praying to stone and wooden idols actually predates ELIZA and Kismet by millennia, and the former didn't even have to provide conversation or changing facial expressions.

Even when an object or figure isn't a philosophical zombie, there can still be deception -- like in the cases of con artists, spies / imposters, and common liars.

The old version of radical behaviorism never actually denied private events, so it wasn't much an invitation to solipsism or nihilism about "other" people having minds. But these robots and their emphasis on emotions and feelings as just outer body activity might eventually revive / engender an offshoot that does tempt in that direction.

~
Reply


Possibly Related Threads…
Thread Author Replies Views Last Post
  Meet the Dark Empath + Why humans mate in private? + Different views on infidelity C C 1 134 Aug 9, 2020 08:05 PM
Last Post: Syne



Users browsing this thread: 1 Guest(s)