Contributors

Tuesday, June 10, 2014

Not a Real Boy

The news is full of stories like this about Eugene Goostman, a computer program that "passed" the Turing test. It passed by making 30% of the judges in the contest think it was a human being and not a computer.

After reading the conversation captured by Time and my own discussion with Eugene Goostman, I wonder whether the judges who were fooled by Eugene would pass the Turing test themselves. You can try it yourself here: http://www.princetonai.com/.

People have been writing programs that pretend to be people for decades. One of the most famous ones was Eliza, written by Joseph Weizenbaum in the 1960s. The conceit of Eliza is that it's a Rogerian psychotherapist. It works by looking for keywords that produce specific response patterns, and by repeating things back to you that you said to it. For example, if you said, "My mommy took my teddy away from me," Eliza would respond with, "Tell me more about your parents."

In Computer Power and Human Reason (1976, p. 189) Weizenbaum describes how Eliza fooled people who knew it was a computer program into thinking that it actually understood something. He compares it to people who are taken in by fortune tellers, believing that these frauds have real insights and "know things." Eliza utilized a primitive form of "cold reading" that mediums and psychics use to deceive their victims.

The conceit of the Eugene Goostman "chatbot" is that he's a 13-year-old Ukrainian boy. This is probably the key to the success of the deception, because it sets the bar for language comprehension and attention span to a very low level.

Eugene appears to work much like Eliza, but it's got a bigger database of canned information that it can discuss, and has a better natural language processor. When you ask it a question, it answers as best it can, which is usually fairly easy if you ask it about things it's programmed to know. Then it throws out another conversation starter about its pet guinea pig, or some other non-sequitur to make conversation.

In the dialog I had with it I ignored the conversation starters and just asked more questions. I asked, "Can you answer yes or no questions?" it responded with, "Sure, anyone can. Anything else?" It gives similar non-responses to other questions about it's capabilities. Ask it, "What's the Ukrainian word for house?" and you get "I have no idea, sorry. And I forgot to ask you where you are from.." Ask it in Ukrainian if it speaks Ukrainian (which it claims to, as well as English, Russian, and a little Yiddish), you get "Err... And what it was? Maybe it was something hexadecimal?" This is because it's a computer program scripted in English that doesn't know what to do with the Unicode characters Ukrainian is written in.

About the third sentence it threw out a red herring about evil robots, as one might imagine a snarky 13-year-old could. When I asked, "Could you repeat what you said about evil robots?," it responded with, "I call all these chatter-bots "chatter-nuts" due to their extremely high intelligence. I hope you recognize irony."

And if you ask the most obvious question: "Are you a real boy?" You get, "I would rather not talk about it if you don't mind. By the way, I still don't know your specialty - or, possibly, I've missed it?" And if you repeat your question you get, "Could you repeat it once more again? Wonna ask me something more?"

Clearly, Eugene Goostman is not a real boy. Or maybe the judges don't consider 13-year-old boys to be real human beings?

And I forgot to ask you where you are from.. I hope you recognize irony. And I still don't know your specialty.

1 comment:

Unknown said...

I am gonna try it, it sounds obvious he wasnt real from what you say, but eventually an android will fool everyone and artificial intelligence will have consciousness, mark my words ...