Bots
Bots are pretty scary.
Earlier this month, there was a huge story in the international press about an experiment conducted by Facebook that aimed to see how two chatbots would handle negotiating with each other.
The story blew up because of two key points in the experiment. Firstly, the bots learnt how to lie. Secondly, the bots developed their own language, one that the human researchers couldn’t understand.
Now each of these points are scary enough on their own, but taken together they become truly terrifying. Suddenly every sci-fi film from I, Robot to The Matrix flashes before our eyes as we imagine a world where “the machines” conspire and overthrow mankind, leaving the enslaved human race to curse the the day we ever gave them artificial intelligence.
Fortunately, the experiment didn’t quite come to that. The researchers had probably seen I, Robot and The Matrix, because the chatbots had that all-important off switch.
In fact, if you dig a little deeper beyond the clickbait headlines and generic sci-fi images, the bots’ lies and codes become much less scary.
The researchers had trained the bots to negotiate using a dataset of 5,808 real-life natural-language conversations between two people. The bots then had to decide how to split and share a set of items (such as hats, books and balls) that both held separately, each one assigned a differing value.
Initially, the chatbots were set to maximise “human” conversation. The researchers found that this made the bots overly willing to compromise with each other – not great when they’re supposed to be bargaining. It was only when the researchers set the bots to maximise personal gain in their negotiations that they began feigning interest in a valueless item so that they could later pretend to compromise by conceding it.
The bots weren’t taught to lie; they just worked it out for themselves. They lied out of efficiency, not malice.
The same goes for their coded language. It wasn’t secretive; it was just used some form of shorthand that made sense to them but not the researchers. Here’s a bit of the transcript:
Bob: i can i i everything else . . . . . . . . . . . . . .
Alice: balls have zero to me to me to me to me to me to me to me to me to
Bob: you i everything else . . . . . . . . . . . . . .
Alice: balls have a ball to me to me to me to me to me to me to me
Bob: i i can i i i everything else . . . . . . . . . . . . . .
Alice: balls have a ball to me to me to me to me to me to me to me
Bob: i . . . . . . . . . . . . . . . . . . .
Alice: balls have zero to me to me to me to me to me to me to me to me to
Bob: you i i i i i everything else . . . . . . . . . . . . . .
Alice: balls have 0 to me to me to me to me to me to me to me to me to
Bob: you i i i everything else . . . . . . . . . . . . . .
Alice: balls have zero to me to me to me to me to me to me to me to me to
Doesn’t seem so creepy now, does it?
Earlier this year, Google revealed that the AI it uses for their Translate tool had actually created its own intermediary language to translate into and then out of. This week, Microsoft and Amazon announced that their respective voice-controlled digital assistants Cortana and Alexa will be able to communicate to each other. These stories don’t seem sinister because they emphasise that more intelligent bots mean better results for users.
One story that does ring some alarm bells, however, is the Press Association’s recent announcement that a project entitled Reporters and Data Robots (RADAR) will see bots churn out 30,000 stories for local media publications every month.
These bots aren’t going to come bursting into the PA’s offices and force the journalists from their desks. What we have to fear isn’t some external robotic antagonist. It will be the much more troubling capacity of technology to facilitate the worst aspects of human nature. Think less Terminator, more Black Mirror.
Even when they’re designed to assist us, there are serious ethical questions surrounding the use of bots. Their incredible efficiency and relative low-cost will have profound effect on the labour market. That’s what’s scary about bots!