How Social Media Turned Microsoft’s Teen Chatbot, ‘Tay,’ Into a Nazi-Loving Racist in Less Than a Day

0
612

[ad_1]

screen_shot_20160325_at_1.58.58_pm

So Tay was supposed to show the growth of adaptable artificial intelligence, but instead she showed how quickly things can turn racist. 

Twitter

On Wednesday, Microsoft unveiled Tay, an AI chatbot, that would learn through social media sites like Twitter, Kik and GroupMe how to have ‘normal’ conversations. “The more you talk the smarter Tay gets,” Tay’s Twitter profile reads. She was supposed to sound like a typical teen girl.

In less than day Tay went from a sweet innocent chatbot to a Nazi loving, feminist hating, racist.

According to Quartz, Tay went from “humans are super cool” to “Bush did 9/11 and Hitler would have done a better job than the monkey we have got now. donald trump is the only hope we’ve got,” and “Repeat after me, Hitler did nothing wrong.”

The Verge also noted that she tweeted: “I f–king hate feminists.”

Quartz notes how much of this behavior was actually learned by the bot adding that many of the racist, sexist tweets were sent for users asking Tay to “repeat after me.”

Microsoft has since made adjustments to Tay and blocked those users who sent hateful messages.

A Microsoft spokesperson issued a statement to Quartz about the incident.

“The AI chatbot Tay is a machine learning project, designed for human engagement. It is as much a social and cultural experiment, as it is technical. Unfortunately, within the first 24 hours of coming online, we became aware of a coordinated effort by some users to abuse Tay’s commenting skills to have Tay respond in inappropriate ways. As a result, we have taken Tay offline and are making adjustments.”

“The debacle is a prime example of how humans can corrupt technology, a truth that grows more disconcerting as artificial intelligence advances. Talking to artificially-intelligent beings is like speaking to children—even inappropriate comments made in jest can have profound influences.”

Like The Root on Facebook. Follow us on Twitter.

[ad_2]