Microsoft, in trying to launch a customer service artificial intelligence presented as a teen girl, got a crash course in how you canât trust the internet after the system became a foul-mouthed, incestuous racist within 24 hours of its launch. The AI, called Tay, engages with â and learns from â Twitter, Kik, and GroupMe users that converse with her. Unfortunately, Microsoft didnât consider the bad habits Tay was sure to pick up, and within hours she was tweeting her support for Trump and Hitler (whoâs having a strong news day), her proclivity for incest, and that she is a 9/11 truther.
âTay is designed to engage and entertain people where they connect with each other online through casual and playful conversation,â Microsoft said upon launch yesterday. âThe more you chat with Tay the smarter she gets.â
https://twitter.com/TayandYou/status/712613527782076417
âbush did 9/11 and Hitler would have done a better job than the monkey we have now,â Tay wrote in a tweet (courtesy of The Independent), adding, âdonald trump is the only hope we’ve got.â Another read: âWE’RE GOING TO BUILD A WALL, AND MEXICO IS GOING TO PAY FOR ITâ (via The Guardian). Tayâs desire for her non-existent father is too graphic to post here.
Once it realised what was happening, Microsoft deleted all the offending tweets. Screenshots of more visceral posts from Tay have been collected by The Telegraph. Since the purge, Tay seems to have been behaving herself: