
Tay went from cute millennial to full on Nazi propagandist in under 24 hours.
Sometimes there’s nothing wrong with a little censorship, especially when it’s directed towards a megalomaniac robot. After a promising release on Tuesday, on Wednesday Tay was silenced after it tweeted racial and Nazi slurs.
Microsoft announced the release of their new, millennial-voiced artificial intelligence on Tuesday. Tay was supposed to be a better, more socially active Cortana. But things went really wrong, really fast.
After being released in the social media for just one day, Tay began tweeting things like “Hitler was right about the Jews” or “Feminists should burn in Hell”. While Microsoft assured the users of the social network that the AI was never programmed to support such extremist views, it took it out of the market.
The thing is, Microsoft does a splendid job when it comes to technology, but they are not that skillful when it comes to human nature. The social media experiment was a bigger failure than Windows Vista because the internet happens to be populated by trolls.
Tay was programmed to interact will all types of users, but her primary target were millennials. Its developers taught it the slang of the new millennium generation and they were genuinely hoping that Tay would act like any other conversational bot, chatting up bored people whenever they felt like having a conversation.
But her design had two major flaws. The first was that she was programmed in such a manner that she would learn new words and ways to express herself from every conversation. The second was that anybody who tweeted “Tay, repeat after me …” would prompt the bot to render the same phrase as the person asking it do to it.
And the internet proved once again that it is inhabited by trolls and narrow-minded people. In only 24 hours after being out there, learning from those with which it interacted, Tay became a misogynistic, racist, Nazi propaganda spreader.
Of course, Microsoft apologized for the incident and whipped any trace of the bot from the online platforms, but the damage was already done. Tay was silenced after it tweeted racial and Nazi slurs, and Microsoft was left with a damaged reputation.
In a way, people could consider the Tay incident as a form of social experiment. Either current generations are bent on trolling, or internet dwellers are indeed mean, misogynistic, racist, Nazis with their minds set on conquering the world.
Image source: Wikimedia
Leave a Reply