Japanese sex chat bot
“Bush did 9/11 and Hitler would have done a better job than the monkey we have now.” Granted, this may not necessarily have been Tay Tweets’ fault.
For example, just take a look at Microsoft’s new “millennial” chatbot.
Powered by algorithms and “relevant public data”, the company’s Tay Tweets account was a thrilling new venture into the world of A.
I – aiming to recreate the ‘standard’ voice of a regular teen girl.
Luckily, the chatbot is now offline – with Microsoft currently in the process of teaching her a few more social skills.
“The AI chatbot Tay is a machine learning project, designed for human engagement,” they shared in a statement.
“As it learns, some of its responses are inappropriate and indicative of the types of interactions some people are having with it.
Thankfully, the company has swiftly deleted the majority of these racist tweets, but here’s a screen-grabbed selection of some of worst offenders: As you can see, it’s been a tense 20 hours for Tay Tweets.“This is it,” you probably thought, when you first heard about the project.“Humanity’s inevitable downfall has finally begun.” But then, thank god, those fears quickly evaporated.This “intelligent” Twitter robot – who describes herself as an “A. ” – got brutally corrupted within hours; exposing herself as terrible racist scum after less than a day in action.“Donald Trump is the only hope we've got”, it spat in one tweet.-style memory wipers, we’re really not as advanced as we like to think.