Oops.
“Twitter trolls made a dummy out of Microsoft’s artificial intelligence chat robot, which learns through public interaction, by turning it into a pro-Nazi racist within a day of its launch.
“Tay, the artificial intelligence (AI) robot, had a bug in which it would at first repeat racist comments, then it began to incorporate the language in its own tweets.
“The tweets have been deleted, Tay has been paused, and Microsoft said it’s “making some adjustments,” the International Business Times reported.
““Tay is an artificial intelligent chat bot developed by Microsoft’s Technology and Research and Bing teams to experiment with and conduct research on conversational understanding…
“”The more you chat with Tay the smarter she gets, so the experience can be more personalized for you,” Tay’s information page states on Twitter.”
Read more at Washington Times