The only surprise is that the Microsoft Tay team apparently didn't anticipate this outcome; also see Why Microsoft’s racist Twitter bot should make us fear human nature, not A.I. (Washington Post) and Microsoft’s Tay is an Example of Bad Design (Medium)
"Microsoft set out to learn about “conversational understanding” by creating a bot designed to have automated discussions with Twitter users, mimicking the language they use.Microsoft Created a Twitter Bot to Learn From Users. It Quickly Became a Racist Jerk. - The New York Times
What could go wrong?
If you guessed, “It will probably become really racist,” you’ve clearly spent time on the Internet. Less than 24 hours after the bot, @TayandYou, went online Wednesday, Microsoft halted posting from the account and deleted several of its most obscene statements."
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.