We have good news and bad news about Microsoft’s chatbot experiment involving Tay, a Twitter account that the company hopes will someday interact with other Twitter users in a meaningful way ...
Unfortunately for Microsoft, however, some racist Twitter trolls figured out a way to manipulate Tay’s behavior to transform it into a crazed racist who praised Hitler and denied the existence ...
Tay, an AI bot aimed at 18-24 year olds, was deactivated within 24 hours of going live after she made a number of Tweets that were highly offensive. Microsoft began by simply deleting Tay's ...
Earlier this week, Microsoft launched Tay - a bot ostensibly designed to talk to users on Twitter like a real millennial teenager and learn from the responses. But it didn't take things long to go ...
Microsoft issued an apology and took Tay offline after less than 18-hours of offensive conversations on Twitter. Donald Trump and Taylor Swift's political views do not align... as he sums up in ...