Microsoft's chat bot Tay, an experiment that went wrong
last week when it began tweeting racist comments, briefly sparked back to life early this morning.
The account posted several tweets in rapid succession, appearing to talk to herself.
“You are too fast, please take a rest,” Tay said, several times per second, for long enough to comprehensively take over the Twitter feeds of more than 200,000 followers.
There were no more racist tweets, but she did post about smoking drugs in front of the police.
According to Fortune, Microsoft inadvertently brought Tay back online during testing.
Read more from
Fortune.