Microsoft’s public experiment with AI crashed and burned after less than a day.Tay, the company’s chat bot designed to talk like a teen, started spewing racist and hateful comments on Twitter on Wednesday, and Microsoft shut Tay down around midnight. The company has already deleted most of the offensive tweets, but not before people took screenshots.Here’s a sampling of the things she said: “N—— like @deray should be hung! #BlackLivesMatter”"I ...