AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
Microsoft's Chatbot Is a Holocaust Denier Denier and They Had to Yanked Her Offline Within 16 Hours of Release to Retool Her
twitter called tay and tay was emulating the personality of a 19 year old girl so it's like hip edgy you know whatever i'm a 19 years old girl kind of vibe. microsoft had to take tay offline within 16 hours of release because tay started spewing racist rhetoric. They tried editing some of her tweets but it was just too much because she's interacting with thousands of people because she's an ai. Microsoft later replaced them with another chat bot called zo which announced that the kuran would be killed by nuclear weapons.