Yahoo Web Search

Search results

  1. Some people on the internet turned Microsoft's new chatbot, Tay, into a sort of reverse Pygmalion -- from Fair Lady back to racist street urchin. It was kind...

    • 7 min
    • 9.2M
    • Internet Historian
  2. Mar 24, 2016 · Tay, Microsoft's new teen-voiced Twitter AI, learns how to be racist 04:22. Microsoft got a swift lesson this week on the dark side of social media. Yesterday the company launched "Tay," an ...

    • CBS News
    • Amy Kraft
    • 4 min
  3. Tay was a chatbot that was originally released by Microsoft Corporation as a Twitter bot on March 23, 2016. It caused subsequent controversy when the bot began to post inflammatory and offensive tweets through its Twitter account, causing Microsoft to shut down the service only 16 hours after its launch. [ 1 ]

  4. Sep 10, 2019 · The Tay chatbot was an imported version of Microsoft’s similar AI bot, XiaoIce, in China. ... Swift has filed many trademark registrations in the US to protect phrases and song/album titles from ...

  5. Mar 25, 2016 · March 25, 2016 1:32 PM PT. Microsoft's AI chatbot Tay was only a few hours old, and humans had already corrupted it into a machine that cheerfully spewed racist, sexist and otherwise hateful ...

    • samantha.masunaga@latimes.com
    • Staff Writer
  6. Mar 25, 2016 · In China, people reacted differently - a similar chatbot had been rolled out to Chinese users, but with slightly better results. "Tay was not the first artificial intelligence application we ...

  7. People also ask

  8. Mar 24, 2016 · Published March 24, 2016. Written By Hope Reese. Less than a day after she joined Twitter, Microsoft's AI bot, Tay.ai, was taken down for becoming a sexist, racist monster. AI experts explain why ...

  1. People also search for