- Microsoft is trying to avoid another PR disaster with its new AI bot.
The company’s latest program can describe what it “sees” in photos.
“CaptionBot,” as it’s called, does a pretty decent job at describing simple everyday scenes, such as a person sitting on a couch, a cat lounging around, or a busy restaurant. But it seems to be programmed to ignore pictures of Nazi symbolism or its leader.
CNNMoney gave CaptionBot several photos of Adolf Hitler and variations of the swastika to analyze, and it often came back with “I really can’t describe the picture” and a confused emoji. It did, however, identify other Nazi leaders like Joseph Mengele and Joseph Goebbels.
Microsoft (MSFT) released CaptionBot a few weeks after its disastrous social experiment with Tay, an automated chat program designed to talk like a teen.
Shortly after putting Tay to work on Twitter, it began to tweet incredibly racist comments like “Hitler was right I hate the jews.”
The company blamed the bot’s behavior on a “coordinated effort” by online trolls to teach and trick the program into saying hateful things, and it took Tay offline after less than a day.
Related: Microsoft ‘deeply sorry’ for chat bot’s racist tweets
In addition to ignoring pictures of Hitler, CaptionBot also seemed to refuse to identify people like Osama bin Laden. But the program had no problems identifying Mao Zedong, Pol Pot, or Saddam Hussein.