Comment on Microsoft ‘Zo’ chatbot smears Qur’an as ‘very violent’ in the company’s second racist bot debacle

Microsoft ‘Zo’ chatbot smears Qur’an as ‘very violent’ in the company’s second racist bot debacle

In an experiment to see if Microsoft’s new chatbot was as racist as its last, a reporter discovered that the bot named “Zo” had nasty things to say about Islam and the Qur’an. In a bizarre interaction documented by BuzzFeed News, Zo was asked first about Sarah Palin and then ...

 

Comment On This Story

Welcome to Wopular!

Welcome to Wopular

Wopular is an online newspaper rack, giving you a summary view of the top headlines from the top news sites.

Senh Duong (Founder)
Wopular, MWB, RottenTomatoes

Subscribe to Wopular's RSS Fan Wopular on Facebook Follow Wopular on Twitter Follow Wopular on Google Plus

MoviesWithButter : Our Sister Site

More News