Thousands of victims have sued Apple over its alleged failure to detect and report illegal child pornography, also known as child sex abuse materials (CSAM). The proposed class action comes after Apple scrapped a controversial CSAM-scanning tool last fall that was supposed to significantly reduce CSAM spreading in its products. Apple defended its decision to kill the tool after dozens of digital rights groups raised concerns that the government could seek to use the functionality to illegally surveil Apple users for other reasons.

Topics:  child   comments   apple   csam   controversial   tool   users   sex   functionality   report   abuse   implement   

 

Welcome to Wopular!

Welcome to Wopular

Wopular is an online newspaper rack, giving you a summary view of the top headlines from the top news sites.

Senh Duong (Founder)
Wopular, MWB, RottenTomatoes

Subscribe to Wopular's RSS Fan Wopular on Facebook Follow Wopular on Twitter Follow Wopular on Google Plus

MoviesWithButter : Our Sister Site

More Business News