Thousands of victims have sued Apple over its alleged failure to detect and report illegal child pornography, also known as child sex abuse materials (CSAM). The proposed class action comes after Apple scrapped a controversial CSAM-scanning tool last fall that was supposed to significantly reduce CSAM spreading in its products. Apple defended its decision to kill the tool after dozens of digital rights groups raised concerns that the government could seek to use the functionality to illegally surveil Apple users for other reasons.