Apple faces accusations of underreporting child sexual abuse material on its platforms

Apple is under scanner for allegedly failing to adequately flag and report instances of child sexual abuse material (CSAM) on its services, despite abandoning controversial plans to scan iCloud for such content last year.

According to reports, child safety experts, including the UK’s National Society for the Prevention of Cruelty to Children (NSPCC), have accused Apple of significantly undercounting the prevalence of CSAM exchanged and stored on iCloud, iMessage, and FaceTime.

Read more

You may also like

Comments are closed.

More in IT