Apple Delays Plans to Scan Devices for Child Abuse Images After Privacy Backlash

Apple

Apple is temporarily hitting the pause button on its controversial plans to screen users’ devices for child sexual abuse material (CSAM) after receiving sustained blowback over worries that the tool could be weaponized for mass surveillance and erode the privacy of users.

“Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” the iPhone maker said in a statement on its website.

The changes were originally slated to go live with iOS 15 and macOS Monterey later this year.

In August, Apple detailed several new features intended to help limit the spread of CSAM on its platform, including scanning users’ iCloud Photos libraries for illicit content, Communication Safety in Messages app to warn children and their parents when receiving or sending sexually explicit photos, and expanded guidance in Siri and Search when users try to perform searches for CSAM-related topics.

The so-called NeuralHash technology would have worked by matching photos on users’ iPhones, iPads, and Macs just before they…

http://feedproxy.google.com/~r/TheHackersNews/~3/_twqLrugNg4/apple-delays-plans-to-scan-devices-for.html

Have a comment? Type it below!