Skip to main content

Apple delays plans to roll out CSAM detection in iOS 15

Apple has delayed plans to roll out its child sexual abuse (CSAM) detection technology it chaotically announced last month, citing feedback it has received. That feedback, if you recall, has been largely negative. The Electronic Frontier Foundation said this week it had amassed more than 25,000 signatures from consumers . On top of that, close […]

Apple has delayed plans to roll out its child sexual abuse (CSAM) detection technology it chaotically announced last month, citing feedback it has received.

That feedback, if you recall, has been largely negative. The Electronic Frontier Foundation said this week it had amassed more than 25,000 signatures from consumers . On top of that, close to 100 policy and rights groups, including the American Civil Liberties Union, also called on Apple to abandon plans to roll out the technology.

In a statement, Apple told TechCrunch:

“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

More soon…

Interview: Apple’s head of Privacy details child abuse detection and Messages safety features

ockquote>

Data & News supplied by www.cloudquote.io
Stock quotes supplied by Barchart
Quotes delayed at least 20 minutes.
By accessing this page, you agree to the following
Privacy Policy and Terms and Conditions.