Apple has delayed the rollout of the child safety features that it announced last month following negative feedback.
The planned features include scanning users’ iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance in Siri and Search, reports MacRumors.
Apple confirmed that feedback from customers, non-profit and advocacy groups, researchers, and others about the plans has prompted the delay to give the company time to make improvements.
Apple issued the following statement about its decision.
“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material,” the company said.
“Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” it added.
Following their announcement, the features were criticized by a wide range of individuals and organizations, including security researchers, the privacy whistleblower Edward Snowden, Facebook’s former security chief, politicians, etc.
Apple has since endeavored to dispel misunderstandings and reassure users by releasing detailed information, sharing FAQs, various new documents, interviews with company executives, and more.
The suite of Child Safety Features was originally set to debut in the US with an update to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey.
It is now unclear when Apple plans to roll out the “critically important” features, but the company still intends on releasing them it appears.