Apple on Friday announced that the three features it revealed to stop the spread of Child Sexual Abuse Material (CSAM) will not be available at the release of iOS 15, iPadOS 15, watchOS 8, and macOS 12 Monterey later this fall. The company will make the CSAM features available later this year in updates.
In a press release, Apple stated that, This program is ambitious, and protecting children is an important responsibility. These efforts will evolve and expand over time.
Apple revealed the CSAM features in early August, and while many in tech community applauded Apple's efforts to protect children, many also voiced concerns about the potential for how the technology behind the CSAM features could be used for other surveillance purposes. Over 90 policy and rights groups published an open letter urging Apple to cancel its CSAM features.
The main feature that sparked controversy is the CSAM detection feature, where images on your device are scanned for hashes, and those hashes are then checked on a list of known CSAM hashes. The argument made against this feature is that it could be implemented for other uses. For example, a government could demand that Apple create a similar process to check for images deemed determinantal to the government's policies.
Apple stated that in such situations, it would turn down such a request, but the declaration did not create any confidence within the concerned. Governments can (and will) always create consequences for not obeying an order, which could cause Apple to change its policy. There's also the possibility that Apple could decide to use the technology for its own purpose other than CSAM, though doing so would weaken Apple's images as a company concerned about user privacy.
The other two CSAM features that are being delayed are:
- Messages communication safety: The Messages app will use on-device machine learning to warn children and parents about sensitive content.
- Siri and Search: These functions will provide additional resources to help children and parents stay safe online and get help with unsafe situations.
Our CSAM detection FAQ has more information about these features, including explanations on how they work and the controversy created.