Apple Might be Ditching its Controversial CSAM Image Scanning Program – ExtremeTech

This site can earn affiliate commissions from the links on this page. Terms of use.

Apple surprised many earlier this year when it announced a plan to combat child sexual abuse with a multi-pronged approach to several new technologies that would be implemented in iOS 15. The most controversial was a program that scanned users’ iCloud libraries in looking for CSAM. , which stands for Material on Child Sexual Abuse. With the release of iOS 15.2 this week, Apple rolled out one of these anticipated features, the ability to detect nude photos in the children’s version of Messages, but the aforementioned scanning technology was conspicuously absent. As of today, it appears that all references to the image-scanning portion of Apple’s plan have been removed from its website, leading people to wonder if Apple has scuttled the technology forever due to the fierce reaction.

Earlier, Apple had announced that it was simply delaying the launch of the technology due to criticism, stating that it needed time to listen to feedback and review its implementation. according to Macrumors. In September, he released the following statement: “Last month we announced plans for performances designed to help protect children from predators who use communication tools to recruit and exploit them, and to limit the spread of child sexual abuse material. Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take more time over the next several months to gather information and make improvements before releasing these critically important child safety features. “

However, instead of the company declaring that these plans are still in effect, Macrumors writes that the company simply erased all signs of its Child Safety Website. As you can see by visiting the link, it only talks about the newly released nudity detection algorithm for Messages, which is not enabled by default and appeared in iOS 15.2. As we noted in our coverage, “…once deployed to a device with a family sharing account, it will search for nudity in images sent and received by the Messages app. If nudity is detected, Messages blurs the image and displays an on-screen warning, explaining the dangers of explicit photo sharing and asking if the viewer wants to continue. “So far, this technology appears to have been received without much fuss, but the week It is not over yet.

1639686741 729 Apple Might be Ditching its Controversial CSAM Image Scanning Program

Apple rolled out nudity detection for kids using its Messages app this week, with seemingly little reaction.

Interestingly, critics of Apple’s iCloud scanning technology put forward what essentially boils down to a “slippery slope” argument, saying that if Apple can design an algorithm that scans X, what’s stopping it from scanning Y and Z in the future? ? As the Electronic Freedom Foundation put it, “All that would be needed to expand the narrow back door that Apple is building is an expansion of the machine learning parameters to search for additional types of content, or an adjustment of the configuration indicators to scan anyone’s accounts, not just kids. That is not a slippery slope; that’s a fully built system that just waits for outside pressure to make the slightest change. ”There was also concern that governments would adopt Apple’s technology to police their citizens, a claim the company strongly promised it would never allow.

Finally, even though Apple removed mentions of CSAM from its child safety portal, we were still able to investigate the original apple text released when it announced its new initiative, so maybe Apple just forgot about that PDF. What is noteworthy, however, is the new updated The child safety pages only mention image nudity detection in Messenger, not CSAM. Despite the removal of references to the controversial technology on its website, an Apple spokesperson told The Verge of the company’s plans. has not changed, and that it is still overdue.

Now read:

Leave a Comment