"Technology that detects CSAM before it is sent from a child’s device can prevent that child from being a victim of sextortion or other sexual abuse, and can help identify children who are currently being exploited,” says Erin Earp, interim vice president of public policy at the anti-sexual violence organization RAINN. The feature is designed so Apple never gets access to the messages, the end-to-end encryption that Messages offers is never broken, and Apple doesn’t even learn that a device has detected nudity. Research has consistently shown, though, that end-to-end encryption is a vital safety tool for protecting human rights and that the downsides of its implementation do not outweigh the benefits.Ĭommunication Safety for Messages is opt-in and analyzes image attachments users send and receive on their devices to determine whether a photo contains nudity. Law enforcement agencies around the world have similarly cited the dire problem of child sexual abuse in opposing the use and expansion of end-to-end encryption, though many of these agencies have historically been hostile toward end-to-end encryption in general because it can make some investigations more challenging. Child safety experts and technologists working to combat CSAM have often opposed broader deployment of end-to-end encryption because it renders user data inaccessible to tech companies, making it more difficult for them to scan and flag CSAM. ![]() The goal is to stop child exploitation before it happens or becomes entrenched and reduce the creation of new CSAM.Īpple’s CSAM update comes alongside its announcement today that the company is vastly expanding its end-to-end encryption offerings for iCloud, including adding the protection for backups and photos stored on the cloud service. Additionally, the core of the protection is Communication Safety for Messages, which caregivers can set up to provide a warning and resources to children if they receive or attempt to send photos that contain nudity. The features work in Siri, Apple’s Spotlight search, and Safari Search to warn if someone is looking at or searching for child sexual abuse materials and provide resources on the spot to report the content and seek help. Parents and caregivers can opt into the protections through family iCloud accounts. Instead, Apple told WIRED this week, it is focusing its anti-CSAM efforts and investments on its “Communication Safety” features, which the company initially announced in August 2021 and launched last December. Now the company says that in response to the feedback and guidance it received, the CSAM-detection tool for iCloud photos is dead. At the beginning of September 2021, Apple said it would pause the rollout of the feature to “collect input and make improvements before releasing these critically important child safety features.” In other words, a launch was still coming. ![]() But the initiative was controversial, and it soon drew widespread criticism from privacy and security researchers and digital rights groups who were concerned that the surveillance capability itself could be abused to undermine the privacy and security of iCloud users around the world. The tool was meant to be privacy-preserving and allow the company to flag potentially problematic and abusive content without revealing anything else. These are just examples of the data we collect from a Kids’ Line.įor a more comprehensive look at what data each of our products and services collects and how we use it, visit the Children’s Privacy Notice.In August 2021, Apple announced a plan to scan photos that users stored in iCloud for child sexual abuse material (CSAM). We also collect unique identifiers so we can tell which mobile device on our network is your child’s. For example, with a Kids’ Line, we collect geolocation data that tells us the location of your child’s mobile device so we can deliver wireless service. Our different products and services, like Kids’ Line, FamilyMode, and SyncUP KIDS Watch, collect different types of data. We may also do other things with children’s data, like comply with and enforce legal and regulatory obligations and respond to government requests. ![]() The main reason we collect children’s data is to provide the product or service. When we do collect that data, we might do it directly, like when you sign up for a service. We might also collect it automatically if your child uses the products or services we offer. ![]() We don’t knowingly collect data from or about children under 16 without the permission of their parent or guardian.
0 Comments
Leave a Reply. |