Fight for the Future: digital rights activists speak out on Apple's photo-scanning feature
Cupertino, California - Tech experts and activists slammed Apple's photo-scanning feature. Here is a look at what a digital rights activist group and its campaign director have to say.
TAG24 spoke to Fight for the Future Campaign Director Caitlin Seeley George to find out what an activist group thinks about Apple's photo-scanning feature.
The feature was supposed to help fight the spread of Child Sexual Abuse Material (CSAM).
According to Apple's Child Safety page, "The system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations."
"Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices."
"Apple only learns about users’ photos if they have a collection of known CSAM in their iCloud Photos account. Even in these cases, Apple only learns about images that match known CSAM."
Seeley George, who runs the campaign to protest Apple's proposed client-side scans, has been spearheading the coordination of the activist organization for two years and works with a core team of 10 to 12 people.
Fight for the Future has a clear stance on photo-scanning technology, both in the cloud and on devices: "In general, we do not really support that. We understand that that is a tool that people use to scan for CSAM materials. We still think that is an infringement on privacy, but we think that in this case, Apple doing it on devices is just a huge leap forward."
"It's something that would set a huge precedent within the tech and comm industry, ends the concept of end-to-end encrypt if your device is being scanned from the start."
Scanning on users' devices is "a very clear line in the sand where we should not broach that, we should not cross that line, because it really takes us to a new world of surveillance. We think that's just a very clear place where we shouldn't allow Apple or other companies to expand into."
The groups at risk
The slippery surveillance slope is not just an abstract worry.
The scanning function could be used by authoritarian governments to force Apple to scan for content used by vulnerable groups, including activists, political opposition, or members of the LGBTQ+ community.
Some parents and members of the LGBTQ+ community have told Seeley George's organization that they feel like the scanning technology actually puts their children in danger.
If parents are informed about content that is flagged, and the family is not a safe space for a child who hasn't come out yet, or if the child is in a violent or dangerous home, then this feature could be putting them in direct harm.
"If you use children as the messaging point or pawn, it is hard to argue the point, because we obviously all want them to be safe."
Seeley George was invited to join a tool from her Kindergartener's school that keeps tabs on what her child looks at online at school. Very 1984, especially, as the activist campaign director said, because it could teach her child that digital surveillance is okay.
Seeley George finished by saying that tech might not be the answer to the problem of child abuse. "We need to expand the way that we're thinking about these issues."
Her organization is skeptical of how the algorithms would keep people excluded.
Algorithms putting people in categories is "really problematic when it is supposed to be about kids, and being as inclusionary as possible to let these kids figure out who they are and grow and develop in ways that are healthy."
Fight for the Future has supported Apple on its past privacy policies, but Seeley George explained that the addition of photo-scanning on devices for CSAM is clear hypocrisy after a global marketing effort to show that your phone and your data are private and secure.
Cover photo: Collage: 123RF/rvlsoft, 123RF/magurok, IMAGO/agefotostock/xPandaVectorx ESY-052070278