Privacy, Schmivacy. Apple installs backdoors to iPhones

Apple has announced its plans to bring changes to its operating systems that sound to many security researchers like a massive privacy nightmare and the perfect invitation to unintended consequences.

iPhone 12 Pro Max camera system

Rafia Shaikh for WCCFTech:

Raising concerns in the industry, the company argues it is doing so to protect children and limit the spread of Child Sexual Abuse Material (CSAM).

Two main concerning points are:

• Apple plans to add a scanning feature that will scan all photos as they are uploaded into iCloud Photos to see if they match a photo in the database of known CSAM maintained by the National Center for Missing & Exploited Children (NCMEC).

• It will also scan all iMessage images sent or received by child accounts (accounts designated as owned by a minor) for sexually explicit material. If the child is a minor, Apple will warn them if they try to send or receive sexually explicit photos and notify the parent.

However, security researchers, while supportive of the [CSAM] efforts, are concerned that Apple is enabling governments worldwide to effectively have access to user data, which could go beyond what Apple is currently planning, as is the case with all backdoors. While the system is being purported to detect child sex abuse, it could be adapted to scan for other text and imagery without user knowledge.

Security researchers around the globe have been writing about why this is effectively the end of privacy at Apple since every Apple user is now a criminal unless proven otherwise.

The new features are also concerning even without the government meddling and could prove life-threatening for queer kids.

Emma McGowan for Avast:

While combating CSAM is extremely important, privacy and security experts are concerned about the possible unintended consequences of this technology. [ Avast Chief Privacy Officer Shane] McNamee questions whether companies should scan people’s devices at all.

“Now that this is possible to have access, authorities will push for more access,” he says. “It’s like we’re peeking over your shoulder, but we’re wearing sunglasses and saying the sunglasses can only see bad things. And you have this little snooper on the device that’s just reading everything and checking it, not sending it to Apple unless you’re doing something wrong. That’s the problem — the definition of ‘doing something wrong’ could be broadened.”

Brianna Wu — a computer programmer, video game creator, online advocate, and Executive Director of Rebellion PAC who describes herself as “an Apple fan” — points out that the US government could theoretically create legislation giving them permission to use this technology without the general public ever knowing. There are “far less checks and balances” on behind the scenes deals between the US government and tech companies, in the name of national security, than the general public may believe.

“This would allow agencies to spy on our phones to find, say, pictures that the Pentagon says compromise national security or belong to terrorists,” Wu tells Avast. “And if you look at the specifics of Edward Snowden’s revelations, it’s clear that our national security agencies may stick to certain rules in the US, but outside there are no rules at all. I feel very confident this technology could be used to spy on people in other countries.”

The second big change is that Apple will allow parents to implement a program on their children’s iMessages that would blur any images with nudity. It will also alert parents if the child chooses to view the image or send nude images themselves. While Wu says she “can live with the iCloud part” of these new changes, she feels that the scanning messages part leads down “a deeply Orwellian road” and she “would beg Apple to reconsider.”

Wu points to the fact that the majority of US teens are sexually active before the age of 18 — and that “sexting” is not uncommon among teenagers. This technology, then, potentially infringes on teens’ right to sexual autonomy. It could also potentially open up charges of distributing child pornography against the children, if a parent reports, or the parents if they share the image with the other parents involved.

But even more concerning to Wu is the possibility that this technology could “out” LGBTQIA+ kids to their parents, potentially placing them in both psychological and physical danger.

“Name a product from Apple, I’ve got it,” Wu says. “I’m all-in on the Apple ecosystem because of privacy. I root for them to succeed, but this is by far the worst plan I’ve seen them put into effect. I think they’re going down a wrong path and it’s extremely concerning.”

MacDailyNews Take:

Interns: Make it a double.

Reader Feedback

This site uses Akismet to reduce spam. Learn how your comment data is processed.