West Virginia sues Apple over iCloud’s alleged role in distribution of child sex abuse material

Apple's Advanced Data protection

Apple has been hit with a lawsuit from West Virginia over its iCloud service and child sexual abuse material (CSAM), but the company’s steadfast commitment to user privacy remains a cornerstone of its approach — one that experts and advocates argue prevents far greater abuses by bad actors.

In a complaint filed on February 19, 2026, in Mason County Circuit Court, West Virginia Attorney General JB McCuskey accused Apple of allowing iCloud to serve as a platform for storing and distributing illegal CSAM, citing internal communications and the company’s lower volume of reports to the National Center for Missing & Exploited Children (NCMEC) compared to competitors like Google and Meta.

The lawsuit claims Apple prioritized privacy over aggressive content scanning, pointing to a previously reported 2020 internal message where an Apple executive described iCloud as the “greatest platform for distributing child porn” due to limited detection measures at the time. It also references Apple’s 2021 announcement of a proposed on-device scanning system (NeuralHash) for CSAM detection, which was ultimately abandoned in late 2022 amid widespread privacy backlash.Apple has long maintained that introducing broad scanning mechanisms — even for illegal content — creates inevitable vulnerabilities. Any client-side or server-side scanning system capable of detecting specific material could be repurposed or compelled for other uses, such as government surveillance, censorship, or mass data exploitation by malicious hackers. Once a technical capability exists to scan user content at scale, it becomes a prime target for bad actors seeking to access private data, impersonate users, or conduct broader attacks on privacy.

By rejecting such systems and instead implementing end-to-end encryption for iCloud backups (rolled out in December 2022), Apple ensures that user data — including photos, messages, and documents — remains inaccessible to the company itself, law enforcement without user consent, or cybercriminals who might breach servers. This architecture fundamentally limits the risk of abuse: no backdoor means no universal key that could be stolen, leaked, or demanded under legal pressure in less democratic jurisdictions.

Apple emphasized in its response to the lawsuit that it continues to innovate for safety while upholding privacy. Features like Communication Safety in Messages automatically detect and blur nudity in communications involving children, with interventions designed to protect young users without compromising overall encryption or enabling routine content inspection. The company reports detected CSAM when required by law and has built one of the most secure ecosystems in the industry.

Privacy advocates have long warned that mandatory scanning requirements or backdoors erode protections for everyone. Journalists, dissidents, domestic violence survivors, and ordinary users rely on strong encryption to safeguard sensitive information. Introducing exceptions for CSAM detection would create a precedent that weakens these safeguards, potentially enabling authoritarian regimes or cybercriminals to exploit the same mechanisms.

While the West Virginia lawsuit seeks damages and court-ordered changes to Apple’s detection practices, Apple’s position underscores a principled stance: true protection against child exploitation must not come at the cost of creating tools that bad actors could weaponize against millions of innocent users worldwide.

Apple continues to lead in device security and privacy features, balancing child safety innovations with the fundamental right to private communication in an increasingly digital world.

MacDailyNews Take: When they stoop to the Think of the Children™ ruse, their desperation fairly screams.

Apple iMessage service is end-to-end encrypted. Apple cannot see data sent via its iMessage service. Apple’s Advanced Data Protection for iCloud allows users to protect important iCloud data, including iCloud Backup, Photos, Notes, and more. Apple cannot see data protected by Advanced Data Protection for iCloud.

In December 2022, after much opposition, including, voluminously, from us here at MacDailyNews, Apple killed an effort to design an iCloud photo scanning tool for detecting child sexual abuse material (CSAM) in the storage service.

As we wrote previously:

This sounds wonderful at first glance (everyone’s for detecting and rooting out purveyors of child pornography) and horrible once you think about it for more than a second (massive, awful potential for misuse)… It’s a huge can of worms. It’s a backdoor, plain and simple, and it neatly negates Apple’s voluminous claims of protecting users’ privacy. It doesn’t matter what they’re scanning for, because if they can scan for one thing, they can scan for anything.MacDailyNews, August 6, 2021

Originally, Apple would use one database of hashes from the National Center for Missing and Exploited Children (NCMEC).

Then, after outcry, Apple changed their backdoor scanning to match “two or more child safety organizations operating in separate sovereign jurisdictions.”

Of course, Apple’s multi-country “safeguard” is no safeguard at all.

The Five Eyes (FVEY) is an intelligence alliance comprising the United States, Australia, Canada, New Zealand, and the United Kingdom. These countries are parties to the multilateral UKUSA Agreement, a treaty for joint cooperation in signals intelligence.

The FVEY further expanded their surveillance capabilities during the course of the “war on terror,” with much emphasis placed on monitoring the World Wide Web. The former NSA contractor Edward Snowden described the Five Eyes as a “supra-national intelligence organization that does not answer to the known laws of its own countries.”

Documents leaked by Snowden in 2013 revealed that the FVEY has been spying on one another’s citizens and sharing the collected information with each other in order to circumvent restrictive domestic regulations on surveillance of citizens.

Apple’s claim to backdoor scan only for CSAM was intended to be a trojan horse, introduced via the hackneyed “Think of the Children™” ruse, that would be bastardized in secret for all sorts of surveillance under the guise of “safety” in the future.

“Those who would give up essential Liberty, to purchase a little temporary Safety, deserve neither Liberty nor Safety.” — Benjamin Franklin

The fact that Apple ever considered this travesty in the first place, much less announced and tried to implement it in the fashion they did, has damaged the company’s reputation for protecting user privacy immensely; perhaps irreparably.

Hopefully, if Apple management has any sense whatsoever, is not hopelessly compromised, and can resist whatever pressure forced them into this ill-considered abject disloyalty to customers who value their privacy and security, the company will end this disastrous scheme promptly and double-down on privacy by finally and immediately enabling end-to-end encryption of iCloud backups as a company which claims to be a champion of privacy would have done many years ago.MacDailyNews, December 23, 2021

One year later, in December 2022, Apple did just as we advised.

MacDailyNews Note: Advanced Data Protection for iCloud offers our highest level of cloud data security and protects the majority of your iCloud data using end-to-end encryption.

If you choose to enable Advanced Data Protection, the majority of your iCloud data — including iCloud Backup, Photos, Notes, and more — is protected using end-to-end encryption. No one else can access your end-to-end encrypted data, not even Apple, and this data remains secure even in the case of a data breach in the cloud.

How to turn on Advanced Data Protection for iCloud

To turn on Advanced Data Protection, first update the iPhone, iPad, or Mac that you’re using to the latest software version.

Turning on Advanced Data Protection on one device enables it for your entire account and all your compatible devices.

On iPhone or iPad
1.Open the Settings app.
2. Tap your name, then tap iCloud.
3. Scroll down, tap Advanced Data Protection, then tap Turn on Advanced Data Protection.
4. Follow the onscreen instructions to review your recovery methods and enable Advanced Data Protection.

On Mac
1. Choose Apple menu  > System Settings.
2. Click your name, then click iCloud.
3. Click Advanced Data Protection, then click Turn On.
4. Follow the onscreen instructions to review your recovery methods and enable Advanced Data Protection.



Please help support MacDailyNews — and enjoy subscriber-only articles, comments, chat, and more — by subscribing to our Substack: macdailynews.substack.com. Thank you!

Support MacDailyNews at no extra cost to you by using this link to shop at Amazon.

7 Comments

  1. I have a better idea:
    Leave private encrypted data alone, and start prosecuting pedophiles.

    Start with the pedophiles!!!!

    We know who they are (or will, once the DOJ starts obeying the law).

    11
    14
    1. ^^^ this isn’t controversial. It’s a completely non-partisan position.
      (Hopefully it doesn’t get deleted off of here.)

      BOTH parties created the law that DOJ is violating.

      8
      10
      1. “Start?”

        Here’s how many pedophile cases the DOJ has prosecuted since President Trump took office on January 20, 2025:

        In calendar year 2025, the FBI arrested over 1,700 child predators with many of these cases referred to DOJ for federal prosecution, including:

        Operation Restore Justice (May 2025): 205 offenders arrested and prosecuted.
        Operation Enduring Justice (August 2025): 234 offenders arrested and prosecuted.
        Operation Relentless Justice (December 2025): 293 offenders arrested and prosecuted.

        Other notable efforts include Operation Grayskull (July 2025), which led to 18 convictions (with sentences totaling over 300 years) for operators of dark web child abuse sites.

        TRAC analyzes DOJ case filings and categorizes them by program area, including Project Safe Childhood. Extrapolating conservatively across 13 months (January 2025–January 2026) suggests approximately 1,950–2,340 new federal prosecutions. This is an estimate and may not capture all related cases (e.g., those classified under human trafficking or other categories).

        For comparison, pre-2025 annual federal child exploitation prosecutions typically ranged from 1,500–2,000.

        Turn off CNN and MSDNC. It’s your first step to stop sounding so uninformed. It could lead to you actually having better ideas, bless your heart.

        24
        7
  2. Sick sick sick sick and tired of this bs “think of the children” government over reach over and over hoping people fall for it. Anytime you hear that, start wearing a steel pan on your bum because they aim to screw you but good.

    What’s next, people that say or plan things on the phone mean the phone company is to blame? How about people that plan bad things in the park, can we sue the park? How about people that speak in the open air, do we sue god?

    Um no. How about the piece of garbage DA does their job, investigates, finds these animals and brings them to justice. Which frankly on the left they want to make pedophiles great again by rebranding them as minor attracted people and the irony is they dont want to prosecute them. The entire thing is a bs sham.

    Hey govt, go f yourself, you’re servants, do the job we tell you, and otherwise shut the hell up. We know what pieces of garbage you are and the nazi orwellian petty dictators you dream of being. The real people needing to be investigated is every loser that things going into government is a great grift.

    TLDR message to govt losers, jump off a cliff. No. I hope they lose and the piece of crap posing DA degenerates bringing this get humiliated and ruined.

    8
    3

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.