West Virginia's attorney general sued Apple on Thursday, February 19, 2026, accusing the iPhone maker of allowing its iCloud service to become what the company's own internal communications called the "greatest platform for distributing child porn."
The state alleges that the company facilitated the spread of child sexual abuse material by declining to deploy tools that scan photos and videos and detect such material in iCloud users’ collections.
Attorney General JB McCuskey, a Republican, accused Apple of prioritizing user privacy over child safety and over the distribution of child sexual abuse material on Apple's data storage platform.
"This conduct is despicable, and Apple's inaction is inexcusable," the statement added.
Apple, clarified in a statement, that it has implemented features that prevent children from uploading or receiving nude images and was "innovating every day to combat ever-evolving threats and maintain the safest, most trusted platform for kids."
"All of our industry-leading parental controls and features, like Communication Safety—which automatically intervenes on kids' devices when nudity is detected in Messages, shared Photos, AirDrop, and even live FaceTime calls—are designed with the safety, security, and privacy of our users at their core," said Apple.
Notably, the U.S. has seen a growing national reckoning over how smartphones and social media harm children. So far, the wave of litigation and public pressure has mostly targeted companies like Meta, Snap, and Google's YouTube, with Apple largely insulated from scrutiny.
West Virginia's lawsuit focuses on Apple's move toward end-to-end encryption, putting digital files outside the reach of both Apple and law enforcement officials.
The state alleges Apple's use of such technology has allowed child abuse material to proliferate on its platform.
For decades, technology and privacy advocates have sparred over end-to-end encryption. Advocates call this vital to ensuring privacy and preventing widespread digital eavesdropping. Governments insist it hinders criminal investigations.
The lawsuit in Mason County Circuit Court seeks statutory and punitive damages and requests that a judge force Apple to implement safer product designs including effective measures to detect abusive material.
Alphabet's Google, Microsoft, and other platform providers check uploaded photos or emailed attachments against a database of identifiers of known child sex abuse material provided by the National Center for Missing and Exploited Children and other clearinghouses.
It abandoned the plan after the FBI complained it would harm investigations.
Federal law requires U.S.-based technology companies to report abuse material to the National Center for Missing and Exploited Children.
The law provides broad protections to internet companies from lawsuits over content generated by users.