Social media addiction claims against Meta not barred by Section 230

Since the mid-90s, Section 230 (47 U.S.C. §230) has functioned as a kind of legal safety net for internet platforms, shielding them from lawsuits and liability tied to user-generated content. Many companies have built entire business models with that protection in mind. And one might reasonably conclude that social media would not be as omnipresent today but for the protections of Section 230 which allowed the ecosystem to develop.

But a recent decision from Massachusetts’ highest court suggests that this protection is narrower than many assume, and that courts are increasingly willing to look beyond content and into how platforms are actually designed and operated.

Quick overview of the dispute

The Commonwealth of Massachusetts brought suit against Meta, alleging that Instagram was intentionally designed to keep young users engaged in ways that could be harmful. The complaint focused on features such as infinite scrolling, constant notifications, and algorithmic content delivery — all tools that encourage users, particularly teenagers, to spend more time on the platform.

At the same time, the state alleged that Meta publicly downplayed or misrepresented the risks associated with Instagram, presenting it as safe while internal research suggested otherwise. It also claimed that Meta failed to effectively enforce age restrictions, despite representing that it did.

Meta’s response was predictable: it argued that Section 230 bars these claims entirely.

What the court decided

The Supreme Judicial Court of Massachusetts disagreed. While it acknowledged that Section 230 can provide powerful protection, including immunity from having to litigate certain claims at all, it concluded that the statute does not apply here, at least at the motion to dismiss stage.

The court allowed the case to proceed, holding that the Commonwealth’s claims were not based on third-party content, but rather on Meta’s own conduct — its design choices, its business practices, and its public statements.

Why the court reached this decision

At the heart of the court’s reasoning is a distinction that is becoming increasingly important in technology litigation: the difference between liability for content and liability for conduct.

Section 230 was designed to prevent platforms from being treated as the publisher of harmful content created by users. But the court emphasized that this protection has limits. It does not extend to situations where a company is being held accountable for its own actions, particularly where those actions involve how a product is designed or how risks are communicated to users.

Here, the claims did not depend on what users posted on Instagram. Instead, they focused on how the platform itself is structured. The alleged harm flowed from features that encourage compulsive use, not from any specific piece of content. In that sense, the platform’s design (and not its role as a publisher) was being challenged.

The court applied the same reasoning to the Commonwealth’s deception claims. The court made clear that Section 230 does not shield a company from liability for its own statements. If a company represents that its product is safe or non-addictive, and those representations are alleged to be misleading, those claims stand on their own, independent of any user-generated content.

Commonwealth v. Meta Platforms, Inc., — N.E.3d —, 2026 WL 969430 (Mass. Apr. 10, 2026)

Colorado federal court upholds My Pillow founder defamation verdict

section 230 immunity

The United States District Court for the District of Colorado left a jury’s verdict intact by denying defendants’ bid for judgment as a matter of law and denying plaintiff’s request to increase punitive damages.

Plaintiff Eric Coomer sued defendant Michael J. Lindell, defendant Frankspeech LLC, and defendant My Pillow, Inc. for defamation and related claims based on statements accusing Coomer of helping rig the 2020 presidential election while he worked for Dominion Voting Systems. After trial, the jury found defendant Lindell and defendant Frankspeech liable on certain defamation claims, found defendant Frankspeech liable for intentional infliction of emotional distress and punitive damages, and found defendant My Pillow not liable.

Frankspeech was a streaming and broadcasting platform that Lindell created, that aired interviews, hosted shows such as “The Lindell Report,” and livestreamed events like the Cyber Symposium, rather than simply operating as a passive message board or social media site.

Post-trial requests

Defendant Lindell and defendant Frankspeech asked the court to enter judgment as a matter of law in their favor on several grounds, including that Frankspeech was immune under 47 U.S.C. § 230, that plaintiff had not proved economic damages, and that the evidence did not support actual malice. Plaintiff asked the court to amend the final judgment to increase the punitive damages award against defendant Frankspeech based on alleged continuing misconduct during the case.

Motions denied

The court ruled that both post-trial motions should be denied. It refused to overturn the jury’s findings against defendant Lindell and defendant Frankspeech, and it also refused to enlarge the punitive damages award against defendant Frankspeech.

Why the court ruled the way it did

The court concluded that there was sufficient evidence for a reasonable jury to find that Frankspeech was not entitled to Section 230 immunity because it was not merely hosting third-party content. Instead, Lindell, acting as Frankspeech’s agent, made defamatory statements on its broadcasts, and the company also promoted, sponsored, and livestreamed the Cyber Symposium where additional statements were made. This allowed the jury to find that Frankspeech participated in the development and dissemination of the content, rather than acting as a neutral intermediary. The court also found sufficient evidence of economic harm and actual malice, and it determined that plaintiff had not met the high burden required to justify increasing punitive damages under Colorado law. Finally, the court ordered defendants to show cause why additional Rule 11 sanctions should not be imposed for another inaccurate citation in their briefing.

Coomer v. Lindell, 2026 WL 817370 (D. Colo. Mar. 25, 2026)

Claims against porn sites dismissed because of Section 230 immunity

Section 230 immunity
Plaintiffs sued several internet pornography companies after they discovered that videos secretly recorded of them while changing in a college locker room had been uploaded online.

Plaintiffs asked the court to hold the defendants liable under several theories, including civil conspiracy, negligent monitoring, and violations of the Trafficking Victims Protection Reauthorization Act (TVPRA).

The court granted summary judgment in favor of defendants.

The court held that Section 230 of the Communications Decency Act shielded the defendants from liability for user-generated content, and plaintiffs failed to show that any of defendants materially contributed to the illegal aspects of the videos. The court also found no evidence of a conspiracy or that defendants met the requirements to be considered beneficiaries of a sex trafficking venture under the TVPRA. Claims against defendants who merely licensed trademarks or placed ads were also rejected due to lack of personal jurisdiction or insufficient evidence of wrongdoing.

Jane Does 1–9 v. Collins Murphy, et al., No. 7:20-cv-00947-DCC, 2025 WL 2533961 (D.S.C. Sept. 3, 2025).

Court gives X opportunity to raise Section 230 claim in deepfake case

X  sued Minnesota Attorney General Keith Ellison over a state law that prohibits the dissemination of AI-generated political deepfakes, arguing the statute violates the First and Fourteenth Amendments and is preempted by the Communications Decency Act at 47 U.S.C. 230. A related case challenging the same law is already on appeal in Kohls v. Ellison, leading the court to stay X’s constitutional claims while allowing its Section 230 claim to move forward. The court invited both parties to file motions for judgment on the pleadings within 30 days. If neither does so, the entire case will be stayed pending resolution of the Kohls appeal.

X Corp. v. Ellison, 2025 WL 1833455 (D. Minn. July 3, 2025)

Content moderation lapses did not make hookup app liable for misrepresentation

Section 230

App’s general statement that it would provide a “safe and secure environment” did not amount to a promise for which plaintiff could assert Barnes-style misrepresentation and thereby avoid the app’s Section 230 immunity. 

Plaintiff – an underage user – sued Grindr based on injuries he suffered from meeting up with four different men with whom he had connected on the platform. One of the claims plaintiff brought was for negligent misrepresentation. Defendant stated on the app that it was “designed to create a safe and secure environment for its users,” and plaintiff alleged that defendant failed to do so.

Defendant moved to dismiss this claim under 47 U.S.C. §230, which provides that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” 47 U.S.C. § 230(c)(1). The district court granted the motion and plaintiff sought review with the Ninth Circuit.

On appeal, the Ninth Circuit affirmed the dismissal under Section 230. In certain situations, a promise by an online platform to do something can form the basis of a claim against the platform that will not be barred by Section 230 immunity. For example, in Barnes v. Yahoo!, Inc., 570 F.3d 1096 (9th Cir. 2009), Section 230 did not protect Yahoo against a claim that it failed – despite its promise to do so – to take down indecent profiles impersonating the plaintiff in that case. And in Estate of Bride v. Yolo Technologies, Inc., 112 F. 4th 1168 (9th Cir. 2024), plaintiffs’ negligent misrepresentation claims were not subject to Section 230 immunity where the platform promised to unmask anonymous harassing users but failed to do so.

In this case, however, the Court saw the situation differently than it did in either Barnes or Estate of Bride. In those cases, plaintiffs were seeking to hold defendants liable for specific promises or representations. In this case, by contrast, Grindr’s general statement that its app was designed to create a safe and secure environment was a description of its moderation policy and thus protected from liability under Section 230.

Doe v. Grindr, Inc., 128 F.4th 1148 (9th Cir. February 18, 2025)

Section 230 protected Meta from Huckabee cannabis lawsuit

Mike Huckabee, the former governor of Arkansas, sued Meta Platforms, Inc., the parent company of Facebook, for using his name and likeness without his permission in advertisements for CBD products. Huckabee argued that these ads falsely claimed he endorsed the products and made misleading statements about his personal health. He asked the court to hold Meta accountable under various legal theories, including violation of his publicity rights and privacy.

Plaintiff alleged that defendant approved and maintained advertisements that misappropriated plaintiff’s name, image, and likeness. Plaintiff further claimed that the ads placed plaintiff in a false light by attributing statements and endorsements to him that he never made. Additionally, plaintiff argued that defendant had been unjustly enriched by profiting from these misleading ads. Defendant, however, sought to dismiss the claims, relying on the Communications Decency Act at 47 U.S.C. 230, which grants immunity to platforms for third-party content.

The court granted Meta’s motion to dismiss. It determined that Section 230 shielded defendant from liability for the third-party content at issue. The court also noted that plaintiff’s allegations lacked the specificity needed to overcome the protections provided by Section 230. Furthermore, the court emphasized that federal law, such as Section 230, preempts conflicting state laws, such as Arkansas’s Frank Broyles Publicity Protection Act.

Three reasons why this case matters:

  • Defines Section 230 Protections: It reaffirms the broad immunity tech companies enjoy under Section 230, even in cases involving misuse of publicity rights.
  • Digital Rights and Privacy: The case highlights the tension between protecting individual rights and maintaining the free flow of online content.
  • Challenges for State Laws: It shows how federal law can preempt state-specific protections, leaving individuals with limited recourse.

Mike Huckabee v. Meta Platforms, Inc., 2024 WL 4817657 (D. Del. Nov. 18, 2024)

Disabled veteran’s $77 billion lawsuit against Amazon dismissed

gaming law

A disabled Army veteran sued Amazon alleging “cyberstalking” and “cyberbullying” on its gaming platform, New World. Plaintiff claimed Amazon allowed other players and employees to engage in harassment, culminating in his being banned from the platform after over 10,000 hours and $1,700 of investment. Plaintiff sought $7 billion in compensatory damages and $70 billion in punitive damages, asserting claims for intentional infliction of emotional distress, gross negligence, and unfair business practices. Plaintiff also filed motions for a preliminary injunction to reinstate his gaming account and to remand the case to state court.

The court, however, dismissed the case. It granted plaintiff in forma pauperis status, allowing him to proceed without paying court fees, but ruled that his complaint failed to state any claim upon which relief could be granted. The court found no grounds for allowing plaintiff to amend the complaint, as any amendment would be futile.

The court dismissed the case on several legal principles. First, it found that Amazon was immune from liability under the Communications Decency Act at 47 U.S.C. §230 for any content posted by third-party users on the New World platform. Section 230 protects providers of interactive computer services from being treated as publishers or speakers of user-generated content, even if they moderate or fail to moderate that content.

Second, plaintiff’s claims about Amazon employees’ conduct were legally insufficient. His allegations, such as complaints about bad customer service and being banned from the platform, failed to meet the standard for intentional infliction of emotional distress, which requires conduct so outrageous it exceeds all bounds tolerated in a civilized society. Similarly, plaintiff’s gross negligence claims did not demonstrate any extreme departure from reasonable conduct.

Finally, in the court’s view, plaintiff’s claim under California’s Unfair Competition Law (UCL) lacked the necessary specificity. The court found that poor customer service and banning a user from a platform did not constitute unlawful, unfair, or fraudulent business practices under the UCL.

Three Reasons Why This Case Matters

  • Clarifies Section 230 Protections: The case reinforces the broad immunity granted to online platforms for third-party content under Section 230, even when moderation decisions are involved.
  • Defines the Limits of Tort Law in Online Interactions: It highlights the high bar plaintiffs must meet to succeed on claims such as intentional infliction of emotional distress and gross negligence in digital contexts.
  • Sets Guidance for Gaming Platform Disputes: The decision underscores the limited liability of companies for banning users or providing subpar customer support, offering guidance for similar lawsuits.

Haymore v. Amazon.com, Inc., 2024 WL 4825253 (E.D. Cal., Nov. 19, 2024)

Section 230 saves eBay from liability for violation of environmental laws

The United States government sued eBay for alleged violations of environmental regulations, claiming the online marketplace facilitated the sale of prohibited products in violation of the Clean Air Act (CAA), the Toxic Substances Control Act (TSCA), and the Federal Insecticide, Fungicide, and Rodenticide Act (FIFRA). According to the government’s complaint, eBay allowed third-party sellers to list and distribute items that violated these statutes, including devices that tamper with vehicle emissions controls, products containing methylene chloride used in paint removal, and unregistered pesticides.

eBay moved to dismiss, arguing that the government had failed to adequately state a claim under the CAA, TSCA, and FIFRA, and further contended that eBay was shielded from liability under Section 230 of the Communications Decency Act (CDA), 47 U.S.C. 230(c).

The court granted eBay’s motion to dismiss. It held that eBay was immune from liability because of Section 230, which protects online platforms in most situations from being held liable as publishers of third-party content. The court determined that, as a marketplace, eBay did not “sell” or “offer for sale” the products in question in the sense required by the environmental statutes, since it did not possess, own, or transfer title of the items listed by third-party sellers.

The court found that Section 230 provided broad immunity for eBay’s role as an online platform, preventing it from being treated as the “publisher or speaker” of content provided by its users. As the government sought to impose liability based on eBay’s role in hosting third-party listings, the court concluded that the claims were barred under the CDA.

United States of America v. eBay Inc., 2024 WL 4350523 (E.D.N.Y. September 30, 2024)

No Section 230 immunity for Facebook on contract-related claims

section 230

Plaintiffs sued Meta, claiming that they were harmed by fraudulent third-party ads posted on Facebook. Plaintiffs argued that these ads violated Meta’s own terms of service, which prohibits deceptive advertisements. They accused Meta of allowing scammers to run ads that targeted vulnerable users and of prioritizing revenue over user safety. Meta moved to dismiss claiming that it was immune from liability under 47 U.S.C. § 230(c)(1) (a portion of the Communications Decency Act (CDA)), which generally protects internet platforms from being held responsible for third-party content.

Plaintiffs asked the district court to hold Meta accountable for five claims: negligence, breach of contract, breach of the covenant of good faith and fair dealing, violation of California’s Unfair Competition Law (UCL), and unjust enrichment. They alleged that Meta not only failed to remove scam ads but actively solicited them, particularly from advertisers based in China, who accounted for a large portion of the fraudulent activity on the platform.

The district court held that § 230(c)(1) protected Meta from all claims, even the contract claims. Plaintiffs sought review with the Ninth Circuit.

On appeal, the Ninth Circuit affirmed that § 230(c)(1) provided Meta with immunity for the non-contract claims, such as negligence and UCL violations, because these claims treated Meta as a publisher of third-party ads. But the Ninth Circuit disagreed with the district court’s ruling on the contract-related claims. It held that the lower court had applied the wrong legal standard when deciding whether § 230(c)(1) barred those claims. So the court vacated the dismissal of the contract claims, explaining that contract claims were different because they arose from Meta’s promises to users, not from its role as a publisher. The case was remanded back to the district court to apply the correct standard for the contract claims.

Three reasons why this case matters:

  • It clarifies that § 230(c)(1) of the CDA does not provide blanket immunity for all types of claims, especially contract-related claims.
  • The case underscores the importance of holding internet companies accountable for their contractual promises to users, even when they enjoy broad protections for third-party content.
  • It shows that courts continue to wrestle with the boundaries of platform immunity under the CDA, which could shape future rulings about online platforms’ responsibilities.

Calise v. Meta Platforms, Inc., 103 F.4th 732 (9th Cir., June 4, 2024)

Scroll to top