Social media addiction claims against Meta not barred by Section 230

Since the mid-90s, Section 230 (47 U.S.C. §230) has functioned as a kind of legal safety net for internet platforms, shielding them from lawsuits and liability tied to user-generated content. Many companies have built entire business models with that protection in mind. And one might reasonably conclude that social media would not be as omnipresent today but for the protections of Section 230 which allowed the ecosystem to develop.

But a recent decision from Massachusetts’ highest court suggests that this protection is narrower than many assume, and that courts are increasingly willing to look beyond content and into how platforms are actually designed and operated.

Quick overview of the dispute

The Commonwealth of Massachusetts brought suit against Meta, alleging that Instagram was intentionally designed to keep young users engaged in ways that could be harmful. The complaint focused on features such as infinite scrolling, constant notifications, and algorithmic content delivery — all tools that encourage users, particularly teenagers, to spend more time on the platform.

At the same time, the state alleged that Meta publicly downplayed or misrepresented the risks associated with Instagram, presenting it as safe while internal research suggested otherwise. It also claimed that Meta failed to effectively enforce age restrictions, despite representing that it did.

Meta’s response was predictable: it argued that Section 230 bars these claims entirely.

What the court decided

The Supreme Judicial Court of Massachusetts disagreed. While it acknowledged that Section 230 can provide powerful protection, including immunity from having to litigate certain claims at all, it concluded that the statute does not apply here, at least at the motion to dismiss stage.

The court allowed the case to proceed, holding that the Commonwealth’s claims were not based on third-party content, but rather on Meta’s own conduct — its design choices, its business practices, and its public statements.

Why the court reached this decision

At the heart of the court’s reasoning is a distinction that is becoming increasingly important in technology litigation: the difference between liability for content and liability for conduct.

Section 230 was designed to prevent platforms from being treated as the publisher of harmful content created by users. But the court emphasized that this protection has limits. It does not extend to situations where a company is being held accountable for its own actions, particularly where those actions involve how a product is designed or how risks are communicated to users.

Here, the claims did not depend on what users posted on Instagram. Instead, they focused on how the platform itself is structured. The alleged harm flowed from features that encourage compulsive use, not from any specific piece of content. In that sense, the platform’s design (and not its role as a publisher) was being challenged.

The court applied the same reasoning to the Commonwealth’s deception claims. The court made clear that Section 230 does not shield a company from liability for its own statements. If a company represents that its product is safe or non-addictive, and those representations are alleged to be misleading, those claims stand on their own, independent of any user-generated content.

Commonwealth v. Meta Platforms, Inc., — N.E.3d —, 2026 WL 969430 (Mass. Apr. 10, 2026)

Social media posts about divorce caused man to be found in contempt of court

social media ban

The Court of Appeals of Wisconsin affirmed an order holding a husband in contempt for violating a family-court ban on social media posts about his divorce case and the parties’ children.

Petitoner-wife asked the circuit court to hold respondent-husband in contempt, enforce the social media restrictions, and impose sanctions. She requested that the husband be punished for violating the court’s directives and that the court order remedial steps to stop further violations.

What the court ruled

The family court entered the requested relief and the husband sought review. On appeal, the court affirmed the contempt order. It held that the husband had not shown any error in the circuit court’s finding that he willfully violated the court’s oral and written directives barring social media posts about the children and the case. The court also declined to consider husband’s other challenges, including his First Amendment and overbreadth claims, because they were undeveloped and unsupported by legal authority and record citations.

Rationale

The appellate court concluded that the circuit court reasonably exercised its contempt power because the husband admitted posting a video after the oral ruling that barred posts about the children, and that recording included material from the hearing about custody-related recommendations. The court said this was enough notice and support for contempt. As to the rest of the appeal, the husband failed to present coherent, legally developed arguments, so those issues were not considered.

In re the marriage of Fox, 2026 WL 945666 (Wisc. Ct. App. April 7, 2026)

 

Does the DMCA safe harbor cover infringing images in an email?

DMCA safe harbor for notifications

Plaintiff photographer sued Pinterest for copyright infringement, alleging Pinterest displayed his and other photographers’ copyrighted images in notifications sent outside of the Pinterest website. Pinterest moved for summary judgment, arguing it was protected under the safe harbor provisions of Section 512(c) of the Digital Millennium Copyright Act (“DMCA”). The court granted Pinterest’s motion and dismissed the case.

Pinterest is a familiar and massive social media platform, where individuals upload and share image-based “Pins” that function as visual bookmarks. The platform displays Pins in personalized feeds curated by algorithms and which contain advertisements labeled as “promoted.” Pinterest also delivers through notifications such as emails, in-app alerts, and push notifications, which contain hyperlinks that trigger display of images hosted on its servers. One such notification that plaintiff received included his copyrighted photograph, prompting him to file suit six days later.

The court found that Pinterest’s actions fell within the DMCA’s Section 512(c) safe harbor, which shields service providers from copyright liability for content stored at the direction of users. Because Pinterest raised this as an affirmative defense, it had the burden to prove every element of the safe harbor criteria, and the court concluded it had met both the statutory threshold and all required conditions.

Statutory threshold requirements under the DMCA

To qualify for the DMCA safe harbor, Pinterest had to meet several threshold statutory requirements that are found in Sections 512(c) and (i): it had to be a service provider, maintain a designated agent, implement a repeat infringer policy, and accommodate standard technical measures. The court found that Pinterest satisfied all four. As “one of the largest social media platforms in the world,” it operated a qualifying online platform as defined by the statute. The evidence showed that Pinterest maintained a registered agent with the Copyright Office and that it enforced a strike-based policy for repeat infringers. And the court found that Pinterest did not interfere with any recognized standard technical measures that plaintiff implemented with his works. (Plaintiff had asserted that he embedded certain metadata in his photographs, but he did not argue that this metadata qualified as a “standard technical measure” under the DMCA, nor did he claim that Pinterest interfered with it — in fact, he alleged that Pinterest preserved the metadata on its servers.)

How Pinterest met the required conditions

After finding that Pinterest satisfied the DMCA’s threshold requirements, the court turned to whether Pinterest’s conduct of sending out copyright protected images in off-platform notifications was protected under Section 512(c). To do so, Pinterest had to show three things:

  • the alleged infringement occurred due to user-directed storage;
  • Pinterest lacked actual or red flag knowledge of the infringement; and
  • Pinterest either had no right and ability to control the activity or did not receive a direct financial benefit from it.

The court evaluated each element in turn.

By reason of storage at the direction of a user

The court concluded that Pinterest met the first requirement for DMCA safe harbor protection: the alleged infringement occurred “by reason of the storage at the direction of a user.” It emphasized that the image at issue was not embedded in the notification itself but was instead hosted on Pinterest’s servers and accessed via a hyperlink contained in the notification. When a user opened the message, their software triggered a request to Pinterest’s server to retrieve and display the image, just as it would when accessing content directly through the platform. Because this method merely facilitated access to user-uploaded content without altering it, the court found the display was within the statutory definition.

No knowledge of infringement

The court found that Pinterest satisfied the second requirement for DMCA safe harbor protection by showing it lacked actual or red flag knowledge of the alleged infringement. Critically, Harrington never sent Pinterest a DMCA takedown notice or otherwise identified the allegedly infringing material before filing suit. The DMCA operates on a notice and takedown system: platforms are not required to proactively monitor user content but must respond once they receive proper notice. Because Harrington gave no such notice and offered no evidence that Pinterest otherwise knew about the specific image at issue, the court concluded there was no genuine dispute as to Pinterest’s lack of knowledge.

Control and financial benefit

The court found that Pinterest met the third and final requirement for DMCA safe harbor by showing it neither had the right and ability to control the alleged infringement nor received a financial benefit directly attributable to it. While Pinterest used algorithms to curate content and monetize its platform generally, the court held that this did not amount to the kind of “substantial influence” over user activity that would disqualify it under the DMCA. Pinterest did not direct users to upload specific content, nor did it participate in any purposeful conduct related to the display of plaintiff’s photo.

The court also rejected plaintiff’s claim that Pinterest profited directly from the infringement. Pinterest presented evidence that its notifications did not contain advertisements and that it earned no revenue specifically tied to the image in question. Plaintiff’s counter-evidence failed to show otherwise. Even if ads had appeared near the image, the law requires a direct connection between the infringing display and revenue, which was absent here. Therefore, Pinterest satisfied this final element of the DMCA safe harbor defense.

Harrington et al. v. Pinterest, Inc., No. 20-CV-5290, 2026 WL 25880 (N.D. Cal., January 5, 2026)

Ninth Circuit declines to impose broad injunction against California’s social media law for minors

social media ban

NetChoice, an internet trade association representing companies such as Google, Meta, and X, sued the State of California over its Protecting Our Kids from Social Media Addiction Act, claiming that the law violates the First Amendment. The Act restricts how social media platforms interact with minors, particularly limiting access to algorithmic feeds, requiring certain default settings, and mandating age-verification procedures.

Plaintiff asked the court to block enforcement of several provisions of the law through a preliminary injunction, focusing on its claims that aspects of the Act unlawfully restrict speech and are unconstitutionally vague. The lower court declined to issue the injunction. Plaintiff sought review with the Ninth Circuit.

On appeal, the Ninth Circuit largely affirmed the district court’s refusal to issue a broad injunction but ruled that the provision of the law requiring platforms to hide like and share counts by default for minors is unconstitutional. It reversed the lower court on that point and instructed it to modify its injunction to prevent enforcement of that specific provision.

The court ruled this way because it found the like-count requirement to be content-based and therefore subject to strict scrutiny under the First Amendment. The government failed to show that hiding like counts was the least restrictive means to achieve its goal of protecting minors’ mental health. Other provisions, including those governing private-mode settings and age verification, either survived scrutiny or were deemed unripe for review.

NetChoice LLC v. Bonta, — F.4th —, 2025 WL 2600007 (9th Cir. Sept. 9, 2025)

TikTok and Meta terms granted other users remix rights

tiktok copyright

Plaintiff sued TikTok and Meta after other users on those platforms incorporated clips from her video into their own posts, allegedly without her permission. She claimed this was copyright infringement and also alleged that TikTok failed to protect her from harassment by users in the comments of her live videos. Plaintiff filed the lawsuit on her own, without a lawyer.

Plaintiff asked the court to hold TikTok and Meta liable for copyright infringement and to consider tort claims against TikTok for harassment. But both companies responded by asking the court to dismiss the case. They pointed to the user agreements Plaintiff had accepted when she signed up. Those terms gave the platforms and their users broad rights to use, modify, and distribute any content she uploaded. TikTok also invoked immunity under 47 U.S.C. 230, a provision in federal law protecting platforms from liability for user-generated content.

The court agreed with the platforms. It found that plaintiff had granted TikTok and Meta valid licenses to use her video, so there could be no copyright violation. The court also ruled that it had no authority to hear the tort claims because plaintiff had not shown that the court had jurisdiction over those parts of the case. The court rejected plaintiff’s arguments that she did not fully understand the contracts or that the agreements were unfair. On appeal, the Tenth Circuit upheld the decision, finding no clear error in how the lower court handled the case and ruling that plaintiff had waived her right to challenge the licensing issue by not objecting to it specifically.

In the end, the court dismissed all claims against both companies. The court also declined to take up any new claims plaintiff tried to raise during the appeal, saying she had not brought those up earlier and did not support them with enough detail.

Three reasons why this case matters:

  • It reinforces how powerful and far-reaching social media terms of service can be in protecting platforms from copyright claims.

  • It shows the importance of making specific objections and arguments in court—especially during appeals.

  • It highlights how courts apply procedural rules strictly, even when someone is representing themselves without a lawyer.

Sethunya v. TikTok, 2025 WL 1144776 (10th Cir. April 18, 2025)

Federal court says it was OK to fire CEO who criticized boy wearing prom dress

on camera

The fired CEO of a telehealth company sued his former employer’s customer, alleging that the customer wrongfully pressured his employer to fire him after a video went viral of him confronting a boy wearing a prom dress. The lower court granted summary judgment and dismissed the plaintiff’s tortious interference claims. Plaintiff sought review with the Sixth Circuit. On appeal, the court affirmed the summary judgment in favor of the former employer’s customer.

What happened

In April 2021, plaintiff encountered teenagers taking prom photos at a Tennessee hotel. During this encounter, plaintiff told a teenage boy wearing a red prom dress that he “looked like an idiot.” Another teen recorded the interaction and posted it online, where it quickly went viral. Actress Kathy Griffin shared the video with her two million Twitter followers, identifying plaintiff.

The video created significant problems for plaintiff’s former employer. The company’s board of directors expressed concern about how plaintiff’s behavior reflected on the company.

Defendant, the former employer’s largest customer, soon received many messages expressing disappointment about its business relationship with a company whose CEO behaved this way. Defendant arranged a call with the company to discuss the situation.

According to plaintiff, defendant threatened to end its contract with the company if the company did not fire plaintiff. Shortly after this call, the company’s directors voted to terminate plaintiff’s employment. The next day, defendant publicly stated that the company “stepped up to do the right thing” by firing plaintiff.

The lawsuit

Plaintiff sued defendant (but not his former employer) for tortious interference with his employment contract and tortious interference with his employment relationship under Tennessee law. Defendant asked the court for summary judgment, arguing that plaintiff couldn’t prove his claims even if all facts were viewed in his favor.

The Court’s decision

The Sixth Circuit affirmed the district court’s decision to grant summary judgment to defendant, rejecting plaintiff’s claims for two main reasons.

First, plaintiff’s tortious interference with contract claim failed because the company did not breach any contract when it fired him. Plaintiff’s employment contract allowed the company to fire him with or without cause. Since the company had the legal right to terminate plaintiff’s employment, it could not have breached the contract by doing so. Under Tennessee law, a claim for tortious interference with a contract requires an actual breach of contract.

Second, plaintiff’s tortious interference with employment relationship claim failed because he could not show that defendant acted with an improper motive or used improper means. The court found no evidence that defendant acted with the primary purpose of injuring plaintiff. Instead, the record showed defendant sought to protect its business from public criticism. Additionally, defendant’s contract with the company gave it the right to stop doing business with the company “for any reason or no reason,” so, in the Court’s mind, threatening to exercise this right was not improper.

Three reasons why this case matters:

  • It clarifies that claims for tortious interference with contracts require an actual breach of contract, which does not occur when an employer exercises its contractual right to terminate an at-will employee.
  • It demonstrates that businesses can take steps to protect their reputation without facing liability for tortious interference, as long as they act within their contractual rights.
  • It illustrates how viral videos capturing personal conduct can have significant professional consequences, especially for people in leadership positions.

Johnson v. University Hospitals Health System, Inc., 2025 WL 637442 (6th Cir. February 27, 2025)

Section 230 protected Meta from Huckabee cannabis lawsuit

Mike Huckabee, the former governor of Arkansas, sued Meta Platforms, Inc., the parent company of Facebook, for using his name and likeness without his permission in advertisements for CBD products. Huckabee argued that these ads falsely claimed he endorsed the products and made misleading statements about his personal health. He asked the court to hold Meta accountable under various legal theories, including violation of his publicity rights and privacy.

Plaintiff alleged that defendant approved and maintained advertisements that misappropriated plaintiff’s name, image, and likeness. Plaintiff further claimed that the ads placed plaintiff in a false light by attributing statements and endorsements to him that he never made. Additionally, plaintiff argued that defendant had been unjustly enriched by profiting from these misleading ads. Defendant, however, sought to dismiss the claims, relying on the Communications Decency Act at 47 U.S.C. 230, which grants immunity to platforms for third-party content.

The court granted Meta’s motion to dismiss. It determined that Section 230 shielded defendant from liability for the third-party content at issue. The court also noted that plaintiff’s allegations lacked the specificity needed to overcome the protections provided by Section 230. Furthermore, the court emphasized that federal law, such as Section 230, preempts conflicting state laws, such as Arkansas’s Frank Broyles Publicity Protection Act.

Three reasons why this case matters:

  • Defines Section 230 Protections: It reaffirms the broad immunity tech companies enjoy under Section 230, even in cases involving misuse of publicity rights.
  • Digital Rights and Privacy: The case highlights the tension between protecting individual rights and maintaining the free flow of online content.
  • Challenges for State Laws: It shows how federal law can preempt state-specific protections, leaving individuals with limited recourse.

Mike Huckabee v. Meta Platforms, Inc., 2024 WL 4817657 (D. Del. Nov. 18, 2024)

Ex-wife held in contempt for posting on TikTok about her ex-husband

tiktok contempt

Ex-husband sought to have his ex-wife held in contempt for violating an order that the divorce court had entered. In 2022, the court had ordered the ex-wife to take down social media posts that could make the ex-husband identifiable.

The ex-husband alleged that the ex-wife continued to post content on her TikTok account which made him identifiable as her ex-husband. Ex-wife argued that she did not name the ex-husband directly and that her social media was part of her work as a trauma therapist. But the family court found that the ex-wife’s posts violated the previous order because they made the ex-husband identifiable, and also noted that the children could be heard in the background of some videos. As a result, the court held the ex-wife in contempt and ordered her to pay $1,800 in the ex-husband’s attorney fees.

Ex-wife appealed the contempt ruling, arguing that ex-husband did not present enough evidence to support his claim, and that she had not violated the order. She also disputed the attorney fees. On appeal, the court affirmed the contempt finding, agreeing that her actions violated the order, but vacated the award of attorney fees due to insufficient evidence of the amount.

Three reasons why this case matters:

  • It illustrates the legal consequences of violating court orders in family law cases.
  • It emphasizes the importance of clarity in social media use during ongoing family disputes.
  • It highlights the need for clear evidence when courts are asked to impose financial sanctions such as attorney fees.

Kimmel v. Kimmel, 2024 WL 4521373 (Ct.App.Ky., October 18, 2024)

Scroll to top