Colorado federal court upholds My Pillow founder defamation verdict

section 230 immunity

The United States District Court for the District of Colorado left a jury’s verdict intact by denying defendants’ bid for judgment as a matter of law and denying plaintiff’s request to increase punitive damages.

Plaintiff Eric Coomer sued defendant Michael J. Lindell, defendant Frankspeech LLC, and defendant My Pillow, Inc. for defamation and related claims based on statements accusing Coomer of helping rig the 2020 presidential election while he worked for Dominion Voting Systems. After trial, the jury found defendant Lindell and defendant Frankspeech liable on certain defamation claims, found defendant Frankspeech liable for intentional infliction of emotional distress and punitive damages, and found defendant My Pillow not liable.

Frankspeech was a streaming and broadcasting platform that Lindell created, that aired interviews, hosted shows such as “The Lindell Report,” and livestreamed events like the Cyber Symposium, rather than simply operating as a passive message board or social media site.

Post-trial requests

Defendant Lindell and defendant Frankspeech asked the court to enter judgment as a matter of law in their favor on several grounds, including that Frankspeech was immune under 47 U.S.C. § 230, that plaintiff had not proved economic damages, and that the evidence did not support actual malice. Plaintiff asked the court to amend the final judgment to increase the punitive damages award against defendant Frankspeech based on alleged continuing misconduct during the case.

Motions denied

The court ruled that both post-trial motions should be denied. It refused to overturn the jury’s findings against defendant Lindell and defendant Frankspeech, and it also refused to enlarge the punitive damages award against defendant Frankspeech.

Why the court ruled the way it did

The court concluded that there was sufficient evidence for a reasonable jury to find that Frankspeech was not entitled to Section 230 immunity because it was not merely hosting third-party content. Instead, Lindell, acting as Frankspeech’s agent, made defamatory statements on its broadcasts, and the company also promoted, sponsored, and livestreamed the Cyber Symposium where additional statements were made. This allowed the jury to find that Frankspeech participated in the development and dissemination of the content, rather than acting as a neutral intermediary. The court also found sufficient evidence of economic harm and actual malice, and it determined that plaintiff had not met the high burden required to justify increasing punitive damages under Colorado law. Finally, the court ordered defendants to show cause why additional Rule 11 sanctions should not be imposed for another inaccurate citation in their briefing.

Coomer v. Lindell, 2026 WL 817370 (D. Colo. Mar. 25, 2026)

Alex Jones gets partial win in Connecticut lawsuit over unfair trade practices 

Erica Lafferty, William Sherlach, and other family members of victims of the Sandy Hook Elementary School shooting sued Alex Jones, Free Speech Systems, LLC, and related entities. Plaintiffs sought damages for defamation, invasion of privacy, emotional distress, and violations of the Connecticut Unfair Trade Practices Act (CUTPA). Plaintiffs argued that defendants’ conspiracy theories about the Sandy Hook shooting violated CUTPA because defendants spread lies to attract audiences and sell products such as dietary supplements and survival gear. Plaintiffs asked the court to hold defendants liable for using false statements as a deceptive trade practice tied to their business interests.

The trial court ruling

The trial court sided with plaintiffs and allowed the CUTPA claim to proceed. It found that defendants’ false statements about the shooting being a hoax were tied to the sale of products advertised on their media platforms. According to the lower court, defendants’ spreading of falsehoods to increase product sales qualified as an unfair trade practice under CUTPA. The jury awarded plaintiffs substantial damages, including compensation for the CUTPA violation.

The appellate court reversal

Defendants appealed, and the appellate court reversed the trial court’s ruling on the CUTPA claim. The appellate court concluded that defendants’ defamatory statements were not directly tied to the sale of goods or services in a way that CUTPA covers. While defendants monetized their platforms, the court reasoned that the alleged lies about Sandy Hook were not themselves commercial conduct. The court ruled that the connection between the false statements and product sales was too weak to support a CUTPA violation. As a result, the appellate court directed the trial court to adjust the judgment by removing the damages associated with the CUTPA claim.

Three Reasons Why This Case Matters:

  • It’s Sensational: Anything involving Alex Jones and the Sandy Hook Massacre are attention-getting.
  • Protects Defamation Framework: By separating defamation from trade practices, the court preserved traditional tort remedies for harmful speech without expanding CUTPA.
  • Addresses Modern Media Monetization: The case underscores how courts assess financial gain from speech in an era of monetized platforms.

Lafferty v. Jones, — A.3d —, 2024 WL 5036021 (App. Ct. Conn. December 10, 2024)

 

K-Pop companies seek U.S. court’s help to unmask anonymous YouTubers

Three South Korean entertainment companies turned to a U.S. court to assist in identifying anonymous YouTube users accused of posting defamatory content. The companies sought permission to issue a subpoena under 28 U.S.C. § 1782, a law that allows U.S. courts to facilitate evidence collection for foreign legal proceedings.

Applicants alleged that the YouTube channels in question posted false claims about K-pop groups they manages, including accusations of plagiarism and deliberate masking of poor vocal performances. Applicants – who had already initiated lawsuits in South Korea – needed the subpoena to obtain identifying information from Google, the parent company of YouTube, to pursue these claims further. Google did not oppose the request but reserved the right to challenge the subpoena if served.

The court ruled in favor of applicants, granting the subpoena. It determined that the statutory requirements under § 1782 were met: Google operates within the court’s jurisdiction, the discovery was intended for use in South Korean legal proceedings, and applicants qualified as interested persons. The court also weighed discretionary factors, such as the non-involvement of Google in the South Korean lawsuits and the relevance of the requested information, finding them supportive of applicants’ request.

The court emphasized that the subpoena was narrowly tailored to identify the operators of the YouTube channels while avoiding unnecessary intrusion into unrelated data. However, it also sought to ensure procedural fairness, requiring Google to notify the affected individuals, who would then have 30 days to contest the subpoena.

Three Reasons Why This Case Matters:

  • International Legal Cooperation: The case illustrates how U.S. courts can assist in resolving international disputes involving anonymous online actors.
  • Accountability for Online Speech: It highlights the balance between free expression and accountability for potentially harmful content on digital platforms.
  • Corporate Reputation Management: The decision reflects how businesses can use legal avenues to protect their reputation across jurisdictions.

In re Ex Parte Application of HYBE Co., Ltd., Belift Lab Inc., and Source Music Co., Ltd., 2024 WL 4906495 (N.D. Cal. Nov. 27, 2024).

Section 230 protected President Trump from defamation liability

TRUMP 230

Plaintiff sued the Trump campaign, some of the President’s advisors and several conservative media outlets asserting claims for defamation. Plaintiff – an employee of voting systems maker Dominion – claimed defendants slandered him by saying plaintiff had said he was going to make sure Trump would not win the 2020 election.

The Trump campaign had argued that two retweets – one by Donald Trump and another by his son Eric – could not form the basis for liability because Section 230 shielded the two from liability. The lower court rejected the Section 230 argument. But on review, the Colorado Court of Appeals held that Section 230 immunity should apply to these retweets.

Section 230 shields users of interactive computer services from liability arising from information provided by third parties. The facts of the case showed that both President Trump and Eric Trump simply retweeted a Gateway Pundit article and an One America Network article without adding any new defamatory content.

The court specifically rejected plaintiff’s argument that Section 230 immunity should not apply because of the Trump defendants’ knowledge that the retweeted information was defamatory. The court looked to a broader consensus of courts that hold such an idea is not woven into Section 230 imm.

The case supports the proposition that defendants could repost verbatim content that someone else generated – even with knowledge that the content is defamatory – and not face liability.

Coomer v. Donald J. Trump for President, Inc., — P.3d —, 2024 WL 1560462  (Colo. Ct. App. April 11, 2024)

Second Circuit rules in favor of Barstool Sports in high profile online defamation case

The Second Circuit Court of Appeals has ruled in favor of Barstool Sports and certain of its employees in the longstanding defamation case brought by Michael Rapaport. The actor and comedian Rapaport and his company, Michael David Productions Inc., had appealed a lower court decision that had granted summary judgment to Barstool Sports and several of its employees, including founder David Portnoy.

Barstool Sports, a media and comedy brand established in 2004, is known for its unfiltered content across various platforms. Michael Rapaport, a prominent figure in entertainment, is similarly recognized for his candid commentary on social and political issues. The partnership between Rapaport and Barstool Sports began in 2017 but soon deteriorated, leading to a public and messy feud.

The Dispute

The conflict escalated when Rapaport had a disagreement with Barstool personality Adam Smith. This led to a series of derogatory exchanges on social media, ultimately resulting in Rapaport’s dismissal from Barstool. Portnoy publicly announced the split, citing Rapaport’s negative comments about Barstool’s fanbase. Following this, both parties continued to engage in a bitter exchange of insults online.

Lower Court Proceedings

Rapaport filed a lawsuit against Barstool, alleging defamation, among other claims. The defamation claim was based on multiple comments Barstool personalities made on various platforms. The district court, however, ruled in favor of Barstool, leading to Rapaport’s appeal.

The Appellate Court’s Decision

The appellate court observed the criteria under New York law for establishing defamation. The court differentiated between statements of fact and expressions of opinion, with the latter being protected and not actionable for defamation. The analysis focused on the context in which the statements were made, considering the nature of the language used and the broader setting of the dispute.

The court found that the statements made by Barstool, including accusations of racism, fraud, and other personal attacks, were part of a hyperbolic and vulgar feud, and were thus likely to be perceived as opinions rather than factual assertions. Moreover, the court noted that many statements were made on platforms where opinionated content is expected, further undermining the claim that they conveyed factual information about Rapaport.

Conclusion

The appellate court affirmed the district court’s judgment, emphasizing that the context and nature of the statements were key in determining their status as non-actionable opinions. The decision underlines the complexities of defamation claims in the digital era, where the line between fact and opinion can be blurred by the nature of the platform and the style of communication used.

This case serves as a reminder of the challenges in navigating defamation in the age of social media, where public figures often engage in heated exchanges that can have legal implications. The ruling reinforces the importance of context in evaluating such claims, setting a precedent for future defamation cases in the digital landscape.

Rapaport v. Barstool Sports Inc., 2024 WL 88636 (2nd Cir. January 9, 2023) [Link to decision]

Can a person be liable for retweeting a defamatory tweet?

section 230 user retweet defamatory

Under traditional principles of defamation law, one can be liable for repeating a defamatory statement to others. Does the same principle apply, however, on social media such as Twitter, where one can easily repeat the words of others via a retweet?

Hacking, tweet, retweet, lawsuit

A high school student hacked the server hosting the local middle school’s website, and modified plaintiff’s web page to make it appear she was seeking inappropriate relationships. Another student tweeted a picture of the modified web page, and several people retweeted that picture.

The teacher sued the retweeters for defamation and reckless infliction of emotional distress. The court dismissed the case, holding that 47 USC §230 immunized defendants from liability as “users” of an interactive computer service. Plaintiff sought review with the New Hampshire Supreme Court. On appeal, the court affirmed the dismissal.

Who is a “user” under Section 230?

Section 230 provides, in relevant part, that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider”. Importantly, the statute does not define the word “user”. The lower court held that defendant retweeters fit into the category of “user” under the statute and therefore could not be liable for their retweeting, because to impose such liability would require treating them as the publisher or speaker of information provided by another.

Looking primarily at the plain language of the statute, and guided by the 2006 California case of Barrett v. Rosenthal, the state supreme court found no basis in plaintiff’s arguments that defendants were not “users” under the statute. Plaintiff had argued that “user” should be interpreted to mean libraries, colleges, computer coffee shops and others who, “at the beginning of the internet” were primary access points for people. And she also argued that because Section 230 changed common law defamation, the statute must speak directly to immunizing individual users.

The court held that it was “evident” that Section 230 abrogated the common law of defamation as applied to individual users. “That individual users are immunized from claims of defamation for retweeting content they did not create is evident from the statutory language. ”

Banaian v. Bascom, — A.3d —, 2022 WL 1482521 (N.H. May 11, 2022)

See also:

Is it defamation to accuse someone of sending a bogus DMCA takedown notice?

DMCA defamatory

Esports aren’t only about 21st century video games. Apparently there is a relatively robust community of Tecmo Bowl enthusiasts who – though the game is three decades old – gets together to compete in tournaments. A couple of members of that community got into it with one another online, and the spat spawned some fierce litigation. That scuffle raised the question of whether accusing someone of sending a bogus DMCA takedown notice is defamatory.

The online scuffle

Plaintiff was upset about posts defendant made in the forum of a Tecmo Bowl tournament website. One of plaintiff’s claims was that defendant had wrongfully accused plaintiff of sending bogus DMCA takedown notices to Facebook concerning a page related to a previous Tecmo Bowl tournament.

Claims of bogus DMCA takedown notices defamatory?

So plaintiff sued defendant in Texas state court for defamation, and lost. He believed that he had established a defamation claim, since defendant had – in plaintiff’s view – accused plaintiff of violating the law by abusing the DMCA process. So plaintiff sought review with the Court of Appeals of Texas. But that higher court agreed with the lower court. It was proper to dismiss the defamation case.

The court evaluated whether an objectively reasonable reader of the forum posts would draw the implication that plaintiff had committed a crime. Specifically, plaintiff had asserted that defendant accused plaintiff of committing perjury, since DMCA takedown notices have to be sworn to. See 17 U.S.C. §512(c)(3)(A)(vi).

But the court did not agree with plaintiff’s theory. It found “that the general public, or more accurately the reasonable reader, is not likely aware of what a “DMCA claim” [is] or what the acronym DMCA even means.” So in this court’s view, and on these facts, accusing someone of sending a DMCA takedown notice that was bogus was not defamatory.

Hawkins v. Knobbe, 2020 WL 7693111 (Ct. App. Texas) December 28, 2020

See also:

Need help with an online issue? Let’s talk.

About the author:

Evan Brown is an intellectual property and technology attorney in Chicago. This post originally appeared on evan.law.

Section 230 immunity protected Twitter from claims it aided and abetted defamation

Twitter enjoyed Section 230 immunity for aiding and abetting defamation because plaintiffs’ claims on that point did not transform Twitter into a party that created or developed content.

An anonymous Twitter user posted some tweets that plaintiffs thought were defamatory. So plaintiffs sued Twitter for defamation after Twitter refused to take the tweets down. Twitter moved to dismiss the lawsuit. It argued that the Communications Decency Act (CDA) at 47 U.S.C. §230 barred the claim. The court agreed that Section 230 provided immunity to Twitter, and granted the motion to dismiss.

The court applied the Second Circuit’s test for Section 230 immunity as set out in La Liberte v. Reid, 966 F.3d 79 (2d Cir. 2020). Under this test, which parses Section 230’s language, plaintiffs’ claims failed because:

  • (1) Twitter was a provider of an interactive computer service,
  • (2) the claims were based on information provided by another information content provider, and
  • (3) the claims treated Twitter as the publisher or speaker of that information.

Twitter is a provider of an interactive computer service

The CDA defines an “interactive computer service” as “any information service, system, or access software provider that provides or enables computer access by multiple users to a computer server.” 47 U.S.C. § 230(f)(2). The court found that Twitter is an online platform that allows multiple users to access and share the content hosted on its servers. As such, it is an interactive computer service for purposes of the CDA.

Plaintiffs’ claims were based on information provided by another information content provider

The court also found that the claims against Twitter were based on information provided by another information content provider. The CDA defines an “information content provider” as “any person or entity that is responsible, in whole or in part, for the creation or development of information provided through the Internet or any other interactive computer service.” 47 U.S.C. § 230(f)(3). In this case, the court found that plaintiffs’ claims were based on information created or developed by another information content provider – the unknown Twitter user who posted the alleged defamatory content. Plaintiffs did not allege that Twitter played any role in the “creation or development” of the challenged tweets.

The claim treated Twitter as the publisher or speaker of the alleged defamatory information

The court gave careful analysis to this third prong of the test. Plaintiffs alleged that Twitter had “allowed and helped” the unknown Twitter user to defame plaintiffs by hosting its tweets on its platform, or by refusing to remove those tweets when plaintiffs reported them. The court found that either theory would amount to holding Twitter liable as the “publisher or speaker” of “information provided by another information content provider.” The court observed that making information public and distributing it to interested parties are quintessential acts of publishing. Plaintiffs’ theory of liability would “eviscerate” Section 230 protection because it would hold Twitter liable simply for organizing and displaying content exclusively provided by third parties.

Similarly, the court concluded that holding Twitter liable for failing to remove the tweets plaintiffs found objectionable would also hold Twitter liable based on its role as a publisher of those tweets, because deciding whether or not to remove content falls squarely within the exercise of a publisher’s traditional role and is therefore subject to the CDA’s broad immunity.

The court found that plaintiffs’ suggestion that Twitter aided and abetted defamation by arranging and displaying others’ content on its platform failed to overcome Twitter’s immunity under the CDA. In the court’s view, such activity would be tantamount to holding Twitter responsible as the “developer” or “creator” of that content. But in reality, to impose liability on Twitter as a developer or creator of third-party content – rather than as a publisher of it – Twitter would have to directly and materially contribute to what made the content itself unlawful.

Plaintiffs in this case did not allege that Twitter contributed to the defamatory content of the tweets at issue, and thus pled no basis upon which Twitter could be held liable as the creator or developer of those tweets. Accordingly, plaintiffs’ defamation claims against Twitter also satisfied the final requirement for CDA immunity: the claims sought to hold Twitter, an interactive computer service, liable as the publisher of information provided by another information content provider. Ultimately, Twitter had Section 230 immunity for aiding and abetting defamation.

Brikman v. Twitter, Inc., 2020 WL 5594637 (E.D.N.Y., September 17, 2020)

See also:

Website avoided liability over user content thanks to Section 230

About the author:

Evan Brown, Copyright work made for hireEvan Brown is an attorney in Chicago practicing copyright, trademark, technology and in other areas of the law. His clients include individuals and companies in many industries, as well as the technology companies that serve them. Twitter: @internetcases

Need help with an online legal issue?

Let’s talk. Give me a call at (630) 362-7237, or send me an email at [email protected].

Restraining order entered against website that encouraged contacting children of plaintiff’s employees

Plaintiff sued defendant (who was an unhappy customer of plaintiff) under the Lanham Act (for trademark infringement) and for defamation. Defendant had registered a domain name using plaintiff’s company name and had set up a website that, among other things, he used to impersonate plaintiff’s employees and provide information about employees’ family members, some of whom were minors.

Plaintiff moved for a temporary restraining order and the court granted the motion.

The Website

The website was structured and designed in a way that made it appear as though it was affiliated with plaintiff. For example, it included a copyright notice identifying plaintiff as the owner. It also included allegedly false statements about plaintiff. For example, it included the following quotation, which was attributed to plaintiff’s CEO:

Well of course we engage in bad faith tactics like delaying and denying our policy holders [sic] valid claims. How do you think me [sic], my key executive officers, and my board members stay so damn rich. [sic]

The court found that plaintiff had shown a likelihood of success on the merits of its claims.

Lanham Act Claim

It found that defendant used plaintiff’s marks for the purpose of confusing the public by creating a website that looked as though it was a part of plaintiff’s business operations. This was evidenced by, for example, the inclusion of a copyright notice on the website.

Defamation

On the defamation claim, the court found that the nature of the statements about plaintiff, plaintiff’s assertion that they were false, and the allegation that the statements were posted on the internet sufficed to satisfy the first two elements of a defamation claim, namely, that they were false and defamatory statements pertaining to the plaintiff and were unprivileged publications to a third party. The allegations in the complaint were also sufficient to indicate that defendant “negligently disregarded the falsity of the statements.”

Furthermore, the statements on the website concerned the way that plaintiff processed its insurance claims, which related to the business of the company and the profession of plaintiff’s employees who handled the processing of claims. Therefore, the final element was also satisfied.

First Amendment Limitations

The court’s limitation in the TRO is interesting to note. To the extent that plaintiff sought injunctive relief directed at defendant’s speech encouraging others to contact the company and its employees with complaints about the business, whether at the workplace or at home, or at public “ad hominem” comments, the court would not grant the emergency relief that was sought.

The court also would not prohibit defendant from publishing allegations that plaintiff had engaged in fraudulent or improper business practices, or from publishing the personally identifying information of plaintiff’s employees, officers, agents, and directors. Plaintiff’s submission failed to demonstrate to the court’s satisfaction how such injunctive relief would not unlawfully impair defendant’s First Amendment rights.

The did, however, enjoin defendant from encouraging others to contact the children and other family members of employees about plaintiff’s business practices because contact of that nature had the potential to cause irreparable emotional harm to those family members, who have no employment or professional relationship with defendant.

Symetra Life Ins. Co. v. Emerson, 2018 WL 6338723(D. Maine, Dec. 4, 2018)

California Supreme Court rejects Yelp takedown injunction

section 230 immunity

The Supreme Court of California ruled that Yelp could not be ordered to remove defamatory reviews from its website because federal law immunized it from that form of relief.

Plaintiffs Dawn Hassell and the Hassell Law Group sued defendant Ava Bird for defamation, false light, and intentional infliction of emotional distress based on negative Yelp reviews that plaintiffs alleged Bird had posted about the law firm. After Bird did not appear, plaintiffs obtained a default judgment awarding damages and an injunction requiring defendant to remove the reviews, and the order also directed Yelp, which was not a defendant, to remove the reviews from its website.

Yelp’s request

Yelp asked the court to set aside or modify the default judgment so it would no longer require Yelp to remove the reviews. It argued that the order violated due process and was barred by Section 230 of the Communications Decency Act because Yelp had not been sued as a defendant and could not be treated as the publisher of third-party content.

Court’s ruling

The court ruled that Yelp was entitled to immunity under Section 230 and that the order had to be revised to delete any requirement that Yelp remove the challenged reviews or later comments by the reviewers. It therefore reversed the Court of Appeal insofar as that court had upheld the denial of Yelp’s motion.

Why the court ruled as it did

The court ruled this way because ordering Yelp to remove the reviews improperly treated it as the publisher of information provided by another content provider, which Section 230 forbids. The court reasoned that plaintiffs could not avoid that immunity by suing only defendant and then obtaining an injunction that indirectly compelled Yelp to take down third-party content, and because the statutory issue resolved the case, the court did not reach Yelp’s due process argument.

Hassell v. Bird, 5 Cal. 5th 522 (Cal., July 2, 2018)

Scroll to top