Oklahoma federal court keeps Paycom trademark, cybersquatting suit

domain name law

The United States District Court for the Western District of Oklahoma refused to dismiss Paycom trademark and cybersquatting suit against Pay.com entities, finding that the case could proceed in Oklahoma. Plaintiff sued defendants for trademark infringement, false designation of origin, trademark dilution, cybersquatting, common-law trademark infringement, unfair competition, and violation of the Oklahoma Deceptive Trade Practices Act. Plaintiff alleged that defendants used Pay.com and Paycom-related branding in a way that confused consumers, suggested an affiliation with plaintiff, and harmed plaintiff’s PAYCOM marks.

Dismissal request

Defendants asked the court to dismiss the case for lack of personal jurisdiction and also asked it to dismiss the cybersquatting claim for failure to state a claim. They argued that their online activity did not create sufficient Oklahoma contacts and that plaintiff had not adequately pleaded the elements of an Anti-Cybersquatting Protection Act (ACPA) claim.

Jurisdiction ruling

The court ruled that dismissal was not warranted. It held that plaintiff made a prima facie showing of specific personal jurisdiction in Oklahoma and also held that the amended complaint plausibly stated a cybersquatting claim, so defendants’ motion to dismiss was denied in full.

Why the court rejected dismissal

The court found that defendants had sent follow-up marketing emails directly to prospective Oklahoma merchants after those businesses began account applications through the pay.com website, and those contacts were enough to show purposeful direction toward Oklahoma that related to plaintiff’s alleged injuries. The court also found that defendants had not shown jurisdiction in Oklahoma would be unreasonable, and it concluded that the cybersquatting arguments turned on factual disputes and matters outside the pleadings that could not be resolved on a Rule 12(b)(6) motion.

Paycom Payroll, LLC v. Pay.com US, Inc., 2026 WL 810559 (W.D. Oklahoma, March 24, 2026) 

Supreme Court rejects contributory liability in long-running Cox case

copyright liability

Plaintiffs sued Defendant for secondary copyright infringement, alleging that Defendant, an internet service provider, was liable because it continued to provide internet access to subscribers whose accounts were associated with music piracy. Plaintiffs won a jury verdict for $1 billion, and the Fourth Circuit Court of Appeals agreed in part, concluding that Defendant could be contributorily liable for continuing to serve known infringers.

Defendant asked the U.S. Supreme Court to reverse the contributory infringement ruling, arguing that merely providing internet service to subscribers suspected of infringement does not make a service provider liable under the Copyright Act. It contended that secondary liability requires proof that Defendant induced infringement or provided a service designed for infringement, not just knowledge that some subscribers used the service unlawfully.

The Court ruled that Defendant was not contributorily liable and reversed the Fourth Circuit’s decision on that issue. It held that supplying internet access to the general public, even with knowledge that some users may infringe copyrights, is not enough by itself to establish contributory copyright infringement.

The Court ruled this way because contributory liability requires intent, which Plaintiffs could show only by proving that Defendant induced infringement or provided a service tailored to infringement. The Court concluded that Defendant did neither, since it did not encourage infringement and its internet service had substantial lawful uses. The Court also rejected Plaintiffs’ reliance on the DMCA, explaining that the statute creates safe harbors and does not itself impose liability on service providers that fail to qualify for them.

Cox Communications, Inc. v. Sony Music Entertainment, 607 U.S. ___, 2026 WL 815823 (Mar. 25, 2026).

FAFO in federal court: Hacker who bragged on Hulu documentary slammed with liability under federal law

fafo

Plaintiff sued defendant for unlawfully accessing plaintiff’s email account and publishing more than sixty private emails on social media. Defendant had repeatedly claimed credit for the hack in a Hulu documentary, on social media, and in podcast appearances. Plaintiff brought several claims in federal court, including claims under the Stored Communications Act, the Computer Fraud and Abuse Act, and invasion of privacy under Tennessee common law.

Plaintiff asked the court to enter summary judgment on liability, arguing that defendant’s own public statements confirmed every essential element of the Stored Communications Act and invasion of privacy tort claims.

The court ruled that defendant was liable under the Stored Communications Act and for public disclosure of private facts. It denied summary judgment on the Computer Fraud and Abuse Act claim because plaintiff had not presented sufficient evidence of economic loss. But that issue remains open for trial.

The court ruled this way because it found that defendant gave repeated, detailed accounts of how he accessed plaintiff’s email account, changed the password, and took control. Plaintiff submitted additional evidence that it lost access to the account during the same period. The court held that this conduct met the elements of unauthorized access under the Stored Communications Act and that the publication of dozens of personal emails, including intimate messages and communications from family members, qualified as highly offensive under Tennessee law.

McKamey v. Yerace, No. 3:21-CV-00132, 2024 WL 7147987 (M.D. Tenn. January 15, 2026)

Did anti-ICE church protestors in Minnesota violate federal law?

A brazen and disruptive intrusion by anti-ICE activists during a Sunday worship service at Cities Church in St. Paul, Minnesota, has rightly drawn national outrage and sparked a federal investigation into potential violations of civil rights laws. The protesters, organized by various left wing groups, stormed the sanctuary, chanting slogans and effectively shutting down the service. This understandably left congregants, including children, very upset. The U.S. Department of Justice’s Civil Rights Division is now examining whether these actions violated the federal Freedom of Access to Clinic Entrances Act (FACE Act).

At first glance, applying the FACE Act (originally passed in the 1990s to combat violent blockades and threats at abortion clinics) might seem unexpected. But the law’s text is clear and broad. Congress deliberately extended protections to places of religious worship. It recognized that the same aggressive tactics used against medical facilities could be weaponized against houses of worship. The law exists to prevent raucous interference and other hostilities toward religious exercise.

What the FACE Act actually prohibits

Under 18 U.S.C. § 248(a)(2), it is unlawful for anyone to:

by force or threat of force or by physical obstruction, intentionally injure, intimidate or interfere with or attempts to injure, intimidate or interfere with any person lawfully exercising or seeking to exercise the First Amendment right of religious freedom at a place of religious worship.

This provision exists precisely because disruptions like the one in Minnesota threaten the fundamental right to peaceful worship. The activists did not simply express disagreement outside. They invaded the sanctuary mid-service, chanting demands and accusations. They turned a sacred space into a stage for their political theater.

Why the definitions are critical (and why this conduct looks troubling)

The FACE Act does not criminalize all protest or even offensive speech. It targets specific, harmful conduct with narrow definitions. Here are the key definitions for this situation:

  • To “interfere with” means restricting a person’s freedom of movement.
  • To “intimidate” means to place a person in reasonable apprehension of bodily harm to him- or herself or to another.
  • A “physical obstruction” means rendering ingress to or egress from a place of religious worship impassable, or unreasonably difficult or hazardous.

Reports and video evidence suggest the protesters crowded the sanctuary, positioned themselves in the middle during the sermon, and caused congregants to be very upset. This was not peaceful picketing. It was a calculated invasion that terrified families and halted a Christian service. Given the current political climate and instances of violence, it seems the government should be able to prove that worshippers – who were clearly exercising a First Amendment right – were placed in reasonable apprehension that either they or their loved ones would be harmed.

Federal enforcement, civil options, and state inaction

The FACE Act allows not only criminal prosecution but also civil suits, including by state attorneys general. Minnesota’s AG could theoretically act to defend residents’ religious freedom. However, given the state’s political leadership (often sympathetic to disruptive protests), we have no reason to hold our breath awaiting enforcement from local or state officials.

Infringement case against OpenAI failed because there was no copyright registration

copyright dismissed

Thinking about suing an AI company for copyright infringement? Do not overlook the basics. Before any court will consider the merits of an infringement claim, the plaintiff needs to have an actual copyright registration in hand, not just a pending application.

That notion was confirmed in a recent unsuccessful lawsuit against OpenAI in federal court in California. Plaintiff sued OpenAI alleging that OpenAI infringed the copyright in several artificial intelligence models and content plaintiff claimed to have developed and then destroyed evidence to conceal that alleged infringement.

Plaintiff asked the court to issue a temporary restraining order preventing defendant from deleting or altering data and documents related to the alleged infringement while the case proceeded. The court denied the request for a temporary restraining order and dismissed the complaint.

The court ruled this way because the Copyright Act bars any civil infringement action until copyright registration has been made, and courts interpret that requirement to mean the Copyright Office must have issued a registration certificate, not merely received an application. This left plaintiff in this case unable to show a likelihood of success on the merits.

Gholami v. OpenAI, Inc., No. 26-cv-00174, 2026 WL 61359 (N.D. Cal., January 8, 2026).

Here is some relevant info:

Does the DMCA safe harbor cover infringing images in an email?

DMCA safe harbor for notifications

Plaintiff photographer sued Pinterest for copyright infringement, alleging Pinterest displayed his and other photographers’ copyrighted images in notifications sent outside of the Pinterest website. Pinterest moved for summary judgment, arguing it was protected under the safe harbor provisions of Section 512(c) of the Digital Millennium Copyright Act (“DMCA”). The court granted Pinterest’s motion and dismissed the case.

Pinterest is a familiar and massive social media platform, where individuals upload and share image-based “Pins” that function as visual bookmarks. The platform displays Pins in personalized feeds curated by algorithms and which contain advertisements labeled as “promoted.” Pinterest also delivers through notifications such as emails, in-app alerts, and push notifications, which contain hyperlinks that trigger display of images hosted on its servers. One such notification that plaintiff received included his copyrighted photograph, prompting him to file suit six days later.

The court found that Pinterest’s actions fell within the DMCA’s Section 512(c) safe harbor, which shields service providers from copyright liability for content stored at the direction of users. Because Pinterest raised this as an affirmative defense, it had the burden to prove every element of the safe harbor criteria, and the court concluded it had met both the statutory threshold and all required conditions.

Statutory threshold requirements under the DMCA

To qualify for the DMCA safe harbor, Pinterest had to meet several threshold statutory requirements that are found in Sections 512(c) and (i): it had to be a service provider, maintain a designated agent, implement a repeat infringer policy, and accommodate standard technical measures. The court found that Pinterest satisfied all four. As “one of the largest social media platforms in the world,” it operated a qualifying online platform as defined by the statute. The evidence showed that Pinterest maintained a registered agent with the Copyright Office and that it enforced a strike-based policy for repeat infringers. And the court found that Pinterest did not interfere with any recognized standard technical measures that plaintiff implemented with his works. (Plaintiff had asserted that he embedded certain metadata in his photographs, but he did not argue that this metadata qualified as a “standard technical measure” under the DMCA, nor did he claim that Pinterest interfered with it — in fact, he alleged that Pinterest preserved the metadata on its servers.)

How Pinterest met the required conditions

After finding that Pinterest satisfied the DMCA’s threshold requirements, the court turned to whether Pinterest’s conduct of sending out copyright protected images in off-platform notifications was protected under Section 512(c). To do so, Pinterest had to show three things:

  • the alleged infringement occurred due to user-directed storage;
  • Pinterest lacked actual or red flag knowledge of the infringement; and
  • Pinterest either had no right and ability to control the activity or did not receive a direct financial benefit from it.

The court evaluated each element in turn.

By reason of storage at the direction of a user

The court concluded that Pinterest met the first requirement for DMCA safe harbor protection: the alleged infringement occurred “by reason of the storage at the direction of a user.” It emphasized that the image at issue was not embedded in the notification itself but was instead hosted on Pinterest’s servers and accessed via a hyperlink contained in the notification. When a user opened the message, their software triggered a request to Pinterest’s server to retrieve and display the image, just as it would when accessing content directly through the platform. Because this method merely facilitated access to user-uploaded content without altering it, the court found the display was within the statutory definition.

No knowledge of infringement

The court found that Pinterest satisfied the second requirement for DMCA safe harbor protection by showing it lacked actual or red flag knowledge of the alleged infringement. Critically, Harrington never sent Pinterest a DMCA takedown notice or otherwise identified the allegedly infringing material before filing suit. The DMCA operates on a notice and takedown system: platforms are not required to proactively monitor user content but must respond once they receive proper notice. Because Harrington gave no such notice and offered no evidence that Pinterest otherwise knew about the specific image at issue, the court concluded there was no genuine dispute as to Pinterest’s lack of knowledge.

Control and financial benefit

The court found that Pinterest met the third and final requirement for DMCA safe harbor by showing it neither had the right and ability to control the alleged infringement nor received a financial benefit directly attributable to it. While Pinterest used algorithms to curate content and monetize its platform generally, the court held that this did not amount to the kind of “substantial influence” over user activity that would disqualify it under the DMCA. Pinterest did not direct users to upload specific content, nor did it participate in any purposeful conduct related to the display of plaintiff’s photo.

The court also rejected plaintiff’s claim that Pinterest profited directly from the infringement. Pinterest presented evidence that its notifications did not contain advertisements and that it earned no revenue specifically tied to the image in question. Plaintiff’s counter-evidence failed to show otherwise. Even if ads had appeared near the image, the law requires a direct connection between the infringing display and revenue, which was absent here. Therefore, Pinterest satisfied this final element of the DMCA safe harbor defense.

Harrington et al. v. Pinterest, Inc., No. 20-CV-5290, 2026 WL 25880 (N.D. Cal., January 5, 2026)

Ninth Circuit declines to impose broad injunction against California’s social media law for minors

social media ban

NetChoice, an internet trade association representing companies such as Google, Meta, and X, sued the State of California over its Protecting Our Kids from Social Media Addiction Act, claiming that the law violates the First Amendment. The Act restricts how social media platforms interact with minors, particularly limiting access to algorithmic feeds, requiring certain default settings, and mandating age-verification procedures.

Plaintiff asked the court to block enforcement of several provisions of the law through a preliminary injunction, focusing on its claims that aspects of the Act unlawfully restrict speech and are unconstitutionally vague. The lower court declined to issue the injunction. Plaintiff sought review with the Ninth Circuit.

On appeal, the Ninth Circuit largely affirmed the district court’s refusal to issue a broad injunction but ruled that the provision of the law requiring platforms to hide like and share counts by default for minors is unconstitutional. It reversed the lower court on that point and instructed it to modify its injunction to prevent enforcement of that specific provision.

The court ruled this way because it found the like-count requirement to be content-based and therefore subject to strict scrutiny under the First Amendment. The government failed to show that hiding like counts was the least restrictive means to achieve its goal of protecting minors’ mental health. Other provisions, including those governing private-mode settings and age verification, either survived scrutiny or were deemed unripe for review.

NetChoice LLC v. Bonta, — F.4th —, 2025 WL 2600007 (9th Cir. Sept. 9, 2025)

Claims against porn sites dismissed because of Section 230 immunity

Section 230 immunity
Plaintiffs sued several internet pornography companies after they discovered that videos secretly recorded of them while changing in a college locker room had been uploaded online.

Plaintiffs asked the court to hold the defendants liable under several theories, including civil conspiracy, negligent monitoring, and violations of the Trafficking Victims Protection Reauthorization Act (TVPRA).

The court granted summary judgment in favor of defendants.

The court held that Section 230 of the Communications Decency Act shielded the defendants from liability for user-generated content, and plaintiffs failed to show that any of defendants materially contributed to the illegal aspects of the videos. The court also found no evidence of a conspiracy or that defendants met the requirements to be considered beneficiaries of a sex trafficking venture under the TVPRA. Claims against defendants who merely licensed trademarks or placed ads were also rejected due to lack of personal jurisdiction or insufficient evidence of wrongdoing.

Jane Does 1–9 v. Collins Murphy, et al., No. 7:20-cv-00947-DCC, 2025 WL 2533961 (D.S.C. Sept. 3, 2025).

Scroll to top