School district has to stop filtering web content

PFLAG v. Camdenton R–III School Dist., 2012 WL 510877 (W.D.Mo. Feb. 16, 2012)

Several website publishers that provide supportive resources directed at lesbian, gay, bisexual, and transgender (LGBT) youth filed a First Amendment lawsuit against a school district over the district’s use of internet filtering software. Plaintiffs asked the court for an injunction against the district’s alleged practice of preventing students’ access to websites that expressed a positive viewpoint toward LGBT individuals.

The court granted a preliminary injunction. It found that by using URL Blacklist software, the district (despite its assertions to the contrary) engaged in intentional viewpoint discrimination, in violation of the website publishers’ First Amendment rights. The URL Blacklist software — which relied in large part on dmoz.org — classified positive materials about LGBT issues within the software’s “sexuality” filter, and it put LGBT-negative materials under “religion,” which were not blocked.

It found that the plaintiffs had a fair chance of success on the merits of their First Amendment claims. The school district had claimed it was simply trying to comply with a federal law that required the blocking of content harmful to minors. But the court found that the chosen method of filtering was not narrowly tailored to meet that interest.

One may wonder whether Section 230 of the Communications Decency Act could have protected the school district in this lawsuit. After all, 47 U.S.C. 230(c)(2)(A) provides that:

No provider or user of an interactive computer service shall be held liable on account of—

(A) any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected. . . . (Emphasis added.)

Section 230 would probably not have been much help, because the plaintiffs were seeking injunctive relief, not money damages. An old case called Mainstream Loudoun v. Bd. of Trustees of Loudoun, 24 F. Supp. 2d 552 (E.D. Va. 1998) tells us that:

[Section] 230 provides immunity from actions for damages; it does not, however, immunize [a] defendant from an action for declaratory and injunctive relief. . . . If Congress had intended the statute to insulate Internet providers from both liability and declaratory and injunctive relief, it would have said so.

One could understand the undesirability of applying Section 230 to protect filtering of this sort even without the Mainstream Loudoun holding. If Section 230 completely immunized government-operated interactive computer service providers, allowing them to engage freely in viewpoint-based filtering, free speech would suffer in obvious ways. And it would be unfortunate to subject Section 230 to this kind of analysis, whereby it would face the severe risk of being unconstitutional as applied.

Video: This Week in Law Episode 150

Had a great time hosting This Week in Law Episode 150, which we recorded on February 24. (Thanks to Denise Howell for handing over the hosting reins while she was off for the week.) It was a really fun conversation with three very smart panelists — Mike Godwin, Greg Sergienko and Jonathan Frieden. We talked about copyright and free speech, encryption and the Fifth Amendment, and the state of internet privacy.

If you’re not a regular listener or viewer of This Week in Law, I hope you’ll add it to your media diet. I’m on just about every week (sometimes I’m even referred to as a co-host of the show). We record Fridays at 1pm Central (that’s 11am Pacific, 2pm Eastern). The live stream is at http://live.twit.tv and the page with all the past episodes and various subscription options is http://twit.tv/twil.

No restraining order against uncle posting family photos on Facebook

Court refuses to consider common law invasion of privacy tort to support restraining order under Minnesota statute.

Olson v. LaBrie, 2012 WL 426585 (Minn. App. February 13, 2012)

Appellant sought a restraining order against his uncle, saying that his uncle engaged in harassment by posting family photos of appellant (including one of him in front of a Christmas tree) and mean commentary on Facebook. The trial court denied the restraining order. Appellant sought review with the state appellate court. On appeal, the court affirmed the denial of the restraining order.

It found that the photos and the commentary were mean and disrespectful, but that they could not form the basis for harassment. The court held that whether harassment occurred depended only on a reading of the statute (which provides, among other things, that a restraining order is appropriate to guard against “substantial adverse effects” on the privacy of another). It was not appropriate, the court held, to look to tort law on privacy to determine whether the statute called for a restraining order.

Teacher fired over Facebook post gets her job back

Court invokes notion of “contextual integrity” to evaluate social media user’s online behavior.

Rubino v. City of New York, 2012 WL 373101 (N.Y. Sup. February 1, 2012)

The day after a student drowned at the beach while on a field trip, a fifth grade teacher updated her Facebook status to say:

After today, I am thinking the beach sounds like a wonderful idea for my 5th graders! I HATE THEIR GUTS! They are the devils (sic) spawn!

Three days later, she regretted saying that enough to delete the post. But the school had already found out about it and fired her. After going through the administrative channels, the teacher went to court to challenge her termination.

The court agreed that getting fired was too stiff a penalty. It found that the termination was so disproportionate to the offense, in the light of all the circumstances, that it was “shocking to one’s sense of fairness.” The teacher had an unblemished record before this incident, and what’s more, she posted the content outside of school and after school hours. And there was no evidence it affected her ability to teach.

But the court said some things about the teacher’s use of social media that were even more interesting. It drew on a notion of what scholars have called “contextual integrity” to evaluate the teacher’s online behavior:

[E]ven though petitioner should have known that her postings could become public more easily than if she had uttered them during a telephone call or over dinner, given the illusion that Facebook postings reach only Facebook friends and the fleeting nature of social media, her expectation that only her friends, all of whom are adults, would see the postings is not only apparent, but reasonable.

So while the court found the teacher’s online comments to be “repulsive,” having her lose her job over them went too far.

Six interesting technology law issues raised in the Facebook IPO

Patent trolls, open source, do not track, SOPA, PIPA and much, much more: Facebook’s IPO filing has a real zoo of issues.

The securities laws require that companies going public identify risk factors that could adversely affect the company’s stock. Facebook’s S-1 filing, which it sent to the SEC today, identified almost 40 such factors. A number of these risks are examples of technology law issues that almost any internet company would face, particularly companies whose product is the users.

(1) Advertising regulation. In providing detail about the nature of this risk, Facebook mentions “adverse legal developments relating to advertising, including legislative and regulatory developments” and “the impact of new technologies that could block or obscure the display of our ads and other commercial content.” Facebook is likely concerned about the various technological and legal restrictions on online behavioral advertising, whether in the form of mandatory opportunities for users to opt-out of data collection or or the more aggressive “do not track” idea. The value of the advertising is of course tied to its effectiveness, and any technological, regulatory or legislative measures to enhance user privacy is a risk to Facebook’s revenue.

(2) Data security. No one knows exactly how much information Facebook has about its users. Not only does it have all the content uploaded by its 845 million users, it has the information that could be gleaned from the staggering 100 billion friendships among those users. [More stats] A data breach puts Facebook at risk of a PR backlash, regulatory investigations from the FTC, and civil liability to its users for negligence and other causes of action. But Facebook would not be left without remedy, having in its arsenal civil actions under the Computer Fraud and Abuse Act and the Stored Communications Act (among other laws) against the perpetrators. It is also likely the federal government would step in to enforce the criminal provisions of these acts as well.

(3) Changing laws. The section of the S-1 discussing this risk factor provides a laundry list of the various issues that online businesses face. Among them: user privacy, rights of publicity, data protection, intellectual property, electronic contracts, competition, protection of minors, consumer protection, taxation, and online payment services. Facebook is understandably concerned that changes to any of these areas of the law, anywhere in the world, could make doing business more expensive or, even worse, make parts of the service unlawful. Though not mentioned by name here, SOPA, PIPA, and do-not-track legislation are clearly in Facebook’s mind when it notes that “there have been a number of recent legislative proposals in the United States . . . that would impose new obligations in areas such as privacy and liability for copyright infringement by third parties.”

(4) Intellectual property protection. The company begins its discussion of this risk with a few obvious observations, namely, how the company may be adversely affected if it is unable to secure trademark, copyright or patent registration for its various intellectual property assets. Later in the disclosure, though, Facebook says some really interesting things about open source:

As a result of our open source contributions and the use of open source in our products, we may license or be required to license innovations that turn out to be material to our business and may also be exposed to increased litigation risk. If the protection of our proprietary rights is inadequate to prevent unauthorized use or appropriation by third parties, the value of our brand and other intangible assets may be diminished and competitors may be able to more effectively mimic our service and methods of operations.

(5) Patent troll lawsuits. Facebook notes that internet and technology companies “frequently enter into litigation based on allegations of infringement, misappropriation, or other violations of intellectual property or other rights.” But it goes on to give special attention to those “non-practicing entities” (read: patent trolls) “that own patents and other intellectual property rights,” which “often attempt to aggressively assert their rights in order to extract value from technology companies.” Facebook believes that as its profile continues to rise, especially in the glory of its IPO, it will increasingly become the target of patent trolls. For now it does not seem worried: “[W]e do not believe that the final outcome of intellectual property claims that we currently face will have a material adverse effect on our business.” Instead, those endeavors are a suck on resources: “[D]efending patent and other intellectual property claims is costly and can impose a significant burden on management and employees….” And there is also the risk that these lawsuits might turn out badly, and Facebook would have to pay judgments, get licenses, or develop workarounds.

(6) Tort liability for user-generated content. Facebook acknowledges that it faces, and will face, claims relating to information that is published or made available on the site by its users, including claims concerning defamation, intellectual property rights, rights of publicity and privacy, and personal injury torts. Though it does not specifically mention the robust immunity from liability over third party content provided by 47 U.S.C. 230, Facebook indicates a certain confidence in the protections afforded by U.S. law from tort liability. It is the international scene that gives Facebook concern here: “This risk is enhanced in certain jurisdictions outside the United States where our protection from liability for third-party actions may be unclear and where we may be less protected under local laws than we are in the United States.”

You have to hand it to the teams of professionals who have put together Facebook’s IPO filing. I suppose the billions of dollars at stake can serve as a motivation for thoroughness. In any event, the well-articulated discussion of these risks in the S-1 is an interesting read, and can serve to guide the many lesser-valued companies out there.

On the radio: Mobile devices and the Fourth Amendment

I was honored to be a guest on this morning’s episode of Oregon Public Broadcasting’s show Listen Out Loud, talking with host Dave Miller about the recent case of Schlossberg v. Solesbee.

Listen to the interview here:
MP3

We talked about the Fourth Amendment and, more specifically, the exceptions to the warrant requirement for searches made incident to lawful arrests. Some courts have given special treatment to mobile devices when considering whether the information contained on them may be searched without a warrant, because of the vast amounts of personal information that is present.

Fair use, the DMCA, and presidential politics

The 2012 presidential election cycle is already giving internet law enthusiasts things to talk about. Last week it was Ron Paul’s grumblings about an unauthorized campaign ad on YouTube. Now NBC is moaning about a Mitt Romney ad comprised almost entirely of Tom Brokaw on the Nightly News in 1997.

NBC has asked the ad be pulled, claiming it is a copyright infringement. Smart people are already saying the ad is fair use. It probably is fair use.

And NBC knows that. Romney’s campaign posted the ad on YouTube five days ago, and it is yet to be the subject of a DMCA takedown notice. Though such a notice would be easy to draft and send, NBC is aware that the fallout could be expensive. Section 512(f) of the DMCA penalizes the senders of bogus takedown notices. And the courts have not taken kindly to purported victims of infringement who do not fully consider fair use before having content taken off YouTube.

With the election still months away, we may yet see controversial action like we did in 2008 by the news media to disable political content. These situations underscore the problem presented by how long it takes to process DMCA counternotifications and 512(f) actions.

A candidate’s defeat makes these processes moot. So maybe we should hope for a longer republican primary season just so we can see some good DMCA and fair use litigation. Come on NBC, send that takedown notice!

Are nonpirate Megaupload users entitled to compensation from the government?

If I left my coat in a taxi that was later impounded because, unknown to me, the driver was transporting heroin in the trunk, would I be left out in the cold?

People who used Megaupload to lawfully store and transfer files are rightfully upset that their stuff is unavailable after last week’s raid. Some groups in other countries say they are going to sue the U.S. government. Would a lawsuit like that get anywhere in a U.S. court?

The Fifth Amendment — best known for its privilege against self-incrimination — says that “private property [shall not] be taken for public use, without just compensation”. (You can impress your legally-trained friends at parties by confidently and casually referring to the Takings Clause.) Does the Takings Clause give innocent Megaupload users a right to be paid the value of the files they are being deprived of while the feds use the servers on which those files are stored to prove their case against Kim Dotcom and company?

Back in 2008, Ilya Somin and Orin Kerr had a conversation on the Volokh Conspiracy discussing this question of whether the Fifth Amendment protects innocent third parties who lose property in a criminal investigation. If you read that commentary you will see that a case over the Megaupload takedown might be tough for a number of esoteric reasons, not the least of which is Supreme Court precedent.

There are some face-value problems with a case like this as well. Has the government taken the property for a “public use”? One could argue that the reason the servers (including the innocent content) were seized was for the so-called public good of going after piracy. But then the innocent content is not being “used” in connection with the prosecution — it just happens to be there.

I do not pretend to know the answers to this inquiry, and I’m relying on sharper Constitutional minds than mine to leave some good comments. (If you know Ilya Somin or Orin Kerr, send them a link to this post!) All I know is that it does not seem fair that users of the cloud should so easily be deprived in the name of law enforcement.

 

Enhanced by Zemanta

There is no “generalized right to rummage” through an adversary’s Facebook account

Tompkins v. Detroit Metro. Airport, 2012 WL 179320 (E.D. Mich. January 18, 2012)

Plaintiff filed a personal injury lawsuit against defendants claiming she was impaired in her ability to work and enjoy life. One of the defendants filed a motion with the court asking it to order plaintiff to authorize access to her entire Facebook account. The court denied the motion. Finding that defendant had not made a “sufficient predicate” showing that the sought-after information was relevant, and that the request was overly broad, the court held that defendant “[did] not have a generalized right to rummage at will through information that [plaintiff had] limited from public view.”

The court distinguished two other well-known social media discovery cases, Romano v. Steelcase and McMillen v. Hummingbird Speedway. In those cases, the Facebook users had posted photos of themselves engaged in activities that were inconsistent with their claimed injuries (e.g., going fishing and traveling to Florida). The publicly-visible photos that plaintiff in this case posted, which defendant argued made the rest of her account relevant, were of her holding a 2-pound dog, and standing with friends at a birthday party. “If [her] public Facebook page contained pictures of her playing golf or riding horseback,” the court noted, “[defendant] might have a stronger argument for delving into the nonpublic section of her account.”

The court made clear that its decision did not address the question of whether a Facebook user has a reasonable expectation of privacy in so-called private pages. (And there’s nothing in the decision to suggest that inquiry should be answered in the affirmative.) The court also noted that it was not answering the question of whether one could challenge a subpoena to Facebook under the Stored Communications Act (18 U.S.C. 2701 et seq.) as contemplated by Crispin v. Christian Audigier, 717 F.Supp.2d 965 (S.D. Cal. 2010).

Other coverage from Eric B. Meyer.

Ron Paul not allowed to find out who posted mean video about Jon Huntsman on YouTube

Ron Paul 2012 Presidential Campaign Committee, Inc. v. Does, 12-00240 (N.D. Cal. January 25, 2012)

(Hat tip to Venkat for posting a link to this decision.)

Ron Paul’s campaign — Ron Paul 2012 Presidential Campaign Committee, Inc. — sued some John Doe defendants in federal court over an offensive video attacking former (but then current) opponent Jon Huntsman. The video demonstrated a gross insensitivity toward Chinese culture, and was posted to YouTube and promoted on Twitter by a user calling himself NHLiberty4Paul.

Since the campaign did not know the true identity of the John Doe defendants, it asked the court for leave to take “expedited discovery” so that it could serve subpoenas on YouTube and Twitter. (The Federal Rules of Civil Procedure do not allow early discovery like this unless the court specifically permits it.)

The court denied the campaign’s motion seeking early discovery. It held that the campaign failed to show the required “good cause” for expedited discovery set forth in the case of Columbia Ins. Co. v. Seescandy.com, 185 F.R.D. 573, 577 (N.D.Cal.1999).

Under the Seescandy.com standard, in determining whether there is good cause to allow expedited discovery to identify anonymous internet users named as Doe defendants, courts consider whether:

  • (1) the plaintiff can identify the missing party with sufficient specificity such that the court can determine that defendant is a real person or entity who could be sued in federal court;
  • (2) the plaintiff has identified all previous steps taken to locate the elusive defendant;
  • (3) the plaintiff’s suit against defendant could withstand a motion to dismiss; and
  • (4) the plaintiff has demonstrated that there is a reasonable likelihood of being able to identify the defendant through discovery such that service of process would be possible.

The court found that the campaign failed to address these required issues. One is left to wonder whether there is enough of Paul’s campaign left to make it worthwhile to try again.

Scroll to top