Adult Entertainment Law

How Civil Sex Trafficking Claims Impact the Adult Entertainment Industry

By: Lawrence G. Walters May 17, 2023

Introduction: Civil Claims and Their Impact on the Adult Entertainment Industry

Numerous online platforms including Kik, Twitter, Reddit, Craigslist, and MindGeek have been sued by civil claimants seeking to hold them responsible for sex trafficking activities by their users. These lawsuits have significant consequences for adult industry website operators, billing processors, and performers. The legal standard to which online platforms are held in these instances may ultimately be decided by the U.S. Supreme Court. This article will explore the types of allegations made against platform operators and the changes we are seeing in the adult industry as a result.

Understanding the Role of Online Platforms in Sex Trafficking Lawsuits

Why would a large online platform ever be involved with sex trafficking? Naturally, any association with such nefarious activities is bad for business, terrible for public relations, and inconsistent with basic human dignity. In reality, however, sex trafficking statutes are intentionally broad, and seek to punish anyone who benefits from trafficking or those who participate in a sex trafficking venture – regardless of the level of involvement. Online platforms have been named as defendants in civil sex trafficking claims based on their age verification procedures, content moderation policies, takedown responses, and general business practices. These lawsuits have been filed even without any active involvement by the platform in the alleged sex trafficking activities, or any actual knowledge of the activities. Understanding how this can happen starts with a review of the underlying criminal laws.

Exploring the Legal Background: Key Factors and Statutes Involved in Sex Trafficking Cases

As with many criminal offenses, the key factor in determining whether someone is culpable as part of a criminal venture turns on whether the defendant had “knowledge” of the illegal activities. The federal prohibition on sex trafficking, 18 U.S.C. § 1591, prohibits anyone from “knowingly” recruiting, enticing, harboring, transporting, providing, obtaining, advertising, maintaining, patronizing, or soliciting a person through force, fraud, or coercion, to engage in a commercial sex act. Further, the statute punishes anyone who benefits financially, or receives anything of value, from participating in a venture which has engaged in the activity described above. Id. The defendant must have acted knowingly (or in reckless disregard) of the fact that force, fraud, or coercion was used to procure a commercial sex act from the victim. Bottom line: anybody remotely connected with sex trafficking can be brought into a sex trafficking case. This is particularly true given the existence of accomplice liability laws that prohibit conspiracy, or aiding and abetting, any substantive offense. Aiding and abetting only requires substantially assisting another in violating the law, while conspiracy simply requires an agreement to violate the law (plus an overt act in furtherance) without the need to prove any completed crime.

Expansion of Sex Trafficking Cases: From Physical Spaces to Online Platforms

Sex trafficking cases have been brought against businesses that are typically far removed from any such illegal activities such as hotels or truck stops. Until FOSTA/SESTA (“FOSTA”) was passed in April 2018, online platforms were protected from civil sex trafficking claims by Section 230 immunity. However, FOSTA changed all that. Now, online platforms can be liable in civil courts “if the conduct underlying the claim constitutes a violation of section 1591” [the criminal sex trafficking statute]. This awkward wording, allowing for civil liability only if the claim constitutes a criminal violation, has led to conflicting interpretations in the courts as to when and how online platforms can be successfully sued for sex trafficking violations.

Shifting from Actual to Constructive Knowledge: The New Challenge for Online Platforms

In civil courts, defendants are often held responsible for their negligent actions even if they did not intentionally commit harm. Failing to clean up a spill in a retail store which results in a slip and fall by a customer is an example of such liability. The store owner did not intend for anyone to be hurt, but can be held financially liable to the injured party, nonetheless. So, when it comes to sex trafficking violations, civil claims have been asserted when a defendant “should have known” of the illegal activity, even in the absence of actual knowledge. This illustrates the legal principle of “constructive knowledge”. Defendants are deemed to know the things they should have known with reasonable inquiry. They can also be found liable if they turn a blind eye to unlawful activity. However, when it comes to online platforms that disseminate speech, there are legitimate First Amendment issues implicated by holding a platform operator responsible for things it arguably should have known, but did not, in fact, know. Imposing such a broad standard of liability on platforms can result in a “chilling effect” on speech which causes the platform to moderate or censor broad swaths of user speech, in an effort to avoid potentially being held liable for actions that a civil claimant thinks the platform should have known.

Civil Claims and FOSTA: The Fine Balance between Knowledge Standards

Potentially in recognition of this concern, FOSTA requires that civil claimants demonstrate a criminal violation of sex trafficking laws before liability will attach to an online platform. However, the courts have been divided on whether the actual knowledge standard or the constructive knowledge standard prevails when suing a platform for sex trafficking violations.

Allegations Against Online Platforms: From Age Verification to Content Moderation

Plaintiffs have filed lawsuits against numerous online platforms based on allegations that they failed to promptly take down content in response to abuse notices, failed to impose sufficient age verification procedures on content uploaders, failed to aggressively monitor content for illegal activities, or failed to pay sufficient attention to allegations of unlawful conduct by their users. A fact potentially missing in many of these cases is any allegation of actual criminal conduct by the platform operator, itself. It would be highly unlikely that a large platform operator would knowingly participate in a sex trafficking venture through its own actions. Therefore, civil claimants have sought to rely on the concept of “constructive knowledge” in the attempt to attach liability for unlawful conduct of third party users.

“Constructive Knowledge”: A Pandora’s Box for Online Platforms

Any effort to determine whether a platform “should have known” that its users were involved in sex trafficking opens a Pandora’s Box of numerous potential indicators of such knowledge. Did the platform miss one email, out of many thousands of routine support messages, demanding takedown of alleged recordings of forced commercial sex acts? Did the platform fail to implement robust age verification procedures, using the latest technology, before allowing a creator to upload content? A wide variety of operational decisions could be brought into question when evaluating a constructive knowledge standard. Attempting to comply with this standard can result in a never-ending undertaking designed to enhance legal compliance efforts, moderate content, and verify users. The inclusion of online service providers as defendants in these cases, such as hosts and billing companies, can exacerbate the compliance burdens as platforms try to comply with rules and guidelines imposed by their service providers in order to stay in business. The impact is ultimately borne by adult content creators who are faced with increasing barriers to their ability to monetize content.

A Ray of Hope: Ninth Circuit’s Decision in Does v. Reddit

However, a recent decision from the Ninth Circuit Court of Appeals offers some logic to the legal analysis, and a ray of hope to both platforms and content creators. On October 24, 2022, the Ninth Circuit decided Does v. Reddit, which addressed, head on, the level of knowledge required to assert a valid sex trafficking claim against an online platform. In Reddit, the platform was sued, on a class action basis, for allegedly allowing explicit images of minors to be posted on various “sub-reddit” accounts, and failing to prevent the reposting of the materials after removal. Further, the plaintiffs in that case alleged that Reddit permitted the labeling of accounts with terms that suggested underage content such as:  /r/BestofYoungNSFW, r/teensdirtie, /r/TeenBeauties, and /r/YoungGirlsGoneWildd. Finally, they claimed that Reddit received advertising revenues from these channels while failing to track offending IP addresses and “delayed” implementation of potentially available content moderation tools such as PhotoDNA.

Implications for the Adult Industry: Precautions and Moving Forward

While the Ninth Circuit’s decision in Does v. Reddit is not binding on other federal courts, it is highly influential, and has the potential to provide the much-needed clarification to online platforms, service providers, and content creators alike. However, there are still several courts that have yet to rule on this issue. The final word will likely come from the U.S. Supreme Court in the next few years. In the meantime, platforms should ensure they have policies in place to promptly remove any content associated with unlawful activities, report such activities to appropriate law enforcement agencies, and cooperate with such agencies when requested.

This ever-evolving landscape underlines the need for adult industry operators to remain vigilant in their compliance efforts and to work with legal counsel familiar with the latest developments in this challenging area of the law. The balance between freedom of speech, online content moderation, and the prevention of illegal activities is a delicate one, and platforms must tread carefully to avoid legal pitfalls while providing a safe and secure environment for their users.

Conclusion

The recent rise in civil sex trafficking claims against online platforms presents both challenges and opportunities for the adult entertainment industry. While the lawsuits have led to increased scrutiny and compliance costs, they have also fostered a dialogue around necessary reform in platform operation policies, age verification procedures, and content moderation. The Ninth Circuit’s ruling in Does v. Reddit, coupled with potential future rulings from other federal courts or the U.S. Supreme Court, will continue to shape the way the adult industry operates and responds to these pressing issues. Ultimately, the industry’s collective response to these challenges can lead to a safer and more responsible internet space, providing protection to vulnerable individuals while maintaining the freedom of expression that is at the heart of the internet’s appeal.

Recent Posts

Free Speech and Free Press in the Age of Disinformation

By Bobby Desmond - October 09, 2024 The Importance of First Amendment Protections The First…

55 years ago

New York Times covers creative use of Google Trends evidence in obscenity case

Lawrence Walters, a Florida lawyer who is an expert in obscenity law, said that there…

55 years ago

Legal Overview of Daily Fantasy Sports

The Evolution of Daily Fantasy Sports By: Neil Braslow, Esq. Walters Law Group It is…

55 years ago