A boy who was solicited and recruited for sex trafficking as a minor, and endured continued victimization after Twitter refused to remove material depicting his sexual abuse, has sued the social media company.
According to the lawsuit, the claimant, who is still a minor and is referred to as John Doe, “seeks to shine a light on how Twitter has enabled and profited from CSAM (child sexual abuse material) on its platform, choosing profits over people, money over the safety of children, and wealth at the expense of human freedom and human dignity.”
“Twitter is not a passive, inactive, intermediary in the distribution of this harmful material; rather, Twitter has adopted an active role in the dissemination and knowing promotion and distribution of this harmful material. Twitter’s own policies, practices, business model, and technology architecture encourage and profit from the distribution of sexual exploitation material,” the lawsuit continues.
According to the claimant, Twitter refused to remove material depicting the plaintiff’s abuse, even after it was reported to the platform. The offending material received upwards of 167,000 views before the Department of Homeland Security forced Twitter into action.
The lawsuit states that Twitter ignored the claimant’s plea and was eventually forced to act after a mutual contact was able to connect with a DHS agent. The agent initiated contact with Twitter and requested the removal of the offending material.
“Only after this take-down demand from a federal agent did Twitter suspend the user accounts that were distributing the CSAM and report the CSAM to the National Center on Missing and Exploited Children (“NCMEC”). This is directly in contrast to what their automated reply message and User Agreement state they will do to protect children,” reads the lawsuit.
Doe is suing Twitter for damages for its failure to report child sexual abuse material as is mandated by law and its own terms of service agreement. The claimant alleges that Twitter “knowingly hosted sexual exploitation material, including child sex abuse material (referred to in some instances as child pornography), and allowed human trafficking and the dissemination of child sexual abuse material to continue on its platform, therefore profiting from the harmful and exploitive material and the traffic it draws.”
“Defendant has benefited financially and/or received something of value from participation in one or more sex trafficking ventures by allowing Twitter to become a safe haven and a refuge for, ‘minor-attracted people,’ human traffickers, and discussion of ‘child sexual exploitation as a phenomenon,’ to include trade and dissemination of sexual abuse material,” the lawsuit continues.
The lawsuit argues that Twitter already possesses a variety of tools to moderate content on its platform, including algorithms that limit the spread of offensive content, and has the ability to 'shadowban' or suspend users for violating the site’s terms of service. The lawsuit claims that Twitter did not use the means at its disposal to deal with the CSAM in a timely fashion.
“Notwithstanding its stated policy, Twitter permits large amounts of human trafficking and commercial sexual exploitation material on its platform, despite having both the ability to monitor it, and actual and/or constructive knowledge of its posting on the platform,” the lawsuit alleges. “Twitter also contains significant pornographic content, including illegal child sexual abuse content. Twitter permits numerous profiles, posts, comments, and other content either advertising, soliciting, or depicting CSAM.”