A boy was solicited and recruited for sex trafficking as a minor. After escaping the nightmare, he had to see content showing child sexual abuse of himself promoted on Twitter. He sued the tech company.
The lawsuit says, “This lawsuit seeks to shine a light on how Twitter has enabled and profited from CSAM (child sexual abuse material) on its platform, choosing profits over people, money over the safety of children, and wealth at the expense of human freedom and human dignity.”
It continued, “Twitter is not a passive, inactive, intermediary in the distribution of this harmful material; rather, Twitter has adopted an active role in the dissemination and knowing promotion and distribution of this harmful material. Twitter’s own policies, practices, business model, and technology architecture encourage and profit from the distribution of sexual exploitation material.”
The boy is suing Twitter for damages “under the federal Trafficking Victims’ Protection Reauthorization Act, Failure to Report Child Sexual Abuse Material, Receipt and Distribution of Child Pornography, and related state law.” He claims that Twitter “knowingly hosted sexual exploitation material, including child sex abuse material (referred to in some instances as child pornography), and allowed human trafficking and the dissemination of child sexual abuse material to continue on its platform, therefore profiting from the harmful and exploitive material and the traffic it draws.”
“Defendant has benefited financially and/or received something of value from participation in one or more sex trafficking ventures by allowing Twitter to become a safe haven and a refuge for, ‘minor attracted people,’ human traffickers, and discussion of ‘child sexual exploitation as a phenomenon,’ to include trade and dissemination of sexual abuse material,” the lawsuit says.
The lawsuit explains that “Twitter has a variety of mechanisms used to moderate content on the platform. Upon information and belief, Twitter uses software and algorithms to ensure tweets reach a smaller audience, block users from tweeting, hide tweets from users in a specific country, hide user profiles, convert users into read-only mode, temporarily lock users out of their account until account verification, and permanently suspend accounts. Rather than act decisively by banning certain types of behavior and allowing others, Twitter’s policy and engineering teams sometimes de-emphasize content and allow users to hide content that may be offensive but not explicitly against the platform’s terms of service.”
Ironically, Twitter has a “zero-tolerance child sexual exploitation policy,” which the lawsuit highlights. The policy includes the following:
“Any content that depicts or promotes child sexual exploitation including, but not limited to:
- visual depictions of a child engaging in sexually explicit or sexually suggestive acts;
- illustrated, computer-generated or other forms of realistic depictions of a human child in a sexually explicit context, or engaging in sexually explicit acts;
- sexualized commentaries about or directed at a known or unknown minor; and
- links to third-party sites that host child sexual exploitation material.
The following behaviors are also not permitted:
- sharing fantasies about or promoting engagement in child sexual exploitation;
- expressing a desire to obtain materials that feature child sexual exploitation;
- recruiting, advertising or expressing an interest in a commercial sex act involving a child, or in harboring and/or transporting a child for sexual purposes;
- sending sexually explicit media to a child;
- engaging or trying to engage a child in a sexually explicit conversation;
- trying to obtain sexually explicit media from a child or trying to engage a child in sexual activity through blackmail or other incentives;
- identifying alleged victims of childhood sexual exploitation by name or image; and
- promoting or normalizing sexual attraction to minors as a form of identity or sexual orientation.”