Twitter tries to dismiss child sex trafficking lawsuit based on immunity

Twitter Tries To Dismiss Lawsuit By Child Sex Trafficking Survivor Using Section 230 Based On Immunity

Twitter  a federal court in Florida is asking for one lawsuit sued by a victim of child sex trafficking, who claimed that the  social media giant  immediately refused to remove the child's explicit content on its platform.

The lawsuit, filed in January, accuses the Silicon Valley company of knowingly benefiting financially from the distribution of sexual abuse material, while not immediately responding to multiple complaints and correspondence asking it to remove the illegal and offensive content.

The content was eventually removed about nine days after the child and his family filed a complaint with Twitter. The family and child victim allege that Twitter took action only after a Department of Homeland Security agent issued a “take down request” against the company, while other methods of getting the company to remove the content, according to  the complaint did not work .

Twitter Removes Privacy Option, and Shows Why We Need Strong Privacy Laws | Electronic Frontier Foundation

“Twitter is not a passive, inactive agent in the distribution of this harmful material; Twitter has previously played an active role in the dissemination and deliberate promotion and distribution of this harmful material, ”the lawsuit states.

It further claims that "Twitter's proprietary policies, practices, business model, and technology architecture encourage and benefit the dissemination of sexual exploitation material."

The lawsuit also alleges that Twitter failed to fulfill its duty to report child sexual abuse material and was negligent when it took no action, allowing the content to be viewed more than 167.000 times and retweet 2.200 times before being taken down .

“Twitter's behavior is extremely different from what a reasonably cautious person in the same situation would do to avoid harm to others,” the lawsuit said.

In a  lawsuit on Wednesday Twitter did not respond to allegations that it "refused" to remove the content immediately after the complaints were filed. Instead, it tried to characterize its actions as slowness in removing the offending content. The company defended the delay by stating that the "sheer number" of tweets posted on the platform "simply does not allow Twitter ... to immediately or accurately remove all objectionable content in all cases." The company says there are hundreds of millions of tweets posted on the platform every day.

“The fact that nine days passed before the objectionable content was removed does not make Twitter liable under applicable law,” said Twitter in its statement.  motion for rejection .

The company also invokes immunity under Section 230 of the Communications Decency Act. The provision largely exempts online platforms from liability for content posted by users, although they may be held liable for content that violates anti-sex trade or intellectual property laws.

Jack Dorsey is trying to sell his first tweet as an NFT - The Verge

Jack Dorsey.

“Since Twitter's alleged liability here rests on its failure to remove content from its platform, the bias dismissal of the complaint is justified on this ground alone,” argued Twitter.

They also argue that Congress's exception to section 230, which allows civil liability claims for online platforms that knowingly participate in a sex trafficking business, does not apply.

“The complaint does not come close to meeting this specific and demanding criminal law standard. It does not claim that Twitter knowingly participated in any kind of venture with the perpetrators, let alone a sex trafficking (ie commercial sex) venture, ”Twitter said.

The social media behemoth has also argued that the plaintiffs have not made any claims on some of the allegations in the complaint.

According to the lawsuit, Twitter had sent the child an email on Jan. 28 stating that it had "reviewed the content and found no violation of our policies, so no action will be taken at this time." The child had first filed a complaint with Twitter on January 21. The content was removed on or around January 30th.

"If Twitter had rated the material as they claimed in their response to John Doe, they would have seen the above comments, which clearly acknowledge that the material depicts minors," the lawsuit states.

The lawsuit against Twitter aims to prevent the platform from continuing to take advantage of the illegal content posted on its site, and to seek damages for the harm done to the child.

A hearing for the negative motion  is scheduled for June 4 .

 

Facebook the 'perfect' platform for pedophiles and child abuse?

4.8 4 To vote
Article review
Subscribe now
Subscribe to
guest
May be your real name or a pseudonym
Not required
0 comments
Inline feedback
See all comments
CommonSenseTV
Dutch NL English EN French FR German DE Spanish ES
0
What is your response to this?x
()
x