TikTok must face a lawsuit for recommending the viral ‘blackout challenge’

TikTok must face a lawsuit for recommending the viral ‘blackout challenge’

TikTok’s algorithmic recommendations on the For You Page (FYP) constitute the platform’s own speech, according to the Third Circuit court of appeals. That means it’s something TikTok can be held accountable for in court. Tech platforms are typically protected by a legal shield known as Section 230, which prevents them from being sued over their users’ posts, and a lower court had initially dismissed the suit on those grounds.

But the appeals court said the speech at issue is TikTok’s own and sent the case back to the lower court to reconsider. It will be up to that court to determine if TikTok can be held responsible in this particular case, where it faces charges including strict products liability and negligence.

The ruling is particularly significant because it shows one area where courts may find the limits of Section 230 immunity. In July, the Supreme Court issued a ruling in a case known as Moody v. NetChoice, over Texas and Florida’s social media laws. In their decision, the justices provided a guide to how lower courts could determine what kinds of actions by social media platforms could be considered First Amendment-protected speech. The justices included content moderation and curation in that bucket.

The ruling is particularly significant because it shows one area where courts may find the limits of Section 230 immunity

But SCOTUS did not weigh in on “algorithms [that] respond solely to how users act online,” and since the Third Circuit believes TikTok’s algorithm falls into this category in this case, the judges said that its content recommendations to specific users qualify as TikTok’s “own first-party speech.” Section 230 only protects online platforms from being held liable for how they deal with third-party speech, like for hosting their users’ posts (or choosing to remove them).

See also  Social networks can’t be forced to filter content for kids, says judge

The Third Circuit’s opinion draws on Moody in its explanation of why TikTok should have to face a lawsuit from the mother of 10-year-old Nylah Anderson, who “unintentionally hanged herself” after watching videos of the so-called blackout challenge on her algorithmically curated FYP. The “challenge,” according to the suit, encouraged viewers to “choke themselves until passing out.”

“Given the Supreme Court’s observations that platforms engage in protected first-party speech under the First Amendment when they curate compilations of others’ content via their expressive algorithms, it follows that doing so amounts to first-party speech under [Section] 230, too,” Third Circuit Judge Patty Shwartz wrote in the opinion of the court. TikTok did not immediately respond to a request for comment.

Had Anderson searched for the blackout challenge on TikTok, Shwartz wrote in the court’s opinion, “then TikTok may be viewed more like a repository of third-party content than an affirmative promoter of such content.” The judges said they reached their conclusion “specifically because TikTok’s promotion of a Blackout Challenge video on Nylah’s FYP was not contingent upon any specific user input.”

The judges said that the algorithm that determines what shows up on a user’s FYP decides what third-party speech to include or not in its compilation and then organizes the videos it chooses to show. “Accordingly, TikTok’s algorithm, which recommended the Blackout Challenge to Nylah on her FYP, was TikTok’s own ‘expressive activity,’ … and thus its first-party speech,” the opinion says.

Source link

Technology