People walk past an advertising billboard for YouTube on September 27, 2019 in Berlin, Germany.
Sean Gallup | Getty Images
The Justice Department has warned the Supreme Court against an overly broad interpretation of a law shielding social media companies from liability for what users post on their platforms, a position that undermines on Google protection in a case that could change the role of content moderation on digital platforms.
In a brief filed Wednesday, led by acting Justice Department Solicitor General Brian Fletcher, the agency said the Supreme Court should overturn an appeals court ruling that found Section 230 of the Communications Decency Act protected Google from liability under US counterterrorism laws.
Section 230 allows online platforms to engage in good faith moderation of content while preventing them from being held liable for their users’ posts. Tech platforms argue that this is a critical protection, especially for smaller platforms that could otherwise face costly legal battles, as the nature of social media platforms makes it difficult to quickly catch any harmful post.
But the law has been a hot-button issue in Congress, with lawmakers on both sides of the aisle arguing that the liability shield should be drastically limited. But while many Republicans believe the law’s content moderation permissions should be scaled back to reduce what they say is censorship of conservative voices, many Democrats instead oppose how the law could protect platforms , which host misinformation and hate speech.
The Supreme Court case, known as Gonzalez v. Google, was brought by family members of American citizen Nohemi Gonzalez, who was killed in a 2015 terrorist attack claimed by ISIS. The suit alleges that Google’s YouTube did not adequately stop ISIS from distributing content on the video-sharing site to aid its propaganda and recruitment efforts.
The plaintiffs brought charges against Google under the Anti-Terrorism Act of 1990, which allows US citizens injured by terrorism to seek compensation. The law was updated in 2016 to add secondary civil liability to “any person who aids and abets by knowingly providing substantial assistance” to an “act of international terrorism.”
Gonzalez’s family has argued that YouTube has not done enough to prevent ISIS from using its platform to spread its message. They argue that while YouTube has policies against terrorist content, it has failed to adequately monitor the platform or block ISIS from using it.
Both the district and appeals courts agreed that Section 230 protects Google from liability for hosting the content.
While it did not take a position on whether Google should ultimately be found liable, the DOJ recommended that the appeals court’s decision be reversed and sent back to the lower court for further review. The agency argued that while Section 230 would bar plaintiffs’ claims based on YouTube’s alleged failure to block ISIS videos from its site, “the statute does not bar claims based on YouTube’s alleged targeted recommendations of ISIS content.”
The Justice Department argued that the appeals court was correct in finding that Section 230 shielded YouTube from liability for allowing ISIS-affiliated users to post videos because it was not acting as a publisher by editing or creating the videos . But he said allegations of “YouTube’s use of algorithms and related features to recommend ISIS content require a different analysis.” The Justice Department said the appeals court did not adequately address whether the plaintiffs’ claims could warrant liability under that theory, and as a result, the Supreme Court must send the case back to the appeals court to do so.
“Over the years, YouTube has invested in technology, teams and policies to identify and remove extremist content,” Google spokesman Jose Castaneda said in a statement. “We regularly work with law enforcement, other platforms and civil society to share information and best practices. Undermining Section 230 would make it harder, not easier, to fight harmful content — making the Internet less safe and less useful for all of us. “
The Chamber of Progress, an industry group that counts Google as one of its corporate partners, warned that the DOJ report offers a dangerous precedent.
“The attorney general’s position would impede the platforms’ ability to recommend facts instead of lies, help instead of hurt, and empathize instead of hate,” Chamber of Progress CEO Adam Kovacevic said in a statement. “If the Supreme Court rules in Gonzalez, platforms will be unable to recommend help for those considering self-harm, reproductive health information for women considering abortion, and accurate election information for people who want to vote.” This would unleash a flood of lawsuits from trolls and haters unhappy with the platforms’ efforts to create safe, healthy online communities.”
WATCH: The messy business of content moderation on Facebook, Twitter, YouTube
