The U.S. Supreme Court has agreed to hear a challenge of federal protections for internet and social media companies, a move that could significantly shape the future of social media.
Under existing law, social media platforms like Twitter, Facebook, Instagram, and YouTube are protected under Section 230 of the Communications Decency Act, allowing them to escape liability from user-created content hosted on their platforms.
One case, Gonzalez v. Google, is to determine if the media giant’s algorithms are protected from lawsuits, which tech companies use to moderate content without banning it.
The second case, Twitter v. Taamneh, will determine if companies can be sued for aiding and abetting in acts of terrorism. Tech companies are currently shielded from lawsuits relating to the extremist actions of organizations that use the platforms to organize and promote potential illegal activities.
The San Jose Mercury News reports:
By taking up Gonzalez, the court opens up fresh risks for platforms including Google, Meta and Twitter. In that case, the court is expected to decide whether Google can cite Section 230 to avoid liability over its YouTube algorithms having recommended videos that were created by supporters of the terrorist group ISIS. An eventual ruling against Google could expose major parts of the tech giant’s business, not to mention other tech companies that use automatic recommendation engines, to new lawsuits.
In the Twitter case, the justices will review whether hosting generally pro-ISIS content — unrelated to a specific terrorist attack by the organization — may constitute “knowing” and “substantial assistance” to the group in violation of a federal anti-terrorism law, particularly in the face of company policies and efforts to block that material.
Both Republican and Democrat lawmakers have raised concerns over the liability shield – with conservatives pointing out that the platforms’ ability to ban users for their content inhibits their right to free speech.
Democrats, meanwhile, say that platforms aren’t doing enough to clamp down on extremism and have allowed their algorithms to push extremist content to an unsuspecting audience.
The San Francisco-based 9th U.S. Circuit Court of Appeals in 2021 dismissed the lawsuit in a ruling relying largely on another law, known as Section 230 of the Communications Decency Act of 1996.
Section 230, enacted before the rise of today's major social media companies, protects "interactive computer services" by ensuring they cannot be treated as the "publisher or speaker" of any information provided by other users.
Depending on the judgment of the Supreme Court, tech companies could lose the ability to cite Section 230 to avoid lawsuits alleging violations of the US Anti-Terrorism Act.