Politics

Justices skeptical of Tex., Fla. laws that bar platforms from deleting content

A majority of the Supreme Court seemed broadly skeptical Monday that state governments have the power to set rules for how social media platforms curate content, with both liberal and conservative justices inclined to stop Texas and Florida from immediately implementing laws that forbid removing certain controversial posts or political content.

Even as justices expressed concern about the power of social media giants that have become the dominant modern public forum, a majority of the court seemed to think the First Amendment prevents state governments from requiring platforms such as Facebook and YouTube to host certain content.

The high court’s decision in the two cases, likely to come near the end of the term in June, will have a significant impact on the operation of online platforms that are playing an increasingly important role in U.S. elections, democracy and public discussion of the major issues of the day.

The justices were reviewing a challenge from two tech industry associations, whose members include YouTube, Facebook and X, to Texas and Florida laws passed in 2021 in response to concerns from conservatives who said their voices are often censored by the editorial decisions of tech companies.

At issue for the court is whether the First Amendment protects the editorial discretion of large social media platforms or prohibits censorship of unpopular views. Social media posts have the potential to spread extremism and election disinformation, but taking down controversial views can silence discussion of important political issues.

A key question, Chief Justice John G. Roberts Jr. said during almost four hours of argument Monday, is whether the power to decide who can or who cannot speak on a particular platform belongs to the government, or to social media companies.

“The First Amendment restricts what the government can do, and what the government is doing here is saying, you must do this, you must carry these people; you’ve got to explain if you don’t,” said Roberts, a conservative. “That’s not the First Amendment.”

Liberal Justice Sonia Sotomayor also called the Florida and Texas laws problematic: “They are so broad that they stifle speech just on their face,” she said.

But many justices also seemed unpersuaded that the First Amendment protects all aspects or types of digital platforms. Some suggested that sections of the state laws prohibiting the removal of certain content or users could be constitutional as applied to e-commerce and communications sites such as Uber and Gmail.

Justice Clarence Thomas, a critic of liability protections for social media companies, was the most skeptical of the companies’ claims that they are engaging in “editorial discretion” in making choices about content moderation, poking fun at his own longevity on the court in the process.

“I’ve been fortunate or unfortunate to have been here for most of the development of the Internet,” said the justice, who joined the court in 1991 and is its most senior member.

He said the companies’ latest assertion that they were engaged in “editorial discretion” appeared to contradict decades of claims in which they had argued against changes to a provision of the 1996 Communications Decency Act provision — Section 230 — that immunizes the platforms from lawsuits over posts that users share on their services.

In making those arguments, Thomas said, the companies had described their services as “merely a conduit” for those making the posts. On Monday, he continued, they described themselves as engaged in “expressive conduct,” effectively taking on the role of a publisher that would traditionally be liable for the content it hosts.

But NetChoice attorney Paul Clement disputed Thomas’s characterization, focusing instead on the aspect of Section 230 that protects companies from lawsuits over their decisions to remove content from their websites.

He argued that “the whole point” of Section 230 was to allow online platforms to “essentially exercise editorial discretion” in removing harmful content without fear that it would expose them to liability as a publisher of user speech they don’t moderate. If the laws were to take effect, Clement said, platforms would be forced to carry the type of content that Congress was trying to prevent when it drafted Section 230 nearly 30 years ago.

Throughout the marathon arguments, the justices struggled to identify a specific path for resolving the challenges to the state laws. They seemed interested in suggestions from Solicitor General Elizabeth B. Prelogar, representing the Biden administration, who urged the justices to rule narrowly that the laws interfering with content placement decisions are unconstitutional, while leaving open for another day questions about other aspects of the laws.

The Supreme Court decided to take up the issue after two appeals courts issued conflicting rulings. In Florida, a unanimous panel of the U.S. Court of Appeals for the 11th Circuit held that the restrictions of that state’s law probably violate the First Amendment. A divided panel of the U.S. Court of Appeals for the 5th Circuit, however, upheld the Texas law that bars companies from removing posts based on political ideology.

Both opinions were written by appeals court judges nominated by President Donald Trump, underscoring how the issue of free speech online does not always break down along the usual ideological lines.

At its core, the First Amendment protects against government infringement on speech. Courts have also held that the First Amendment protects the right of private companies, including newspapers and broadcasters, to control the speech they publish and disseminate. It also includes the right of editors not to publish something they don’t want to publish.

In the 11th Circuit ruling, Judge Kevin Newsom said social media platforms are distinct from other communications services and utilities that carry data from point A to point B, and their “content-moderation decisions constitute the same sort of editorial judgments” entitled to First Amendment protections when made by a newspaper or other media outlet.

Judge Andrew Oldham of the 5th Circuit ruled the other way, saying social media companies had turned the First Amendment on its head by suggesting that a corporation has an “unenumerated right to muzzle speech” by banning users or removing certain posts.

Oldham distinguished newspapers from social media platforms, which he said were more like “common carriers” such as telephone companies.

Thomas previously made a similar analogy in a 2021 case, suggesting an openness to more regulation of digital platforms, which he said have “concentrated control of so much speech in the hands of a few private parties.”

The tech industry groups, backed by national security officials and researchers, told the Supreme Court that limiting the ability of companies to remove content could allow misinformation and other harmful content to fester online. The Biden administration also sided with the companies, saying the state laws go too far in requiring private companies to present content they deem offensive or objectionable.

“When it comes to disseminating speech, decisions about what messages to include and exclude are for private parties — not the government — to make,” Clement, the attorney for NetChoice and a former solicitor general, told the court in filings ahead of oral argument.

State government officials say regulations are needed to ensure the public has access to diverse sources of information. Florida’s attorney general Ashley Moody told the court in filings that the social media companies are abusing their power over the increasingly dominant forum for public discourse used by hundreds of millions of Americans.

Unlike traditional media, the attorney general’s office said, platforms make their money not from speaking themselves, but from attracting billions of users to their platforms to speak, and therefore are more akin to so-called common carriers or utilities.

Social media platforms are a “crazy-quilt mass of material that is worlds apart from what a newspaper does when it develops its own top-down unified speech product and publishes it,” Moody’s office wrote.

“The telephone company, internet service provider, and delivery company can all be prevented from squelching or discriminating against the speech they carry. And so can the platforms.”

This is a developing story. It will be updated.

This post appeared first on The Washington Post

You may also like