Supreme Court Deliberates on Social Media Platforms' First Amendment Rights: A Critical Legal Battle

Misinformation is spreading on social media as some fight to stop what they call censorship. The Supreme Court is now grappling with how the First Amendment applies to the online world.

Supreme Court Deliberates on Social Media Platforms' First Amendment Rights: A Critical Legal Battle
entertainment
25 Mar 2024, 03:04 AM
twitter icon sharing
facebook icon sharing
instagram icon sharing
youtube icon sharing
telegram icon sharing
icon sharing
Big Tech and Supreme Court

As big tech firms wrestle with how to keep false and harmful information off their social networks, the Supreme Court is wrestling with whether platforms like Facebook and Twitter, now called X, have the right to decide what users can say on their sites. 

The dispute centers on a pair of laws passed in the red states of Florida and Texas over the question of First Amendment rights on the internet. The Supreme Court is considering whether the platforms are like newspapers, which have free speech rights to make their own editorial decisions, or if they're more like telephone companies, that merely transmit everyone's speech.

If the laws are upheld, the platforms could be forced to carry hate speech, and false medical information, the very content most big tech companies have spent years trying to remove through teams of content moderators. But in the process, conservatives claim that the companies have engaged in a conspiracy to suppress their speech.

As in this case: a tweet in 2022 from Congresswoman Marjorie Taylor Greene falsely claiming that there were…
"Extremely high amounts of COVID vaccine deaths."

Twitter eventually banned Greene's personal account for "multiple violations" of its COVID policy.

In addition to Facebook and YouTube, posts considered "misinformation" have also been removed or labeled by the platforms.

Critics, particularly conservatives like Congressman Jim Jordan, have accused social media companies of censoring their viewpoints. As a result, and due to financial considerations, these platforms have started reducing their fact-checking teams.

As a consequence, social media today is flooded with misinformation. For instance, there are posts circulating suggesting tanks are crossing the Texas-Mexico border, when in reality, the footage is from Chile.

Furthermore, there has been a rise in AI-generated images spreading misinformation, creating confusion among users.

As social media moderation teams dwindle in size, a new group under scrutiny is academic researchers focusing on misinformation. These researchers had collaborated closely with the platforms following the discovery of Russian interference in the 2016 election.

During an interview, Lesley Stahl raised the question of whether researchers are now facing obstacles in their work.

Kate Starbird, a professor at the University of Washington and a former professional basketball player, leads a misinformation research group established in preparation for the 2020 election.

Starbird explained that their focus was on identifying misinformation related to election processes, procedures, and results. They would alert the platforms if they came across content that violated their policies, such as a tweet from November 2020 claiming election software in Michigan had manipulated votes from Trump to Biden.

The warning label was added to the flagged content by Twitter after researchers raised concerns about its credibility.

During an interview, researcher Kate Starbird revealed that she had received a death threat as a result of her work. She explained that some threats are meant to intimidate and create fear, making it difficult to distinguish genuine concerns from mere attempts to sow doubt.

Starbird also highlighted the campaign to discredit researchers like her, aimed at undermining their credibility and casting doubt on their findings.

When asked about the spread of misinformation, Starbird confirmed that research indicated a higher prevalence of false information among supporters of Donald Trump and conservatives, particularly during the 2020 election and the events of January 6th.

She pointed out that individuals who stormed the Capitol were influenced by false claims and misinformation, emphasizing the role of such narratives in inciting their actions.

The House Judiciary Committee, chaired by Ohio Republican Congressman Jim Jordan, plays a key role in addressing such issues.

Interview with Rep. Jim Jordan on Misinformation and First Amendment

Lesley Stahl: So how big a problem is mis and disinformation on the web?

Rep. Jim Jordan: Well, I'm sure there's some. But I think, you know-- our concern is the bigger problem of the attack on First Amendment liberties. 

But Congressman Jordan says her group unfairly flagged posts like this tweet by Newt Gingrich: 

"Pennsylvania democrats are methodically changing the rules so they can steal the election" 

He complains that government officials put pressure on social media companies directly –

Rep. Jim Jordan: A great example, 36 hours into the Biden administration, the-- the Biden White House sends-- a email to Twitter and says, "We think you should take down this tweet ASAP." 

Government Influence on Social Media Content

Just a call alone from the government, he says, can be unnerving. 

Rep. Jim Jordan: You can't have the government say, "Hey, we want you to do X," government who has the ability to regulate these private companies, government which has the ability to tax these private companies.

He says that White House email to Twitter involved a tweet from…

Rep. Jim Jordan: Robert F. Kennedy Jr. and everything in the tweet was true.

That tweet implied falsely that baseball legend Hank Aaron's death was caused by the COVID vaccine. 

Lesley Stahl: Did they take it down?

Rep. Jim Jordan: Turned out they didn't. Thank goodness. 

And that post is still up.

Kate Starbird says the social media platforms also often ignored the researchers' suggestions.

Kate Starbird: The statistics I've seen are just for the Twitter platform. But I-- my understanding is-- is that they've responded to about 30% of the things that we sent them. And I think the-- on the majority of those, they put labels. 

Lesley Stahl: But just a third.

Kate Starbird: Just a third, yeah.

Lesley Stahl: And do you suspect that Facebook was the same? 

Kate Starbird: Oh, yeah.

Katie Harbath: These platforms have their own First Amendment rights. 

Katie Harbath spent a decade at Facebook where she helped develop its policies around election misinformation. When she was there, she says it was not unusual for the government to ask Facebook to remove content, which is proper, as long as the government is not coercing.

Katie Harbath and Lesley Stahl Discuss Social Media Content Removal

Katie Harbath, a representative of social media platforms, refuted claims by conservatives that the platforms were taking down content at the government's request. She emphasized that the platforms made their own decisions and often pushed back against government influence.

During an interview with Lesley Stahl, the topic of a specific case involving Nancy Pelosi was brought up. The case referred to a doctored video of Pelosi posted on Facebook in 2019, where her speech appeared slurred due to editing.

When asked if the video was taken down, Harbath confirmed that it remained on the platform as it did not violate their policies. Stahl questioned whether Pelosi pressured the company to remove the video, to which Harbath acknowledged that Pelosi was displeased and the incident strained the company's relationship with her.

Recently, the conservatives' attempt to limit government influence on social media suffered a setback in the Supreme Court. The justices seemed inclined to reject the effort, raising questions about the platforms' rights to control content on their sites.

Congressman Jordan has argued against the removal of what tech companies label as "misinformation," sparking further debate on content moderation.

The court's decision on whether social media platforms have a First Amendment right similar to news organizations will have significant implications on content moderation and government influence in the digital sphere.

Rep. Jim Jordan: I believe in letting the American people, respecting their common sense, to determine what's accurate and what isn't.

Lesley Stahl: What about the claim that the 2020 election was stolen? Do you think these companies should allow people to express that and let individuals decide for themselves?

Rep. Jim Jordan: I trust that the American people are intelligent. I have not personally made that statement. What I have mentioned are the concerns raised about the 2020 election. I believe Americans share those concerns. 

Lesley Stahl: No, they don't--

Rep. Jim Jordan: You don't think they have concerns about the 2020 election?

Lesley Stahl: Most people accept the result without question. That's all I'm saying. They don't doubt whether--

Rep. Jim Jordan: Fair enough.

Lesley Stahl: Biden won or not. Right? Right? Most people don't doubt

Rep. Jim Jordan: Oh, OK. No--

Lesley Stahl: The outcome.

Rep. Jim Jordan: Right.

X essentially implemented what Jordan suggests. Following Elon Musk's takeover in 2022, a large number of fact-checkers were dismissed. Consequently, the platform is now filled with misinformation and falsehoods. Unbeknownst to many, this – purportedly a video from Gaza -- is actually from a video game. Subsequently, X users added a warning label.

Reframing Reality: The Impact of Misinformation on Democracies

Amidst the chaos of conflicting narratives, the line between truth and falsehood blurs. Darrell West, a senior fellow of technology innovation at the Brookings Institution, highlights the critical need to address the repercussions of misinformation on our societies.

West warns of the global threat posed by disinformation, emphasizing its potential to sway elections and undermine democratic processes. With half of the world's population participating in elections this year, the stakes are high.

Within the U.S., West points to a concerning trend of deliberate misinformation campaigns, particularly from right-wing factions seeking to sow confusion among the public. Moreover, he expresses alarm over attempts to suppress academic researchers who strive to combat misinformation.

Lesley Stahl raises the issue of whether targeting these researchers constitutes harassment, suggesting a broader agenda to stifle valuable research efforts.

Recreated News

Rep. Jim Jordan: I find it interesting that you use the word "chill," because in-- in effect, what they're doing is chilling First Amendment free speech rights. When, when they're working in an effort to censor Americans, that's a chilling impact on speech.

Lesley Stahl: They say what you're doing, they do, is a violation of their First Amendment right.

Rep. Jim Jordan: So us pointing out, us doing our constitutional duty of oversight of the executive branch-- and somehow w-- (LAUGH) we're censoring? That makes no sense.

Lesley Stahl: We Americans, we're looking at the same thing and seeing a different truth.

Rep. Jim Jordan: We might see different things, I don't-- I don't think you can see a different truth, because truth is truth.

Lesley Stahl: Okay. The-- the researchers say they're being chilled. That's their truth.

Rep. Jim Jordan: Yeah.

Lesley Stahl: You're saying they're not. So what's the truth? 

Rep. Jim Jordan: They can do their research. God bless em', do all the research you want. Don't say we think this particular tweet is not true-- and-- or-- or--"

Lesley Stahl: Well, that's their First Amendment right to say that. 

Rep. Jim Jordan: Well, they can say it, but they can't take it down.

Lesley Stahl: Well, they can't take it down and they don't. They just send their information to the companies. 

Rep. Jim Jordan: But when they're coordinating with government, that's a different animal.

Lesley Stahl: In spite of their denials, it is evident that they are coordinating behind the scenes.

We engaged in a back-and-forth discussion.

Produced by Ayesha Siddiqi. Associate producer, Kate Morris. Broadcast associates, Wren Woodson and Aria Een. Edited by Matthew Lev.