chevron-down Created with Sketch Beta.
March 06, 2023 Law and National Security

High court case centers on algorithms, behavior, expert says

The U.S. Supreme Court in recent days has heard arguments in Gonzalez v. Google, its first case concerning Section 230 of the Communications Decency Act. At the center of the case is whether Section 230 should continue providing immunity to internet platforms ─ such as YouTube, Facebook, Twitter, Instagram and others ─ from potential liability arising from third-party content posted on the platforms, mainly through the providers’ use of algorithms.

A Supreme Court ruling on Section 230 of the Communications Decency Act may result in new social media regulation, says professor Joshua Tucker.

A Supreme Court ruling on Section 230 of the Communications Decency Act may result in new social media regulation, says professor Joshua Tucker.

Oral arguments offer some insight into the issues that concern the justices, such as whether algorithms create content. Either way, a ruling that curtails Section 230’s immunity could change the kind of content that users see. 

An often overlooked question, though, is whether algorithms are effective in affecting user behavior, said Joshua Tucker, professor of politics at New York  University and a recent guest on National Security Law Today, a podcast sponsored by the ABA Standing Committee on National Security Law.

Tucker, who’s done extensive research on voter influence through social media and how broad public opinions are formed, also serves as co-director of NYU’s Center for Social Media and Politics, and is the director of NYU’s Jordan Center for Advanced Study of Russia.

“There are lots of anecdotes, lots of stories about people starting off on a platform and ending up in particular places,” Tucker said. “We looked at whether or not YouTube’s algorithm pushed people to extreme political content.” He added that since YouTube does not share its algorithm data with researchers, the team recruited subjects and used a web browsing tracker to see exactly what YouTube was recommending to them.

The results showed that users often start looking at moderate content and end up getting sucked into extremist content, but that it happened only about 3% of the time. “It does not seem to be, from our research anyway, the dominant experience of what happens to people on the platform. So I think this is a really important thing when we think about these foreign influence attempts.” Instead, it seems people get their information from varied sources, he said.

How the court decides the Section 230 question may result in a new era of social media regulation, he added.

Related links: