Connect with us

Technology

At social media hearing, lawmakers circle algorithm-focused Section 230 reform

Published

on

Rather than a CEO-slamming sound bite free-for-all, Tuesday’s big tech hearing on algorithms aimed for more of a listening session vibe — and in that sense it mostly succeeded.

The hearing centered on testimony from the policy leads at Facebook, YouTube and Twitter rather than the chief executives of those companies for a change. The resulting few hours didn’t offer any massive revelations but was still probably more productive than squeezing some of the world’s most powerful men for their commitments to “get back to you on that.”

In the hearing, lawmakers bemoaned social media echo chambers and the ways that the algorithms pumping content through platforms are capable of completely reshaping human behavior.

“… This advanced technology is harnessed into algorithms designed to attract our time and attention on social media, and the results can be harmful to our kids’ attention spans, to the quality of our public discourse, to our public health, and even to our democracy itself,” said Sen. Chris Coons (D-DE), chair of the Senate Judiciary’s subcommittee on privacy and tech, which held the hearing.

Coons struck a cooperative note, observing that algorithms drive innovation but that their dark side comes with considerable costs

None of this is new, of course. But Congress is crawling closer to solutions, one repetitive tech hearing at a time. The Tuesday hearing highlighted some zones of bipartisan agreement that could determine the chances of a tech reform bill passing the Senate, which is narrowly controlled by Democrats. Coons expressed optimism that a “broadly bipartisan solution” could be reached.

What would that look like? Probably changes to Section 230 of the Communications Decency Act, which we’ve written about extensively over the years. That law protects social media companies from liability for user-created content and it’s been a major nexus of tech regulation talk, both in the newly Democratic Senate under Biden and the previous Republican-led Senate that took its cues from Trump.

Lauren Culbertson, head of U.S. public policy at Twitter Inc., speaks remotely during a Senate Judiciary Subcommittee hearing in Washington, D.C., U.S., on Tuesday, April 27, 2021. Image Credits: Al Drago/Bloomberg / Getty Images

A broken business model

In the hearing, lawmakers pointed to flaws inherent to how major social media companies make money as the heart of the problem. Rather than criticizing companies for specific failings, they mostly focused on the core business model from which social media’s many ills spring forth.

“I think it’s very important for us to push back on the idea that really complicated, qualitative problems have easy quantitative solutions,” Sen. Ben Sasse (R-NE) said. He argued that because social media companies make money by keeping users hooked to their products, any real solution would have to upend that business model altogether.

“The business model of these companies is addiction,” Sen. Josh Hawley (R-MO) echoed, calling social media an “attention treadmill” by design.

Ex-Googler and frequent tech critic Tristan Harris didn’t mince words about how tech companies talk around that central design tenet in his own testimony. “It’s almost like listening to a hostage in a hostage video,” Harris said, likening the engagement-seeking business model to a gun just offstage.

Spotlight on Section 230

One big way lawmakers propose to disrupt those deeply entrenched incentives? Adding algorithm-focused exceptions to the Section 230 protections that social media companies enjoy. A few bills floating around take that approach.

One bill introduced in 2020 from Sen. John Kennedy (R-LA) and Reps. Paul Gosar (R-AZ) and Tulsi Gabbard (D-HI) would require platforms with 10 million or more users to obtain consent before serving users content based on their behavior or demographic data if they want to keep Section 230 protections. The idea is to revoke 230 immunity from platforms that boost engagement by “funneling information to users that polarizes their views” unless a user specifically opts in.

In another bill, the Protecting Americans from Dangerous Algorithms Act, Reps. Anna Eshoo (D-CA) and Tom Malinowski (D-NJ) propose suspending Section 230 protections and making companies liable “if their algorithms amplify misinformation that leads to offline violence.” That bill would amend Section 230 to reference existing civil rights laws.

Section 230’s defenders argue that any insufficiently targeted changes to the law could disrupt the modern internet as we know it, resulting in cascading negative impacts well beyond the intended scope of reform efforts. An outright repeal of the law is almost certainly off the table, but even small tweaks could completely realign internet businesses, for better or worse.

During the hearing, Hawley made a broader suggestion for companies that use algorithms to chase profits. “Why shouldn’t we just remove Section 230 protection from any platform that engages in behavioral advertising or algorithmic amplification?” he asked, adding that he wasn’t opposed to an outright repeal of the law.

Sen. Amy Klobuchar (D-MN), who leads the Senate’s antitrust subcommittee, connected the algorithmic concerns to anti-competitive behavior in the tech industry. “If you have a company that buys out everyone from under them … we’re never going to know if they could have developed the bells and whistles to help us with misinformation because there is no competition,” Klobuchar said.

Subcommittee members Klobuchar and Sen. Mazie Hirono (D-HI) have their own major Section 230 reform bill, the Safe Tech Act, but that legislation is less concerned with algorithms than ads and paid content.

At least one more major bill looking at Section 230 through the lens of algorithms is still on the way. Prominent Big Tech critic House Rep. David Cicilline (D-RI) is due to propose a Section 230 bill soon that could suspend liability protections for companies that rely on algorithms to boost engagement and line their pockets.

“That’s a very complicated algorithm that is designed to maximize engagement to drive up advertising prices to produce greater profits for the company,” Cicilline told Axios last month. ” … That’s a set of business decisions for which, it might be quite easy to argue, that a company should be liable for.”

Source link