The states, not Washington, are where tech regulation happens for now, thanks to a deadlocked Congress.
What’s happening: Statehouses are drawing money and attention from tech firms and advocates hoping to influence laws on everything from privacy to digital taxes to driverless cars — and now online speech.
Driving the news: A former Facebook policy executive offers state lawmakers detailed guidance for crafting tech regulations that could effectively reduce harms and withstand legal challenge in a new report shared first with Axios.
- Matt Perault, former head of global policy development at Facebook’s parent Meta, now director of the University of North Carolina’s Center on Technology Policy, launched the new guide Thursday.
- Co-authored by J. Scott Babwah Brennen, head of online expression at the center, the guide aims to provide state lawmakers with workable ways to regulate online content.
- The authors said they were inspired to write it after talking to Democratic Virginia state delegate Wendy Gooditis, who asked them for feedback on crafting better social media and content moderation bills.
What they’re saying: “The federal government talks a lot about reform, but states are actually doing it,” Perault told Axios. “States have been successful in areas of tech reform like privacy. They have been significantly less successful in content regulation.”
- He added that the report tries to take both Republican and Democratic concerns about online content into consideration seriously: “Both sides of the aisle have concerns that are worth trying to deliver on to some extent.”
What’s happening: Bills introduced in states including Ohio, Alabama and Tennessee have attempted to prohibit companies from removing users’ legal speech. Other states’ bills seek to prohibit algorithmic curation or create transparency requirements.
- There’s been an exponential increase in the scope of state legislative activity on such issues, Mark Brennan, a partner at Hogan Lovells who counsels clients on tech regulation, told Axios.
- But no one is getting it right, Brennan said: “Frankly, states have a pretty bad track record in putting together regulation in this space.”
Context: Section 230 of the Communications Decency Act, which largely protects online platforms from being liable for what people post, pre-empts state laws. And some attempts to regulate content have run afoul of the First Amendment.
- One high-profile attempt from Florida Gov. Ron DeSantis to keep social media platforms from banning users was blocked in federal court.
Details: Perault’s report breaks down 13 recommendations into three categories: understanding online content moderation, strengthening enforcement against problematic content, and increasing investment in local news, public institutions and media literacy.
- Perault said the report is meant to be a menu of options for state legislators and state attorneys general to consider if they hope to improve online discourse.
What they’re saying: “It’s not possible for anyone to address content moderation issues in a way that will make everyone happy,” but the report’s principles are a good place to start, Evelyn Douek, senior research fellow at the Knight First Amendment Institute at Columbia University, told Axios.
Between the lines: UNC’s tech policy center, which published the report, gets funding from companies and foundations including Amazon, Apple, Google, the Charles Koch Foundation, Meta, Microsoft, TikTok and Zoom.
- Perault said he spoke to a wide range of companies and content moderation experts for feedback on the report, but that “funding doesn’t buy output.”
The big picture: Tech companies recognize the states are going to be more active than Congress and have increasingly lobbied state houses to pass bills that critics have called weak, recently around privacy.
The other side: Tech critics who have pushed for Section 230 reform at the federal level have argued that updating the law is the only path to real change.
- Different priorities among Republicans and Democrats, along with general congressional dysfunction, have stymied broad federal regulations.
The bottom line: “So much of all content moderation discourse and proposed regulation is rhetoric and grandstanding,” Douek said.
- “But that doesn’t have to be true … there’s meaningful, good-faith steps that could come from state lawmakers that could help advance the ball in our understanding of tech platforms.”