Dries Buytaert

Section 230: repeal, reform or reprioritize?

Today, the U.S. Congress and big tech companies continued the debate about Section 230 of the 1996 Communications Decency Act.

Put simply, Section 230 provides websites immunity from liability from third-party content. This internet legislation is a double-edged sword. On the one hand it has allowed the dangerous spread of misinformation on social media. On the other hand it has helped the internet thrive.

If I write something untrue and damaging about you on Facebook, you might be able to sue me, but you can't sue Facebook. As a result, social media companies don't really care what is said on their platforms. Their immunity is a big reason why fake news, hate speech and misinformation has been able to spread uncontrollably.

At the same time, Section 230 makes it possible for bloggers to host comments from their readers, for Open Source communities to work together online, and for YouTubers to share videos. Section 230 enables people to share, innovate and collaborate. It has empowered a lot of good.

President Biden has suggested revoking Section 230. Other policy makers would like to reform Section 230. Either revoking or modifying Section 230 could have a big impact on any organization that hosts online content.

Hosting companies could be impacted, but also bloggers and Open Source communities. Having to police all content could quickly become unsustainable, especially for individuals and small organizations. People publish so much new content every day!

As Katie Jordan, the Director of Public Policy and Technology for the Internet Society said, If cloud providers get wrapped up in this conversation about pulling back intermediary liability protection, then by default, they're going to have to reduce privacy and security practices because they'll have to look at the content they're storing for you, to know if they're breaking the law..

A wholesale repeal of Section 230 seems too far reaching to me. It could cause more harm than good. A careful reform seems more appropriate.

Instead of being so focused on Section 230, I'd start by regulating search and social media algorithms. Hosting content is one thing, but recommending content to millions of people is another. When search and social media companies reach billions of people, their content recommendation algorithms can sway public sentiment, introduce bias or rapidly spread misinformation. We should start there.

I've said in the past that we need an FDA for large-scale algorithms that impact society. Just as the FDA ensures that pharmaceutical companies aren't lying about the claims they make about their drugs, there should be a similar regulator for large-scale software algorithms. For example, we need some level of guarantee that companies like Google, Twitter and Facebook won't intentionally (or unintentionally) manipulate search results to shape the public opinion.

— Dries Buytaert