Section 230 of the Communications Decency Act shields interactive computer services from liability arising from inappropriate or illegal content published by its users as long as said service moderates the content in good faith:
(1) Treatment of publisher or speaker
No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.
(2) Civil liability
No provider or user of an interactive computer service shall be held liable on account of—
(A) any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or
(B) any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in paragraph (1).
Without Section 230 protections, companies like Twitter and Facebook would require an army of lawyers and editors constantly monitoring content. Advertising revenue alone would not be enough to cover the costs. As I proposed in the past, social media companies would have to charge users for publishing.
Since I wrote my original post on the subject in 2018, I had some time to think, and my views have evolved.
Section 230 does not need to be abolished — it needs to be revised. We need to clarify the distinction between hosts, content-sharing services, content-discovery services, content-consumption services, discussion boards, and publishers.
Hosts are the easiest to define. A host offers an infrastructure for hosting user content. The user has a great degree of control over the content and how it is published. Hosts do not repurpose or modify user content in any way, though they may offer a mechanism to discover information such as search
At the crudest level, a service like AWS is a host. They offer the equivalent of running a server in your basement. You can host whatever you want on their hardware.
WordPress.com is also a host. They offer a highly customizable platform for publishing. I would put Tumblr in this category as well.
Hosts have limited ways to earn money from their users. They can either charge users directly or have another arrangement, such as asking the user to place an advertising banner amid their content — said banner only relevant to its content. What hosts do not do is mine user-generated content for purposes other than searching and discovery.
Hosts are not publishers. Users are.
Hosts are like landlords. They let you use their property. They do not get in the way of you decorating your apartment how you see fit. They reserve the right to kick you out for illegal, abusive, or inappropriate activity. Landlords are not liable for renters’ behavior, and neither should hosts.
Content-sharing platforms allow users to share and discover content, such as images. Similar to hosts, they do not repurpose user-generated content in any shape or form.
Just like hosts, they have limited ways to monetize their services. They can charge users for advanced services (like high-resolution images). They can advertise directly to users, similar to how I described advertising by hosts above — advertising must be relevant only to the surrounding content.
Flickr.com is one example of such a platform. Flickr does not repurpose content.
Vimeo is a video sharing platform — the users that generate this content have full control over where, when, and how it appears.
YouTube repurposes user content — they modify user videos to insert advertising; the moment they do that, they become publishers.
Search engines are content-discovery services.
As long as search engines do not repurpose content that their users discover, they do not need to be held liable for it. The moment a search engine repurposes the content, they become a publisher.
For example, Google News is a news search engine that also repurposes content. They should be treated as a publisher.
Content-consumption platforms include products like RSS feed readers, and news aggregators like Apple News, that do not repurpose aggregated content for anything other than summarizing and delivering it to the user.
Some aggregators have a curated news section. By curating content, the aggregator app repurposes it. In that case, the curator acts as a publisher and is liable for the content they curate.
Discussion boards are lightly moderated forums where users discuss a related set of interests and topics. Examples include old-school BBS systems, Usenet Newsgroups, email lists, hosted phpBB bulletin boards, Discord, IRC, Telegram, Signal, etc.
Additionally, I would classify the comments section of a newspaper as a discussion board.
In this case, the spirit of the original Section 230 protections should apply. As long as there is good faith moderation, the party responsible for this message board should not be held liable for the users’ content.
A publisher repurposes user-generated content. The user gives up most of their rights to control where, when, and how their content shows up, and what their content is used for.
New York Times is a publisher — their journalists are users who produce content, whereas their editors repurpose it. The journalists have little say over where and how their content shows up.
Medium is a publisher as well. They limit ways in which users can customize the look and placement of the content they generate. Medium editors pick and choose featured content and control discovery mechanisms.
Youtube, Facebook, and Twitter are most certainly publishers. They do not give users any mechanism to customize when, where, and how their content appears to others. They also repurpose the content for purposes other than displaying it to other users.
A publisher or any internet service that repurposes user-generated content for motives other than display and discovery must most certainly be held liable for the content they propagate.
Where do social networks fit?
I propose a simple rule:
Does the internet service repurpose user-generated content for motives other than display and discovery?
If the answer is yes, then the service should be considered a publisher and therefore held liable for the content they propagate.
Consider Facebook as a case study.
Facebook offers a free service to users. Users generate content, which Facebook collects. Facebook repurposes user-generated content to track users and show them personalized ads in places other than Facebook itself. Facebook’s terms of service state as much:
Specifically, when you share, post, or upload content that is covered by intellectual property rights on or in connection with our Products, you grant us a non-exclusive, transferable, sub-licensable, royalty-free, and worldwide license to host, use, distribute, modify, run, copy, publicly perform or display, translate, and create derivative works of your content (consistent with your privacy and application settings).
Based on the proposed rule above, Facebook should be held liable for the content they host, use, distribute, modify, run, copy, publicly perform or display, translate, and create derivative works of.
To avoid liability, Facebook would have to modify how they operate:
- Advertising should only be pertinent to the content near which ads are displayed and should not track a user from place to place,
- User-generated content should not be repurposed (i.e. Facebook may not use, distribute, modify, run, copy, display, translate, or otherwise create any derivative works of user-generated content), and
- Algorithms may not decide when, where and how user-generated content is displayed