Intermediary downstream liability protection as a common carrier, w/o Section 230?

Do court rulings from the 1990s actually protect intermediaries from hosting speech they don't moderate?

OK, considering my previous post, and given McConnell's political antics Tuesday' (Lauren Feiner et al on CNBC), we come back to a question, mainly about the first prong of Section 230. Do we reallyneed it to protect a user's ability to post lawful speech online, if we consider only hosted sites with no moderation?

Others have suggested, for example, giving webhosts 'common carrier' immunity similar to telecom companies, provided they identify their users first, don't moderate, but enforce reasonable AUP's and take action only when others report violations (generally concerning unlawful content) to them.

Do they have this kind of immunity already?

My post here Dec. 3 links to a review of Jeff Kosseff's The 26 Words that Created the Internet. An early section of the book covers some old cases before 230 was passed that speak to this question.

In 1959, in a case Smith v. California, the Supreme Court ruled that a state cannot hold a bookseller responsible for unlawful content (in this case, alleged obscenity) in a book it sells if it does now know about it and does not have a reasonable opportunity to know given the size of the operations. Requiring that would reduce the amount of material available to the public, and the statue was vague as to how a proprietor could comply (MTSU analysis). The California law was held to violate the First Amendment as applied to the state by the incorporation doctrine.

In 1991, in Cubby v Compuserve, the service had hosted a rumor bulletin board under a journalism forum. When a rumor led to a lawsuit, the southern District of New York dismissed the case and considered that Compuserve had no knowledge of the defamatory post. The court held that the service was more like a distributor than a publisher (dmlp analysis).

Still, this is not a complete analogy to telephone service (nor is it a complete analogy to a modern website offering intellectual content as well as commerce for sale). The statements on the forum were much more likely to become known to a significant part of the public. On the other hand, defamation law often regards an assertion as 'published' if one other party understands it (as in a phone call). In the motion picture industry, most content is offered to the public by 'distributors' which are separate from the production companies. The distributor usually 'owns' the copyright, in effect. Typically the distributor (like Netflix) accepts responsibility for the legal risks for the content. Because the volume of offerings is relatively small (with no user generated content), it can afford to do so. At the time, Compuserve probably did not have the volume of potential readership that even a private owned website (like mine) would have today. I do recall that when a company I was working for got bought in 1994, an employee found out at work from a Compuserve forum.

In 1994, Prodigy, which was offering (somewhat clownish) packaged content to dial-up users similar to AOL's, did host bulletin boards, and was sued for a post accusing someone essentially of potential securities fraud. The case was Stratton Oakmont v. Prodigy. A New York State court held, in 1995, that Prodigy was indeed a publisher because it pre-screened content for its guidelines. The final disposition was settled out of court before appeal (dmlp)

That case helps set up a moderator's dilemma, which leads to the reason that Section 230 protects good-faith moderation, and doesn't even demand political neutrality.

'

The overall impression of these three cases, although the last two don�t come from SCOTUS or even an appeals court, is that courts have a concept of �publication� as separate from �hosting� or �distribution� or �facilitation� (etc) but do not necessarily consider the scale of the communication, the number of people reached and the likelihood that bad actors could be incentivized.� There is pretty much a �responsibility for your own acts� doctrine until there is moderation, without 230.

Generally, webhosts do not become concerned with their users' content until they receive complaints (about illegal content or some AUP violations although they will sometimes screen for violations like generating spam or malware, but that is likely to be perceived as an essential machination that does not involved editorial judgment on subject matter). This suddenly became a problem after Charlottesville, when several extreme right websites were removed and even denied domain name hosting. In one extreme case, the social network Gab was denied hosting for a while in 2018 after one of its members committed an atrocity on a synagogue in Pittsburgh. Gab could not have known about the specific threat, but it was viewed as intentionally attracting radicalized individuals deplatformed by larger services (Twitter). Today Donald Trump is conspicuous on Gab.

There is very little experience with federal courts on how they would view the 'publisher v distributor' problem without Section 230, even of the lower court's handling of Cubby sounds reasonable, even convincing. Particularly lacking is much guidance on the likelihood of content's scaling in the reach to users; in the early 1990s this scale did not exist.' The First Amendment does not imply that a private company has to host your speech. But it gets more interesting if you look at a webhost, for example, as an infrastructural utility (like a phone company) which then should be neutral if it does not moderate. But it gets to be an open question if the unprecedented scale and effect of even one person's speech is taken into consideration. Defamation law, as written now, does not seem to take scale into effect (for example, the likelihood of radicalization or inciting threats or behavior like doxing).

It's again relevant to consider the idea, circulated in the early 2000's, that individual blogging was considered a threat to campaign finance and fair elections, until the FEC decided around 2006, not to worry about it. Indeed, part of the recent epidemic of social media censorship has been centered around social media companies well-founded fears of feeding conspiracy theories into the election, or baseless claims following it.

If you accept the idea that the Compuserve case protects intermediaries when they don't moderate based on one lower court decision maybe that's all you need for personal sites and chatrooms like Gab to exist. The only liability would be the speaker. Apparently McConnell and other conservatives believe this. But it's dangerous to do so that it doesn't consider scale. You need to decide whether you want to protect unmonitored speech in a forum that allows instant self-publication with global reach. We have gotten used to expecting as a right, but never really have established that fact as a matter of law.

(Posted: Wednesday, December 30, 2020 at 12:15 PM EST)