Preferences

This is an unpopular opinion here, but I think in general the whole "immunity for third-party content" thing in 230 was a big mistake overall. If you're a web site that exercises editorial control over the content you publish (such as moderating, manually curating, algorithmically curating, demoting or promoting individual contents, and so on), then you have already shown that you are the ones controlling the content that gets published, not end users. So you should take responsibility for what you publish. You shouldn't be able to hide behind "But it was a third party end user who gave it to me!" You've shown (by your moderation practices) that you are the final say in what gets posted, not your users. So you should stand behind the content that you are specifically allowing.

If a web site makes a good faith effort to moderate things away that could get them in trouble, then they shouldn't get in trouble. And if they have a policy of not moderating or curating, then they should be treated like a dumb pipe, like an ISP. They shouldn't be able to have their cake (exercise editorial control) and eat it too (enjoy liability protection over what they publish).


Moderating and ranking content is distinct from editorial control. Editorial control refers to editing the actual contents of posts. Sites that exercise editorial control are liable for their edits. For instance if a user posts "Joe Smith is not a criminal" and the website operators delete the word "not", then the company can be held liable for defaming Joe Smith. https://en.wikipedia.org/wiki/Section_230#Application_and_li...
I’d go farther and say that any content presented to the public should be exempt from protection. If it’s between individuals (like email) then the email provider is a dumb pipe. If it’s a post on a public website the owner of the site should be ultimately responsible for it. Yes that means reviewing everything on your site before publishing it. This is what publishers in the age of print have always had to do.
I don't want a law that requires a gatekeeper for communication between members of the public.
Then stand up your own website.

If you're using someone else to do it, they should have a say in what is published under their name, and some responsibility for it.

The thing that's missing is the difference between a unsolicited external content (ie. pay-for-play stuff) and directly user-supplied content.

If you're doing editorial decisions, you should be treated like a syndicator. Yep, that means vetting the ads you show, paid propaganda that you accept to publish, and generally having legal and financial liability for the outcomes.

User-supplied content needs moderation too, but with them you have to apply different standards. Prefiltering what someone else can post on your platform makes you a censor. You have to do some to prevent your system from becoming a Nazi bar or an abuse demo reel, but beyond that the users themselves should be allowed to say what they want to see and in what order of preference. Section 230 needs to protect the latter.

The thing I would have liked to see long time ago is for the platforms / syndicators to have obligation to notify their users who have been subjected to any kind of influence operations. Whether that's political pestering, black propaganda or even out-and-out "classic" advertising campaign, should make no difference.

This item has no comments currently.

Keyboard Shortcuts

Story Lists

j
Next story
k
Previous story
Shift+j
Last story
Shift+k
First story
o Enter
Go to story URL
c
Go to comments
u
Go to author

Navigation

Shift+t
Go to top stories
Shift+n
Go to new stories
Shift+b
Go to best stories
Shift+a
Go to Ask HN
Shift+s
Go to Show HN

Miscellaneous

?
Show this modal