Not gotchas, I’m not arguing for the sake of it, but these are pretty common situations.
I always urge people to volunteer as mods for a bit.
At least you may see a different way to approach thing, or else you might be able to articulate the reasons the rule can’t be followed better.
You get the benefits of striving to warn users, without the downsides of it being abusive, or seen as abusive.
If I were to build this… well first I would have to ensure no link shorteners, then I would need a list of known tropes and memes, and a way to add them to the list over time.
This should get me about 30% of the way there, next.. even if I ignore adversaries, I would still have to contend with links which have never been seen before.
So for these links, someone would have to be the sacrificial lamb and go through it to see what’s on the other side. Ideally this would be someone on the mod team, but there can never be enough mods to handle volume.
I guess we’re at the mod coverage problem - take volunteer mods; it’s very common for mods to be asleep, when a goat related link is shared. When you get online 8 hours later, theres a page of reports.
That is IF you get reports. People click on a malware infection, but aren’t aware of it, so they don’t report. Or they encounter goats, and just quit the site, without caring to report.
I’m actually pulling my punches here, because many issues, eg. adversarial behavior, just nullify any action you take. People could decide to say that you are applying the label incorrectly, and that the label itself is censorship.
This also assumes that you can get engineering resources applied - and it’s amazing if you can get their attention. All the grizzled T&S folk I know, develop very good mediating and diplomatic skills to just survive.
thats why I really do urge people to get into mod teams, so that the work gets understood by normal people. The internet is banging into the hard limits of our older free speech ideas, and people are constantly taking advantage of blind spots amongst the citizenry.
When I consider my colleagues who work in the same department: they really have very different preferred schedules concerning what their preferred work hours are (one colleague would even love to work from 11 pm to 7 am - and then getting to sleep - if he was allowed to). If you ensure that you have both larks and "nightowls" among your (voluntary) moderation team, this problem should become mitigated.
But once the network grows to a large size it requires a lot of moderators and you start running into problems of moderation quality over large groups of people.
This is a difficult and unsolved problem.
The current problem exists because the content is chosen algorithmically