Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Even in the best case scenario where everyone starts using different instances of Peertube, won't it end like 2012 Reddit full of racist and paedophile instances?

You need a minimum amount of moderation to remove the least desirable groups of people from posting content.



This is a difficult question - which is why this line is always provided as a justification for greater authoritarianism.

I would ask, where is the crime here? Is it that you saw something terrible, or that something terrible occurred? Should we be focused on the prevention of immoral acts, or the representation of those immoral acts on video?

Does your not knowing about certain immoral acts make it better or worse in reality for those suffering by them? If the reality is that there are racists and pedophiles, is better to address the issue openly, or behind closed doors?

In my opinion, it is better to enshrine freedom of speech, and to address issues openly. Having things 'managed' privately may mean that it is ok, as long as no one finds out about it. Its definitely a difficult one, but I am not for hiding reality.


Make no mistake, all these alternative platforms do is optimize for a certain type of content and bury/hide some other kind of content. This will always be the case as long as there is an algorithmic feed and search—users will manipulate the system to try to get at the top.

Yes, technically it's not illegal to allow those kind of fringe statements but by doing so you implicitly allow that content to rise to the top and it can make your site unsafe for racial minorities and children.


>won't it end like 2012 Reddit full of racist and paedophile instances?

There is only one Reddit instance. Subreddits may have quasi-independent moderation (Reddit admins can always intervene if not doing so would cause bigger PR problems than they're willing to ignore or deal with in other ways) but there's no real way for one subreddit to disassociate with other subreddits.

PeerTube (or Mastodon or PixelFed or Lemmy) instances can disassociate from other instances that have incompatible moderation practices, though. Either through allowlisting or denylisting depending on the operators' risk models.

Using Mastodon as a case study: Gab tried to pivot to ActivityPub-based software (I think Mastodon) and yet they were denied the ability to federate with most instances because it's a deeply toxic community.


What is the evidence that a small group of companies has the superiority wisdom to determine who aught to be allowed to speak to the masses?

We elect government democratically, and through that create laws and police. If the internet is full of criminals flaunting laws then the problem is right there in the lack of laws and police, not in the lack of control from a small group of companies.


> won't it end like 2012 Reddit full of racist and paedophile instances?

Not if police intervenes, at least for the pedos there exist established international relations between agencies that works efficiently - there is a reason why you won't see pedo content on the clearnet for long.


Moderation and centralization are quite disconnected.

ActivityPub is federated and has moderation pretty much built in by design.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: