Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You can't post pirated content or child porn online - because you're either directly engaging in or enabling criminal behavior.

If you're promoting armed, violent protests and insurrection - that is also a crime.

And sure, this is happening to a small degree on Twitter and FB - but they make some attempts to stop it, and it's not the main value proposition of the platforms.

The problem with Parler is that this was always where it was headed. It was built to serve people who would use it for this, and a significant portion of the content created and consumed was about this.

There is also legitimate content available on Kick Ass Torrents. But the majority of the consumption is for things that are illegal in the US. So it gets the same treatment as Parler.



> If you're promoting armed, violent protests and insurrection - that is also a crime.

You are conflating Parler with it's users

> this is happening to a small degree on Twitter and FB - but they make some attempts to stop it

From the article:

> And contrary to what many have been led to believe, Parler’s Terms of Service includes a ban on explicit advocacy of violence, and they employ a team of paid, trained moderators who delete such postings. Those deletions do not happen perfectly or instantaneously — which is why one can find postings that violate those rules — but the same is true of every major Silicon Valley platform.


You've made some good arguments throughout this thread, but this one in particular is disingenuous. You can't market yourself to people that were deplatformed specifically for inciting violence and then credibly mock surprise when those people begin inciting violence on your platform. By the time the limited number of moderators get around to deleting or hiding posts the damage is done, and everybody knows it.


A few things:

1. You either are unaware of the meaning of the word "disingenuous", or you know my own intentions better than I do.

2. Did Parler express suprise that some of its users attempted to (and in some cases succeeded) incite violence on their platform?

3. Again your tendency towards superlative undermines the discussion, but "everybody knows it" and "the damage has been done"? This is a very strong statement indeed, claiming that you have knowledge that Parler's moderation has been so ineffectual that every user on their platform is able to view all inciting content before it is taken down.


> You either are unaware of the meaning of the word "disingenuous", or you know my own intentions better than I do.

I'm simply crediting you with the intelligence and experience to understand that what someone says publicly is not always in line with their actual goal. Therefore, by pretending that Parler's terms of service represent their actual intentions despite evidence to the contrary, I believe you are being disingenuous.

> Did Parler express suprise that some of its users attempted to (and in some cases succeeded) incite violence on their platform?

I have no idea. It doesn't matter. "I didn't think the leopards would eat my face!" is not a credible expression of surprise when you invite a bunch of leopards into your home and set them loose.

> Again your tendency towards superlative undermines the discussion, but "everybody knows it" and "the damage has been done"?

I don't think it's undermining the discussion to assume a certain level of conversational and contextual shorthand. "Everybody" does not mean literally every person on earth, it means "people with interest and experience in these matters". I apologise if English is your second language or similar - I'll try to be clearer in future.

> This is a very strong statement indeed, claiming that you have knowledge that Parler's moderation has been so ineffectual that every user on their platform is able to view all inciting content before it is taken down.

Not every user needs to have viewed content for that content to be damaging. However, the more people that see damaging content the more damaging it is. Most social media platforms expose more recent content to more users, so damaging content will do most of the potential damage within a short time. Therefore, platforms with actual intent to reduce damage will need to remove problematic users from that platform while also employing a highly effective moderation team to identify new damaging content as quickly as possible.

It stands to reason that a platform that only wanted to look like they were reducing damage could employ an ineffectual moderation team to remove content only after the majority of the damage was done. I suggest that's what happened here - it seems clear that large amounts of inciting content were available for long periods of time (hours/days).


And, let's note that torrent sites are still widely available. Most torrent sites are simply better constructed for their niche than Parler was.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: