Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Even if Parler hosted straight up illegal content, surely the proportional response is to block those accounts, not the entire platform? If no, why aren't we deplatforming Twitter or Facebook next?

I've made this exact argument. It's a blatant double standard that Youtube or Twitter, as you said, aren't held liable for the content their users upload, but Amazon and Google can deplatform whoever they want for whatever they want.

I've read a joke that we're officially in a cyberpunk world now that corporations are "going to war" with each other. I've thought that our society has been heading towards civil war for years, but only in the past year with so many riots and corporate overreach have I started to believe it might actually come sooner than later.



Except Twitter and Youtube routinely moderate their content.

It isn't perfect but they are actively and publicly doing so. Parler publicly seems to take the stance of no to little moderation. I haven't read the hacked info from the site on principle of how it was distributed/hacked but seems that data confirms it based on others notes in the thread


So lets think about this for a minute. What is the goal? To stop violence and to stop promulgating illegal material? Or to make it look like you want to stop violence?

If its the latter, then lets continue to deplatform people. If its the former, then lets ban facebook, twitter et al given their proven history of allowing violence and hate speech and as an essential tool to organize mobs.

The logic here is terrible and you can't argue against that. No matter how many users parler gets, it won't even come close to having the same reach.


> If its the former, then lets ban facebook, twitter et al given their proven history of allowing violence and hate speech and as an essential tool to organize mobs.

Sounds good to me... They're totally culpable for the domestic terrorism that happened at the capitol. Facebook too. Dogshit companies that track our behavior and make money off of outrage.


I think the goal is to largely remove illegal content, which is obviously a massive problem you want to be more conservative about when you consider moderation. In that regard, it is a good thing that Twitter, Youtube, Facebook etc. aren't perfect yet but are clearly trying to remove that kind of content because it signifies they are conservatively (and in some cases quite blatantly not acting fast/well/consistently enough).

Compare that to Parler which has built its whole marketing image on "absolute free speech" which technically isn't legal and their platform's contents reflect that.

Parler is blatantly NOT trying to stop illegal material, the other platforms are at least trying to...


You're missing the point. This is the real world, not the try Olympics. If I buy a burger from you at a fast food place, and you bring me back a bun, but you tried really hard to make me that burger -- I don't care. Go get me my burger.

If you're trying really hard no to, but still facilitate 1000x the amount of violence (made up statistics), it doesn't matter. The real world results are what matter. At least, its what should matter.


No, I think you are missing the point by a LOT. Your expectation is that EVERY burger must be made perfectly and if it's not its fixed RIGHT away.

Even at a moderately busy McDonalds there will be errors in your order. You can always go back and tell them to fix it and then you will wait... because they are busy filling new orders. And even after you waited there is a chance that they forgot about your fix, they remade it and it is still wrong, or something else happens.

This kind of stuff happens all the time in the real world. In fact, restaurants and food safety scores are the epitome of "try Olympics." You can have a rat in a kitchen in San Francisco and still not be immediately closed. But if you are making conscious efforts to remedy that, you can stay open and often have a long time to fix it.


What is an objective measure we can codify into law as to whether a platform is moderated enough?

Parler likely had a much higher percentage of content advocating for violence, but youtube and twitter probably have hundreds or thousands of times the quantity and reach for the violence advocacy content they host.


> Except Twitter and Youtube routinely moderate their content.

Counter argument : twitter letting "Hang Mike Pence" trends.


Didn't Twitter block that from trending [1]? Considering that it was moderated, how is this a counter argument to a claim that Twitter moderates content posted to its site?

[1] https://www.newsweek.com/twitter-stops-hang-mike-pence-trend...


YouTube and others try. Parler didn’t, on purpose.

This is really just a lesson about what happens when you squander your reputation and lose the presumption of good faith.


This strikes me as an important consideration.

However, it’s far too late in the day for me to put together coherent thoughts. Perhaps others further west could try. Thanks, and buona notte.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: