Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I've noticed a difference between downvoting and non-downvoting discussions systems.

In downvoting discussions, the top post will represent the consensus of the people engaged in the discussion. Note that this isn't "the truth" or "the consensus of the community" but of the people engaged on the subject. For example, if you see an otherwise-liberal community discussing gun-rights, the gun owners within that community will flood in and are highly motivated on the subject, and so the consensus will reflect their interest in the subject.

But either way, consensus.

Without the downvotes, the top post is often misinformation or just trash. Because a small, motivated group pushes it up, and the rest of the participants can only argue against it but not drive it back down... this creates the "engagement-based content" that aggregators crave.

The problem, of course, is that bots and sockpuppets are treated the same as established community members and can skew the "consensus". Realistically, the leadership of an online community needs to be able to identify who are credible voices on a subject and give them the power to steer the conversation. Yes, it's not egalitarian, but these systems never have been.



> In downvoting discussions, the top post will represent the consensus of the people engaged in the discussion

I wish I saw more consensus - things that are agreed upon by multiple sides - but in practice I tend to see a tyranny of the majority - whoever can win the upvote vs. downvote contest. The winning side rises to the top, while dissenting voices are silenced.

One sensible thing that I thought HN did but I don't see immediately in the guidelines is to suggest that downvoting isn't supposed to be a "dislike" button but is intended for comments that are spam, trolling, etc..

On HN, the first post to gather a lot of replies also seems to rise to the top.


IMO there should be 3 voting options

I agree, I disagree, Noise/bullshit

The problem to me is I want to read things I don't agree with but I don't want to read bullshit.

I would especially love to read ideas with an even amount of agree / disagree but a low bullshit amount. Not just sum all this to zero and hide it. That is terrible.


Actually, Slashdot's system was the best, with Flamebait/Troll/Funny/Informative/Insightful options, with only a limited number of votes (they called it "moderation points") available on a given day (this was more complex, you didn't get mod points every day), and a meta-moderation system allowing to rate the way people voted on any given comment.


Since upvoting presumably promotes a post, and "noise/bullshit" presumably penalizes it, what would "I disagree" actually do?

You might say that HN already has this system, distinguishing between downvoting and "flagging".


It would presumably have a lower weight.

If "Agree" is +1, "Disagree" is -1, "Shit take/bad faith/troll" is -2, and "report" is threats of violence or spam.


I agree. Many times well written and relevant comments are down voted because many people have a different strong opinion on that while noise/meme comments climb to the top because they are funny and don't trigger people.


>downvoting isn't supposed to be a "dislike" button but is intended for comments that are spam, trolling, etc.

That rule actually rigs the system in favor of irresponsible voters.


The solution to tyranny of the majority is NOT to hide down-voted comments and provide sorting options. Big problem with reddit is that it hides comments below certain threshold essentially silencing minority opinions.


> below certain threshold essentially silencing minority opinions

These are not minority opinions, but unpopular opinions. Just because something is downvoted to hell and hidden, doesn't mean it's a minority.

A minority opinion is one which few people have, not one which is downvoted by the majority.


If you have a forum with 100 people and your opinion is downvoted by 50 your opinion is (within that forum) the minority opinion.

That doesn't necessarily mean this opinion is a minority opinion in the real world. In fact it doesn't even have to mean those who downvoted don't share your opinion, they might have downvoted you for a heap of reasons (maybe you were rude, or they saw your comment as pointless, maybe they downvoted every comment except their own, etc.)


Your second paragraph invalidates the first. What if all 50 of those people agree with you, but are downvoting you for another reason?


Then you might be in an (possible, but unlikely) edge case scenario and should applaude the complexity of reality surrounding us.

The truth is, you can never be sure if your opinion is minority or majority within any online forum just by looking at up/downvotes of one post in isulation.


A minority opinion is downvoted by the majority because the majority disagrees with the minority opinion.


>sensible thing that...HN did...is to suggest that downvoting isn't supposed to be a "dislike" button

reddit says that too, as do other discussion forums, to no effect, same as here.


reddit has a contested mode that is quite nice to balance the winner take all approach that seems to plague all other systems.

Another thing I have noticed is SO, that remove "karma" for each downvote you action (Note that I have seem that behaviour documented).


My experience is that on predominantly (left) liberal communities, that a gun rights topic’s top post will rarely represent the view of gun owners.

Everything else you said resonated, but that part strongly contradicted for me.


This is because the moderators will often ban all disagreeing members. I’m not talking about trolls, but genuine and good faith discussion.

Saying something like you don’t agree that investment properties should be illegal is enough to get you banned in many parts of reddit.


I'm specifically thinking of the CanadaPolitics subreddit which is solidly centre-left, where this happens consistently. Ditto the Ontario subreddit. General issues subs with centre-left politics that shift hard when a subject that was fixated on by an interest group popped up.

Likewise, threads on covid policy were generally supportive of the scientific consensus... Until the subject of keeping gyms closed comes up, where the bro-science faithful would appear to declare all covid restrictions a crime against humanity.

It's not astriturfing or anything malicious, just the natural result of a subject being a disproportionate interest to a highly motivated subset of the community.


But in a leftwing forum most members wouldn't be gun owners, of course the consensus would sway towards them. As they're in much greater numbers. The OP wasn't saying it's a fair portrayal of the entire community, just a majority consensus. I feel the same applies here just as well

Even non-gun-owners are entitled an opinion on guns as they can be used against them too.


I disagree with people being attributed as credible, because I think content can suffer much more from heavy handed moderation compared to being "brigated" by motivated individuals.

Because the latter is easily identifiable. You basically complain that people upvote the wrong opinions, which I would not disagree with.

Credible sources have also spread misinformation. I think the current system is fine and heavy handed approaches and information hierarchies are counter productive. There are great communities without a leadership and hands off approaches. You idea would lead to "Influencers" getting more control on content because those would be the "leaders". Can be good, can be bad. I think the latter would be the more common occurrence.

But people too invested in online communities also tend to do a large amount of damage. An example would be the people at the top of Reddit karma.

If we had "credible" leaders, there would be less room for objections, which I would like to see more often.


But with downvotes, we ensure that only popular opinions get visibility, defeating the entire purpose of sharing ideas within a community.


Unpopular ideas may be ahead of their time, but they may equally be just stupid or deliberately proposed in bad faith.

There are potential benefits to considering any given idea (you learn something useful or get smarter), and there are also costs (time to evaluate, possible bad outcomes).

Unless you are taking the negative possibilities into account, an all-beneficial 'marketplace' of ideas' does not and will not work.

Consider a farmer's market or flea market. If you go there with a bad product and it fails to sell, or previous customers tell you it turned out to be bad, then you either improve your product or start losing money. But if you could go a small market with a big sound system and a team of marketing people to loudly insist that your products are the best you would quickly wreck the whole market.


> Unpopular ideas may be ahead of their time, but they may equally be just stupid or deliberately proposed in bad faith.

It is more than an idea being ahead of time or stupid.

People need to take a lot of cognitive shortcuts in their model of the world to be able to compute it, which is evidently imperfect. But getting poked holes in one's model is nonetheless painful. Being told you might be wrong, and the realizing you were wrong is painful and scary.

Most people do not use up/down votes, blocks, bans this heavy-handedly because of the expected truthiness of the ideas in question; they primarily do it based on the idea's conformity to their sense of identity and to protect themselves from the anxiety of having been wrong.

And this is not their fault. Polarization is ultimately a failure in integration and having a complex model of the world. But it is work, and it is painful. The more that work is postponed, the more the drift between our models and the reality, making it even more painful to catch up. The tools of discourse we have been given give us the perfect mechanisms for facilitating this avoidance, because they feed on engagement and not the truth. It is like saying "I'll keep telling you sweet lies as long as you don't leave me, doesn't matter how much that hurts you on the long run".

> Consider a farmer's market or flea market. If you go there with a bad product and it fails to sell, or previous customers tell you it turned out to be bad

Or they could be diverting you from mere hard truths. When it comes to ideas, "comfort" is the wrong metric to optimize for.


Unpopular ideas may be ahead of their time, but they may equally be just stupid or deliberately proposed in bad faith.

That's where moderation comes in. Honestly, I think good moderation is better for weeding out trolls and terrible ideas than downvotes. All sorts of idiocy makes it to the top of Reddit, for example.


I would have downvoted here instead of arguing with you as I am doing, lol. I disagree. I strongly dislike moderation ( it feels like an unfair boss coming in and using tricky stuff to avoid directly addressing the moderated party. Cowardice. Injustice. Etc…and such lousy grammatical constructions as “you are posting too fast”. Someone inform the HN bot that quickly is an adverb and that fast is an adjective, many incorrectly written street signs notwithstanding.


"Fast" may also be used as an adverb. See: https://dictionary.cambridge.org/us/dictionary/english/fast


I have been accused of bad faith arguments when that wasn't my intend.

> There are potential benefits to considering any given idea (you learn something useful or get smarter), and there are also costs (time to evaluate, possible bad outcomes).

You don't have to evaluate every comment. You just let it stand and let others that want to invest the time come to their own conclusions. Perhaps you misunderstood the comment, have a different perspective or fail to get the context. This can be true for any comment you read.


In most cases that’s a cue to re-think how a comment is phrased or to pre-emotively address why most people’s knee jerk reaction to it is too hasty. In some cases, yeah the crowd just isn’t receptive to a point of view but most often it’s a matter of phrasing or another issue that needed a second look before posting.


Usually people think in terms of likes and dislikes, "stupid" and "bad faith" are rationalizations, that don't even need to be provided, because they can't be requested.


When the unpopular opinions are things like "COVID is just the flu" and "climate change is fake", both of which have a substantial contingent that will support them, those ideas getting downvoted into oblivion is not a bad outcome.


Sure. But it's a double edged sword. "You should wear your mask" is getting downvoted (edit: I'm using the term loosely here, I hope it tracks) across the country and responsible for hundreds of thousands of deaths. We need to engage with ideas we don't like, and be open to persuasion. Downvotes preclude that.


No we don't, and no they don't. If you engage with ideas you don't like and debate them in good faith, and the promoters of said ideas keep just pumping them out (perhaps even using botnets or whatever) while ignoring all your good-faith attempts to refute them, then you are wasting your time.

At some point you have to cut your losses; you stay open to persuasion by people who consistently exploit your willingness to listen, you're being played. Downvotes do serve the excellent purpose of saying 'I disagree with this and am not willing to waste further time on it.'

Downvotes can also be abused. But an easy way to get around that is to make the system fully transparent and then do cluster analysis. If a downvoter or group thereof only ever puts out negative votes or habitually downvotes everything fromeone they don't like, that will show up after a while and people can weight away those negative opinions. Absent transparency, then the incentives to game the system go way up, and system operators usually wildly overestimate their own ability to guard against that.


I'd love to see a site like Reddit or HN experiment with using relationship graphs to research this. Start with some fixed points of "this person is a credible source of information" vs "this person is a neo-nazi conspiracy theorist" and examine the networks of people who support them and oppose them, and use that to establish credibility-scoring system-wide and weigh people's votes based on that.

I mean, it would be super-biased but at this point I'm all about systems that are biased in favor of scientific consensus and human rights.

Yes, you could reverse the "credibility" of those paragon users and make a system where the people surrounding the neo-nazi conspiracy theorists are the ones who hold all the power but we already get sites like that without any moderation or voting at all so that would be redundant anyways.


Make it even simpler. Mark yourself as the only source of karma. Then when you upvote people, they also become sources of karma; and when you downvote them, they become sinks (their upvotes might actually subtract karma.) This effect should attenuate with distance from you (i.e. your upvote is +1, the person you upvoted's upvote is +0.9.)


That sounds like a recipe for creating filter bubbles.


It is. But diverse bubbles.

I think this is still better, than just one bubble, even if the intention is a "pro-science" bubble. Because there are not many people claiming to be opposed science. Allmost everyone claims to be "pro-science", even flat earthers. So it still comes down to what the specific mod thinks.


I'm very into that (because I spend a great deal of time studying neo-nazi conspiracy theorists and similar people).

I think platform operators to some extent have to eschew moral judgements and let things play out, eg if a variety of distinct and persistent clusters emerge naturally, and 98% of people in a community make nazis' lives miserable for being in the 'nazi cluster' then it doesn't need any additional help from the platform operator. Of course the nazis will whine 'so much for the tolerant left' but honestly, who cares. The key is that if enough people in a community tag individuals participants in a particular negative way, those individuals can either accept it (and then be habitually ignored) or change their ways.

This isn't always a moralistic dichotomy. For example I post a lot of environmental stories to HN, which are not-interesting to some readers who prefer to focus on pure technology. I'm OK with them ignoring such posts or even downvoting them/flagging if I make too many in a row. Likewise there are many technical topics of wide interest on HN which I find deathly boring, but which I am happy to ignore.

Researchers at Stanford have come up with some results on clustering and conflict in online spaces, which you might find interesting: https://snap.stanford.edu/conflict/


> because I spend a great deal of time studying neo-nazi conspiracy theorists and similar people

Really sorry to hear that. I hope it is a hobby and not your job.

I remember when Nazis where a joke on the internet. That they are not quite that anymore is not completely their own making as there are some people obsessed with them and even start to mirror them. Or others that want to build a profile about standing against the obvious.

Suddenly there is a significant group that denies civil liberties like free speech or freedom of association. Free speech in interesting here, because it seem to trigger those that fear that Nazis are overtaking the internet. To me this is very indicative of reactionary behavior towards a vastly overestimated threat.


What about clustering them by upvote/downvotes patterns (I.e. find the bots) and then “shadowban” their votes? (I.e. make them count as one or in any case a fraction of the total)

Maybe they would be hard to distinguish from regular user clustered because of group thinking , but even in that case I wouldn’t consider it a completely bad thing… :-)


There could be a collection of more meaningful one-click reactions ("I disagree", "This contains too many errors to address", "I don't like this person", "This is misinformation/abuse/dangerous"). The downvote (and the upvote, too) don't communicate enough.


Yeah.

"Upvote only communities" are more prone to brigading by hyper-motivated cliques or sub-groups.

"Upvote+downvote communities" are more resistant to the above, but also more prone to hivemind.

It's not clear to me that one is better than the other. It is clear to me that both have significant flaws.

Ideally what you really want to promote would be well-reasoned/well-sourced posts from a variety of viewpoints. In practice I simply don't think this can be achieved solely by community voting. It is simply too easy to click an upvote or downvote button for random community members that don't have the incentive to take a more thoughtful approach.

Every successful community I've seen has had some level of strong moderator involvement, including HN.

(The HN mods kill a lot of low-value posts. Because they do their job well, and they do it with a low profile, many folks are unaware of how much moderation they do...)


An idea I've been interested in trying is requiring people to downvote for a reason: "unproductive," "inaccurate," "off-topic."

The hope is both to increase friction and to allow each person to decide which downvote reasons they want to respect. It also forces people to foreground why they are downvoting and gets rid of "why the downvotes" and changes it into "why is this unproductive?"


Hahahahahaha.

This is exactly what reddit told its users to do (only downvote due to one of these reasons) - and exactly nobody does it.

People will figure out to downvote for whatever "reason" that maximizes the impact of the downvote real fast; groups of people will make sure to weaponize it too even without coordination.

E.g.: "the comment I hate doesn't have enough 'inaccurate' downvotes to sink into the abyss, so I'll do my job and add one".

The only reason why things are downvoted is "I don't like what I feel when I read it", which most of the time is "I disagree with what this says" or "I hate people who say things like that" (the latter more commonly).

You want people to dislike things for the "right" reasons — but the only way to do this is to filter the people, not the content.

(I've been a mod of a decently-sized subreddit, and a user since 2008)


> This is exactly what reddit told its users to do

I'm not on reddit much these days but when I opened it just now I was able to downvote without choosing a reason? So I do not think it is using the system I described.

The ability of each user to chose what downvote categories they want to respect would be an important part of making the system harder to "game" (to the degree that one is "gaming" a voting system by voting).


Reddit says to only downvote if a comment does not contribute to the discussion, not if you disagree.

So downvoting is an implicit choice of only "good" reasons.

Putting a drop-down list with "good" reasons is as good as displaying a pop-up: "Are you sure you are downvoting for <good reasons>?" - which will be promptly ignored.

Users will just be frustrated for a bit until they train themselves to pick the category that maximizes the effect of their downvote.

The only thing you can do with categories is display the distribution / most prevailing category alongside the score (like Slashdot has been doing for decades).

In all of this, you seem to want to control why people downvote.

Instead, examine why people downvote. Because categories won't change that.

It would be like the emoji committee not including an emoji for penis. That worked well, didn't it?

Unless you add categories like "I think the author of this comment is a moron and I hate them", people will shoehorn your categories for this use case.

And if you do that, I'm not convinced the result will be necessarily positive.


> Users will just be frustrated for a bit until they train themselves to pick the category that maximizes the effect of their downvote.

If each other user can choose what downvote categories they want reflected on their site, this dynamic doesn't work the way you are describing.

> examine why people downvote. Because categories won't change that.

It seems to me like forcing people to label why they are downvoting would be very helpful to examining why people downvote.

> It would be like the emoji committee not including an emoji for penis. That worked well, didn't it?

I think it worked really well? Like, I wouldn't mind a penis emoji, but it turns out we've got an informal penis emoji and it works great? What do you feel did not work well about it?

It seems like you are thinking that I am thinking that this system would...prevent bad downvotes or make people think twice or...I dunno, be better? I was not thinking that.

The idea is more about forcing people to contextualize their downvotes in a way that can be observed and reacted to by the community at large. It would be a way of talking about why people downvote without making a particular person defend themselves. It would, hopefully, add another layer to the communal sense of why people are engaging in the way they are.


I don't think that the suggestion was to tell people what to do, but to replace the downvote button with multiple buttons that represent different reasons one would downvote. One of the buttons could be "this is wrong and stupid."

If you're intentionally trying to restrict expression by eliminating options you don't like, you might as well just remove downvotes. If I'm reading it correctly, this suggestion would just add more signal to downvotes (even if irt karma they were all the same.)


I didn't get the impression that the parent comment meant the same, but I fully agree with what you are saying.

>If you're intentionally trying to restrict expression by eliminating options you don't like, you might as well just remove downvotes.

My point essentially.


No idea if they still do but Slashdot used this back in the day. You were given karma points occasionally to pass out and you'd have to select what you were rewarding/punishing.


They still do, and the quality of discourse on Slashdot is about as bad as it has ever been.


No, it was pretty good in the late 90s, very early 00s. Then it got competition from digg and reddit, and lost readership. Only the trolls perservered, and here's the key, without quality conversation, dedicated readers couldn't earn enough meta-karma to moderate. The 'never delete a comment' policy remained for about a decade wherein every story was quickly brigaded by racist, sexist and homophobic slurs, but mostly the n word; with only a handful of diehard commenters remaining.

To this day, I prefer Slashdot's summaries over headline-only aggregators, and I wish other sites would adopt features like the mod-with-reason and karma caps per comment. But they designed the system to limit the power of moderators, and those limits didn't scale with the imbalance between trolls and good-faith posters


After stopping reading /. every day sometime around 2010 or so I've started browsing it again about once a week or so. There's less discussion than there used to be but it's no better or worse than it used to be - there's certainly far, far less of the pointless noise that took after before I left.

Considering the amount of fuckery people engaged on during the site's heyday its moderation system handled things amazingly well. If you read at a threshold of +1 you'd barely see all the garbage posted there.


It sounds like you both agree that mod-with-reason alone didn't save Slashdot's level of discourse :)


You got downvoted for this which I find ironic. Imo, sharing an opinion also shouldn't equate downvote. Adding a comment that contributes nothing to a discussion on the other hand, maybe something like "I agree with that." with nothing more added of substance, should equate a downvote since that sort of post doesn't further discussion, the thread ends there.


I like your explanation. Makes a case for premium paid Twitter tweets to feature downvotes as part of its thread to only allow for consensus based discussion.


> Without the downvotes, the top post is often misinformation or just trash. Because a small, motivated group pushes it up

Nonsense.

A small group has a small impact, so as soon as there is more than 1 post, a bigger group will quickly “win”.


This comment should have been the article. It is clearer, better reasoned, and more interesting. I mean: thumbs up!


So post more quality content.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: