What other less-toxic system could work instead of karma?
Hey! Thanks to the whole Reddit mess, I’ve discovered the fediverse and its increidible wonders and I’m lovin’ it :D
I’ve seen another post about karma, and after reading the comments, I can see there is a strong opinion against it (which I do share).
I’d love to hear your opinions, what other method/s would you guys implement? If any ofc
Posts should just be upvoted and downvoted with no credit given to the person who posted. Same goes for comments. In my opinion, upvoting and downvoting should just help the user find the most relevant information. Content that people upvote is the most seen. Content that people downvote is the least seen. Posters and commenters stay on an equal footing with no points system.
It makes sense to rank content, but ranking users just begs abuse of the system. There's always those that will try to farm the system resulting in lower quality content. It's also an attack vector for bots.
I don't miss the "karma" aspect one bit here. Rate my post quality, not me. On the other hand, tools for ranking users privately could be helpful. In other words a personal ranking for your eyes only would be fine.
It’s a shame, but any sort of number-based system will most likely end up with the same problems as karma. Not having the numbers add up is a good start though, since upvotes and downvotes are only really useful as ‘in-the-moment’ indicators of good vs bad content.
Let’s keep it how it is, so that we don’t have another social credits system that doubles as a dopamine factory.
We should keep it as is. Having an account score just amplifies a big issue with sm. The content should be in focus, not the people posting. A relevant comment should be hightened because it itself is good. In the same way we shouldn't judge something because the user has a low karma, but because the content is bad.
The idea behind something keeping a score on a profile is good, but it doesn't work as intended in practice. People will farm in whatever way they need to get a moral highground. Not having such a scoring system will be a good way to reduce the incentive to copy/paste content from others.
Karma and votes should stay but be hidden to other users. Karma is a good way to detect bots and trolls, but just admins and moderators should see it to act on them if needed. And up/down votes should be hidden too because of the hive mind phenomenon that it produces (Experienced on Reddit): often, the funny or sassy or apparently clever comment gets upvoted and sometimes, the comment with knowledge about the post gets downvoted because the first joke was funny. Many people may not have an opinion about the issue but upvote the funny guy and downvotes the real answer just following the hive. Hiding it, each person reading must decide by themselves if they upvote or downvote a comment.
Web of trust. The biggest thing missing from most attempts to build social networks so far. A few sites did very weak versions, like Slashdot/s friend/foe/fan/freak rating system.
Let me subscribe, upvote, downvote, filter, etc specific content.
Let me trust (or negative-trust) other users (think of it like "friend" or "block", in simple terms)
Then, and this is the key... let me apply filters based on the sub/up/down/filter/etc actions of the people I trust, and the people they trust, etc, with diminishing returns as it gets farther away and based on how much people trust each other.
Finally, when I see problematic content, let me see the chain of trust that exposed me to it. If I trust you and you trust a Nazi, I may or may not spend time trying to convince you to un-trust that person, but if you fail or refuse then I can un-trust you to get Nazi(s) out of my feed.
Subs should be able to force sort by controversial for comments and/or posts.
Any damn fool can come up with comments that are universally approved of, or universally hated. They aren't interesting.
The phrase 'trivially true' applies - "This crime was a bad thing, and the people responsible shouldn't have done it! I am very angry at them!" may be emotionally satisfying to say or to cheer on, but it doesn't add a damn thing to the conversation, any more than "hur hur suck it libruls" does.
There isn't a term for the inverse of ragebait, but there needs to be. All the le reddit moments - the tedious meme-chains, forced in-jokes, etc.
For subs where you want interesting discussion, you want to sort both to the bottom. It's the posts that divide opinions that are worth talking about, almost by definition. If a post has a thousand votes but the total is close to zero, well hey, that's probably worth seeing and engaging wth.
Let people vote with their heart, use upvotes/downvotes however the fuck they want to instead of constantly nagging and whining about it - and then use that to detect and de-prioritise mediocrity.
It wouldn't be appropriate for all subs, but for some places, I think it'd be a huge improvement.
my take: up only, no down, per-post only, no account. if someone is repeatedly a problem mods can show them the door.
karma systems have been around forever allegedly to decrease mod/admin workload managing users by having them "self moderate" and that has NEVER been the actual effect - they've only ever been an engagement metric for advertising and it didn't matter positive or negative if people were angry downvoting they were still engaged. I've witnessed site after site add these systems and then the userbase turn into a toxic cesspool after. In almost 30 years I've only seen one roll back the change even partially. Their culture never fully recovered and its still dominated by people agitating to get attention and to one-up their perceived rivals.
Let reddit things die with reddit. Long live Lemmy.
Not really sure what you think is wrong with karma? most of reddit's problem IMO come down to bad moderation.
But for comment scoring, there are really just 3 methods I've seen:
Generic Up/Downvote - Reddit
Categorized Up/Downvote - Slashdot - This worked on a technical forum to keep technical knowledge near the top, while still allowing stupid/funny comments further down the page, plus it made ignoring stupid/funny threads easy
Personalized Up/Downvote - Facebook/Twitter/etc - basically build a profile of users you agree/interact with, and then weight their interactions accordingly to predict what content you'll like/hate.
I believe Ticktok take this to the next level, because 90% of users don't up/downvote, ticktok logs the passive act of continuing to watch content as a partial upvote making their algorithms train on the average users likes/dislikes faster.
You could probably combine Personalized & Categorized, but I've AFAIK not seen it done.
I think the problems with moderation are harder to solve, because you have both bad-faith moderators & good-faith but easily played moderators as problems, and you also want different dynamics as forums grow.
I think lemmy could really experiment with good moderation & meta-moderation and if the developers are interested anyway, be a far better forum as a result.
Peer review of moderator decisions is something Slashdot did that went quite well. Once you'd been an active user with good "karma" for a while you would occasionally be asked to review other users votes, I think a similar thing could be done for moderation decisions
Elected mods. For subs above a certain size, having moderation essentially boil down to whatever the guy who created the sub decides, is bad. I don't know exactly how it would work to prevent abuse, but as subs grow, at some point it would be good if the community chose the mods.
even short of full fledged democracy community approval of mod appointments would certainly reduce the amount of mod drama where it 1 bad head mod, will purge the other mods and replace them all with sock puppets.
Users-led replacement of bad mods, similar to electing mods, it would be good for users to "recall" a bad mod.
Transparency over mod actions, I understand that with the number of Nazis & other assorted trolls online reddit chose to let mods, moderate anonymously, but it really means you have no idea who is doing a good/bad job in many subreddits, some level of transparency for all but the worst content is key.
Moving subs, as lemmy instances have some control over the content of the subs that reside on them, it would make sense for there to be some method for the users + mods of a sub to decide to move it to another instances. This not only prevents admin abuse, but also encourages competition between instances for technical administration & content administration.
Splitting communities , sometimes subs grow "too big" and have different subcommunities that end up fighting for control of a sub, it would be good if there were a way of these communities splitting into 2 rather than fighting over the original name. not sure how it would work, but thinking about how r/trees & r/cannabis split or something similar. Maybe /r/canabis could become an combo of /r/canabisnews & /r/canabismemes, where users can just ubsub from the 1/2 of the content they don't want.
Letting users weight subs/filter subs how much of subs they see, sometimes I've unsubbed from a high-content sub, just because while i liked the content it was overpowering the rest of my feed, it would be nice to have users configure how much of a sub they see (especially if combined with Categorized Up/Downvote), rather than complaining about "bad moderation" I can just personally choose to see less of what I don't want.
Anyway thank you for reading/not-reading my ted talk, but I suspect this will come up again so now I can copy/pasta it.
Unfortunately, anything you replace karma with will have the same problems that karma has. Any indicator of comment or user quality will be readily gamed by anyone with any skills whatsoever in automation.
I like the system as it is here at the moment.
Up-/Downvotes per Post/Comment to show the popularity (and express (dis-)approval). But nothing to collect per account, so noone gets encouraged to post just for the karma.
Karma does well in my opinion, however it should display the number of upvotes and downvotes, not just one number. Also adnn an option to sort by the number of downvotes.
I think it is fine now. I don't really want karma totals or a wave of different colored reactions. Upvotes or downvote. I can't even do downvote and haven't missed it.
Absolutely nothing. Reducing people to a number and ranking their value based on that is inherently wrong.
Keep it simple, the current Lemmy system works fine. Spambots and particularly disruptive people should just be banned anyways, a gamification system would not solve any issue on that front.
This may be a dumb Idee so please go ahead and tell me if it is.
It seem ppl use it to know who to block in advance. What if you get a red name or some kind of info on the profile if the account has been blocked by other users and it is above 10 or 20 blocks? Would that help? That would suck for that account because it will forever be the ass hole account. But at least no one would really want to farm that except the trolls who want ppl to know they are trolls.
I like how it is now tho. It is good when the mods are responsible first and foremost instead of a system.
I don't think having a rating system that could be farmed or abused is a good solution.
There should be no incentive to generate content just for the publicity of the account.
All the content ends up being reposts of low-effort things that are just more relatable, which, in all honesty, I find really lame.
I loved forums, and we didn't have anything, except for 'total posts' and 'total replies' for users. I like that.
I do like upvoting, but I think karma should be hidden. Maybe if you go to user profile and click a button to see the value. It should not show if you hover over the user in a discussion, like Reddit. This is too much incentive for Karma farming.
I don't like downvotes, and that's the reason I'm on lemmy.one - no downvotes here at all
I agree that it might get toxic at one point, but I'd much rather see extra preventive measures to stop repost bots, karma farming and so on rather than removing points altogether. Maybe it also helps to see karma breakdown by community: say you see someone answering a technical question on a specialised community - it would be of little to no relevance there that they might have 10000 points on r/funny or r/aww - I'd much rather see their points on that specific community.
Here's a crazy idea. What if down voting a comment/post resulted in a weighted random float between 0-1 while upvoting resulted in a weighted random float between 1-2? If you virulently hate a comment or post, ignoring it is the surest way to bury in completely. Posts and comments that Garner attention become the most visible, but gaming the system for visibility could become difficult if the weighting algorithm was tuned appropriately.
I don't think you can have anything in the same spirit that isn't toxic and doesn't encourage brigading by minority groups who want to cancel opinions they don't like. The whole concept is simply glorified ad hominem.
Personally, I enjoy how 4chan does it. Posts are anonymous and seen in chronological order. There are no upvotes/downvotes and no karma or e-peen linked to accounts. Also, everything is temporary and gets deleted off the site completely eventually.
The pros are that you have to judge posts based on its content and nothing else, and can't look through someone's post history and stalk them across the site.
The cons are that you have to shift through tonnes of shitposts just to find decent content which will eventually get deleted. You can't find the top posts of all time, because they're gone. The system also makes it a great place for trolls. 4chan has so many of them that threads entirely made up of trolls trolling trolls is a thing.
If you're looking for a site that ranks the best content then a voting system is needed. If you're looking for discussion then a voting system and permanent accounts only get in the way.
I would have optional, per magazine karma. Mods can decide if they want to enable it and what rules it should follow. Personally, i would max it at some low number, like 100; above that you are an upstanding member of society and that's it.
I think a reputation system is important, though reddit's current karma implementation is bad, there needs to be a method of identifying bad actors and forum shifters.
One refinement over karma could be that the score is kept only by community and should reflect that users contribution to the community.
Simple upvotes and downvotes also don't allow for nuance, replace them with a Buzzfeed like tag system (yes I know we all hate the site for its content but its tag system if used properly could be pretty powerful.
So instead of 'up' and 'down', you have a clickable emoji-menu like list of tags like 'interesting', 'boring', 'funny', 'WTF!?', 'Quality', 'Trash', 'Educational', 'CAT', etc...
So the reputation score for the community isn't just a flat number, rather it will tell you the kind of content a person posts over time, and doesn't carry just flat positive or negative connotation.
I mean the king of Catposting may have massive reputation in meme subs with high ranks in tags for 'Funny', 'Cute', and 'CAT' though that might not be the case if they participate in say a chemistry QnA community.
As these scores are created over time based on each users contributions (post AND comment reputation is the same thing) to the sub as scored by other people's tag selections for that users posts. The more it aligns with the community, the greater their contribution score.
Does this mean that toxic communities can form that exclude people based on reputation tags that the toxic community detests?
Unfortunately yes, that is one of the flaws of the system.
THOUGH
The fact it is contained by community means that a high rep person in an anti-trans community will not have any carryover reputation when joining a community they wish to brigade or degrade the quality of content, and their tag history will make it easy to determine their genuine engagement.
The whole point of them was to create a flow of content with minimum human intervention. That’s a huge goal and The Dream if you’re making money off social media. If you’re not making money off social media then it’s not doing you any good.
What if we had a community standing metric that flips only between "good" and "bad."
You get "bad standing" if the majority of your contributions in the last 6 months have a majority of downvotes than upvotes, but it resets after 6 months.
Everyone defaults to "good standing".
This serves the purpose of a metric to filter out trolls or bad-faith actors, whilst making "karma farming" pointless.
The worst part about using reddit when I first signed up was having to deal with celebrity redditors with bajillions of karma sucking all the air out of any thread they visited. Thankfully, it seems like over time people calmed down a bit with that, or maybe I just started browing non-defaults with more tight-knit communities, but you still have dumb novelty accounts that kind of ruin the experience (if you've ever been got by /u/shittymorph, you know what I'm talking about).
How about expanding the votes into multiple kinds of karma. Make it possible to place several votes: "on topic/off topic", "funny/boring", "shit post", "solution to the question", "agree/disagree", "political", "Interesting", "Spam", "Original Content", etc.
Communities could create whatever rating is suitable for their forum.
Sort of like tags, but votable.
It would basically reward everyone for what they do (being a level 7 funny shitposter is something) but at the same time making it possible for others to filter out anything they don't care about.
So instead of clicking downvote because of disagreement or upvote because it's funny, there would be an outlet for that in its own vote.
I feel that would make it easier to find quality content whether you're looking for serious debate or the hottest memes.
We'd need better comment filtering on individual communities, and it could/would be abused, but overall it would be facilitate the possibility of having different kinds of conversations on the same topic.
Sometimes I want to read funny stuff in serious topics and sometimes there is serious stuff in funny threads, and sometimes people write clever stuff that I disagree with and so on. One vote is just not enough.
Something that is based on user activity and active users after a specific time. i.e the more users it has, the more highlighted it gets.
(And before someone says about bots, etcetc that can be easily circumvented with filters and the like.)
I was thinking maybe some kind of ranking system like Street Fighter 6?
I know everyone seems to be hating karma but I do like that dopamine release. Ofx it will get abused... but what if there are just tiers, rather than seeing a number go up.
And at the highest tier, it doesn't matter anymore. That was you can see who is most active and it kind of gives just a bit of prestige. Furthermore, you won't see a number going up forever, so after awhile it's not like you want to keep gaming the system to see the number go up. But at the same time you can feel some some of progression.
Anyways, it's just a random thought I had as I am grinding on SF6 today haha. I could easily do without karma but it's just a thought.
I still firmly believe one of the worst things to happen to the internet, besides pop-up ads, is up and down votes. Nothing exposes a misanthrope quicker than forcing them to comment instead of passively downvoting everything they see. Which makes it easier to remove them from the party.
Honestly, I find the entire system annoying and counter to fostering real discussions.
If you go to a party, it's not like people in the room have tags over their heads which say "trustworthy," "troll," "crazy," or whatever else. You have to make up your own mind based on your interactions and (hopefully) use of critical thinking to decide if someone you are talking with is worth your time.
If I don't want to take the time to read anything which might offend me, put me off, make me uncomfortable, challenge me, or just in some way be contrary to my world view then frankly, online forums would not be the spaces in which I would be reading things.
I believe that everyone has a point of view that can have value in some way, if only to illustrate that "negative" or "contrary to me" view and people exist around me. They have voices to contribute. Deciding if their contributions are valuable enough to award them a positive or negative "Reputation" is not an abstract thing. A true reputation takes time to build in the real world. It is earned for better or for worse, by actions people take over time not by some arbitrary number farmed by a bot posting cat memes 24/7 or whatever, or posting viewpoints sure to garner upvotes because like minded people are the only ones replying.
I couldn't care less about karma, what I really want is a way to see what I upvote, otherwise I feel that what I upvote is meaningless (for myself, I like to boost content that I like though).
Every system that can be thought of (and has been suggested here) might sound great but when implemented at scale will no doubt prove to be open to abuse and require an army of mods to oversee. Otherwise every multi-million dollar social media company would have implemented it already.
Upvotes and downvotes and cumulative scores kind of do the job well enough that that’s what we keep ending up with.
That being said though, I would be interested in seeing a system where each downvote you make also counts against your own karma to discourage profligate use of the downvote to mean “I have a different opinion but can’t express it here”.
One feature I liked a lot from RES was the ability to tag users, which I used to flag users and remember who they are and what their general opinions are on issues. It made arguing more interesting since I didn't have to rerun over the same old conflicts with people who will never budge on certain issues.
I asked chatgpt, just to see what would filter out...
I would like some advice on designing a content-sorting and content-filtering mechanisms for discussion forums, in order to avoid or mitigates some of the problems that current systems are prone to.
One of the social problems with discussion forums results from the concept of upvoting or downvoting content.
It can be argued that voting content up or down serves to sort content by quality, allowing high-quality content to be seen and engaged with, while discouraging low-quality submissions.
However, in practice, Goodhart's law ends up applying: when a measure becomes a goal, it stops being a useful measure.
By using popularity as a proxy metric to determine quality, and by rewarding popularity with positive attention, this ends up selecting for content with superficial emotional appeal - ragebait, memes, facile/obvious comments pandering to common sentiment, puns, etc. - and not 'useful' content that is thoughtful, incisive, analytical, or important-yet-unpalatable. Ironically, content of this nature is also low-quality in a different way.
Worse, this ends up training users to produce content of this nature at the expense of thoughtful, interesting, incisive discussion, reducing both the production of quality content, and the quality of the collaborative sorting that users perfom by way of voting.
A cumulative 'karma' score for users, being a sum total of their upvotes and downvotes across all of their submissions, while ostensibly encouraging engagement and admirable behaviour... can again end up being gamified, and lead to users making large numbers of popular but low-effort submissions in order to maximise their score.
This can also produce 'filter bubbles' or 'hive minds': subcultures or communities that reject and discourage disagreement or criticism, and end up with a positive feedback loop leading to increasing orthodoxy with little grounding in reality. In some cases, the communities involved can become radicalised or toxic as a result, as more-moderate voices are suppressed from the discussion, and increasingly-extreme views become required in order to gain attention. Of course, it's reasonable and useful to let groups based around common ideals preserve an individual identity separate from the mainstream of discussion, but perhaps a way to regulate excessive self-selection would be useful.
Doing away with any kind of quality-sorting mechanic altogether is not a very useful alternative; a forum where spam, antisocial, or irrelevant submissions are given equal weight and visibility as interesting or thought-provoking posts tends to discourage users from participating or even reading.
As a further complication, users do not generally follow instructions or policy set out for voting on posts, and will tend to use them to indicate approval/agreement regardless.
Human moderation is an important part of any discussion forum and essential to the guidance ans shaping of any community, but the workload does not scale well; it should be considered a limited and intensive resource.
It would seem there are a few challenges to overcome:
1: Better-define the concept of 'quality' for discussion-forum posts beyond immediate, superficial emotional appeal - what is it that makes discussions feel satisfying, and drives deeper engagement?
2: Find a way to programatically ascribe a metric based on this concept, via the engagement of other users, in any form including upvotes, downvotes, tagging, commenting - and statistical or derived data such as the number, depth and overall score of replies to the post, and perhaps even analysis of the text of replies.
3: Ensure that any such system is difficult for individual users to exploit or game; for instance a naive reply-count metric could be trivially exploited by a user posting a large number of replies to a single comment.
4: Limit the effect of self-selection feedback loops, to prevent radicalisation while still allowing common ideals and identity to flourish in their own spaces.
One idea I've had would be to sort all content by 'controversialness' - that is to say, posts with a high number of total votes, but a low-magnitude score once downvotes are subtracted from upvotes. I believe this would tend to deprioritize low-effort posts such as ragebait or pandering to common sentiment, along with junk, spam and boring or irrelevant content, and thus bring 'interesting' content to the top. I believe it would be difficult to game for attention, or for other users to have undue effect on its visibility. However, I'm not sure whether controversialness would be a good metric to drive engagement, or what large-scale effects sorting on this metric might have on a community.
Can you offer any advice or commentary on the situation, or identify any important flaws in my analysis? Can you suggest any alternatives that might work better?
=====================================================================================
ChatGPT
You've outlined several important challenges and considerations when it comes to designing content-sorting and filtering mechanisms for discussion forums. The issues you've raised are indeed prevalent in many online communities, and finding effective solutions can be complex. I'll provide some advice and commentary on your analysis, as well as suggest alternative approaches that might address the problems you've highlighted.
Better defining quality: You're right in emphasizing the need to define quality beyond superficial emotional appeal. One way to approach this is by incorporating multiple dimensions of quality, such as insightful analysis, thought-provoking content, well-reasoned arguments, and valuable information. Encouraging constructive and meaningful discussions should be a primary goal.
Ascribing metrics based on user engagement: Rather than relying solely on upvotes and downvotes, consider implementing a more comprehensive metric that takes into account various aspects of user engagement. This could include factors such as the number of replies, the depth of discussion, the length of time spent reading the post, and even sentiment analysis of the comments. By incorporating a broader set of metrics, you can obtain a more nuanced understanding of user engagement and use it to evaluate content quality.
Avoiding gaming and exploitation: To mitigate the risk of individuals exploiting the system, it's crucial to design mechanisms that discourage or penalize excessive or low-quality contributions. For example, you could implement rate-limiting measures that prevent users from making an excessive number of submissions within a certain time frame. Additionally, considering the quality of the engagement (e.g., the relevance, thoughtfulness, and contribution to the discussion) rather than just the quantity (e.g., sheer number of replies) can help minimize gaming attempts.
Addressing self-selection feedback loops: To prevent echo chambers and radicalization, it's important to strike a balance between allowing communities with common ideals to flourish and promoting diverse perspectives. You could introduce mechanisms that encourage exposure to alternative viewpoints, such as algorithms that prioritize showing posts to users with different preferences or even implementing mandatory or randomized exposure to different topics or communities. Additionally, fostering a culture of respectful and open-minded discussion can help create an environment where diverse opinions are valued.
Regarding your idea of sorting content by controversialness, it could be an interesting approach to promoting engaging discussions. However, it's important to carefully consider the potential effects of such sorting on the community. Controversial topics and discussions can be divisive, and if not managed well, they may lead to increased conflict and hostility. Implementing safeguards, such as active moderation and community guidelines, would be essential in maintaining a healthy and constructive environment.
In conclusion, tackling the challenges of content sorting and filtering in discussion forums requires a multifaceted approach. By defining quality beyond superficial appeal, incorporating comprehensive engagement metrics, preventing gaming and exploitation, and addressing self-selection feedback loops, you can design a system that promotes meaningful discussions while minimizing the negative effects associated with current mechanisms. It's crucial to iterate and refine these systems based on user feedback and community dynamics to ensure the best possible experience for forum participants.
I think it would be better to have like a currency system where posts that are kept alive the longest trigger points, not just how many people upvote them. But then, you should be able to use those points to do something instead of hoarding them like a dragon's treasure or maybe turn them in to awards. If OTHER people give you awards, that's what you should have on display, not just how many upvotes you had. This would also give you more points for helping smaller communities create meaningful content instead of what's popular.
What about hidden karma?
Like there is still karma used internally to decide what posts to promote and how to weight votes, but the numbers are kept only internally so people don't get obsessed with that number next to their (and others') profile?
In my opinion the best alternative is a -1 : +1 scale. Members who contribute little are near 0, members who contribute a lot in a positive way get towards +1, if users contribute a lot in a negative way, their score goes to -1.
There are lots of different particular ways to implement this that isn't up vs. downvote count. Communities created, moderation activity, post count, engagement per post, positive reporting rate, false reporting rate, number of reports against the user, number of communites banned from, etc.
People like big numbers. Karma systems exist because they encourage posting and engagement. Stifling growth because Karma is toxic is bad for everyone in the long run. What matters is growth.
I had a twinge of regret the first time I realized that my Lemmy account didn't have a cumulative tally. Then I realized I didn't actually want. I am better off without the gamification of everything - especially social interaction. It doesn't really serve a purpose outside of gatekeeping, and if we put it in for the purpose of gatekeeping I think we'd all agree (at least those of us who where bot-modded back in reddit) that it's a poor substitute for human intervention in keeping bots and bad actors out.
I don't think there needs to be another system, just a lower emphasis on said system. Like others have said, just post karma.
Someone pointed out that if you make your own instance, you can probably just manipulate your reputation anyway, so it won't be in any way a judge of a person. Even then it was really only useful in extremes, aka if the person had a massively negative rep they were probably a troll, and a near-0 was a new account. Anything other than that was more a metric of time than anything else.
Sure, even post karma is an abusable system, but if you have nothing then you'll be wading through people ignoring others or using circular reasoning to make bad faith arguments. The system is good for at least dealing with extremes.
Personally I think there shouldn't be anything like it at all , that stuff should only be visible to you and nobody else . Didn't stop reddit from becoming toxic cesspit . But once its implemented it's hard to remove w/o serious consequnces . Just look at youtube dislikes .
Worst thing about karma system, r/assistance has minimum karma requirement which I think is shitty to peops who need help
If there has to be something then probably karma that is strictly for the past month of activities. It's not permanent so there's no point in people accumulating it to sell or trying to karma whore. This way it also has an indication that could still help people identify trolls at a glance.
Any shortcut method of mimicking reputation can be and thus will be abused, so they're all toxic.
The only sure way to do it is the good old-fashioned way - by name recognition - actual, earned "reputation."
The way it used to work on all forums and still does on some smaller ones is that people just read posts and write their own posts and over time they come to recognize each other's names and associate them with some impression of each individual's value as a poster.
And yes - that's not very effective in gigantic forums, and it's not accessible to newcomers. You need a relatively small group of posters and new people have to pay attention in order to figure out who are the better or worse posters. That's just the way it is, and is one of the problems with gigantic forums.
We need the karma-equivalent of PageRank. Every vote should not be treated the same, just as Google doesn't weight every link equally. The "one user one vote" system is the equivalent of pre-Google search engines that would rank pages by how many times they contained the search term. But it can't be as simple as "votes from higher-karma users are worth more" because the easiest way to build insane karma is to build a bot or spam low-effort replies to every rising post. Still, the system needs to be able to extract the wisdom of the crowd from the stupidity of the crowd, and the only way to do that is to apply a weighting gradient to users and their votes.
I think the awards system from Reddit could work, just without it being monetized. The awards let you see how people feel about the comment, and it’s more than just good/bad, like/dislike.
From this and other posts on this as well as comments I read and discussions that made me think about it, here's my suggestion.
Upvotes and downvotes but lemmy allows people to only see upvotes in their client if they wish to (be it because they don't like the "negativity" of downvotes or because they're not very good at emotionally dealing with seeing their own comments downvoted)
Some kind of summary of upvotes/downvotes a user got on his or her posts, per forum and only if enabled in that forum. The objective being to as much as possive avoid the gamification side of karma and its side effects (i.e. people taking it in as a "score" which leads to things like karma farming) whilst preserving the positive side of it as a measure of domain expertise or at least willingness to positivelly participate in domain specific forums.
It's very easily abused. Does Karma affect article and comment visibility on Reddit? I don't know the details, but if so I'd suggest that it not do so here. Maybe just have it be a number calculated from boosts, upvotes and downvotes that you can see on the profile if you are a mod trying to determine if someone tends to troll, but not something that has any affect on whether or not your stuff is displayed.