Communities

Writing
Writing
Codidact Meta
Codidact Meta
The Great Outdoors
The Great Outdoors
Photography & Video
Photography & Video
Scientific Speculation
Scientific Speculation
Cooking
Cooking
Electrical Engineering
Electrical Engineering
Judaism
Judaism
Languages & Linguistics
Languages & Linguistics
Software Development
Software Development
Mathematics
Mathematics
Christianity
Christianity
Code Golf
Code Golf
Music
Music
Physics
Physics
Linux Systems
Linux Systems
Power Users
Power Users
Tabletop RPGs
Tabletop RPGs
Community Proposals
Community Proposals
tag:snake search within a tag
answers:0 unanswered questions
user:xxxx search by author id
score:0.5 posts with 0.5+ score
"snake oil" exact phrase
votes:4 posts with 4+ votes
created:<1w created < 1 week ago
post_type:xxxx type of post
Search help
Notifications
Mark all as read See all your notifications »
Q&A

Welcome to Codidact Meta!

Codidact Meta is the meta-discussion site for the Codidact community network and the Codidact software. Whether you have bug reports or feature requests, support questions or rule discussions that touch the whole network – this is the site for you.

Comments on Hobbling of users who consistently post low-quality content

Parent

Hobbling of users who consistently post low-quality content

+17
−3

There are, unfortunately, a few users on Codidact who relatively consistently make low-quality contributions. These posts often come in bursts and tend to be downvoted fairly quickly, but that doesn't slow them down.

I propose that Codidact should implement some manner in which to slow down such users. They shouldn't be prevented entirely from posting, but there should be limits in place to ensure that their posts don't drown out other content.

I am posting a self-answer with a suggestion for how this can be done, but alternative suggestions (or arguments why this is a bad idea) are certainly welcome!

History
Why does this post require moderator attention?
You might want to add some details to your flag.
Why should this post be closed?

4 comment threads

It's not possible to meaningfully hobble bad users. You can only hobble new users. (1 comment)
Like the idea. Added some extra steps to your basic proposal to allow for "strictness configuration";... (2 comments)
Perfect is the enemy of good (3 comments)
Downvote!!! it's not because my answers/questions are poorly written (2 comments)
Post
+10
−5

This is my suggestion for how to implement hobbling of users who consistently post low-quality content.

For a final decision, when a user saves a post:

  • First, look at their most recent post of the same type (question, answer, article, ...). If that post has a score >0.5 (that is, is positively received by the community), then allow the new post. (Note that this means that if their most recent post has not yet been voted on at all, the user does not get a free pass.)
  • If their most recent post has a score of 0.5 or lower (negatively or neutrally received by the community, including not yet having been voted on at all), then look at the user's sum of received votes. If that value is 0 or higher, then allow the new post.
  • If the user's sum of received votes is negative, then block creation of the new post unless, since the user's most recent post of the same type, the count of posts by other users is at least the absolute value of the user's sum of received votes. (So if a user's sum of received votes is, say, -3, then at least 3 posts must have been posted by other users.)
    • As an escape hatch to allow users to redeem themselves in a reasonable amount of time, cap the number of posts that must have been posted by other users in the interim at 10.

For clarity's sake, because it seems I wasn't clear enough on this part, these are meant to be applied from top to bottom as listed above until an actual decision point is reached. If one criteria doesn't result in an "allow the post" decision, that fact doesn't mean that the final outcome will necessarily be a "reject the post" decision; it means nothing more and nothing less than that the particular criteria didn't result in an "allow" decision. (And similarly for a "reject" decision.) Only if a criteria results in an actual "accept the post" or "reject the post" decision will that be the outcome without considering later criteria. There is no falling through the end of the set of criteria (if there is, that's a bug in this suggestion, so please point it out in a comment); every situation will result in either an "allow the post" or a "reject the post" decision no later than by the time the criteria expressed in the last bullet point has been evaluated.

Looking at the user's most recent previous post as a shortcut means that if a user posts, say, a well-received question, then they can post another question fairly quickly even if they have a previous history of low-quality content, but that after doing so they may (depending on their updated sum of received votes) have to wait for one of their existing posts to be voted on before they can post a third question. This allows a user who has a history of previously having posted low-quality content to, if they put in the effort to write good posts, relatively quickly start working their way back up. Once the user's sum of received votes indicates that their contributions are overall positively received, they will be able to post repeatedly based instead on that record.

Restricting to posts of the same type ensures that for example self-answers aren't hobbled just because they are self-answers to a not-yet-voted-on question.

We might want to look only at the sum of received votes for posts of the same type, or the sum of received votes for posts over a limited time period, or the sum of received votes for the user's most recent some number of posts, but I'm not sure how reasonable any of that would be from a performance point of view. Also, the total sum of received votes is already readily available in the UI, so it's something the user can look at and, knowing the criteria, can predict the effect of. If considering all posts by a user turns out to have unintended side effects, it may be worth considering some variation along these lines.

For UX reasons it might well make sense to also evaluate these criteria early, but the ultimate decision needs to be made by the time the post is saved because we can't know that the client is well-behaved.

Ideally, the thresholds (vote score threshold 0.5, sum of received votes threshold 0, interim post count cap 10, possible number of past posts considered) should be configurable per community and possibly even per category (it makes sense to allow more controversial content in the Meta category, for example, since hashing out policies will by its very nature involve making suggestions that some people disagree with), but the values I mention above seem to me like a reasonable starting point that should work reasonably well for low-traffic and higher-traffic communities alike.

History
Why does this post require moderator attention?
You might want to add some details to your flag.

4 comment threads

Disagree on several accounts (9 comments)
“because we can't know that the client is well-behaved.” In other words, guilty until proven innocent... (3 comments)
Should look at more than just most recent post. (2 comments)
I agree (1 comment)
Disagree on several accounts
celtschk‭ wrote over 2 years ago

I disagree on several accounts. First, I disagree on the “when the user saves the post”. So you let the user work on the post without any warnings, and then when all the work is done, you tell them “too bad, your work was for nothing.” If the user can't post for whatever reason, that should be communicated before starting the post, ideally by not presenting the user with an edit box to begin with (with an explanation why there's no edit box).

Second, I disagree on the sum of received votes. There at least should be a threshold on how new the votes are; votes from years ago (in either direction) should not be relevant at all.

Third, I don't think any automatic criterion should be immediately on negative. Imagine the first post gets a single negative vote; that would instantly prevent the user from posting again, although a single vote is clearly not significant. A lower (more negative) threshold is mandatory.

And I honestly don't understand your last bullet point at all.

Canina‭ wrote over 2 years ago

celtschk‭ As a basic design principle, validation will always need to happen at the point of saving. Validation prior to that is a UX optimization only; worthwhile, but can't be relied upon.

Exactly which votes to count for the purposes of this decision is certainly something that can be considered. I already allowed for that to some extent in the paragraph beginning with "We might want to look only at the sum of received votes for posts of the same type". As I said to Olin, I was aiming for predictability for the individual user, and the sum of received votes is already readily available.

A user wouldn't immediately be entirely blocked from posting by a single downvote; they would just have to let another post of the same type be posted first. But yes, the thresholds can be tweaked; that's probably the easiest part to all this.

I'll admit that the last bullet point was rather confusingly phrased; does it make sense now?

celtschk‭ wrote over 2 years ago

If they have to wait for something out of their control to happen, then they are entirely blocked. The duration of that block may be short, or not if the site has currently little traffic (e.g. mathematics.CD currently had the last post 5 days ago).

And yes, now the last bullet point makes sense.

On the point of validation: The web site certainly can re-validate on submission, but the re-validation has to be conditioned on the state when the page was opened. That is, downvotes in between should not count. The only acceptable deviation would be a (sufficiently long, say an hour) timeout, so that one cannot open a page far in advance in order to circumvent a possible future block.

r~~‭ wrote over 2 years ago

celtschk‭, why is it critical to you that the current state of the world not be taken into account when the post is submitted? Wouldn't that leave this scheme wide open to, say, opening ten browser tabs for ten questions and copy-pasting pre-composed questions in quick succession? Why would the additional moderation burden of finding and manually dealing with cases like that be worth whatever benefit you see in using stale state?

celtschk‭ wrote over 2 years ago

Somewhere else I happened to compose answers to questions that others considered close-worthy. The result being that while I was working on my answer, the question was closed without me noticing anything, and only when I transmitted the answer, I noticed that all my work was for nothing. That sucks hard.

Now you're asking what is the relation with the issue at hand? Well, the issue is that especially for users with little activity yet those measures as described might trigger because of some bad votes that are not the fault of them. Even if there are few of them, any reduction of their number is worthwhile.

What is not worthwhile is increasing the chance in order to prevent things that are very unlikely to happen anyway. Also, the specific possibility you note could easily be handled with a rate limit if you are approaching the threshold.

Generally, I believe to presume maliciousness is wrong.

r~~‭ wrote over 2 years ago · edited over 2 years ago

I think this is more like preventing double spending than presuming maliciousness, but we all see things our own way.

Anyway, what if the throttling only takes effect after the first post which would be subject to throttling? Basically, give each user a bit, and if the bit is unset, allow the post. If additionally the quality test (as described in this answer, say) is failed, set the bit and show a notice to the user that subsequent posts are disallowed until the community sees more activity. Set bits ‘expire’ back to unset when the conditions for throttling no longer apply. While the bit is set, disable the posting UI for that user. That prevents abuses of the sort I envisioned, since the first tab would set the bit and the other tabs would not be allowed to post; it also lets through posts by users who've worked for a while on a single post. And it's less arbitrary and indirect than rate limiting or warning users who are ‘approaching’ the threshold.

celtschk‭ wrote over 2 years ago

That would be much better.

Canina‭ wrote over 2 years ago

r~~‭ I might well be missing something obvious, but what then happens if, say, the user closes the wrong browser tab, or their browser crashes, or they have a power outage, or anything else that causes their session (and possibly also their cookies) to disappear? Wouldn't that leave such a flag dangling?

Considering that this comment thread seems like either an argument against the whole concept of hobbling, or at the very least a distinct proposal for how to do it, I would really encourage either you or celtschk‭ to post a separate answer and let the community vote on it. As I said already in the question, I welcome alternative proposals (including arguments why the idea is a bad idea), but can we please keep them where they will be discoverable and votable?

r~~‭ wrote over 2 years ago

Right on, I'll write up an answer shortly. I wanted first to determine if I understood @celtschk's objection well enough to offer an acceptable compromise.