Notifications
Sign Up Sign In
Q&A

NSFW Content Filter

+6
−3

According to the Terms of Service,

You agree not to use the Service to [...] violate any applicable law or regulation; [...] post content that is hateful, discriminatory, harmful, illegal, or otherwise objectionable [...]

For some definition of "objectionable" which does not necessarily include anything hateful, discriminatory, harmful, or illegal, a lot of content colloquially referred to as "not safe for work" (which I use here in its narrower sense in reference to sexual material, whether pictorial or merely textual) is technically not in violation of the ToS. Throughout this post I take for granted that any post in which such content appears is not otherwise in violation of any other policy, whether platform-wide or site-specific, and that each site can individually decide to locally ban NSFW content.

In the next few sections I address the various issues this request is attempting to prevent/overcome; you can scroll to the bottom if you're just interested in the idea itself.


Minors and Explicit Content

As a preface to my feature request below, I'd like to highlight a different section of the ToS, which reads:

If you wish to sign up for an account, you must be at least 13 years old, or 16 years old if you reside in the European Union.

I don't know what the laws are in other countries, but in the United States so-called "adult websites" are only permitted to users aged 18 and older, and I imagine there's plenty of other places out there where age of freedom online is much younger than age of adulthood. I'm not clear on the legal ramifications for us here, but I suspect there could be a legal issue regarding providing such content to minors. The easy solution would be to ban any explicit content; an alternative would be an age requirement to view certain posts that's completely pointless when it's so easy to lie about your age.


Textual NSFW Content

Then you have the other demographic of legal adults who simply don't want to see such posts, whether graphic or textual (not to mention the ones who don't care, but their bosses might). Consider the following (theoretical AFAIK) questions one might ask on our current sites:

  • Tips for writing a graphic scene in erotic literature?
  • How does intimacy work between a human and an alien species which evolved in radically different circumstances from anything on Earth? [For Scientific Speculation]
  • What does Judaism have to say about watching pornography?

Even without any visual aids, I can imagine a large portion of Codidact's user base would be very uncomfortable with these types of questions – but with the status quo there's no reason these are off-limits.


NSFW Filter

My goal with this proposal is that those who want to post these types of questions should still be able to do so, within the confines of the ToS, while at the same time allowing users the freedom to not see them if they don't want to.

The trouble with a filter automatically doing something without any human oversight is that there's a high risk of false positives here. Consider a post on Electrical Engineering which makes reference to connecting male and female interfaces. I don't want to know how true NSFW can come up on Electrical Engineering, but whatever site-wide filter is in place needs to have human oversight in place, or at least a threshold of aggression which can be determined on a per-site basis.


Proposal

Part 1: The NSFW Tag

(I describe here this feature as being an additional tag, but because of some of the properties of it this might be more easily implemented as a checkbox right next to the "submit" button. For instance, there's a five-tag limit, which may run into issues with automatically tagging posts NSFW.)

  1. Create a special tag called "NSFW."
  2. Any post with the NSFW tag added will have the title altered to display [NSFW] in front.
  3. Create a list of keywords used exclusively or almost exclusively in conjunction with adult themes. If these words are found in the title of a post tagged NSFW, then (perhaps only on the post listing) censor those keywords. This list can be edited on a per-site basis.
  4. If a solution to the minors issue is found in which explicit images are allowed, blur/cover all images found in an NSFW post or any answers thereon.
  5. Add a pop-up warning upon clicking to view an NSFW post.
  6. As with all other tags, the NSFW tag can be added at post creation or later.
  7. If the tag is added by a moderator, only a moderator may remove it.
  8. Add an option to the user profile to prevent any NSFW posts from showing up in the question listing; defaults to "allow" but can be changed to "block." If images are allowed, there would be an additional option of "allow NSFW posts but block any containing images.
  9. Users who are not logged in may not view NSFW posts. (Not sure if this would be required if we ban graphic images, but it certainly must be implemented if we allow them.)

Part 2: Moderated Queue, with a Filter List for Support

  1. Create two keyword lists: one with words always or nearly always used in explicit contexts (perhaps more restrictive than the list from Part 1, perhaps identical), and one with words sometimes used in explicit contexts but not always. These lists can be edited on a per-site basis.
  2. If a post contains a word on the "nearly/always explicit" list, automatically tag it NSFW if it isn't already.
  3. Create a moderator tool to see all posts auto-assigned NSFW status, so they can confirm if it's a true or false positive.
  4. If a moderator confirms a post as being a false positive, remove the NSFW tag.
  5. If a moderator confirms a post as being a true positive, lock the tag in place: only a moderator will be able to remove the tag. If someone feels the post was tagged wrongly, they can flag it for review.
  6. If a post already confirmed as a true positive is later edited, the post will be sent through the queue once again. A moderator can then opt to remove the tag or to keep it there; in the latter case it's treated no differently than any other true positive.
  7. If a post contains a word on the "sometimes explicit" list, send it to this moderator queue for handling. Once again, a moderator can opt to leave it alone or mark it as NSFW and thereby lock the NSFW tag in place.
Why should this post be closed?

7 comments

We'll need to consider edits, too. Consider a post that was originally fine and then was edited to add NSFW content. Also consider a post that was suspect, reviewed and approved (not NSFW) by moderators, and then edited. (I don't know if those two cases are the same.) Monica Cellio 14 days ago

To be honest, I'm a little bit surprised to see this. I have always thought that NSFW content is restricted to explicit images of sexual deeds and gore. I don't view your examples as NSFW but rather as educational and especially good questions. Of course, nudes and images of extreme violence should be (and are) forbidden. Out of curiosity: Is there a significant portion of users who view these possible questions as NSFW? (I don't want to downplay this, I just want to be able to understand it.) Zerotime 14 days ago

@Zerotime If "nudes and images of extreme violence should be (and are) forbidden" I'd like to see that reflected in either the ToS or CoC, if only to prevent any potential issues down the road. DonielF 14 days ago

Are the downvotes reflecting a non-consensus that this type of content should be filtered, or that not all of this type of content should be filtered? DonielF 14 days ago

Yes, you are right about the passage in the ToS or CoC - right now both don't mention it. About the downvotes, I can't tell you anything, I didn't vote until now but it would be interesting to know if they are against the functionality altogether or just against the form you're proposing. Zerotime 14 days ago

Show 2 more comments

0 answers

Sign up to answer this question »