Discussion of standard for marking explicit tags (e.g. NSFW) on a post so that frontends can appropriately auto-hide images of such posts.
This is to start off a discussion of how to handle posts with NSFW (and other explicit) content. The goal is to not auto-show images that the user presumably would not want automatically displayed on their computer while browsing Steem, either because they are inappropriate for the context in which they are using their computer (e.g. work) or they just don't want to see them (especially true for NSFL content), but yet still not make the user experience annoying by requiring all posts to be click-to-reveal. As usual it works on the principle of posters self-reporting the appropriate tags that the frontends could then act on, and using community downvoting power to enforce accurate self-reporting.
This is what I posted in the #proposals channel of the Steem slack:
A post/comment may have an "explicit" field in the JSON with an array of strings as its value. If a post/comment has an "nsfw" string in that array, it is considered NSFW and there would always be a visible NSFW tag displayed next to the post/comment.
If a post is marked NSFW, it will not show pictures in that post in any recent/trending view other than the ones of categories whose name is on a client-side list of NSFW categories. However, even then, the pictures will not be shown in that recent/trending view if the user has set the "hide_nsfw_pictures" option in their settings.
Clicking on an NSFW post to see the full detail will show the pictures in the post as long as the user has not set the "hide_nsfw_pictures" option in their settings; if they have, it will replace the image with a placeholder box saying "Possible NSFW content" which can be clicked to reveal the image. An NSFW comment will by default hide the images in the comment with placeholders (again clicking on the placeholder box reveals the image) in a discussion thread unless the top-level post parent in the discussion thread is marked NSFW and the user has not set the "hide_nsfw_pictures" option in their settings.
When submitting a post or comment, there would be a checkbox to indicate whether the post/comment is NSFW. If this checkbox is checked, the "nsfw" string will be included in the array value of the "explicit" field within the JSON of the post/comment. Normally, it is by default unchecked (meaning it is not NSFW). However, if this is a comment in a discussion thread where the top-level post parent is marked NSFW, or if this is a post in a category whose name either has the word "nsfw" in it or is on a client-side list of NSFW categories, then the default state of the checkbox will be checked (meaning it is NSFW).
Again, this is just my initial ideas that we can use as a starting point for a discussion to work out a more formal standard.
In this GitHub issue I opened, I briefly mentioned how I do not thinking properly handling explicit content by overloading the tag system for that purpose is the smart solution long term. That said I think there is a need to expedite the handling of NSFW images on steemit.com, so I am in favor for now of using the #nsfw tag to hide post summaries (which is not yet implemented on steemit.com as of the time I am writing this comment) and to mark the post with the NSFW label.
That is an easy enough fix that should buy us enough time to work through the details of a better metadata solution. Perhaps something like the one in the OP. However, I do need to modify the OP a little bit to work nicely with the new tag system instead of the old category-only system. I think a modified version of the explicit standard in the OP would be a much more elegant solution than the stop-gap solution of filtering based on the #nsfw tag.
@pharesim brought up a valid concern in the slack about wasted UI space by requiring a checkbox to mark a post as NSFW. My response is that it would first require a single extra line (which is really an expandable div) that asks the user if they wish to customize the metadata of the post/comment they are writing. For users who have not been defaulted by the client into certain metadata (set explicit, language, etc. fields in the metadata) due to the client's heuristics (e.g. names of tags included in the post/comment or that of tags included in the top-level post of that comment's discussion thread, explicit or language metadata settings of the post that a comment is replying to, etc.), this would be the only additional UI elements added to the post draft UI. For posts that are defaulted into some of these metadata fields (won't be for most users), they will additionally see some of those defaulted settings above the Post button, so that they know what will be submitted to the blockchain and so that they have opportunity to change it if they think it is necessary. I think this can be a very elegant solution to handle not just NSFW content, but all kinds of explicit content ratings, as well as other standards discussed in steem-standards, such as the language standard.
In summary, the idea is that if you don't want to customize or fill in extra metadata yourself and you aren't defaulted into any metadata, the only wasted line is that single line (which is an expanding div) asking if you want to customize metadata for the post. Also, you can always at a glance tell what metadata has been set for the post (either by you or by default by the client due to heuristics) in a location that is close to the Post button so that you can confirm it is good before clicking the Post button and actually submitting the post (+ metadata) to the blockchain.
I second this proposal. I don't consider NSFW a tag like any other.
The checkbox for posting should also be hidden when hide_nsfw_pictures is active.
This is along the same lines, but one step further: what is Steemit planning to do when the porn industry sets up camp here? I'm almost sure it will, so what is Steem's plans for when this does happen?
Stella, I agree with you.
BUT
I think that steemit just might serve them with a serious dish of cold cold vengance: I think steemit could kill the porn industry by changing its financial topology.
Oh, Disintermediation.....
Are you proposing something like this in the metadata?
{ tags: [ "nsfw". ] }
{"explicit": ["nsfw"]}
Other values could then be "nsfl" or an age rating "r16", "r18"
Correct. Like this.
100% agree w/@arhag. I love looking, but I've been downvoting the titties. We need some NSFW management.
this is great. i agree
Has any solution for this been devloped? I'm trying to figure out if there's a setting switch I need to flip or what. But I'd love to get nsfw pictures (both in posts AND in profile pictures) blurred out or something.