User talk:Pundit/Image advisory label proposal
Add topicAppearance
Latest comment: 12 years ago by Pundit in topic Over simple
Over simple
[edit]Hi Pundit, there are a number of drawbacks to a simplistic advisory or "Not Safe For Work" NSFW system for Wikimedia.
- We are global and standards for NSFW vary hugely round the globe. So either you allow anyone to put anything into the filter and accept that only the most prudish will be happy, or you define one standard of NSFW and try to impose that set of cultural values on the globe. Neither is a workable solution for Wikimedia. Some will rightly object if you set the NSFW bar at a level that would allow images that are not safe where they work, others will laugh when they discover that bare female legs are NSFW in some countries and be outraged if you start putting "bare female faces" in the NSFW filter.
- "Somebody" has to decide whether each image should or should not be classified NSFW, if you fudge the issue of who would have that responsibility then opponents will conclude that ultimately responsibility will fall on either the community or the uploaders. If you put the responsibility elsewhere then in a category based system you are likely to be considered unrealistic. If you put the responsibility on the uploaders then you have embarked on a course that will end in good faith uploaders being blocked because their NSFW standards are different to others.
You may be interested in the discussion at Talk:Controversial content/Brainstorming/personal private filters, especially the bit where Sue Gardner concedes that much of the opposition to the previous category based proposal was indeed that it was based on the Commons categories. WereSpielChequers (talk) 11:03, 21 July 2012 (UTC)
- Hi, thanks for this comment. I think you're absolutely right that standards vary culturally. Depending on how big this problem is (and I'm not sure if the issue is major, since possibly there are images which fall into NSFW in most cultures, and we could address only these clear cases), we could also allow tagging by projects, just like interwikis (e.g. en-NSFW, etc.). Regarding the responsibility, I would assume that images are uploaded in good faith, but anyone can tag pictures as NSFW, yet whenever opposed - a discussion may be started to establish consensus (pretty much like PROD). Pundit (talk) 18:21, 21 July 2012 (UTC)
- Hi Pundit, yes I'm sure there are a bunch of images that are NSFW to pretty much anyone, and that if we had a filtering system they would not be contentious. The problem will come with the much larger group of images that are NSFW to some but not others. If we were to go to one extreme and only treat as NSFW the images that "most" cultures consider NSFW then we would not be serving a large minority or group of minorities, in fact we might all find that we were in the minority on some cultural concerns. At the other extreme we could treat as NSFW any image that a significant number of readers was likely to find offensive, but that would inevitably be seen as over prudish by many. I for one would not want to use filter that labelled any woman showing more than her eyes as "offensive". There is also significant opposition to any proposal that involves some editors imposing their cultural values on others. So in my view any viable filter needs to be tunable at the level of the individual user not the project. WereSpielChequers (talk) 01:00, 22 July 2012 (UTC)
- I think you are right. Ultimately, it will be difficult to develop a global solution. But working on just an individual solution (a personal filter) ignores the fact that 99% of our users don't log in. Of course, a filter COULD be an incentive to some of them to register, but practically most people won't (and we do not want to aggressively advertise the possibility to filter). That's why I think a reasonable intermediary solution is relying on independent projects, per this proposal. cheers Pundit (talk) 01:03, 22 July 2012 (UTC)
- Well its possible to have a two pronged approach. We already make editorial decisions as to which images which projects use in which articles, and a couple of language versions of Wikipedia already have an arrangement to hide the images in a few specific articles. My understanding is that demand for an image filter comes from people who don't want to see some of the images which there is consensus to show in the language version or versions of Wikipedia that they use. So in my view an image filter that meets the unmet demand needs to be one that allows people to decide that they personally don't want to see certain images even if they do have consensus to be in particular pages on WMF wikis. WereSpielChequers (talk) 16:36, 25 July 2012 (UTC)
- It is also about projects being able to introduce independent filtering/warning systems. As of now, we can't do this. Any solution we're looking for should take into account that 99% of our users don't log in. Pundit (talk) 23:13, 26 July 2012 (UTC)
- Projects can and occasionally do offer a warning system, and they also have long convoluted arguments as to which image to use and where in the article to put it. But we still have demand for an image filter, and we will continue to do so. Currently you need to log in if you want to move pages and create articles. I suspect it helps if you are an editor, and it does enable people to upgrade their skin from vector to monobook. But for the vast majority of people there is no need to log in and they don't. However if you offer them a free image filter then those who really want a filter will create an account. WereSpielChequers (talk) 10:23, 29 July 2012 (UTC)
- Absolutely, and I'm not disputing the need for a filter. You're making a good case how it can be useful. However, since most users do not log in, the essential part of any system addressing the readers' needs is some form of allowing projects to use independent labels. It is an illusion to assume that people are, on average, tech-savvy enough to realize they will be able to use filters after logging in, and on the other hand we won't be heavily advertising the possibility to filter all the time neither. Pundit (talk) 14:20, 29 July 2012 (UTC)
- Projects can and occasionally do offer a warning system, and they also have long convoluted arguments as to which image to use and where in the article to put it. But we still have demand for an image filter, and we will continue to do so. Currently you need to log in if you want to move pages and create articles. I suspect it helps if you are an editor, and it does enable people to upgrade their skin from vector to monobook. But for the vast majority of people there is no need to log in and they don't. However if you offer them a free image filter then those who really want a filter will create an account. WereSpielChequers (talk) 10:23, 29 July 2012 (UTC)
- It is also about projects being able to introduce independent filtering/warning systems. As of now, we can't do this. Any solution we're looking for should take into account that 99% of our users don't log in. Pundit (talk) 23:13, 26 July 2012 (UTC)
- Well its possible to have a two pronged approach. We already make editorial decisions as to which images which projects use in which articles, and a couple of language versions of Wikipedia already have an arrangement to hide the images in a few specific articles. My understanding is that demand for an image filter comes from people who don't want to see some of the images which there is consensus to show in the language version or versions of Wikipedia that they use. So in my view an image filter that meets the unmet demand needs to be one that allows people to decide that they personally don't want to see certain images even if they do have consensus to be in particular pages on WMF wikis. WereSpielChequers (talk) 16:36, 25 July 2012 (UTC)
- I think you are right. Ultimately, it will be difficult to develop a global solution. But working on just an individual solution (a personal filter) ignores the fact that 99% of our users don't log in. Of course, a filter COULD be an incentive to some of them to register, but practically most people won't (and we do not want to aggressively advertise the possibility to filter). That's why I think a reasonable intermediary solution is relying on independent projects, per this proposal. cheers Pundit (talk) 01:03, 22 July 2012 (UTC)
- Hi Pundit, yes I'm sure there are a bunch of images that are NSFW to pretty much anyone, and that if we had a filtering system they would not be contentious. The problem will come with the much larger group of images that are NSFW to some but not others. If we were to go to one extreme and only treat as NSFW the images that "most" cultures consider NSFW then we would not be serving a large minority or group of minorities, in fact we might all find that we were in the minority on some cultural concerns. At the other extreme we could treat as NSFW any image that a significant number of readers was likely to find offensive, but that would inevitably be seen as over prudish by many. I for one would not want to use filter that labelled any woman showing more than her eyes as "offensive". There is also significant opposition to any proposal that involves some editors imposing their cultural values on others. So in my view any viable filter needs to be tunable at the level of the individual user not the project. WereSpielChequers (talk) 01:00, 22 July 2012 (UTC)