Talk:Image filter referendum/Archives/2011-08-23
Please do not post any new comments on this page. This is a discussion archive first created in August 2011, although the comments contained were likely posted before and after this date. See current discussion or the archives index. |
Categories
Categories - it doesn't work
"We believe the designation of controversial should be placed on categories of images, not individual images, and the categories deemed controversial would receive this designation when there is verifiable evidence that they have received this designation by the communities the projects serve." - from the report - suggesting how contentious items would be identified.
We categorise pictures based (primarily) on what they are of - or sometimes who they are by. Offensiveness does not follow content category lines. For example en:File:Dali Temple image.jpg belongs to no categories that would suggest it is problematic. Similarly I don't believe we have "Category:Images containing XXX" where XXX is merely incidental to the image, but not necessarily to the offensiveness. Rich Farmbrough 04:34 17 August 2011 (GMT).
- It doesn't follow current category lines. This seems a minor problem (I imagine the Board envisions e.g. "Category:Violent images") that the Foundation could presumably overcome by decree. The more blatant problem is defining the precise boundaries of "violent", "religiously offensive", "pornographic", [insert other "censor-worthy" characteristics here] in a somehow objective and culturally neutral manner (IMO, it's an impossible fool's errand) so as to allow for the consistent application of such categories. --Cybercobra 06:26, 18 August 2011 (UTC)
- Current category lines was the idea - "categories deemed controversial would receive this designation", not "categories deemed controversial would be created". I agree though, that defining categories would need to be created, whether using the category system or a POWDER like syntax which might actually avoid some problem (like more recent versions of files being given the same categorisation as previous versions without checking). Nevertheless the system would be de-facto horribly resource intensive, contentious and fragile. Rich Farmbrough 10:56 18 August 2011 (GMT).
Then why not try to personalize categories? Put personalization option on user's computer and allow people to make up categories by sorting pictures. Not 100% foolproof, but better than nothing. Rickyrab 18:07, 19 August 2011 (UTC)
We don't have to use categories at all. The board resolution itself allows us to work with the devs to implement this feature if we like, and in any form we want. (And if we don't like any of the options we can come up with, we can report that to the board too, at which point they can change their minds and tell the paid devs to do something else instead). --Kim Bruning 23:39, 19 August 2011 (UTC)
Children's Games Category?
I don't understand something. What kind of categories will be set for filtering?
If I'm only in favor of allowing the filtering of extreme violence (gore, bodies, medical) or extreme sexual content, but not anything else - does it mean I need to vote against this feature? Yellowblood 21:40, 19 August 2011 (UTC)
- Violent images, nudity, etc. are the categories that we're intending the referendum for. The children's game category is only used in the mock-up. The developer chose something neutral like Pokémon and Children's Games to use to show how the feature worked, but that's not what the filters would be in the future. It wouldn't really be a good idea to use a controversial image in the mockup since we're trying to say that those images shouldn't surprise people.
:-)
Cbrown1023 talk 01:54, 20 August 2011 (UTC)- The "etc" is one of the big issues here. "Images of Mohammed" is explicitly mentioned in the report I believe and was one of the causes for having the report. Salient examples of other religious prohibitions have been given too, and while we have not been given figures for the various groups, it is hard to see how we would implement one religious filter and refuse to implement another. We are told that some sects have a prohibition on images of women, such a filter would sit uneasily with me, and I think most Wikimedians, but so would not providing it while pandering to the requirements of other groups and claiming "cultural neutrality". Again "images you wouldn't want your children to see" might include (as some filtering software does) images pertaining to drug use - showing how crack is made. Images you might be "squicked" by could well include modern art (fetus earrings for example), just as much as "violence" - we maybe assume that cartoon violence is OK, but for someone who chose the filter because they suffer from PTSD this might not be so. And so it goes on - we would be making undertakings by offering the filters that we couldn't keep any thing like 100% of the time. We would be creating a categorization nightmare. We would be opening the door to various forms of censorship and suppression. There would be disagreements about categorisation that would make certain "neologism" discussion look like picnic in the park. And this would all be meat and drink to media outlets on a quiet day - "Wikipedia censors art" "Wikipeida says these pictures aren't nude" etc.
- So unless we actually have a concrete proposal "These 6 categories will be supported" that you can agree with, then yes, Yellowblood, you should vote against (insofar as you can vote "0" for statement 1). Almost certainly "Nudity" will, as Cbrown1023 says, be a category - although that is a far from simple category, including as it does the Venus de Milo. Rich Farmbrough 02:47 20 August 2011 (GMT).
- It's going to be impossible to filter only sex or gore without also filtering other things people find offensive. Westerners are the people who need this feature least-- these cultures are used information freedom, they just don't get that upset over any images, even gore or sex. All major clases, groups, and cultures will want their own filters... We're Wikimedia-- we're up to that challenge. --AlecMeta 03:12, 20 August 2011 (UTC)
GUI mock-up is misleading, has far fewer categories than is realistic (and reflected in discussion on this page)
Meanwhile, the seductive screenshots presented for this project was clearly created without conducting any sort of survey of how many categories would be required, and "Other controversial content" is almost certainly a non-viable category that no two people would be able to agree on. That kind of superficiality has the potential to make a very negative impression because you look set to introduce something that won't fulfil its advertised purpose, and it will upset people (case study: parents allow their children to use Wikipedia when they didn't before, trusting the filters, and then find that their children are viewing content to which filters haven't been applied yet (human vetting of all new images is not viable either in terms of effort required nor in terms of justified confidence in the review - just try to think of some regular contributors whom you wouldn't trust to review content in this way, and there you go - kids only need to read the Santa Claus article once to get the picture (NPI)). Samsara 22:02, 19 August 2011 (UTC)
- I agree. Of course the Pokemon image isn't meant to be a serious example. But I've run across people who seriously believe "Pokemon is borne of Satan". So should we throw that in to "Other controversial content"? Evil saltine 23:08, 19 August 2011 (UTC)
- I think a filter will have to have a catch-all category for "anything that has ever been objected to by any human being". This category may grow arbitrarily large, to encompass every single image we have if they want it to. If someone wants an imageless wiki, we can serve one up for them. --AlecMeta 00:14, 20 August 2011 (UTC)
Usage of the filter
Pictures of Muhammad
How will the Wikimedia Foundation avoid that people will vote for the option to remove pictures of en:Muhammad?
- The recent version of the important article ar:محمد already seems to contain no picture of Muhammad, so it is factually censored.
- Some people may argue, they feel hurt by depictions of Muhammad and that for them, these are images of violence.
- The voting system is very vulnerable for abuse, e.g. by voting with multiple accounts, or by external organizations orchestrating votes.
- You want to ask users to give their view on how important it is "that the feature be culturally neutral (as much as possible, it should aim to reflect a global or multi-cultural view of what imagery is potentially controversial)". What will happen if the average outcome to this question will turn out to be less then 10? What will happen if the outcome turns out to be below 5?
--Rosenkohl 11:03, 16 August 2011 (UTC)
- Won't it be the users themselves who choose which type of images they want to hide, and which to view? Depictions of Muhammad may be tagged as such (instead of 'violence, etc.), so that readers who may be offended by such images may filter them out. --Joshua Issac 12:04, 16 August 2011 (UTC)
- How about any image of humans? we are all Gods image and therefor not to be portraited. --Eingangskontrolle 12:22, 16 August 2011 (UTC)
- With a well-implemented system, (the few) people who are offended by pictures of humans could choose to not view any pictures of humans. Those who are offended by pictures of Muhammad can choose not to view any pictures of Muhammad. And the rest of us can go on merrily on our way, unaffected. What other people choose to view or not view will have no effect on the rest of us. B Fizz 18:39, 16 August 2011 (UTC)
- The rest of us ("we" the authors of Wikimedia projects) will have to do the task of categorizing the pictures of humans or categorizing the pictures of Muhammad for those who feel offended by these pictures to be able to turn them off, and will be the authors of resulting categorys. I don't see how this effect is neglectible for us or the encyclopaedia, --Rosenkohl 20:59, 16 August 2011 (UTC)
- I've said before, I've said again. If we make a filter, the VERY FIRST images that need to be filtered are Muhammad. No images are causing more people more anguish than those. Christians and Porn isn't even in the same ball park, it ain't the same league. It ain't even the same sport.
- It would be a colossal double-error to have a filter that didn't get those images but got less offensive ones. --AlecMeta 18:02, 18 August 2011 (UTC)
- Which goes being a very good reason for NOT having such a filter, except for everybody to individually opt-out from specific pictures, if he wants to not see them again. If you never show those taliban a picture of him-who-must-not-be-seen, they will never realize they´re sort of very stupid about it. There is some difference between hanging Stalins portrait on your bedroom wall, and seeing a picture of him here. --Maxus96 17:30, 19 August 2011 (UTC)
Will this image be deleted?
Along with similar images?
And is it gonna affect the spanish and porutguese wikipedias as well ? Und by its out come 13:49, 18 August 2011 (UTC)
- No. No images will be deleted. However, the filter is universal, so readers at those projects will be able to self-select what images they'd like to see, as well. --Mdennis (WMF) 14:00, 18 August 2011 (UTC)
- To be fair, this particular image is meant to be entertaining (pornographic), not encyclopedic or educational. commons:File:Bruna_Ferraz_with_Photographer.jpg and commons:File:Bruna_Ferraz_with_Photographer_crop.jpg are entertaining and not educational as well. --Michaeldsuarez 14:06, 18 August 2011 (UTC)
- They can select a filter type, but not which image will be shown and which will not. The decission for single images will be done by unknown users and their POV, which is the oposite to a fundamental rule called NPOV, which the WMF seams not to know anymore. I wonder why it wasn't deleted by now. --Niabot 14:23, 18 August 2011 (UTC)
- They can choose what images will be shown and which will not in the same way that all of us can: by participating in categorizing. Too, if an image is hidden that they want to see, all they have to do is click on it, and it will show. Speaking from my volunteer perspective, POV is rampant through the projects. For instance, whether or not something is notable enough for inclusion is a POV; how much attention should be paid to a topic and where is a POV; whether or not a reference is reliable is a POV. We make editorial decisions all the time. :) It's part of our model. --Mdennis (WMF) 14:43, 18 August 2011 (UTC)
- The first sentence sound a bit ironic. Doesn't it? If you don't want to look at images you invest the effort to view it and to categorize it. Right? Of course you wouldn't do this for yourself anymore, but you will decide for others at what they shouldn't look. Also the concept behind this decision would be POV, only POV. No sources, only personal opinions. Thats definitely not part of our model. --Niabot 00:00, 19 August 2011 (UTC)
- Hmm. Can you explain to me what sources go into making up the notability guidelines? I haven't seen these. --Mdennis (WMF) 00:06, 19 August 2011 (UTC)
- Notability is way different from hiding things that are notable or or not, but "shouldn't been seen". Please don't throw apple and pears questions at me. Anyways: Most inclusions, because of notability, are based on sources. That means that articles without sources will be deleted. Articles that are sourced (a requirement) will be kept, since sources guarantee notability. Thats at least the way the German "notability"-system works. The "rules" are inclusive, not exclusive and to understood as guidelines. You may ask: What is the differentiation between including something and to hide something? If a topic is "notable" to be part of an encyclopedia, then we must assume that illustrations for that topic are also notable enough to be part of it. The filtering is the opposite. This is as saying: The topic is worthy to be described, but the illustrations aren't, because someone don't likes what he has to admit. --Niabot 00:16, 19 August 2011 (UTC)
- I wrote this in response to your first question but edit conflicted; I believe I've amended. Determining what constitutes notability (what makes a good source, what makes enough sources) is very much a matter of POV--the perspective of what is "encyclopedic" --and it is the first thing I mentioned above as part of the encyclopedic model. Notability is not only used to include, but to exclude (German Wikipedia has a speedy deletion criterion for "Zweifelsfreie Irrelevanz", I know, and conducts deletion debates for more borderline material.) Determining what images belong in which articles is also part of the encyclopedic model, and one which every article creator engages in as a matter of routine. Even this conversation is largely a matter of "point of view"--without reading through the entire page, I can say that I'm seeing a lot of people weighing in not from sources, but merely their own perspectives. In other words, they wish their POV to affect what others see--whether their POV is that others should see whatever they choose to upload and allow on the projects or their POV is that others should have the option to hide from themselves images they do not wish to look at. Point of view is central to the formulation of most of our policies and practices--what we agree, as a community, is appropriate for our objectives. "No sources, only personal opinions" are the underpinnings of much of this.
- Notability is way different from hiding things that are notable or or not, but "shouldn't been seen". Please don't throw apple and pears questions at me. Anyways: Most inclusions, because of notability, are based on sources. That means that articles without sources will be deleted. Articles that are sourced (a requirement) will be kept, since sources guarantee notability. Thats at least the way the German "notability"-system works. The "rules" are inclusive, not exclusive and to understood as guidelines. You may ask: What is the differentiation between including something and to hide something? If a topic is "notable" to be part of an encyclopedia, then we must assume that illustrations for that topic are also notable enough to be part of it. The filtering is the opposite. This is as saying: The topic is worthy to be described, but the illustrations aren't, because someone don't likes what he has to admit. --Niabot 00:16, 19 August 2011 (UTC)
- Hmm. Can you explain to me what sources go into making up the notability guidelines? I haven't seen these. --Mdennis (WMF) 00:06, 19 August 2011 (UTC)
- The first sentence sound a bit ironic. Doesn't it? If you don't want to look at images you invest the effort to view it and to categorize it. Right? Of course you wouldn't do this for yourself anymore, but you will decide for others at what they shouldn't look. Also the concept behind this decision would be POV, only POV. No sources, only personal opinions. Thats definitely not part of our model. --Niabot 00:00, 19 August 2011 (UTC)
- They can choose what images will be shown and which will not in the same way that all of us can: by participating in categorizing. Too, if an image is hidden that they want to see, all they have to do is click on it, and it will show. Speaking from my volunteer perspective, POV is rampant through the projects. For instance, whether or not something is notable enough for inclusion is a POV; how much attention should be paid to a topic and where is a POV; whether or not a reference is reliable is a POV. We make editorial decisions all the time. :) It's part of our model. --Mdennis (WMF) 14:43, 18 August 2011 (UTC)
- In terms of your note above, fortunately, people don't have to look at every image to help determine categorization. I could weigh in on the proper categorization of images without looking at them all in the same way that I can weigh in on notability without examining every subject. We form guiding principles to which content may be compared. --Mdennis (WMF) 00:43, 19 August 2011 (UTC)
Things don't have to been viewed?
-
Things don't have to been viewed: Public debt
-
Things don't have to been viewed: Hunger in Africa
-
Things don't have to been viewed: Unemployment
-
Things don't have to been viewed: Executions in USA
-
Things don't have to been viewed: Holocaust
-
Things don't have to been viewed: Child Soldiers
-
Things don't have to been viewed: War Crime #1
-
Things don't have to been viewed: War Crime #2
-
Things don't have to been viewed: Torture
-
Things don't have to been viewed: Homosexuality
-
Things don't have to been viewed: Armenian Genocide
-
Things don't have to been viewed: Climate Change Attribution
-
Things don't have to been viewed: Drug use
—The preceding unsigned comment was added by Widescreen (talk • contribs) 10:25, 19 August 2011.
Does this mean you're going to stop censoring Goatse?
Or are going to be even more dishonest and shitty when it suits you? —The preceding unsigned comment was added by Radar snake (talk • contribs) 21:06, 18 August 2011.
- I'm in favor of restoring the image. An image would provide a concise and straightforward portrayal of what the subject is. If a visitor doesn't wish to view the image, she or he could take advantage of this new tool while allowing those who wish to view it to view it unmolested by censors. --Michaeldsuarez 22:17, 18 August 2011 (UTC)
- Without getting into any specific cases, mostly because i don't want to personally look at said specific cases to investigate them, I absolutely believe a filter will go hand in hand with chiseling on the wall in marble for time eternal "Wikimedia is not censored". The argument of "Delete this content because it's controversial" should become laughable and embarrassing in the filter age. That's the idea, anyway. --AlecMeta 01:33, 19 August 2011 (UTC)
- There's a special trick the deletionists have learned to help them achieve such results, which works like this:
- Without getting into any specific cases, mostly because i don't want to personally look at said specific cases to investigate them, I absolutely believe a filter will go hand in hand with chiseling on the wall in marble for time eternal "Wikimedia is not censored". The argument of "Delete this content because it's controversial" should become laughable and embarrassing in the filter age. That's the idea, anyway. --AlecMeta 01:33, 19 August 2011 (UTC)
- Wikipedia must not carry content if it is uncertain that it violates copyright.
- All that is needed to maintain this uncertainty is for the deletionist never to stop claiming it's so.
- For further information see s:Industrial Society and Its Future. Note that Wikisource:Possible copyright violations is an oubliette that goes back two years. Wnt 06:39, 19 August 2011 (UTC)
bad proposal badly worded
The proposal is focused on the hiding functionality, which is easy enough to add I suppose, but as others have pointed out, the real problem is the labeling of images. The proposal completely avoids any discussion or feedback gathering of how labeling or tagging of images would work. Would consensus drive this? Would admins make the final call? How will this all work?
Can you immagine how difficult it is going to be to gain consensus on some images? Even within the U.S., standards on what is pornography and what isn't vary widely. Are images of fine art going to be labeled as sexual in nature? That's going to anger both sides of the argument and create a lot of wasted time effort and energy.
This proposal is counter to the mission of the foundation which focuses, rightly, on unrestricted content availability :
- The mission of the Wikimedia Foundation is to empower and engage people around the world to collect and develop educational content under a free license or in the public domain, and to disseminate it effectively and globally.
I would like to see this proposal withdrawn and reworked at least to the point that these vagaries are addressed.
Ideally this proposal should be shelved and content filtering left to the 3rd parties, there are freely available ones as well as commercial ones. This allows individuals to select the filter that meets their needs and not rely on the wiki community to decide (poorly) for them.--RadioFan 13:28, 19 August 2011 (UTC)
Breasts
This reminds me of all the people who got banned by filters when they discussed "breast cancer." Aren't we dumbed down enough? Stormbear
- I don't think they're going to that benighted extreme here, but indeed the treatment of breasts would expose some of the problems with this idea. Does Wikipedia allow a category for images with naked female breasts? And if so does it allow a category for images with naked male breasts? It is recognized in more enlightened jurisdictions - even, I've heard, in the state of New York - that prohibiting female skin but not male skin is indeed sexual discrimination. Should Wikipedia then put resources into a sexist categorization scheme? If we want an international perspective, should we have a category for women with unveiled faces? Should we have one for men with unveiled faces to balance it out? In short, does helping people avoid offense mean that we specifically designate our resources to maintaining sexually discriminatory beliefs?
The reason why this is so impossible is that we are probing the endless theorems and principles of False. 75.97.33.173 16:54, 19 August 2011 (UTC)
Criterion creep
As pointed out in a posting above enquiring about imagery of Muhammad, there is a huge problem with deciding what filters to implement (eligibility = arbitrary?), who gets control of the filters (do we need more quibbling admins, maybe?), or how to decide what to filter. For instance, let's assume we implement a "gross" filter. I may find tripe gross, others may find it delicious. Some people find hairy spiders cute, and what about cartoon pictures? Who draws the line for what would trigger an arachnophobic response? Do we throw out political cartoons depicting individuals with eight legs? And has anything been said about whether previously excluded Featured Pictures would see the light of day under the proposed provision?
Just try listing all the cultural groups around the planet that may have one hang-up or another about what is valid to depict. Forget about modern sects, and consider *just* all the different ethnic groups that exist, their indigenous cultures, and their diverse perceptions of the world. Are we going to cater for all of them? Or just the more populous ones, and, if so, what justifies this? Do cultures lose their vote by going extinct? And whose fault is it, anyway? (Just throwing that one in there to pre-empt handwaving.) Shall we invent another political device? Another process to decide which filters to implement and which to leave out? Another battle-ground for the various lobbying groups that are already interfering with the true, original purpose of Wikipedia? A honey-pot for them, perhaps? (Now, that would be clever!) Samsara 22:02, 19 August 2011 (UTC)
Suggesting stricter limits and narrower, more straightforward categories
The filter should be limited to sex, nudity, blood, and guts. There's substance and form to those words, and everyone knows what sex, nudity, blood, and guts are when they see them. Cultural concerns, controversial content, and violence are too vague and too open to interpretation. There'll never be consensus on which images are controversial enough to be filtered. It would be best to drop controversial content and violence from the categorization scheme. --Michaeldsuarez 13:42, 19 August 2011 (UTC)
- In other words, you believe that the system should address only content widely regarded as objectionable in your culture. —David Levy 13:59, 19 August 2011 (UTC)
- No, I just saying that it's easier to come to an agreement on which images contain sex, nudity, blood, and guts than it is to agree on which images contain controversial material. --Michaeldsuarez 14:11, 19 August 2011 (UTC)
- It's even easier to come to an agreement on which images contain trees, bicycles, fish and doughnuts. The only material distinction between those categories and yours is that the latter are commonly regarded as objectionable in your culture. Someone from another culture might have a dramatically different idea of what content is easily identifiable as problematic. —David Levy 15:59, 19 August 2011 (UTC)
- At which point is something sex(ual) or nude after your impression? Is a very tight dress/swimsuit already nude? Is a fantasy picture showing angels nude? Does an act photograph or drawing fall under sex if genitals are shown?
- Now, who wants to decide whats nude and whats not nude? Volunteers are welcome. Preferably one from US, one from Europe, one from Asia, one from Latin America and so on. --Niabot 14:20, 19 August 2011 (UTC)
- I have suggested 1 thing in my vote that there should be at least 2 levels for each category, mild and serious. For the sexual/nudity category, mild level may contain image with revealing wardrobe without showing female nipples, male genital or bare butt (4th image). -- Sameboat (talk) 14:42, 19 August 2011 (UTC)
- Doesn't this make the decision even harder? Now you have to choose if something is nude or not. Additionally you will have to choose in which category to put it. How to distinguish between mild nudity and hard nudity? --Niabot 14:59, 19 August 2011 (UTC)
- No. The level must be as specific (nipples, genital...) as possible for easier categorization. This also allows easier search result for images with specific characteristics. -- Sameboat (talk) 15:01, 19 August 2011 (UTC)
- That (nipples, genitals...) means you decide just on the basis what is shown. An artistic and well known artwork would fall in the same category as the latest porn from some unknown amateur? A drawing or photograph of the human body, describing the body parts, showing nipples and genitals would fall in the "hard" category? Is it that what you want to achieve? --Niabot 15:16, 19 August 2011 (UTC)
- Artistic or porn (or even medical), conservatives care not. They only feel any exposure of these body parts to be offensive whatever the purpose is. -- Sameboat (talk) 15:20, 19 August 2011 (UTC)
- How about normal people that don't want to look at porn but are very interested in art or medicine? In school art classes depictions of Greek and Roman statues are shown. They contain genitals as well, since this is part of the art to depict the human body. Do you indirectly admit that such a filter would tend to remove much more then needed and that the educational purpose would suffer? --Niabot 15:26, 19 August 2011 (UTC)
- I think we should avoid the content of "purpose" of the image because coming next will be a derailed debate. But I do think we should add the subcategory under the nudity and violence such as historical artwork for user to untick its filtering. -- Sameboat (talk) 15:33, 19 August 2011 (UTC)
- You would need to do this for all other categories as well. In the result we have exploding categories or we draw an unsufficient line which makes the filtering very doubtful. It wouldn't satisfy conservatives and/or liberal people. --Niabot 16:12, 19 August 2011 (UTC)
- Something (exploding categories) has to be done. There's no way any solution could satisfy the whole world. Liberals should be aware that things they're comfortable with are constantly labelled by conservatives. This fact shouldn't be avoided, even in Wikipedia. The filter categories (or labels) may upset the liberals but they have to accept it. The very least thing they could do is to disable the filter in the first place. The detailed categories should be as objective as possible, which means subjective categories like "artistic", "educational" and "pornographic" should be avoided because they're gonna be abused. -- Sameboat (talk) 23:54, 19 August 2011 (UTC)
- The status quo requires no value judgements on the part of the Wikimedia Foundation or its projects' volunteers. The planned filter system, conversely, will force us to either specially categorize practically everything or draw arbitrary lines when determining what content to label "potentially objectionable."
- "Liberals" and "conservatives" alike stand to feel discriminated against, depending on where those lines are drawn. For example, will this photograph be included in the optional filtering? If not, many "conservatives" will be outraged. If so, will this photograph be included as well? Otherwise, we'll be taking a formal stance that homosexual kissing is "potentially objectionable" and heterosexual kissing is not.
- If we include both in the optional filtering, will we place them in separate categories (thereby affirming that a material distinction exists)? If so, what other distinctions will we affirm? Monoracial kissing and multiracial kissing? White people kissing and black people kissing?
- I see no feasible means of maintaining neutrality. —David Levy 01:50/02:03, 20 August 2011 (UTC)
- Literally just create all the categories you have mentioned, homosexual kissing, heterosexual kissing, etc. even the transgender kissing if that is stated in the file description. let the users customize their own filter combination themselves. The filter label really does no actual harm to the liberals. -- Sameboat (talk) 02:28, 20 August 2011 (UTC)
- That falls under the "[categorizing] practically everything" scenario mentioned above.
- Do you honestly believe that it's realistic to maintain filter categories for "homosexual kissing," "heterosexual kissing," "monoracial kissing," "multiracial kissing," "white kissing," "black kissing," et cetera? Do you realize that this barely scratches the surface (even for kiss breakdowns)?
- I find it strange and disconcerting that you've continually singled out "the liberals" (as though no one else conceivably could find fault with the system's implementation). —David Levy 03:38, 20 August 2011 (UTC)
- Yes I know, but it is still required to allow users to better customize their own list of filter. But category should be added if they're reasonably requested (number of users who ask for it may be a consideration, but not the highest weight.) And in order for the filter categories not being added indiscriminately, it shouldn't be open to normal users. Apart from moderators and higher user rights, new user group, image filter maintainer, may be added for handling the categorization of image filter to ease the workload of other sysops. -- Sameboat (talk) 03:57, 20 August 2011 (UTC)
- So we need to decide which filter requests are "reasonable." And you believe that this can be carried out in a neutral, non-drama-inciting manner?
- No matter what decision is made in response to a given request, it will amount to a formal stamp of approval or rejection of a particular belief's validity. In such a scenario, I see no possible means of maintaining neutrality and collegiality. —David Levy 04:33, 20 August 2011 (UTC)
┌─────────────────────────────────────────────────┘
This needs to be discussed in a better ground. But giving it a second thought, tagging the race of people inside the image is not quite necessary. -- Sameboat (talk) 05:09, 20 August 2011 (UTC)
- You realize, I presume, that it's common for people to strongly oppose (and be offended by) romantic/sexual relations between persons of different racial backgrounds. Many disapprove of racial integration in general, and some even object to the very sight of humans belonging to x race or religion (particularly in certain contexts). If we set up a system intended to enable users to filter "potentially objectionable" images without tagging relevant images in a manner accommodating such individuals, aren't we effectively condemning their beliefs? —David Levy 05:24, 20 August 2011 (UTC)
- The filter labels do do harm. To suggest that someone can filter out "White people kissing" "Jews kissing" "Mixed races kissing" is to promote racism. Rich Farmbrough 03:45 20 August 2011 (GMT).
- You're avoiding the reality by calling my suggestion racism. All we need is to provide visual clue of the image, not conjecture its purpose like "intimacy", "hatred", etc. -- Sameboat (talk) 04:04, 20 August 2011 (UTC)
- I don't think that Rich is attributing your comments to racism; he's saying that the proposed setup would promote racism.
- Do you disagree that a desire to filter images depicting people of certain racial backgrounds kissing likely reflects racism? No matter how pragmatic you believe such a setup would be (and I disagree on that point), do you dispute the likelihood that it would be perceived as a promotion of bigotry (and objected to) by many?
- I don't believe that it's appropriate for the Wikimedia Foundation to condone or condemn such beliefs, which is part of why I see no tenable approach apart from taking no stance (the status quo). —David Levy 04:33, 20 August 2011 (UTC)
- ┌───────────────────────────────────────┘
- I misspoke, but whether the categories are gonna promote hate or not is very depending on the people themselves. It is like calling someone homosexual, someone may find it inappropriate because it may constitute promoting homophobia but other may not think so. It's just a sexual preference, homo can live with it carefreely and acknowledge there is a group of considerable people who does not like it. That's why we provide means for them to avoid those scenes. -- Sameboat (talk) 04:55, 20 August 2011 (UTC)
- ┌───────────────────────────────────────┘
- Such a system inherently requires us to make non-neutral, contentious decisions regarding which types of content should/shouldn't be assigned optional filters (effectively deeming beliefs "valid" or "invalid"). It simply isn't realistic to assign filter categories to everything. Conversely, Wikimedia Foundation wikis have thrived under a system assigning them to nothing (the only neutral approach). —David Levy 05:09, 20 August 2011 (UTC)
- The filter should be limited to sex, nudity, blood, and guts. -- Whoa. This is exactly the direction this whole discussion shouldn't even take. What you are suggesting is implementing exactly the kind of culture-centric bias and ideology which we should struggle to defend Wikipedia against. Opt-in filter, sure, fine by me. But there can be no default filtering (ie. by default, all images should be shown too all readers) and sure as heck there can be no "politically correct" prescription as to what "should" be included in the filter system and what shouldn't. Who are we to decide and tell people what to regard as "potentially filter-worthy" and "not filter-able"? Just to give you one first-hand example: A category-based opt-in filter would allow me to filter all spider images (I have arachnophobia). According to you, the filter should not give me the option to filter the images I want filtered.
Also according to you, we should default-filter that which you for whatever reason deem "easily identifiable/negotiable" sensitive material.Ridiculous. No offense, but it really is. --87.78.46.49 14:54, 19 August 2011 (UTC)
- Who in this section said anything about filtering by default? --Michaeldsuarez 15:43, 19 August 2011 (UTC)
- Sorry, I must have carried that over from another section. Struck accordingly. However, even leaving that out, there still is the huge problem that any set of predefined filter options embody an intrusion on the user's decision what they want filtered and what they don't. We can't do that. --87.78.46.49 15:50, 19 August 2011 (UTC)
I think the arachnophobia example above is perfect, because it removes the "morality" aspect, and allows to focus on the real issue. Having been arachnophobic myself, I have frequently browsed very quickly past pages with pictures of spiders in books, and wished I could avoid them altogether. But just as I wouldn't expect a publisher of books to address my personal sensitivites in this regard, I don't think it is a worthwhile cause for Wikipedia to implement any such features. If I really wanted to avoid pictures of spiders, I would set up a proxy, which did some image recognition to identify spider-like images and replace them. I believe such filters are available commercially, at least. This type of filtering is simply not something Wikipedia should concern itself with. The possibility that someone might be offended by something on Wikipedia is not a good enough reason to implement such a feature - and the issues that implementing such a feature raises, about how to make categories etc, are very, very good reasons NOT to implement such a feature. --Lasse Hillerøe Petersen --87.104.100.229 16:47, 19 August 2011 (UTC)
- I agree. I honestly don't see any potential advantages to including this into MediaWiki when there are perfectly viable other options. See e.g. my posting below (How is this necessary when there are userscripts?). There already is a userscript that allows people to replace all images with text that can in turn be clicked to reveal the image. --87.78.46.49 16:54, 19 August 2011 (UTC)
Severely Negative Effects of This Feature
Within a week of an IP-optional filter going public, variations of the following will be in hundreds of schools, libraries, and churches worldwide.
All computer users on |
Institutions in extremely conservative or religious communities will have authority figures enforcing this policy first. Being from the United States, I can use Bob Jones University as an example where filter use will be enforced instantly upon the school realizing it exists. A multitude of less conservative institutions may follow suit, especially if their computers are regularly in the presence of children and protective parents. Here's the problem.
access to alternative public computers will face de facto censorship.
small, poor towns could be susceptible.
I apologize for the crass and overblown headline and fonts, but I felt this needed attention. I would love to have a feature where the individual user can decide just for themselves how much "ick" they want in their Wikipedia experience. But we can't forget that in so many places, internet usage can be dictated by figures of authority, via policies, physical supervision, and social pressure. Now I personally don't mind parents controlling what their kids can see. The worry is the hundreds of thousands of people that only have access to Wikipedia in one place (usually a school or library, which already have a predilection for dictating filter use). And if this idea is implemented, it won't just be on the English Wikipedia. Small rural towns in countries all across the world may be subjected to de facto censorship on the whim of the schoolmaster, or town librarian, or even town officials.
I think the forum and the referendum's wording dangerously downplay this feature's impact.
- Question for the master-minds
Before we go forward, I have a question for the individuals spearheading the feature's creation. It's not an attempt at snark, as I do have great faith that they have the best of intentions. That gives me hope that they'll thoughtfully consider this problem.
- Wikipedia operated perfectly fine without this feature for years. Why the sudden interest in implementing it? Also, who originally lobbied for its creation, and what was their rationale? Was it a group of Wikipedia editors, administrators, or outside influences such as parent groups, religious groups, educators, or general readers?
- IP or Account-based?
Perhaps just applying the feature to named accounts, not IP addresses, would greatly diminish the prevalence of de facto censorship. I can see it being difficult for any authority figure to supervise each public computer and make sure the user is logged in to an account with the censor feature activated. Just instructing IP users to switch the filter on would be easier for them to enforce. Let's talk! Rodgerrodger415 02:55, 20 August 2011 (UTC)
Discussion of problem
Thoughts, hopefully? Rodgerrodger415 02:55, 20 August 2011 (UTC)
- Prior to this image filter, tons of people from heavily censoring nations are blocked from accessing WP in the first place. And even if they could, their computer must be installed with other kinds of content filter. Our image filter does not make those readers suffer more. -- Sameboat (talk) 03:28, 20 August 2011 (UTC)
- I more or less agree with the big dictatorships, but from an American perspective, it's more the small towns in Kentucky and West Virginia I'm worried about, among so many other places. Rodgerrodger415 05:14, 20 August 2011 (UTC)
- How soon can we get the content filter implemented?! I'm afraid my children might find out the Holocaust happened, or that the moon landing wasn't faked. Pothed 03:32, 20 August 2011 (UTC)
- "Filter, or it didn't happen"? Rich Farmbrough 03:39 20 August 2011 (GMT).
- What's the difference? Pothed 04:37, 20 August 2011 (UTC)
- Rich Farmbrough 15:42 20 August 2011 (GMT).
- What's the difference? Pothed 04:37, 20 August 2011 (UTC)
- "Filter, or it didn't happen"? Rich Farmbrough 03:39 20 August 2011 (GMT).
- One-- there won't be any "one" indecency filter-- there is no one concept of indecency. So their signs are going to have to be more complex.
- Two-- any organizations technically sophisticated enough to control your computer can already enforce such a policy without our help.
- Three-- organizations that aren't technically sophisticated won't be able to enforce such a policy anyway.
- As for why now-- I know they've been worried about the issue for a very long time. Some very bright people considered it a strategic priority to get this done-- I personally don't see it as important-- but the time of our brightest people IS important, so if only to get them off this topic, we do have a strategic resource at interest.
- Also-- we do have an editor trends/demographics/openness problem, and there's this theory floating around that maybe a filter could help. I personally don't think the filter alone can even dent that problem, but maybe I'm wrong.
- Another reason to build the filter is, no joke, to show the world how to build a filter properly. A lot of organizations are facing this question, and "no filters" isn't an option for most of them. they've spent 2yr+ thinking about how to do this right, in a way that is very clearly intended only for responsible, intentional self-use. There's a certain virtue in, having solved the problem, showing what the right answer looks likes. That's not a good reason, but it's a half-good reason.
- I'll let people more saavy than I give you better answers, but those are some of the parts I can shout out.
- --AlecMeta 03:44, 20 August 2011 (UTC)
- The Harris report was delivered on or around 21 September 2010, less than 11 months ago. The report is well written and thoughtful, but is not authored by Wikimedians (this is clearly both a drawback and an advantage). The authors specifically addressed their recommendations to the community, rather than the board, showing that they realised the solutions of the problems lie in the communities hands, both in terms of understanding the issues and implementing then, for very sound reasons.
- The board resolution (a request, as it is characterised) was put forward 29 May 2011 (not June as the referendum page says), and work almost immediately began on planning this survey. Therefore the maximum time the board board had to consider the issue was about 8 months not 2 years, and the amount of time spent considering "how to do it right" is I think a lot less. No one has been giving this much thought until now, and I mean that not unkindly - it was not something anyone needed to do. At first blush the idea is great, simple, doable and a quick win. (In fact it is so much all those things that a number of naive versions have already been created.) It's only the close examination, with many experienced Wikimedians, information scientists, computer scientists and other assorted experts and generalists that has pulled out a series of extreme problems, all of which need solving if we are not to run into difficulties. It may be, as some argue, that the difficulties are worth the benefits, on the other hand it may not. The crux though, is that a proper evaluation of those difficulties need s to be me made - if there were a project plan, this would be a mammoth exception report, and the project would need to be re-planned and re-costed, and it's viability re0determined. Simply because the formal process hasn't been put into place, it does not mean that the viability of the proposal necessarily still stands. Rich Farmbrough 04:10 20 August 2011 (GMT).
- I agree that this is going to have to be kind of a large undertaking, and I hope the strategists realize that. It is a very big can of worms and no other organization like ours has ever attempted to tackle it, because we're one-of-a-kind. Half-measures may lead to worse community effects than no measures at all; or maybe half-measures will just slowly evolve into full measures. In terms of the total amount of time-- foundation, staff, developer, admin, and editor time that will ultimately be poured into tagging and debating, this could likely become a very major undertaking. I have no clue if it's the best use of resources, but if anyone can do it right, it's the people we've got, and I believe it can be done and done right if we really want to do it. --AlecMeta 05:16, 20 August 2011 (UTC)
- "no other organization like ours has ever attempted to tackle it" - I would say that's because no organization like this should concern themselves with this sort of thing. It's possible the other ones realize that. Maybe they didn't endeavor because they figured that even self-imposed filtering of content based on a digitally divined morality shouldn't be the concern of the content provider, but the concern of the end user. I don't believe question they should be asking is 'can it be done?' Of course it can be done. As a computer scientist and web developer myself, I have no reason to believe they couldn't do it and make it work exactly as described. Probably better. The question, rather, should be 'should this be Wikipedia's responsibility in the first place?' To that I say: Absolutely not. Pothed 05:45, 20 August 2011 (UTC)
- This is no more an issue of censorship than Google Safe Search is. The boffins around here screaming censorship are ill informed twats at best. To be honest, ALOT of organisations completely block Wikipedia due to the kind of content this filter is addressing and the lack of an optional filter, particulerly educational institutions. I feel this is actually a better alternative, some wikipedia is better than no wikipedia at all and keeping in mind this is just targeting potentially offensive images, not text. Promethean 06:53, 20 August 2011 (UTC)
- Ill-informed twats? Have you ever spent time in a small-town library, the inky dinky ones with small computer labs closely monitored by librarians? They don't need "technology" to be able to enforce censorship on their users, as they already have frequent visual surveillance going on to catch people circumnavigating the porn firewalls. A lot of community's depend on these computer labs, and Wikipedia may be on of their few outlets for frank pictures, especially if the rest of their surfing is controlled for "sanitation" as well. (And most people are eager to please or intimidated by authority figures, and so will obey the new "policy" just so they don't get caught looking at what are now classified as naughty pictures). Your argument is, "well, the crazy places already ban Wikipedia, so it can't get worse." We don't know that. What about the places that grudgingly allow for the photos to be seen, as they are "considered encyclopedic?" Once there is a setting, they can stamp down on that. Basically we aren't sure how many venues are on the brink. We're all making sweeping generalization about what "might happen," but we need to seriously think about the consequences from all demographic viewpoints. And again, I think this situation might be more easily solved with an account-only based feature, not a super easy, accessible IP-based one. Rodgerrodger415 15:49, 20 August 2011 (UTC)
- Also, with this comment: "ALOT of organisations completely block Wikipedia due to the kind of content this filter is addressing and the lack of an optional filter, particulerly educational institutions. I feel this is actually a better alternative, some wikipedia is better than no wikipedia", you just admitted that allowing schools to censor their students is one of the feature's main motivations. Please confirm or deny. Again, who was behind the lobbying for this feature? Rodgerrodger415 16:12, 20 August 2011 (UTC)
- This is no more an issue of censorship than Google Safe Search is. The boffins around here screaming censorship are ill informed twats at best. To be honest, ALOT of organisations completely block Wikipedia due to the kind of content this filter is addressing and the lack of an optional filter, particulerly educational institutions. I feel this is actually a better alternative, some wikipedia is better than no wikipedia at all and keeping in mind this is just targeting potentially offensive images, not text. Promethean 06:53, 20 August 2011 (UTC)
- "no other organization like ours has ever attempted to tackle it" - I would say that's because no organization like this should concern themselves with this sort of thing. It's possible the other ones realize that. Maybe they didn't endeavor because they figured that even self-imposed filtering of content based on a digitally divined morality shouldn't be the concern of the content provider, but the concern of the end user. I don't believe question they should be asking is 'can it be done?' Of course it can be done. As a computer scientist and web developer myself, I have no reason to believe they couldn't do it and make it work exactly as described. Probably better. The question, rather, should be 'should this be Wikipedia's responsibility in the first place?' To that I say: Absolutely not. Pothed 05:45, 20 August 2011 (UTC)
- I agree that this is going to have to be kind of a large undertaking, and I hope the strategists realize that. It is a very big can of worms and no other organization like ours has ever attempted to tackle it, because we're one-of-a-kind. Half-measures may lead to worse community effects than no measures at all; or maybe half-measures will just slowly evolve into full measures. In terms of the total amount of time-- foundation, staff, developer, admin, and editor time that will ultimately be poured into tagging and debating, this could likely become a very major undertaking. I have no clue if it's the best use of resources, but if anyone can do it right, it's the people we've got, and I believe it can be done and done right if we really want to do it. --AlecMeta 05:16, 20 August 2011 (UTC)
- "...some wikipedia is better than no wikipedia at all..." - I wholeheartedly disagree. If you aren't giving me the whole truth, then you can keep it to yourself. I'd also probably not come to you for any credible information in the future either. So yes, 'no Wikipedia' is absolutely better than 'filtered, compromised integrity, politically correct Wikipedia'. In fact, it's not even close. Pothed 17:40, 20 August 2011 (UTC)
Relevance
So does this make it so that i can do whatever I want with Images like post an orange on an article about tomatoes? UserBassistcello
Legal/liability issues
Does this mean that Wikimedia Foundation is taking a position on what constitutes objectionable material?
It seems to me that by creating categories of possibly objectionable material, Wikimedia Foundation would be taking a position on what can be considered objectionable. That would be unfortunate.--Wikimedes 07:15, 17 August 2011 (UTC)
- And it would contradict the main principle of NPOV. Is a picture of the pope offensive enough to be categorized? Is a religious symbol offensive enough or not? Perhaps a picture of Friedrich Nietzsche is too offensive for Catholics? Is a caricature relating to a religious topic offensive enough or not? Who can decide this without strongly infringing NPOV? Personally I wish that every picture including bananas should be categorized because I feel offended by those and it would be nice to filter them. Is there any chance that this can be realized? ;-) --78.94.169.111 00:36, 18 August 2011 (UTC)
- I don't think it would mean the WMF is deciding anything, actually, any more than the WMF decides the content of articles (we don't). This is about making something available for editors to use, or not. -- phoebe | talk 14:14, 18 August 2011 (UTC)
- Did the WMF care for anything else then donations? I wonder which donor had this great idea. --Niabot 00:50, 18 August 2011 (UTC)
The foundation board itself wrote a very even-handed resolution. http://wikimediafoundation.org/wiki/Resolution:Controversial_content . The discussion on foundation-l was also fairly even-handed. The issue is that one of the bullet-points in the resolution required foundation employee action (the result of which you see here, so far). There's actually a lot of room within the scope of the resolution to shoot down some of the more obviously hair-brained concepts people might want to implement. Even so, I've posted on foundation-l to request a clearer resolution, so that employees can better engage with volunteers, without a need for one side to shout the other down. --Kim Bruning 17:46, 19 August 2011 (UTC) Did that sound diplomatic enough? ;-)
Matters of principles
Neutrality
The FAQ for this referendum says that one of the principles behind the creation of the filtering tool is that "The feature is to be culturally neutral, and all-inclusive". However, the mock-up designs displayed show completely non-neutral filtering options. The IFR page says that "These are subject to change based on the outcome of the referendum and the realities of feature development, but it's likely that the final product would look very similar to these mock-ups". I think it would be very helpful if this were made clearer, explaining that these are just images showing what the interface style might look like, and that the actual feature will actually be neutral, giving no possible filtering setting any higher availability than any other conceivable filter, and not at all include any options like those given in the images displayed, assuming that this is the case, of course. If this is not the case, the FAQ must be corrected before the start of the referendum, in order to allow people to understand whether this feature would abolish NPOV before they vote. --Yair rand 22:29, 26 July 2011 (UTC)
- Hi Yair. Thanks for your comment, but I think you're likely misreading the intention behind these (admittedly vague) statements. The board specifically noted neutrality in design as a core principle -- we were thinking of things like category names. "Nudity" is pretty objective and neutral, for instance, as a descriptive category; "bad content" is not. This is important. That doesn't mean, however, that we can't pick certain categories of images to make hideable, as those that are commonly controversial across many cultures. I don't quite know what you mean about interface design; but I disagree quite strongly that the feature would "abolish NPOV" -- remember in this proposal nothing goes away permanently, and editing standards remain the same. -- phoebe | talk 18:18, 31 July 2011 (UTC)
- How is that relevant? The reason (well, one of them) behind rejecting repeated proposals for temporary/permanent hiding of certain "objectionable" content was that "objectionable" means something different for everyone and hiding certain content would be giving preference to certain groups in violation of NPOV, which, as you helpfully pointed out, is exactly what this image filter will do, and not, as SJ suggested above might be the case, give options to hide any such images in any neutrally selected categories not giving preference to select groups. --Yair rand 20:32, 31 July 2011 (UTC)
- There will be no permanent hiding of anything. Every single reader will be able to see every single image he wants to see.
- This is not a one-size-fits-all filter system. You decide what categories of material you find objectionable (if anything). It doesn't matter what everyone else wants to hide. Consequently, it doesn't matter if "objectionable" means something different for every reader.
- Putting a 'click here to see the image' button on your screen doesn't impair NPOV at all. Failing to load the picture of the man jumping off the Golden Gate Bridge when the page loads the first time does not "give preference to certain groups" or "violate NPOV". WhatamIdoing 19:37, 4 August 2011 (UTC)
- It is a one-size-fits-all filter system, though, because the filtering categories with which the end user is presented, as well as the content of those categories, will be decided by the community. That will inevitably mean edit wars and non-neutrality. I too would much prefer the system that SJ suggested, but that is not what we will be voting on here.--Danaman5 00:57, 7 August 2011 (UTC)
- The fact that multiple options will be presented means that it's not a one-size-fits-all system. You could choose to see everything except one type of images, and I could choose to see everything except one different type of images, and the next person could choose to see everything except four types of images. The ability to configure it means that it's not one-size-fits-all. It might well be a 5040-sizes-fit-most system (that's the number of combinations available in the sample above), but it is not a one-size-fits-all system. WhatamIdoing 22:49, 14 August 2011 (UTC)
- You are wrong. The user can decide, that he does not want the see "extreme politicans". But who will fit to this description? Some will say Hitler, Stalin, Goebbels. Other will include Mao, Lenin and Mubarak. And still others will include Palin, Obama or George W. Bush. I predict a censors war. --Eingangskontrolle 09:10, 16 August 2011 (UTC)
- Contrary to what was said above, nudity is not an all-or none concept. Are pictures that deliberately leave that fact ambiguous, nudity? Is a image with the genitals covered by a large X, nudity? Do we propose to differentiate between male and female nudity? And similarly with all the other categories. Anyone who thinks this is hair-splitting should see current discussions of almost anything controversial on Wikipedia. There are no world-wide standards, and any attempt to do one is likely to drift either to the current US view of people like most of our editors, or towards the most inclusive position. Or possibly to the least inclusive--certainly at least some people opposed to the general concept will try to move towards that one. DGG 16:25, 16 August 2011 (UTC)
- The use of categories is a problem since they would be inevitable bias in what would be included or not included in the categories. For instance Mormons may want to not having to view pictures of some underwear, how to classify that? The only unbiased system would be to have it on an per image basis where you can select to hide specific images. // Liftarn 12:34, 19 August 2011 (UTC)
- The filter system is POV. What could be obscene to someone may not be obscene to someone else. A filtering system, in my opinion is all about controlling content in an adult educational format. I do not believe that a fundamental change on Wikipedia be dictated by a few persons who are offended by some photo. I have yet to find an article on Wikipedia that is explicitly gory or contains graphic sexual content. There is currently no need for a filtering service on Wikipedia. A filtering system would destroy any neutrality in an article, since, by its very nature photos from an article would be not shown if considered obscene by some unknown persons who could be offended by this photo. There are articles that children, in my opinion, need to be supervised by an adult. Again, let the choice be for the parents to make, rather then a biased filtering system. Cmguy777 02:35, 17 August 2011 (UTC)
1984
welcome to censorpedia Bunnyfrosch 16:53, 16 August 2011 (UTC)
- Welcome to 2011. It's opt in. B Fizz 17:53, 16 August 2011 (UTC)
- Calling this censorship is an insult to the many millions of people who have suffered under actual censorship. Real censorship does not have a "thanks, but I want to see it" button. Rd232 18:05, 16 August 2011 (UTC)
- This censorship is an insult to all contributers who made the Wikimedia projects in their spare time, asking no money. They believed in principles that will be thrown away once this censorship is introduced. Next step will be censorship on articles, and slowly, the project will be taken over by politicians and religious fanatics - Quistnix 18:08, 16 August 2011 (UTC)
- You appear to have a non-standard definition of the word censorship. From Wiktionary: "Censorship: The use of state or group power to control freedom of expression, such as passing laws to prevent media from being published or propagated." How is anyone preventing anything from being published or propagated here? Philippe (WMF) 18:16, 16 August 2011 (UTC)
- So, your response to my saying "calling this censorship is an insult" is to repeat the insult? And then to throw in the slippery slope claim. Except that this is not censorship so there is no slope, slippery or otherwise. No matter how often or how hysterically the claim is made, it will still not be true that this is censorship and it will still not be true that this is a slippery slope to actual censorship. Rd232 18:18, 16 August 2011 (UTC)
- @Philippe We already have the first calls to make the censored version the default and to hide anything that someone could find controversial, until they "opt-out" to enable the uncensored view. Even if we make it an "opt-in" it's very likely that it sooner or later changes to "opt-out" which meats the criteria for censorship. It's the drawing of pink bunny jumping over yellow flowers. After the discussions on EN about what images might be suitable (featurable) for the mainpage I'm fully convinced that it would not stop at this point and many authors and their ideals (uncensored, neutral knowledge, accessible for everyone) would be betrayed. --Niabot 19:16, 16 August 2011 (UTC)
- @Rd232:Time will prove you're wrong. One day you'll wake up, and that day will be too late to change things - Quistnix 20:00, 16 August 2011 (UTC)
- @Philippe: Nothing wrong with that definition. We just differ in our interpretation of it - Quistnix 20:04, 16 August 2011 (UTC)
- Your honour, it's not theft, I just disagree with your interpretation of "ownership". Is this what passes for reasoned debate round here? Meh. Count me out. Rd232 20:06, 16 August 2011 (UTC)
- This hypothetical discussion could actually conceivably occur if you (accidentally) steal from a lien-holder. The existence-or-not of a lien complicates matters considerably. ;-) --Kim Bruning 11:21, 21 August 2011 (UTC) moral of the story: don't use metaphor and simile around wikipedians. ;-) I haven't actually checked the sources to see if the exact quote occurred in court history... yet
- Your honour, it's not theft, I just disagree with your interpretation of "ownership". Is this what passes for reasoned debate round here? Meh. Count me out. Rd232 20:06, 16 August 2011 (UTC)
- This censorship is an insult to all contributers who made the Wikimedia projects in their spare time, asking no money. They believed in principles that will be thrown away once this censorship is introduced. Next step will be censorship on articles, and slowly, the project will be taken over by politicians and religious fanatics - Quistnix 18:08, 16 August 2011 (UTC)
Its the first step to enable censorship by third parties. School districts, governments, internet-provider, employers... And many fundamentlists will remove tagged images from the articles, just because they are tagged. --Bahnmoeller 09:01, 17 August 2011 (UTC)
For anyone who doubts the slippery-slope argument in this case, there has already been, on this page, a proposal to make the filter opt-in - for all values of "offensive". And indeed such a call would be hard to resist - we would be seen to be either assuming that readers are prurient/ungodly/voyeuristic, or deliberately exposing them to "unpleasant" images when we can flip a bit to "protect" them. Rich Farmbrough 14:05 17 August 2011 (GMT).
My own, belated, dismay
I suppose I should have been paying more attention. I knew a referendum was coming, and I naively expected it to be on the substance of this monstrosity, and not on implementation details — it had not occurred to me that the Foundation would even consider committing a "feature" such as this without overwhelming community support given that it goes exactly opposite everything it is supposed to stand for.
There is no such thing as "offensive" or "controversial" information, only offended points of view. It's destroying our educative mission for the sake of — well, I'm not even sure what rational objective this is supposed to serve. "Morality" as defined by a minority in some random theocracy? Some vague and ill-defined standard of "respectability" handwaved into being without thought?
Encouraging readers to blind themselves to the parts of reality they dislike is the opposite of our educative goal. It is stealth POV forking, causing people to view different articles that match their preconceptions (or, in the real world, somebody else's preconceptions as imposed to them by social pressure out outright force). There is no real conceptual difference between allowing image POV filtering and allowing text POV filtering. When does the "exclude liberal text" toggle come, then? Or the "I don't want to see 'evolutionist' articles" checkmark?
You may think I'm abusing a slippery slope argument; I don't believe there is a slope: being able to exclude images of "sex" (however you want to defined it), or that of "the prophet", or whatever is exactly the same thing as being able to exclude articles on biology, or on religions you disapprove of. The false distinction between images and articles is just a fallacy to rationalize this "feature".
That's not even getting into the problems about how easy such categorization of contents over arbitrary pigeonholes of "offensiveness" is easy to abuse by third parties — others have expounded on that aspect elsewhere and in more detail.
Yes, people are allowed to use the blinders the foundation will provide for them; I believe in freedom to self-harm. I have no desire to be a party to it. We should be the beacon of knowledge, allowing people to expand their limited vision — not giving them a comfortable experience of strictly limited points of view. That's TV's job. — Coren (talk) / (en-wiki) 20:57, 16 August 2011 (UTC)
- I couldn't agree more. Kusma 22:32, 16 August 2011 (UTC)
- Wikipedia. The free Encyclopedia. If those filters come Wikipedia isn't free anymore. --Matthiasb 23:30, 16 August 2011 (UTC)
- I completely agree with this points. --Niabot 00:28, 17 August 2011 (UTC)
- You put it better than I could. Now if we can only overturn this mess of an idea... --Hibernian 06:59, 18 August 2011 (UTC)
- Wikipedia. The free Encyclopedia. If those filters come Wikipedia isn't free anymore. --Matthiasb 23:30, 16 August 2011 (UTC)
We always have the right to fork, and/or to set up a different foundation if we happen not to like this one anymore. Some wikipedias have done so in the past http://en.wikipedia.org/wiki/Enciclopedia_Libre . Of course, it might be somewhat easier to just talk with the board instead, and have them modify or vacate the current position. Most of the board are pretty clueful! :-) --Kim Bruning 16:00, 19 August 2011 (UTC)
- An option for reaching people is to use the foundation-l mailing list. --Kim Bruning 17:41, 19 August 2011 (UTC)
Best thing ever
Finally a feature that will allow me to hide the pornographic images on Wikipedia. Thanks, Wikipedia. 84.198.50.96 02:03, 19 August 2011 (UTC)
- So then what were you doing viewing the porn articles if you didn't want to see porn? --Cybercobra 06:31, 19 August 2011 (UTC)
- Now Cybercobr, that is FUNNY and true! great observation. I havent yet accidentally seen 'racy" images on WP. I am in fact disappointed to see some of them disappear. Jack B108 19:02, 19 August 2011 (UTC)
- Some people would like to look up a term without seeing pornography. Britannica doesn't have graphic sexual pictures, and some people would be able to safely research things in schools or at work. Ottava Rima (talk) 22:58, 19 August 2011 (UTC)
Chinese government's dream
It appears that internet censorship on China functions better if websites aren't totally blocked, but has just certain contents filtered. Of course, this is made much easier if Wikipedia itself tags such content. The Chinese government is actively looking for more ways to filter out sexual content from web sites, so this proposal is a perfect match :)
--187.40.208.90 04:41, 19 August 2011 (UTC)
Great, nothing else than censorship
Really sick, and next week you will have the article filter referndeum... 84.119.1.200 06:09, 19 August 2011 (UTC)
"Principle of Least Astonishment" usage
At the end of this sentence:
- Often, within the Wikimedia world, this is referred to as the principle of least astonishment, or least surprise.
please place a {{citation needed}} Mfwitten 17:07, 19 August 2011 (UTC)
- See Wikipedia:Principle of least astonishment and Principle of least astonishment. SilkTork 21:40, 19 August 2011 (UTC)
please place a {{citation needed}} Mfwitten 17:07, 19 August 2011 (UTC)
- As I understood it "Least surprise" is the benign editorial practice of not including the most potentially disturbing images that would be relevant to the article at the very top of the article. If someone clicks, for example, on a link to "torture " in the English Wikipedia, the top image is , appropriately, a display of medieval torture instruments, not of a person being tortured. (None of the other Wikipedias I checked have a really horrific picture at the top, either) It conveys the idea without being really frightening. Even children have seen such images. Should someone reach this page, a little reading will let them quickly decide how much further they want to go. Even on a 1920 X 1200 display, the first human example in not on the first screen, and even so, it's not a contemporary photograph, but an ancient monument. Almost all people who might potentially be disturbed would use much smaller screens than this--it's not perfect, but it helps considerably. I support this method. Doing anything else is, in my opinion , exploiting shock value, not conveying information. (Some subjects may not lend themselves to this, but they are generally subjects of very obvious NSW content). I am totally opposed to any compromise with censorship, but careful arrangement of material is not censorship. DGG 22:15, 19 August 2011 (UTC)
Images are editorial choices
In wikipedia, images are editorial choices of the contributors, I see no reason why to only propose auto censuring tool for this part of the editorial choices made. Moreover, an image illustrates a topic but should (more reallisticly may) also be commented and described in the article itself; removing the illustration will make this part of the article useless or really weird.
IMO it's not our job as a community to implement a parental control solution. It's up to the parents. Does anyone want thoses kind of censoreship tools in museum or castel to not display the statue of Naked men, women and children ? To sum up, waste of time and money IMO. --PierreSelim 12:34, 16 August 2011 (UTC)
- The proposal isn't to be automatic; it is instead for people who don't want to see such things to be able to choose to not see them. Images aren't permanently removed; even if one person hides the picture for themselves, it will still be there for the rest of us -- and it will still be available even for the person who has hidden it. Nothing editorial is being interfered with. I just want to make sure the proposal is clear :) -- phoebe | talk 15:57, 16 August 2011 (UTC)
- If people dont like to see certain images, they should not use the internet. --Eingangskontrolle 16:32, 16 August 2011 (UTC)
- That is the most asinine argument I've heard yet. The purpose of this feature is to allow more people to comfortably use Wikipedia. Your counterargument is that these people shouldn't be using the internet at all? How does that, in any way, help Wikipedia's goal to reach as many people as possible? That should be a major goal of Wikipedia, anyways. Perre said "it's not our job ... to implement a parental control system". Sure, it's not our job, but why not provide one anyways? It will be convenient for parents, and will leave everyone else unaffected. You don't have to opt in if you don't want to. B Fizz 18:19, 16 August 2011 (UTC)
- If people dont like to see certain images, they should not use the internet. --Eingangskontrolle 16:32, 16 August 2011 (UTC)
The system is not a parental control system; it doesn't take power away from the reader and put it somewhere else. The power remains with the reader, and that's as it should be. I'm not even sure, off-hand, whether it would even be possible for Wikimedia to create a parental control system. How would it identify children and parents, given that the overwhelming majority of readers are anonymous, and accounts are just screen-names with an optional email address?? Parental controls have to be on the relevant computer to work. The only link with parental controls is that parental control software might eventually use WP classification of images to enforce controls locally (but they can already do this using categories of images and pages, if they really want). Rd232 18:31, 16 August 2011 (UTC)
- Our proposed system is innocent, I get it, that's fine. However, I think an obvious design requirement must be that the metadata used to feed the system must also be innocent, and not be usable for evil. Since the metadata would ideally be curated by the community, I think the best way to move forward is to let community members help with the design process (also in part to help decide whether this design requirement can actually be met). --Kim Bruning 12:39, 20 August 2011 (UTC)
1984
welcome to censorpedia Bunnyfrosch 16:53, 16 August 2011 (UTC)
- Welcome to 2011. It's opt in. B Fizz 17:53, 16 August 2011 (UTC)
- Calling this censorship is an insult to the many millions of people who have suffered under actual censorship. Real censorship does not have a "thanks, but I want to see it" button. Rd232 18:05, 16 August 2011 (UTC)
- This censorship is an insult to all contributers who made the Wikimedia projects in their spare time, asking no money. They believed in principles that will be thrown away once this censorship is introduced. Next step will be censorship on articles, and slowly, the project will be taken over by politicians and religious fanatics - Quistnix 18:08, 16 August 2011 (UTC)
- You appear to have a non-standard definition of the word censorship. From Wiktionary: "Censorship: The use of state or group power to control freedom of expression, such as passing laws to prevent media from being published or propagated." How is anyone preventing anything from being published or propagated here? Philippe (WMF) 18:16, 16 August 2011 (UTC)
- So, your response to my saying "calling this censorship is an insult" is to repeat the insult? And then to throw in the slippery slope claim. Except that this is not censorship so there is no slope, slippery or otherwise. No matter how often or how hysterically the claim is made, it will still not be true that this is censorship and it will still not be true that this is a slippery slope to actual censorship. Rd232 18:18, 16 August 2011 (UTC)
- @Philippe We already have the first calls to make the censored version the default and to hide anything that someone could find controversial, until they "opt-out" to enable the uncensored view. Even if we make it an "opt-in" it's very likely that it sooner or later changes to "opt-out" which meats the criteria for censorship. It's the drawing of pink bunny jumping over yellow flowers. After the discussions on EN about what images might be suitable (featurable) for the mainpage I'm fully convinced that it would not stop at this point and many authors and their ideals (uncensored, neutral knowledge, accessible for everyone) would be betrayed. --Niabot 19:16, 16 August 2011 (UTC)
- @Rd232:Time will prove you're wrong. One day you'll wake up, and that day will be too late to change things - Quistnix 20:00, 16 August 2011 (UTC)
- @Philippe: Nothing wrong with that definition. We just differ in our interpretation of it - Quistnix 20:04, 16 August 2011 (UTC)
- Your honour, it's not theft, I just disagree with your interpretation of "ownership". Is this what passes for reasoned debate round here? Meh. Count me out. Rd232 20:06, 16 August 2011 (UTC)
- This hypothetical discussion could actually conceivably occur if you (accidentally) steal from a lien-holder. The existence-or-not of a lien complicates matters considerably. ;-) --Kim Bruning 11:21, 21 August 2011 (UTC) moral of the story: don't use metaphor and simile around wikipedians. ;-) I haven't actually checked the sources to see if the exact quote occurred in court history... yet
- Your honour, it's not theft, I just disagree with your interpretation of "ownership". Is this what passes for reasoned debate round here? Meh. Count me out. Rd232 20:06, 16 August 2011 (UTC)
- This censorship is an insult to all contributers who made the Wikimedia projects in their spare time, asking no money. They believed in principles that will be thrown away once this censorship is introduced. Next step will be censorship on articles, and slowly, the project will be taken over by politicians and religious fanatics - Quistnix 18:08, 16 August 2011 (UTC)
Its the first step to enable censorship by third parties. School districts, governments, internet-provider, employers... And many fundamentlists will remove tagged images from the articles, just because they are tagged. --Bahnmoeller 09:01, 17 August 2011 (UTC)
For anyone who doubts the slippery-slope argument in this case, there has already been, on this page, a proposal to make the filter opt-in - for all values of "offensive". And indeed such a call would be hard to resist - we would be seen to be either assuming that readers are prurient/ungodly/voyeuristic, or deliberately exposing them to "unpleasant" images when we can flip a bit to "protect" them. Rich Farmbrough 14:05 17 August 2011 (GMT).
My own, belated, dismay
I suppose I should have been paying more attention. I knew a referendum was coming, and I naively expected it to be on the substance of this monstrosity, and not on implementation details — it had not occurred to me that the Foundation would even consider committing a "feature" such as this without overwhelming community support given that it goes exactly opposite everything it is supposed to stand for.
There is no such thing as "offensive" or "controversial" information, only offended points of view. It's destroying our educative mission for the sake of — well, I'm not even sure what rational objective this is supposed to serve. "Morality" as defined by a minority in some random theocracy? Some vague and ill-defined standard of "respectability" handwaved into being without thought?
Encouraging readers to blind themselves to the parts of reality they dislike is the opposite of our educative goal. It is stealth POV forking, causing people to view different articles that match their preconceptions (or, in the real world, somebody else's preconceptions as imposed to them by social pressure out outright force). There is no real conceptual difference between allowing image POV filtering and allowing text POV filtering. When does the "exclude liberal text" toggle come, then? Or the "I don't want to see 'evolutionist' articles" checkmark?
You may think I'm abusing a slippery slope argument; I don't believe there is a slope: being able to exclude images of "sex" (however you want to defined it), or that of "the prophet", or whatever is exactly the same thing as being able to exclude articles on biology, or on religions you disapprove of. The false distinction between images and articles is just a fallacy to rationalize this "feature".
That's not even getting into the problems about how easy such categorization of contents over arbitrary pigeonholes of "offensiveness" is easy to abuse by third parties — others have expounded on that aspect elsewhere and in more detail.
Yes, people are allowed to use the blinders the foundation will provide for them; I believe in freedom to self-harm. I have no desire to be a party to it. We should be the beacon of knowledge, allowing people to expand their limited vision — not giving them a comfortable experience of strictly limited points of view. That's TV's job. — Coren (talk) / (en-wiki) 20:57, 16 August 2011 (UTC)
- I couldn't agree more. Kusma 22:32, 16 August 2011 (UTC)
- Wikipedia. The free Encyclopedia. If those filters come Wikipedia isn't free anymore. --Matthiasb 23:30, 16 August 2011 (UTC)
- I completely agree with this points. --Niabot 00:28, 17 August 2011 (UTC)
- You put it better than I could. Now if we can only overturn this mess of an idea... --Hibernian 06:59, 18 August 2011 (UTC)
- Wikipedia. The free Encyclopedia. If those filters come Wikipedia isn't free anymore. --Matthiasb 23:30, 16 August 2011 (UTC)
We always have the right to fork, and/or to set up a different foundation if we happen not to like this one anymore. Some wikipedias have done so in the past http://en.wikipedia.org/wiki/Enciclopedia_Libre . Of course, it might be somewhat easier to just talk with the board instead, and have them modify or vacate the current position. Most of the board are pretty clueful! :-) --Kim Bruning 16:00, 19 August 2011 (UTC)
- An option for reaching people is to use the foundation-l mailing list. --Kim Bruning 17:41, 19 August 2011 (UTC)
Next Step: Filters for Objectionable Ideas
At first I thought: Why not? Everybody gets the Wikipedia they want.
Then I thought harder. If you want an article to be the way you want it, write and illustrate your own article, and publish it somewhere. If it makes sense to start tagging images, the next step is to start tagging words, phrases and ideas, so that no one need come to a consensus by participating in those bothersome editing discussions about what is appropriate, notable or reliably sourced. Rather one can simply avail oneself of a checklist of ones current prejudices (or, to be gentler, beliefs) and let the cadres of censors (or Filter Editors, as they will probably call themselves) for those beliefs protect you from any challenges to your current point of view.
I don't know if the whole idea is insidious or merely ridiculous. It would be insidious if it could actually be implemented without causing a chaos of claims and counterclaims about what interests and prejudices may be covered by tagging, with no "reliable sources" to settle the matter. Just consider for a moment the prospect of removing all articles, images and sentences relating to Black persons, so that White supremacists may get the "culturally neutral" Wikipedia they want for their children. It would make the current Discussion pages on controversial articles look like flower arranging. Given that I think it could never be implemented without destroying what is best about Wikipedia, I judge this proposal to be ridiculous. —Blanchette 21:35, 16 August 2011 (UTC)
- Are you aware that there are already filters in place at en.Wikipedia which disallow certain word and phrases? Many of those filters are secret (i.e., you cannot see which words and phrases are disallowed). Where were the defenders of free speech when those were implemented? Delicious carbuncle 22:38, 16 August 2011 (UTC)
- They do not disallow the words and phrases, they stop them being added in certain circumstances without a little extra effort. And Wikipedia is not about free speech, in that sense - Wikipedia is very censored, but only (or only intentionally) by the removal of un-encylopaedic or illegal material. Rich Farmbrough 23:45 16 August 2011 (GMT).
- I opposed them; I still oppose them. People say "spam" but then that turns out to mean "copyright" and from there it turns into "things they don't like". But this is still a community capable of coming together and sternly rejecting all such things. Wnt 05:49, 17 August 2011 (UTC)
- If I'm reading this right, you seem to be talking about systems in place designed to prevent vandalism; it's like saying you support censorship if you support laws against having things vandalized.
- Back of the topic of discussion however; I agree with the OP; the real danger of this sort of system is that once it's implemented, there's no reason not to also do the same with articles and their content. Before you call this position 'absurd' I strongly suggest you go read up on things like Conservapedia, a 'wiki' started because wikipedia was too 'liberal' (Which is to say wikipedia was full of un-Christian ideas like evolution.) --142.177.87.169 21:18, 18 August 2011 (UTC)
- What's next, Personalapedia, the Encyclopedia that Displays Only What You Want to Hear? Rickyrab 18:23, 19 August 2011 (UTC)
- They do not disallow the words and phrases, they stop them being added in certain circumstances without a little extra effort. And Wikipedia is not about free speech, in that sense - Wikipedia is very censored, but only (or only intentionally) by the removal of un-encylopaedic or illegal material. Rich Farmbrough 23:45 16 August 2011 (GMT).
Hear, hear, in my opinion this proposal is both ridiculous and insidious. --Hibernian 06:58, 18 August 2011 (UTC)
- I see clearly the threat of censoring words and customizing one's experience to the point one never has to deal with challenging ideas. On the other hand, images are much more immediately powerful than words, and have more power to be deceptive (think of how awkward snapshots of yourself fail to capture who you are—what if that were the source of the primary idea people had of you?). There is a big difference, and filtering one hardly necessitates filtering the other.
- Still that Wikipedia should be universal is a goal to be striven for. It's one of the last places where we aren't wrapped in comforting self-constructed cocoons of egotism (cf. Daniel Boorstin's The Image.). Even if it means heightened strife over what images should and should not be posted, the struggle itself is a healthy exercise for the unity of Wikipedia and ultimately the unity of our culture. JKeck 15:33, 20 August 2011 (UTC)
Best thing ever
Finally a feature that will allow me to hide the pornographic images on Wikipedia. Thanks, Wikipedia. 84.198.50.96 02:03, 19 August 2011 (UTC)
- So then what were you doing viewing the porn articles if you didn't want to see porn? --Cybercobra 06:31, 19 August 2011 (UTC)
- Now Cybercobr, that is FUNNY and true! great observation. I havent yet accidentally seen 'racy" images on WP. I am in fact disappointed to see some of them disappear. Jack B108 19:02, 19 August 2011 (UTC)
- Some people would like to look up a term without seeing pornography. Britannica doesn't have graphic sexual pictures, and some people would be able to safely research things in schools or at work. Ottava Rima (talk) 22:58, 19 August 2011 (UTC)
Chinese government's dream
It appears that internet censorship on China functions better if websites aren't totally blocked, but has just certain contents filtered. Of course, this is made much easier if Wikipedia itself tags such content. The Chinese government is actively looking for more ways to filter out sexual content from web sites, so this proposal is a perfect match :)
--187.40.208.90 04:41, 19 August 2011 (UTC)
Great, nothing else than censorship
Really sick, and next week you will have the article filter referndeum... 84.119.1.200 06:09, 19 August 2011 (UTC)
VOTE NO
- This can of worms is SO, SO BAD.
Subjectively categorizing images based on how "controversial" they are? This is a place we don't want to go. Mormons will NEVER agree with nudists, even though both viewpoints are valid in their own right. So who decides which viewpoint should be used to categorize, say, a picture of a naked butt? The most vocal supporter? The viewpoint with the most articles written about it? The point of view with the most supporters? Without non-controversial guidelines (and all guidelines in this matter will be controversial), discussions will devolve into screaming matches, bad calls by administrators trying to quell the fires, personal attacks, bullying, and mob rule, over, and over again. Couldn't those volunteer hours be spent on more productive activities, like, say, improving content?
For upper management to even consider doing this to the Wikipedia community is absurd. It's like them saying "here's a nice forum for you to fight over personal opinions. We don't care that it will make our in good faith users so utterly distraught, they're liable to quit editing in exasperation."
Censoring images should be left up to commercial browsers, not Wikipedia itself. If they want to allow screening by choice, they should be working with browser programmers, not putting this bomb of a burden on our community.
It's also just begging outside interest groups to try and bully Wikipedia into further censorship; you give an inch, they take a mile. Rodgerrodger415 20:51, 19 August 2011 (UTC)
- My boyfriend just pointed something out; the upper management probably has no qualms "kicking the Wikipedia beehive," so to speak. While untold numbers of Wikipedia editors may lose faith in the screaming matches and quit, it will also get thousands of more people angry enough to participate in Wikipedia arguments and edit wars. More hits. More hits, more money. I just feel so bad for the administrators that will be responsible for policing all this unnecessary chaos. Rodgerrodger415 21:28, 19 August 2011 (UTC)
- Your boyfriend may not be aware that Wikipedia is run by a non-profit foundation, and carries no advertising so makes no money from extra page hits. Thparkth 21:45, 19 August 2011 (UTC)
- The owners of the Wikipedia program make money selling out the code for Wikis in general - I believe it's a leasing system, though I'd need to check. This makes Wikipedia, in it's own way, their main marketing horse for the system. Rodgerrodger415 21:57, 19 August 2011 (UTC)
- MediaWiki is open source - there's nothing to sell. Evil saltine 22:15, 19 August 2011 (UTC)
- If you're right, that's fantastically awesome - I'll be quite happy to know I was wrong. Unfortunately doesn't change my opinion about the problems with this new referendum, though, which is seriously bumming me out. Rodgerrodger415 22:23, 19 August 2011 (UTC)
- There are no "owners" Of the Wikipedia program. Copyright is held by the contributors, in the case of content, or the developers, in case of the code. We give the software away for free to anyone who wants it. There's no lease, there's no money changing hands. Philippe (WMF) 06:15, 20 August 2011 (UTC)
- If you're right, that's fantastically awesome - I'll be quite happy to know I was wrong. Unfortunately doesn't change my opinion about the problems with this new referendum, though, which is seriously bumming me out. Rodgerrodger415 22:23, 19 August 2011 (UTC)
- MediaWiki is open source - there's nothing to sell. Evil saltine 22:15, 19 August 2011 (UTC)
- The owners of the Wikipedia program make money selling out the code for Wikis in general - I believe it's a leasing system, though I'd need to check. This makes Wikipedia, in it's own way, their main marketing horse for the system. Rodgerrodger415 21:57, 19 August 2011 (UTC)
- Your boyfriend may not be aware that Wikipedia is run by a non-profit foundation, and carries no advertising so makes no money from extra page hits. Thparkth 21:45, 19 August 2011 (UTC)
This is SO, SO BAD. Terrible, really terrible.190.51.159.142 13:32, 20 August 2011 (UTC)
Concerns
This would be a waste of time
Another waste of time by the Foundation. Wikimedia projects are not censored, except projects like :ar. They think they should not have pics of their prophet. Some christians might like to censor the Piss Christ. Ultra-orthodox hasidic jews don't like pictures of women of the opposite sex. On :nl some don't like pictures with blood. Some classification scheme has to be in place, some review process, and people have to invest time doing that. This time is wasted and can not be spent spreading free knowledge. And who will make these decisions? Zanaq 09:18, 1 July 2011 (UTC)
- As I understand the concept, the 'classification scheme' would be categories for media, which already exist. The 'review process' for creating and updating categories also exists - it happens every day when media are uploaded. Category information is generally considered useful metadata and free knowledge in its own right. I could imagine people trying to create "useless" categories that have meaning only to them, but that seems like a rare case [and, again - is something that already happens today with category creation]. –SJ talk | translate
- Free content, or free information, is any kind of functional work, artwork, or other creative content that meets the definition of a free cultural work. A free cultural work is one which has no significant legal restriction on people's freedom. Where do i get a refund? aleichem 09:35, 1 July 2011 (UTC)
- I agree with this definition, but I'm not sure what point you mean to make. Could you describe it in more detail? For instance, if you are concerned about excluding free content from the projects, the filter proposals would not do any of that. (Notability guidelines, in contrast, exclude the vast majority of all free content from the projects - in what could more accurately be named 'censorship'.) –SJ talk | translate 03:53, 3 July 2011 (UTC)
- Great Wall of China within Wikimedia? That attempt deserves an thumbs down button. --Matthiasb 09:33, 1 July 2011 (UTC)
- User side filtration is NOT the same as censorship. Bulwersator 09:38, 1 July 2011 (UTC)
- Just to be clear: I do not think this is exactly censorship. I do think it is unworkable and contrary to the goals of most (if not all) WikiMedia Projects. Zanaq 09:53, 1 July 2011 (UTC)
- When issuing the referendum, hopefully it will contain details on how it will be done, what committees will be responsible for what, how a user can get protection (against them self), use cases (I don't want to see pictures of abused animals, Bin Laden and Scientology), default values, uploader and user training, ... --Foroa 16:53, 1 July 2011 (UTC)
- This always ends up with the situation where one group will tell what the other group is allowed to see. We occuse foreign governments of doing that. Choosing always end up in culture and politics, two things Wikipedia should stay far away from. Edoderoo 13:15, 2 July 2011 (UTC)
- This is a user side opt-in filter. I cannot understand how this becomes censorship. --Bencmq 14:47, 2 July 2011 (UTC)
- I have actually seen very little info so far. I may hope you are right. An opt-in filter is self-censorship, I won't care. Any other option is an area I'm even not ready to discuss about. I can't tell what others can see or not... No one can... Edoderoo 16:25, 2 July 2011 (UTC)
- This is a user side opt-in filter. I cannot understand how this becomes censorship. --Bencmq 14:47, 2 July 2011 (UTC)
- This always ends up with the situation where one group will tell what the other group is allowed to see. We occuse foreign governments of doing that. Choosing always end up in culture and politics, two things Wikipedia should stay far away from. Edoderoo 13:15, 2 July 2011 (UTC)
- When issuing the referendum, hopefully it will contain details on how it will be done, what committees will be responsible for what, how a user can get protection (against them self), use cases (I don't want to see pictures of abused animals, Bin Laden and Scientology), default values, uploader and user training, ... --Foroa 16:53, 1 July 2011 (UTC)
- Just to be clear: I do not think this is exactly censorship. I do think it is unworkable and contrary to the goals of most (if not all) WikiMedia Projects. Zanaq 09:53, 1 July 2011 (UTC)
- User side filtration is NOT the same as censorship. Bulwersator 09:38, 1 July 2011 (UTC)
┌──────────────────┘
- Although I do agree that there will be inevitable problems - if the filter is based on categories, there may be dispute over certain images if they should be be put into this or that category. But let's wait and see.--Bencmq 16:51, 2 July 2011 (UTC)
- This would be an opt-in filter. –SJ talk | translate 03:53, 3 July 2011 (UTC)
- Or go on doing useful things, instead of organizing censorhip on Wikipedia - Quistnix 19:07, 2 July 2011 (UTC)
- This is how all censorship starts.. slowly. First it's opt-in, then opt-out and then it wil be mandatory on request of foreign governments / organizations, since the meganism is already in place. Very dangerous. If people want to filter, fine, do it on your own PC. Koektrommel 07:54, 3 July 2011 (UTC)
- I'm not aware of any examples where censorship has come in via that gradual way. I suspect the typical pattern is more that if organisations do things that upset people and refuse to compromise by allowing some sort of opt out or rating system, then eventually there is an overreaction, censorship ensues and people regret not agreeining a reasonable compromise. As for the idea of people filtering on their own PCs, what proprtion of our readers are technically capable of doing that, 10%? 1%? We need a solution that everyone can use, not just the technoscenti. WereSpielChequers 17:12, 3 July 2011 (UTC)
- Then you need a system that does not require an account. As far as the slippery slope goes, suppose we start with an opt-in system and discover that it isn't all that effective as the vast majority of readers do not register and log in. So what then? Do we default to "reasonable" censors for IP readers, encouraging them to register if they want to opt out?? And straight down the rabbit hole we go.
- Before we even consider this referendum, WMF needs to understand the requirements. And the requirements are a system that (a) does not require registration (so has to be based on cookies rather than on-wiki preferences) (b) is easy to use and (c) can easily be advertised without running afoul of the "No disclaimers in articles" prohibition on en and likely other projects. And even then, it will be completely ineffective in many areas. Muhammad pictures for one, as many Muslims won't be satisfied unless we censor them for everyone. The "you have the capability to disable them for yourself" response has gone in one ear and out the other for years.
- For all the battles it will cause, a simple categorization system for images does have value. But I have to say that I'm in with the group who thinks the actual tools for censoring images should be left to third parties. Resolute 14:24, 4 July 2011 (UTC)
- I'd say that on a global level censorship usually comes "via that gradual way". If you read the OpenNet initiative reports, you will see that governments often use opt-in software from Western companies as basis for their obligatory censorship infrastructure. Without the existing professional opt-in solutions, they would have needed to invest much more resources into censorship, possibly making the whole attempt less attractive --Mcleinn 13:40, 19 August 2011 (UTC)
- The current proposal would work for logged-out users; they are the primary reading audience. –SJ talk | translate 18:59, 4 July 2011 (UTC)
- Not that's really enlightening. Above we here the system should be an opt-in, now we hear the system is proposed to work for logged-out users. How, could one explain, a logged-out user should opt-in, or anyway, how that user even could opt-out, if he isn't a registrated user? --Matthiasb 16:44, 13 July 2011 (UTC)
- Using cookies for preferences? –SJ talk | translate 21:27, 13 July 2011 (UTC)
- Not that's really enlightening. Above we here the system should be an opt-in, now we hear the system is proposed to work for logged-out users. How, could one explain, a logged-out user should opt-in, or anyway, how that user even could opt-out, if he isn't a registrated user? --Matthiasb 16:44, 13 July 2011 (UTC)
There are more slippery slopes and rabbit holes in this completely unworkable proposal than you'll every find at a water park. As mentioned earlier the proposal is contrary to Wikipedia's core values which include no censorship. As a reminder, just go back a few years to the censorship that the UK Government tried to impose on Wikipedia when it decided that an image of a CD cover on one webpage constituted porn.
Worst of all will be the unending subjective image 'tag wars' as dozens or hundreds of editors argue over whether some image is or is not vulgar, is or is not sexually explicit, is or is not offensive to women or Uzbeckistanians, etc.... 'Offensive to Uzbeckistganians and women' is not part of the proposed criteria today, but there will be no guarantee that the defined criteria won't be subject to mission creep if right wing fundamentalists are able to established the precedent. Such censorship tools have no business on Wikipedia –those who want to censor everything that can be viewed as objectionable should leave their computers off and stare at their walls.
What starts off as voluntary self-censorship today will likely morph in the hands of state legislators and tin-pot dictators. So where exactly did that rabbit go now? Harryzilber 21:27, 17 August 2011 (UTC)
- I agree, and I'd like to add that if the system is too liberal instead of conservative, we have another set of POV problems. Telling users that they shouldn't reasonably be offended by something when they are is a POV. Inevitably, there will be users who will put their rage faces on and screech "why [is/isn't] this tagged as offensive?". Every offended user is a user who won't donate, and I encourage everybody to vote 0 on the first question not only for Wikimedia's reputation but also for common decency. — Internoob (Wikt. | Talk | Cont.) 20:33, 19 August 2011 (UTC)
This would be a distraction from Foundation principles
The Wikimedia Foundation, Inc. is a nonprofit charitable organization dedicated to encouraging the growth, development and distribution of free, multilingual content, and to providing the full content of these wiki-based projects to the public free of charge. The Wikimedia Foundation operates some of the largest collaboratively edited reference projects in the world, including Wikipedia, a top-ten internet property.
Any type of filter runs contrary, even if opt in. These projects were designed to take the sum of human knowledge and make it freely available. The donated money, if any at all is used here, for which the foundation uses to further the goal, should be put to better use. NonvocalScream 19:57, 4 July 2011 (UTC)
- That would depend on whether the content was standing in the way of growth, development or distribution - which it seems that it is at the current time. The content would remain completely freely accessible to everyone even with this option in place. As such, I don't believe that it runs contrary to the Foundation's principles. Mike Peel 20:17, 4 July 2011 (UTC)
- What is standing in the way of growth, dev, and dist at the current time? Filters should be provided at the client side, not the server side. The projects or the Foundation have no business wasting resources in such an elegant way. NonvocalScream 20:27, 4 July 2011 (UTC)
- Let's see ... how about contributions from schools, universities and organisations where Wikipedia is currently blocked due to its controversial images? People in countries where their national sentimentality is offended by Wikipedia's content sufficiently that they don't edit (e.g. pictures of Muhammad)? As a very controversial statement, what is worse: Wikipedia not being editable in China due to an image of Tiananmen Square, or not having much more content about China that is freely accessible to everyone on the internet?
- Given that most large internet sites provides filtering on the server side, not just leaving it up to the client side, it's clear that there is demand for a server side solution. Simply based on that, I would argue that it is worth the Foundation running a poll/referendum of both its editors and readers to see whether Wikimedia should operate a similar server-side solution. Mike Peel 20:37, 4 July 2011 (UTC)
- The Foundation can run the poll/ref. I think that conversation here, now, before the poll is healthy.
- I maintain that the Foundation, and its projects really have no business expanding into this type of area. Regardless of what country won't display because of "this" image, it is not in the ambit of the WMF to do this thing. It is totally out of scope. NonvocalScream 20:42, 4 July 2011 (UTC)
- What is standing in the way of growth, dev, and dist at the current time? Filters should be provided at the client side, not the server side. The projects or the Foundation have no business wasting resources in such an elegant way. NonvocalScream 20:27, 4 July 2011 (UTC)
As long as readers can choose to see the image with one click, I don't believe Wikipedia loses any freedom with this image filter. --NaBUru38 18:43, 15 August 2011 (UTC)
- I agree, but unfortunately it won't work this way. Filtering almost inevitably will be based on cookies, and there is nothing easier than to inject cookie at firewall level, pretending that it was user who decided not to see the image. Ipsign 07:39, 16 August 2011 (UTC)
- If a malicious party controls the firewall, the game's already over. They can block all of Wikipedia, or all images, or pages in some categories, or pages containing particular words. Bypassing those firewalls is out of scope. 59.167.227.70 09:30, 16 August 2011 (UTC)
- If a 'malicious party' (they'll see themselves as good and wholesome, of course!), controls the firewall, they can't really selectively block things right now. They need some sort of data to work with. At the moment decent data to help such filters does not exist or is (apparently) insufficient. Part of this proposal has the side effect that we create that data, and hand the bad guys the tools they need to make their evil plans actually work. --Kim Bruning 13:03, 21 August 2011 (UTC)
- If a malicious party controls the firewall, the game's already over. They can block all of Wikipedia, or all images, or pages in some categories, or pages containing particular words. Bypassing those firewalls is out of scope. 59.167.227.70 09:30, 16 August 2011 (UTC)
I hardly understand why this would be to wikimedia itself to set a filter option. If a filter tool is seen as useful, there is plenty of technical means runned by third party. We can suggest some to people who need or want them. If you don't want to be offended, you'll probably want that for other website too. Then, you don't need a website specific preference, you need a tool that'll be work with a loads of website. I don't think this filter thing is a legitimate goal in regards of Wikimedia principles.
Plus, it could be counter-productive and dangerous. Wikimedia is, by many ways, one of the Internet White Knights, one of those you can count on, because you know they are promoting stuffs that almost everybody will, in the end, call "good" ou "valuable". Would Wikimedia promote a system that some will call "censorship", the whiteness would be gone. And Grey Knights of incertain morals may not deserve the time and money lot of people agree to give to White ones. This filter thing is the best way to ruin Wikimedia's reputation. Think about how good we'll look in press articles. Cherry 14:32, 16 August 2011 (UTC)
Technical limitations
If this feature is implemented some technical difficulty may arise. For instance, if the feature relies on cookies, what will we do about people whose browser autodelete cookie at closing ? This isn't just a personal choice : in many public, browser are set on "autodelete cookie after closing browser". Cherry 14:53, 16 August 2011 (UTC)
- If they consistently run into issues like that and usually delete their cookies, then they should login and have their preferences saved with their account. Cbrown1023 talk 15:13, 16 August 2011 (UTC)
- This also goes the other way: many internet cafes or other public access venues will not turn off browsers between different users. So one user choosing to turn on a filter will cause put the next user in an opt-out mode, instead of opt-in. Cafes could presumably also hardwire some cookies to be non-deletable. Non-techie users will try clicking to opt-out, but fail to know why they cannot opt-out or find a method. Boud 21:41, 16 August 2011 (UTC)
+ We're not here to tell people what to do. Saying "then they should X or Y" is dubious. On the one hand, you seem to want to adapt to people (by adding a personal filter feature), and the other hand you seem to ask them to adapt to us. If we think that people should adapt themself to wikimedia's content, then we don't need any filter : we need a section somewhere saying "if you don't like what's on the sites, get yourself a solution or get off". If we think we have to adapt, then we have to provide a solid proposal, that will raise as little problem as possible. Cherry 18:31, 18 August 2011 (UTC)
Additional maintenance overhead
Who is going to set up the huge infrastructure that is required for this, and more importantly, who is going to keep it running in an unbiased manner? All the users who think this is stupid won't want to spend hours arguing with religious types (Danish cartoons, 15th century engravings of Muhammad: blasphemy or encyclopaedic?), people who oppose homosexuality (men holding hands: OK or not?), people who think that a picture of a medical condition will scar a child for life ("Think of the Children!" will be a very common refrain), people who have certain political ideologies (a map with Taiwan and Tibet as distinct from China offends User:CCP), people who don't like pictures of any weapons, people who wish to use the system to further wiki-battles (User:Alice has her images systematically tagged by User:Malloy), people who have noting better to do that annoy others (much more time-consuming than vandalism, and much more fun to watch unfold). We have enough of that with the existing CV and PROD set-ups. We will need a clearing-house, individual discussions for each image, and a process to watch for abuse. The backlog will grow since no-one will agree on anything even vaguely controversial (buttocks from the side: nudity or not?). The users who are capable of running a neutral system will probably get bored and and move on, letting the crazies run the show.
This proposal will generate huge amounts of discussion for every borderline image. There are thousands out there who love nothing more than a pointless and time consuming fight that will need to be arbitrated and mediated by people who really have better stuff to do. It will suck up thousands of user-hours (more than it has already) on useless bickering. Inductiveload 17:44, 16 August 2011 (UTC)
Filtering
Wikipedia is an open encyclopedia. Much of the content, if not all, is made for adult educational purposes. Filtering may cause editors to become careful and tone down controversial subjects. I am against using obscenity or explicitly gory or sexual content. However, let the editors be the filterers. This is an educational site for the world, not an elementary school. Let parents filter for their own children. The answer is not a filtering system, that would only interfere with content search ability, in my opinion. A filtering system would give POV to many articles. For example an article discussion on Sally Hemings and Thomas Jefferson could be viewed as obscene, rather then researched historically accurate. An open statement on Wikipedia front page that states Wikipedia contains adult language and content would be appropriate, rather then a burdensome filtering system. Cmguy777 01:40, 17 August 2011 (UTC)
Public reaction
We're not sure that this filter idea will be welcomed by the readers. There is a real risk of public backlash. We could lose the confidence of many readers, in order to satisfy only a few of them. We could also be mocked in general media. This filter is already misunderstood within the wikimedia community and called "censorship" by some. I have some doubts on the *subtile* treatement this would get in a press article. Cherry 10:07, 18 August 2011 (UTC)
Other
Other Projects
Commons must be sensitive to the need of every other project in the Wikimedia universe, as well as users around the globe
Can we translate that into: If a project elects or is forced to elect to ban images of ( ), the foundation will be happy to assist? --Eingangskontrolle 17:33, 16 August 2011 (UTC)
- That seems to be a very different statement from what was written. Philippe (WMF) 17:35, 16 August 2011 (UTC)
Or can some project choose, not to use this system at all. --Eingangskontrolle 17:37, 16 August 2011 (UTC)
- No, the Board of Trustees has directed the Foundation to proceed with this. It will be integrated into the software, as I understand it. There will not be an opt-out. Philippe (WMF) 18:17, 16 August 2011 (UTC)
- I'm not clear on why the WMF would ask users to comment on something that they already intend to implement. Can you explain? Delicious carbuncle 20:07, 16 August 2011 (UTC)
- From the Content page: The feature will be developed for, and implemented on, all projects. It will not permanently remove any images: it will only hide them from view on request. For its development, we have created a number of guiding principles, but trade-offs will need to be made throughout the development process. To aid the developers in making those trade-offs, we are asking you to help us assess the importance of each by taking part in this referendum. Philippe (WMF) 20:09, 16 August 2011 (UTC)
- So, to be clear, an image filter is going to be implemented, regardless of the "result" of this "referendum"? Delicious carbuncle 21:02, 16 August 2011 (UTC)
- From the Content page: The feature will be developed for, and implemented on, all projects. It will not permanently remove any images: it will only hide them from view on request. For its development, we have created a number of guiding principles, but trade-offs will need to be made throughout the development process. To aid the developers in making those trade-offs, we are asking you to help us assess the importance of each by taking part in this referendum. Philippe (WMF) 20:09, 16 August 2011 (UTC)
- I'm not clear on why the WMF would ask users to comment on something that they already intend to implement. Can you explain? Delicious carbuncle 20:07, 16 August 2011 (UTC)
- No, the Board of Trustees has directed the Foundation to proceed with this. It will be integrated into the software, as I understand it. There will not be an opt-out. Philippe (WMF) 18:17, 16 August 2011 (UTC)
Why is it called a referendum? Usually that would be a "yes/no" on whether we want this, not a request for comment on what is important and what is not. (Building consensus is important, building a self-censorship infrastructure is not, especially when that could be done by third party add-ons without compromising the integrity of the Foundation's mission of disseminating all human knowledge). Kusma 20:35, 16 August 2011 (UTC)
- As far as I understood it is decided that a filter will be implemented, and that it will be opt-in-only (= no filtering by default). All other details of its configuration shall be developed as far as possible within the community's ideas and wishes. The referendum tries to find out which filter features are wanted or rejected by the community. --Martina Nolte 22:33, 16 August 2011 (UTC)
- I find it already strange that such a "feature" was implemented without asking the community. On top of that: Why do we vote anyway if its already decided? It's like: "Hey guys, you can censor yourself now. If you don't like, we will do it for you". --Niabot 23:31, 16 August 2011 (UTC)
I think that the technical side of things on the wiki is not a sticking point. However, the categorization system required to make it work is not as innocent. Depending on the implementation, I guess we could accept the plugin, but (imo rightly) refuse to generate the categorization scheme. --Kim Bruning 15:56, 19 August 2011 (UTC)
- we could [...] (imo rightly) refuse to generate the categorization scheme -- Imo we should strictly refuse to create, maintain and even to host any arbitrary categories based on filter-specific criteria like "controversial material" and such. Just the descriptive categories we have should be fine to allow users to choose specific filtering according to their liking. If some people want something like e.g. a "controversial" or "NSFW" category, those categories should not be hosted and maintained on any Wikimedia project site. --87.78.46.49 16:07, 19 August 2011 (UTC)
More editor will be leaving
Pictures are an essential part of the editorial content of an article. If this is implemented I fear many good editors will leave the project. This proposal is not necessary. Every modern browser have the option to hide pictures. I am disgusted that this is even been considered.--LexCorp 19:10, 18 August 2011 (UTC)
- Browsers can only shut off all images or no images. What is proposed here would allow me to selectively shut off images belonging to a particular category. That is a vastly different thing. If I'm an arachnophobe, then under the present scheme, I'd have to shut off all images from the entire encyclopedia (possibly from the entire Internet) - under this new scheme, I can selectively hide images in Category:Spiders - and I'm done. SteveBaker 19:32, 19 August 2011 (UTC)
- There are programs you can can install to filter your entire internet connection, if that's what you want to do. Wouldn't it be more effective to block spiders once on your own computer, instead of waiting for every site on the web to implement their own filtering system, which you then need to configure individually? 24.61.174.18
- If editors leave because children will now have the ability to not be exposed to graphic sexual pictures then those editors aren't the ones that should be editing a WMF project to begin with. Ottava Rima (talk) 22:56, 19 August 2011 (UTC)
Bowdlerizing Wikipedia
Although this particular idea is not dangerous by itself, bowdlerization is intimidating for Wikipedia's future because this is opening a door that will be hard to close. If we are starting with hiding information that some people think offends them, where do we draw the line?
- Indeed, if we admit that some people are offended by sexual or violent images, why do we not care about their feelings in other matters? Shouldn't we offer a feature to hide Nazi-related information from those whose families have suffered in Holocaust? But then, what about those who have suffered under Soviet regime? And those who are offended by the separatist/independence claims of Taiwan or Tibet?
- Why only images, if we could mark some passages in text, too, with special markers? Why shouldn't we offer a parental control feature that would allow someone to hide some things from everyone on a certain computer or, possibly, local network?
- Also, if we start to teach our editors thinking in these terms, paying attention not only to the correctness of provability of information, but also possible subjective feelings of readers, how can we say this large cultural change in Wikipedia will not influence our content as a whole, so in the end, editors will start to censor their own text? After all, self-censoring is the most widespread censorship there is, and the most powerful.
- These are subtle changes, that could be brought on, step by step, by conscientious people with high ethical criteria. Indeed, what's interesting is that while most of these have been mentioned in current discussion as concerns, some have also been gladly welcomed. Yet in the end, we would be quite far from free knowledge. Then again, maybe we shouldn't even start on that road? --Oop 05:26, 19 August 2011 (UTC)
They dont care / they dont know
[10:07am] sgardner: I would start, Ariel, by saying that I'm not super-knowledgeable about that proposal. People have talked with me about it, but I haven't read the page discussion, so I am not up to speed on all the thinking that's been done.
[10:12am] Tango42: I'll have a go at rephrasing Seth's question (although it didn't look particularly loaded to me): Sue, what is the WMF's goal for the image filter referendum? What do you hope to gain by holding it?
... several lines cut ...
[10:16am] sgardner: I missed Tango42, thanks Beria.
[10:17am] sgardner: One is, we want to know to what extent, in general, the editing community is favourably inclined towards the idea of the feature. I believe that is the first question in the referendum.
[10:17am] sebmol: it is
[10:17am] Tango42: No, sue, that isn't the first question
[10:18am] sebmol: it's the first question in the referendum
[10:18am] Tango42: The question is about how important the feature is, not whether people are in favour of it
[10:18am] Beria: it is Tango42
[10:18am] Tango42: those are very different things
[10:18am] sebmol: only to mathmaticians
[10:18am] sgardner: And second, we want to get a sense of which attributes/characteristics of the feature people feel are most important. So that as the feature is being built, the developers have information that can help them make tradeoffs in its design/functionality.
IRC_office_hours/Office_hours_2011-08-18
Thats the voice of somebody directly resposible. They started an avalance and have no knowlege about snow. --Eingangskontrolle 14:32, 19 August 2011 (UTC)
- You rather selectively cut [some back and forth discussion] from a much larger answer about the referendum here. -- phoebe | talk 15:23, 19 August 2011 (UTC)
- Where can i find the bigger answer? The chat doesn't contain it. There are so many open questions. The only question that was answered by now is, if the filter will be integrated. That is a more less definitive yes. But anything else is in the dark. --Niabot 15:30, 19 August 2011 (UTC)
- I was talking about the answers in the chat. What I mean by selective cutting: the proposal Sue said she wasn't super-knowledgeable about wasn't this one, it was something else altogether (enwiki autoconfirmed article creation, I believe?). These office hours are open to any topic :) -- phoebe | talk 15:55, 19 August 2011 (UTC)
- Where can i find the bigger answer? The chat doesn't contain it. There are so many open questions. The only question that was answered by now is, if the filter will be integrated. That is a more less definitive yes. But anything else is in the dark. --Niabot 15:30, 19 August 2011 (UTC)
- You rather selectively cut [some back and forth discussion] from a much larger answer about the referendum here. -- phoebe | talk 15:23, 19 August 2011 (UTC)
- Maybe, but you're (collectively) the pebbles. As in "Kosh: The avalanche has already started. It is too late for the pebbles to vote." -- Seth Finkelstein 14:52, 19 August 2011 (UTC)
I could not find an answer - the whole talk seems to have several topics at the same time. But it is very clear, that some asked questions and others who are elected (paid?) to have the answers ready are not prepared to answer. And they have not read what is published in their name. Please extract the relevant answers for us to read. --Bahnmoeller 18:36, 19 August 2011 (UTC)
Well meaning, but unworkable
My comment on the "referendum" —
At first I thought this was a not-so-important but not completely unreasonable idea, but the more I think about it the less I like it.
The idea of being culturally neutral is certainly appealing but I can't help feeling it will be rather difficult to achieve. For instance, I know quite a few people who find some religious images offensive but will their culture be taken into account? There are other people who would find quite innocent images of gay people to be offensive or sexual. How will they be classified? Who decides? This is inviting a whole bunch of fresh controversies about what to censor (& where to draw the lines) into the organisation. It may be a "personal choice" for the reader, but that ignores the inevitable choices the Wiki community will have to make. The very choice of categories is a loaded act of deciding whose sensibilities are important and whose aren't. It can't possibly be neutral since it's an act of discrimination (in the basic sense of the word).
This is the first can of worms I see here; the other is whose time is going to be wasted by dealing with flagged images? That seems like an invitation to the censorious on one hand and the prankster on the other. If it's implemented, perhaps that at least should be reserved for account holders so there is some chance of judging the worth of the input.
Although I understand the motivation for this idea, I can't help feeling that it has some very unpleasant ramifications and simply cannot be done in the suggested neutral way.
It's also rather offensive to be asked to participate in a "referendum" when it's seems clear that the decision has already been made and the questions assume that you agree with the idea. Even giving an answer to some of the questions could be construed as support.
I also can't help feeling there are more productive improvements people could be working on. ☸ Moilleadóir ☎ 16:12, 19 August 2011 (UTC)
- I certainly won´t be wasting my time flagging images, and I urge everyone to do the same if this proposal is implemented: without volunteers it just won't work. Zanaq 17:49, 19 August 2011 (UTC)
Still not anonymising accidentally included citizens: money better spent?
At a time when news organisations and websites are under criticism for allowing "rogue" governments and vigilante groups to track the identities and whereabouts of discontent individuals through published image media, the WMF would do well to finally follow the call for "ordinary", accidentally included individuals to be anonymised (blurred) by default. Sadly, in a very hawkish manner, this proposal is still widely dismissed in the community, but it would actually be a way for WMF to take a stance on an issue rather than just give in to popular demand. The point at which a project does nothing more than give in to popular demand is the demarcation of having lost its vision, and its momentum for the future. Wikipedia didn't become a phenomenon by placating people. Samsara 22:02, 19 August 2011 (UTC)
Definitions
"servers hosted by a neutral third party"
From the content page: "The referendum [...] will be conducted on servers hosted by a neutral third party." Um. Well. What? Will users' login information be shared with this "neutral" third party? If not, how can the vote be held on external servers? Why would a vote like this be held not only off wiki, but off Wikimedia? How could a third partly outside Wikimedia possibly be neutral? --Yair rand 13:38, 3 July 2011 (UTC)
- This means simply 'conducted as the Board election was' - by a third party specialized in overseeing fair votes. There may not be a need for such precautions in this case [I would prefer an open-ballot vote on Meta, myself], but it is often seen as an unbiased choice for running the technical process of voting. –SJ talk | translate 15:09, 3 July 2011 (UTC)
- Sj is correct. The election is held using the Securepoll extension, which passes a user who's already logged in to a WMF wiki (and verified as meeting voting requirements) to the servers of the third party for voting. Philippe (WMF) 00:40, 4 July 2011 (UTC)
- Thank you for the explanation. --Yair rand 02:53, 4 July 2011 (UTC)
- Sj is correct. The election is held using the Securepoll extension, which passes a user who's already logged in to a WMF wiki (and verified as meeting voting requirements) to the servers of the third party for voting. Philippe (WMF) 00:40, 4 July 2011 (UTC)
- Is there some sort of technical explanation of the mechanism, except the source code? If IP addresses and user names are not connected on the third party's servers, then I suppose some cryptographic key is included in the URL or body of the request to the third party server. I brief explanation of how this is done and what information is exposed to the third party would be important. --LPfi 16:39, 5 July 2011 (UTC)
Interesting, that we are not asked yes or no. The decision ist already made and we are asked for minor details. --Eingangskontrolle 08:54, 16 August 2011 (UTC)
No. They propose questions as statements that are for something. If you are neutral about that, you put 5, if you strongly support it (the equivalent of very yes) you put 10, if you endorse it but not too strongly (a minor yes) then you put something between 5 and 10. If you oppose it (i.e. you would vote no) you put less than 5, with 0 being an absolute no. If they asked a question with just yes and no as the answer, it leaves out a scale and a way to determine what is most important, it also leaves out a neutral option. If you do not think this is good enough, then you are insane. The very first question is how important it is for the Wikimedia projects to offer this feature to reader. If you do not want the feature, you put 0 here. If you do want it you put 10. If you are neutral, you put 5, or anywhere else you please. Black.jeff 09:00, 20 August 2011 (UTC)
Process
I'm so glad I can't vote!
As a wikipedia editor with several hundred edits from various IP addresses I've connected with in the past but no registered account, I'm ecstatic to have no say in the community-wide opposition to this abysmal "referendum"! Long live wiki freedoms! 98.217.75.153 04:29, 18 August 2011 (UTC)
- Glad to hear you're more than content with your own decision to decline to have a vote through refusal to register. Have a nice day! Infrogmation 20:20, 20 August 2011 (UTC)
Voting (where?)
Which project should I vote from? I am elibable on BOTH commons and on EN wiki. Can I vote twice or if I do, will my votes be canceled? --tyw7 19:31, 19 August 2011 (UTC)
- If you are eligible to vote from two accounts, it doesn't really matter which one you cast your vote from (as long as you aren't blocked on the project you're voting from). As for voting, I believe you can only vote once. It doesn't actually spell it out here, but in the past you have been allowed to vote only once total from all projects and all accounts (that is, one vote per person, not per account/project). --Philosopher Let us reason together. 01:40, 20 August 2011 (UTC)
Technically speaking there is one vote per account. But ofcourse one person should only vote with the primary account. There are good reasons to edit under separate names, such as splitt professional edits from general interest edits. --Bahnmoeller 15:54, 20 August 2011 (UTC)
I also got an email about a referendum but I cannot vote!
The email below invited me to vote but when I tried to do so I got a message saying that I am not eligible to vote!!!!????
Dear Renato Costa,
You are eligible to vote in the image filter referendum, a referendum to gather more input into the development and usage of an opt-in personal image hiding feature. This feature will allow readers to voluntarily screen particular types of images strictly for their own accounts.
Its purpose is to enable readers to easily hide images on the Wikimedia projects that they do not wish to view, either when first viewing the image or ahead of time through individual preference settings. The feature is intended to benefit readers by offering them more choice, and to that end it will be made as user-friendly and simple as possible. We will also make it as easy as possible for editors to support. For its development, we have created a number of guiding principles, but trade-offs will need to be made throughout the development process. In order to aid the developers in making those trade-offs, we need your help us assess the importance of each by taking part in this referendum.
For more information, please see http://meta.wikimedia.org/wiki/Image_filter_referendum/en. To remove yourself from future notifications, please add your user name at http://meta.wikimedia.org/wiki/Wikimedia_nomail_list.
What's a referendum and what's this
Importance vs. should/shouldn't
"It is important for the Wikimedia projects to offer this feature to readers." - again, there are two separate questions here that have been rolled into one. (a) should the Wikimedia projects offer this feature to readers?, and (b) how important is it to offer this feature? Although answering (b) implies support of (a), asking about importance is a very separate question from whether the feature should or should not be enabled. If the majority of the community believe that Wikimedia shouldn't allow this feature, then the importance doesn't particularly matter. (in contrast to: if the feature is rated as low-importance, then that could mean that the WMF still funds its development and implementation but doesn't rate its development as highly as if it were high importance). Mike Peel 21:15, 23 July 2011 (UTC)
- Note that I don't believe that this should be subject to Wikimedia's standard rule of consensus: if e.g. 10% of voters believe that this feature should be enabled, then that's probably a sufficient amount of people (i.e. representing sufficient demand) to make this worthwhile implementing - particularly if those voters come from under-represented parts of the world. Mike Peel 21:29, 23 July 2011 (UTC)
- We are not asked if we want this feature nor not. It will come anyway as there is no way to stop it by this farce. --Eingangskontrolle 09:00, 16 August 2011 (UTC)
- This referendum is disgusting. The misleading formulation of the questions is ridiculous. Starting from Jimbo "benevolent dictator" Wales (with his vulgar unilateral deletion of illustrations and artwork guilty of "sexual content"), there has long been a worrying pressure to censor wikipedia according to family values. It's time to respond with a big Fuck you. We've had it, morons, we are going to revolt and bring a little bit of Egypt to Wikipedia.--Sum 19 August 2011
- We are not asked if we want this feature nor not. It will come anyway as there is no way to stop it by this farce. --Eingangskontrolle 09:00, 16 August 2011 (UTC)
- That would be true, if it was a scale from 0 to 10 with 0 being not important and 10 being very important. With the actual setup, 5 is neutral (not important). If you do not want it, then you put 0, strongly oppose. There is no reason to break it apart. Your proposed first question is a yes or no, with the second being a degree of yes or no. With the actual voting system it is combined as 1, the degree and direction of yes or no. (perhaps they should have used negative numbers, would that have been better for you?)Black.jeff 09:15, 20 August 2011 (UTC)
Survey format is limited, and alien to our culture
At the footservant level, a survey like this would be strongly criticised and discouraged. But Foundation ignores our culture and goes ahead with one anyway. Survey does not allow for structured, public discussion (it has to happen here and is apparently intended to be inconsequential), or even for indicating make-or-break features that people may actually feel make the idea perfectly tenable or completely untenable. There is also no binding statement on how the results will be announced, i.e. format, level of detail, and cross-linking. All of these add up to a very disappointing showing. Samsara 22:02, 19 August 2011 (UTC)
- I agree with your thoughts. Nicely put. I think Wikipedia people up top have already decided it will be implemented, though. The part that honestly has me most worried, besides the eruption of internal and unnecessary drama over each and every picture, is schools and institutions being given the reigns on turning filters on and off for all large groups of users. It may not happen this year, or the next, but with the tools in place, it's ridiculous to think it won't happen eventually. And just for starters, students could get in trouble for using school computers without a mandated filter on, just like people get kicked out of libraries for looking at porn. I can just see the little computer lab warning right now: "Students who look at Wikipedia without using proper indecency filters (directions below) will be reported to the principal and lose lab privileges." Rodgerrodger415 22:18, 19 August 2011 (UTC)
- It's really okay. You can see this as a group that's made up it's mind already, but you don't have to.
- They want discussion, they're reading the discussions. They want a survey because they get so many more respondents.
- They can't tell us how the results will be interpreted because they don't know. When the results come out, we can all interpret them together and see if it was useful to have a survey or not.
- If 95% of people who speak a given language all say they want a filter, I want us to be able to help them get it. That's a survey question, not a discussion question.
- FUTURE consensus can be affected by the present survey. It's not evil!! It's just a babystep. --AlecMeta 22:30, 19 August 2011 (UTC)
- I do not believe the designers were idiots, and I therefore conclude they want to be able to report x% said we should do it this way, and 100-x%, the other way, from which the reader who did not know the history would conclude that 100% wanted it one way or the other. I note that 95% of the discussion on this page is not about the feature, but pro or con on the entire overall question. It seems clear from the questionnaire that the board either assumed they represented the community in this, or they didn't care what the community thought. Fortunately, Wikipedia is not yet so consistently bureaucratic as to be able to avoid fundamental discussion. DGG 22:37, 19 August 2011 (UTC)
- My impression is they already know some readers want this-- mobile users for example, at-work users, non-western cultures. They also know how to do it in a way that doesn't upset nearly as many people as the other ways. The things they don't know is whether it's better to build a very basic filter that approximates our values or a more-complex filter that perfectly models our values.
- But they are also open to being surprised if consensus lies somewhere other than where they think it does. The discussion matters, the survey matters, and if the survey is confusion, that's because it's a very complex issue and their first time surveying.
- If they wanted to just ram this down our throats, they could have just voted to do so. They really do want to know what we think. --AlecMeta 00:27, 20 August 2011 (UTC)
- The problem here is not that the board are not listening, nor that they intend to ram the feature through, but that have created that impression - the board resolution and the referendum page both suggest that the feature is a done deal - and Phillipe has also said as much on this talk page. Of course, the board being sensible, they will take into account the new information and ideas, and be at liberty to revoke their previous resolution if that's appropriate. However the only communication that indicates that this open minded and pragmatic approach is the case, is the comments from Phoebe on this page, therefore I think editors who assume that the board are "dead set" on this need not be criticised for reaching that conclusion. Rich Farmbrough 03:38 20 August 2011 (GMT).
- I understand what you're saying. Realistically, they are probably "dead set" in their thinking right this. But if examination reveals a truly bad proposal that generates a surprising amount of controversy, that absolutely does "reset" their thinking. I got to see this first hand, and it really is kind of mindblowing. Consensus isn't an illusion or a scam-- WMF means it. --AlecMeta 04:50, 20 August 2011 (UTC)
- The problem here is not that the board are not listening, nor that they intend to ram the feature through, but that have created that impression - the board resolution and the referendum page both suggest that the feature is a done deal - and Phillipe has also said as much on this talk page. Of course, the board being sensible, they will take into account the new information and ideas, and be at liberty to revoke their previous resolution if that's appropriate. However the only communication that indicates that this open minded and pragmatic approach is the case, is the comments from Phoebe on this page, therefore I think editors who assume that the board are "dead set" on this need not be criticised for reaching that conclusion. Rich Farmbrough 03:38 20 August 2011 (GMT).
- I find the notion that this is Alien to our culture hilarious, you must think people are really stupid or are at risk of appearing stupid yourself. Arbcom elections, BoT elections, Checkuser / CU Elections any major decision such as this is done via a secret vote. In this case it is a refferendum. The reason it is done this way is because the local village people method of discussion and consensus more often than not results in 1000 different people putting forward 1000 marginally different points of view and argue like a rabble for 3 months before the discussion is declared stale. The survey method gets a clear, concise over-all feeling of an idea without their being 3000 pages of patent bullshit to sift through. Promethean 07:05, 20 August 2011 (UTC)
Alternative ideas
How is this necessary when there are userscripts?
I've found this userscript (to be used with Greasemonkey or Scriptish) and although this one default-filters all images on Wikipedia (except SVGs), I'm fairly certain it would be perfectly possible to create a script which employs Wikipedia and Commons categories to define and apply customized filters.
That userscript could contain options such as either completely filtering images, or to make them unhide-able them via mouseclick, and whether to filter based on Wikipedia article categories, or on Wikipedia/Commons file categories, or both, and even whether to filter based on some yet-to-be-created third-party-hosted independent categories (e.g. some independently maintained general "NSFW" category which would be thoroughly unsuitable for Wikimedia projects).
How then is adding this plug-in into MediaWiki even technically necessary or potentially advantageous in any way? If someone just wrote that script, people who prefer to filter images could simply be pointed to the userscript. --87.78.46.49 16:19, 19 August 2011 (UTC)
- The people who want filtered images, in my experience, are often new information technology in general. Installing a greasemonkey userscript is well beyond their realistic capabilities. Remember, we desperately need to capture people who can't even edit raw wikitext without machine assistance, that's a goal we have. Userscripts are the most ideologically correct answer, but it's not feasible for the audience in question.--AlecMeta 19:39, 19 August 2011 (UTC)
- So we should cater to such people's technical inability and help them censor themselves? Come on, you're not even being serious, are you. People who are kept away because we don't help them suppress content to their liking and who are (as you claim) unable to follow a set of instructions consisting of exactly three largely self-explanatory hyperlinks ([1], [2], and the userscript link) are people who we definitely want to keep away. This is ridiculous. I knew something rotten was at the core of this proposal, and now I get it. It is to draw even more inept people into contributing to Wikipedia, which will in the end accomplish one thing and one thing only: drive away even more sensible high-level contributors. Fantastic. --213.196.208.244 21:08, 19 August 2011 (UTC)
- Your view is shifted way too far towards readers and away from editors. It's editors we're short of, not readers (increasingly, because editors have been leaving in droves for years and we've apparently been unable to do anything sensible or constructive about it). And it's not because the job is done - look at any sensible benchmark such as progress in Vital Articles, and you'll find most of them are "start" or "C-level" quality. Or look at the decline in ACID and similar collaborations. We're not short on articles to improve, just short of editors who aren't completely fed up with how we do things. And I think the point about Userscripts is a brilliant insight that should be given some attention. At least one version of the proposal is clearly trying to re-invent the wheel vis-a-vis this. And some are telling us that people come to Wikipedia to learn, but please don't teach me anything tech? Oh dearie me, Aunt Gladys! Samsara 22:29, 19 August 2011 (UTC)
- The simple fact is that editors comprise less than 1/10th of 1/10th of 1 percent of our users. Our users are readers, not editors.
- I agree that we should spend a lot of time recruiting editors (in fact, that is my primary task, as an employee of the Foundation) - but the first step towards recruiting new editors is to provide them a space where they feel welcome and not uncomfortable.
- To speak about userscripts: seriously? 95+% of our userbase doesn't even know that things like "talk pages" even exist, let alone "super user" technologies like userscripts or browser plugins.--Jorm (WMF) 22:36, 19 August 2011 (UTC)
- The point is that the technology to block images from all Wikimedia pages already exists. So the MediaWiki image filter will accomplish exactly nothing that cannot also be accomplished using (freely available, no less) third-party software. Bottom line, anyone who wants to filter images can do so already, and if they can't yet, then they can easily acquire the little necessary knowledge required to do so. People who are too lazy to learn how to install Firefox, Greasemonkey and one userscript cannot be too upset about those images. People who are too inept to follow the very simple instructions (Firefox, Greasemonkey and one userscript) shouldn't command the project's attention and resources to create and implement an extra plugin that, again, does nothing beyond helping them do something which can be done using existing technology. --213.196.208.244 22:56, 19 August 2011 (UTC)
- " Remember, we desperately need to capture people who can't even edit raw wikitext without machine assistance, that's a goal we have. "
- This has always worried me slightly. Wikitext is basically "click edit, type some text, click save" - advanced wikitext is using
- Your view is shifted way too far towards readers and away from editors. It's editors we're short of, not readers (increasingly, because editors have been leaving in droves for years and we've apparently been unable to do anything sensible or constructive about it). And it's not because the job is done - look at any sensible benchmark such as progress in Vital Articles, and you'll find most of them are "start" or "C-level" quality. Or look at the decline in ACID and similar collaborations. We're not short on articles to improve, just short of editors who aren't completely fed up with how we do things. And I think the point about Userscripts is a brilliant insight that should be given some attention. At least one version of the proposal is clearly trying to re-invent the wheel vis-a-vis this. And some are telling us that people come to Wikipedia to learn, but please don't teach me anything tech? Oh dearie me, Aunt Gladys! Samsara 22:29, 19 August 2011 (UTC)
headers and links, maybe bold and italics. All the rest is detail, you can leave it to someone else until or unless you learn it. Want a URL hyperlink? Just type it in, or cut and paste if you now how to do that. If you can't categorise your article/stub someone will do it for you - they do it for me, for sure, and better than I could. How adding an editor with buttons containing icons whose meaning is obscure anyway is going to help is a mystery. Oh well. Rich Farmbrough 02:19 20 August 2011 (GMT).
- Wikitext from scratch is, in fact, pretty easy. But if you dive into pre-existing wikitext without knowing anything about markup, you just see hieroglyphics. Strange to imagine, but very real. --AlecMeta 02:53, 20 August 2011 (UTC)
@.--User:Jorm (WMF) I, and i think many other editors, feel very uncomfortable with a Wikipedia which is using software, that is intentended to filter content. --Bahnmoeller 15:49, 20 August 2011 (UTC)
Design
Opt out of seeing the "hide content" tabs (or opt-in)
If we were to implement this, I would like to see a user-configuration option (which obviously applies only to registered users) that allows me to turn OFF the "hide content" tabs so that I don't see them. That way my limited screen space is not cluttered with things that I will never click. Of course the default for that option can be "show tabs", because I'll only have to turn them off once. Mitch Ames 10:07, 16 August 2011 (UTC)
- If we do decide to do this proposal, I too think this would be necessary. I wouldn't want to be reminded every time I see a illustration that Wikipedia is implementing self-censorship. DGG 16:13, 16 August 2011 (UTC)
- Even better: the "hide content" tabs could be made opt-in too. The opt-in for the hiding-content system could be located at the bottom together with the * Privacy policy * About Wikipedia * Disclaimer * Mobile view links. Otherwise, we constantly get the idea that we should feel guilty about looking at Commons images shown on the Wikipedia. "Edit" and "hide content" are very different: one is about correcting errors (or completing, clarifying, etc.), the other makes us think about censorship. Boud 22:29, 16 August 2011 (UTC)
- Yes, placing the word "hide" under every single image on every WP page would have a subtle but significant negative effect on the overall "feel" of the WP site. It would tend to distract the reader from the educational purpose of each image and subtly orient the reader on the image's morality (good/bad, appropriate/inappropriate). As Boud said, the thought of censorship would be pervasive. The word "hide" has a newer meaning in computer-ese, but the original meaning is strongly associated with danger and/or shame. Similarly, the symbol of a circle with a diagonal line through it (next to "hide" in the mock-ups) has a strong negative connotation of "banned" or "opposed" or simply "no!" Instead of a "hide" link there could be a button with an "X" icon (universally understood to mean "close" or "dismiss"), or a symbol representing a collapsing action, or an on/off switch, or a gear-shaped icon representing "image settings", all of which have a more neutral feel and would be less obtrusive. --KnowWell 23:13, 20 August 2011 (UTC)
- Even better: the "hide content" tabs could be made opt-in too. The opt-in for the hiding-content system could be located at the bottom together with the * Privacy policy * About Wikipedia * Disclaimer * Mobile view links. Otherwise, we constantly get the idea that we should feel guilty about looking at Commons images shown on the Wikipedia. "Edit" and "hide content" are very different: one is about correcting errors (or completing, clarifying, etc.), the other makes us think about censorship. Boud 22:29, 16 August 2011 (UTC)
- If we do decide to do this proposal, I too think this would be necessary. I wouldn't want to be reminded every time I see a illustration that Wikipedia is implementing self-censorship. DGG 16:13, 16 August 2011 (UTC)