Jump to content

2010 Wikimedia Study of Controversial Content: Part Two

From Meta, a Wikimedia project coordination wiki

The following is part two of the Draft Study on Wikimedia Controversial Content by Robert Harris and Dory Carr-Harris. Comments on the draft or questions are highly encouraged, and should be placed on the talk page, but for now please refrain from revising the draft directly. The preliminary statement to the recommendations on this page is located at 2010 Wikimedia Study of Controversial Content, and it should be read for full context before commenting.

Introduction

The principles we outlined in Part One can be summarized in this way. Wikimedia projects are currently animated by two goals – first, devotion to openness of information, and second, respect and service to users. Most of the time, these two goals are in harmony with each other, and respect for users is demonstrated most effectively by openness of information. In a small minority of cases, they conflict, and pursuing openness seems to reduce our service to the public, or some part of it. We draw a circle, in effect, around these instances, and label the content within that circle “controversial” or “potentially-objectionable.” The content in the circle confuses us, because it pulls us in seemingly opposite directions. Openness, our first principle, would have us deal with it in one way; service and respect, our second principle, in another. This study was commissioned, in effect, to figure out what to do about these conflicts.

It probably goes without saying, but needs to be said nonetheless, that this debate is informed by the question of legality of content, especially within the jurisdiction of the United States, where Wikimedia servers are housed. Content which it is illegal to host in this jurisdiction is not permitted on Wikipedia servers, and supersedes all other policies and guidelines about inclusion of content on the projects. In particular, images of child pornography as defined by US law are not permitted on Wikimedia sites, and to the knowledge of the Foundation (and ourselves) no such images exist within the projects. They would be speedily removed if encountered, and their presence reported to the appropriate authorities.

Ground Rules and Definitions

The law sets some basic frameworks within which the Foundation’s projects operate, but there are many questions that remain about controversial content within the space the law provides us. Before we explain how we have answered those questions, a few basic observations and definitions. Although much of the current debate around controversial content within Wikimedia projects focuses on images, we were commissioned by the terms of reference of the study to look at all content, text and image, and we have done so. However, one of our axiomatic starting points in this study was that text and image are quite different forms of communication. The brain processes each differently; text is much more amenable to change and manipulation than images are; and although there are certain words that provoke strong, immediate emotional reactions, many more images do, almost as a matter of course. To us, text and image are different.

One of the first ramifications of this difference is that we have crafted different definitions and tests of the concept “controversial” for text and image (We explained in Part 1 why we prefer the term “controversial” over “potentially-objectionable;” we believe “potentially-objectionable” is far too broad and open a term).

“Controversial” text within WMF projects

Definition of “controversial” for text

For text, definitions of and tests for the concept “controversial” already exist within Wikimedia projects, and we see no reason to alter them. English Wikipedia, for example, has both an essay/quasi guideline on this issue (that dates back, in one form or other to 2003) and a list of controversial articles.[1][2]

Objective criteria have been established and explained in both these articles to define and test for the concept “controversial”, and they seem perfectly adequate to us. Two things should be noted in regards to these definitions and articles. The first is, that the list deemed “controversial” in Wikipedia is relatively long. As might be expected, it contains a healthy list of articles dealing with sexual issues, but also includes the article on “smooth jazz”. Because text and image are different, the list of “controversial” text articles is much larger than the similar list for images (unless you object to a photo of Kenny G.) Secondly, by and large, the admonitions to editors dealing with controversial articles is to apply existing policy (NPOV, verifiability, etc.) as carefully and assiduously as possible. Except for policies surrounding biographies of living people, and policies around “pending changes,” there are no special policies in Wikipedia projects to deal with controversial articles. The policy is, to apply current rules with care.

Recommendations: Controversial Text

Because of the considerations outlined above, our recommendations surrounding text in WMF projects are the following:

It is recommended:

1. That no changes be made to current editing and/or filtering regimes surrounding text in Wikimedia projects. Current editorial procedures, including the procedure to modify those regimes if needed, are adequate, in our opinion, to balance the twin Wikimedia principles of access to information and service to readers.

2. That, therefore, no changes or filters be added to text on current Wikimedia projects to satisfy the perceived needs of children.

3. That, however, the Foundation investigate the creation of a “WikiJunior” version of the Wikipedias, aimed at children under the age of 12, either as a stand-alone project or in partnership with existing and appropriate educational institutions.

Explanations:

Recommendation 1

We believe the current regime surrounding the handling of controversial text and the mediation of goals that that implies is working well. We believe so for two reasons – the internal procedures in place to mediate conflicts work, and there are methods to create improvements to those procedures where needed, which also are effective. For these reasons, we are recommending that the status quo for dealing with controversial text within the projects be maintained. This means, for us, that no internal tagging of text content be initiated that might lead to third-party filtering– either by keyword, or topic, or any other regime.

Recommendations 2 and 3

Up to now, we have not specifically mentioned the needs of children in this study. We noted in our first communication with you that, to us, the question of controversial content and the question of the relationship of children and their parents on the projects were linked, but ultimately separate issues. That’s not to say we did not take seriously the issues surrounding children and their exposure to “controversial” content on the projects. We will be dealing in more detail in part 3 our suggested approach to these issues.

However, we will note here that we do not believe it is either effective or desirable to restrict access to current text articles on the projects for children. We believe that such a regime would prove difficult, if not impossible, to administer, and might result in unintended uses by censorious third parties that would not be to the Foundation’s benefit. Much more successful, in our opinion, is a project specifically targeted to children, and to the quite different needs of children in different age groups. Some projects of this nature have already been begun in the WikiJunior section of WikiBooks,[3] but it is our feeling that the scope of such a venture might necessitate the formation of partnerships with institutions who have experience and resources already devoted to this area.

“Controversial” Images within WMF Projects

Images within WMF projects

As mentioned above, we believe text and images to be significantly different forms of communication, and so demand different regimes around the question of what is, and how to deal with, content defined as “controversial.” Images are also treated in quite a different manner than text within the projects. Each Wikipedia, for example, has its own guidelines and practices for dealing with “controversial images” within its own ambit.[4]

By and large, these policies are narrowly defined to mesh with policies designed for “controversial” content on specific articles and discussed within the talk pages of specific articles. However, perhaps because these projects are older, there are other regimes surrounding controversial images within the Wikipedias. English Wikipedia, for example, has a “Bad Image” List, an anti-vandalism mechanism, which locks certain images, (the majority of them sexual in nature) restricting their use to specifically enumerated sites. Administrator approval is needed to unlock them for other use. However, this is an en:WP procedure only – it does not apply to any other project.[5]

However, of course, the Foundation also has a separate project completely and only devoted to images, Wikimedia Commons, founded in 2004. With over 8.5 million images in its data banks, and increasing that number at a rate of approximately 7,000 per day,[6] Commons is, after the Wikipedias, the next most heavily used project within the WMF family.[7]

It is unique in many ways, not the least of which is that it is intended to serve as an image bank for all other projects, and thus has a mandate wider than any other project in the WMF family – Commons must be sensitive to the need of every other project in the Wikimedia universe, as well as users around the globe. Because of the significance of Commons as a repository of images within Wikimedia, most of the recommendations surrounding images in this study have been made with Commons specifically in mind.

Definition of “controversial” for Images

We noted above the distinction we have chosen to make between text and image within this study. One of the significances of this distinction is that our definitions of “controversial” for these two forms of communication are different. We noted above that there is currently a form of working definition of “controversial’ within Wikipedia, which we see no need to alter. Commons has no such current definition, and we believe, that either formally, or informally, such a definition and categorization within Commons is needed.

We realize that there is great deal of anxiety in the Commons community that too loose or sloppy a definition of potentially-objectionable or controversial imagery could result in a tsunami of restrictions on Wikimedia sites, with every imaginable kind of content falling under this regime. We do not believe this fear is justified. Definitions of “controversial” for images must be as clear, objective, and verifiable as current definitions are for text, and we believe such definitions are possible, although different than those for text. Where text uses primarily internal definitions of “controversial” (edit wars, incidences of reversion), we believe the definition of controversial for images should use equally objective and verifiable external tests. We believe the designation of controversial should be placed on categories of images, not individual images, and the categories deemed controversial would receive this designation when there is verifiable evidence that they have received this designation by the communities the projects serve. This evidence would include: existing legislative and legal frameworks around the distribution of said images; wide public debates in responsible forums concerning these types of images; articles and editorial opinion expressed about these types of images in notable journals and newspapers, etc. In our view, these tests will ensure that the bar to be admitted to the category of controversial image will be high (much higher than it is for the Wikipedias), and should remain so. In effect, our surmise is that only three categories of images can currently pass this test -- sexual images, violent images, and certain images considered sacred by one spiritual tradition or another. Note that, as with text, a designation of “controversial’ for Commons images does nothing more than suggest that a more rigorous application of existing rules be applied to these images regarding appropriateness and inclusion on the project – it does not take a stand one way or another about their eventual status on Commons.

Scope on Commons

And the reason for that is that there is already a perfectly adequate policy to determine inclusion on Commons – the test for educational scope – basically that images are permitted on Commons if they are “realistically useful for an educational purpose".[8] As with all policies defined by language, there is room for interpretation of the Commons scope policy, but our feeling is that the policy does not need to be changed. And we are not recommending that it be changed.

However, unlike the situation with controversial text on Wikimedia projects, where we approve of both the overall policy directions of the projects, and the manner in which those policies are developed and carried out in practice, we are recommending that the current application of the policy on educational scope be refined on Commons, especially in regard to certain types of sexual images.

The reasons we believe educational scope on Commons is legitimately more difficult to administer than on other projects is basically a combination of three factors – Commons has to serve every conceivable existing WMF project, so its bias is towards inclusion, to prevent unintentional deletion of a potentially valuable image; secondly, Commons has to anticipate future demands as well, and, unlike text, a deletion of an image in Wikimedia projects is permanent, so this also induces a bias towards inclusion, and finally, Commons serves two distinct purposes, to serve the WMF projects, and to provide a freely-available image bank to the world, which also induces a bias towards inclusion.

There are many thousand sexual images currently on Commons (by even the most conservative definition of sexual content, perhaps over 10,000 by a more liberal definition: we will attempt to publish a rough catalogue in Part 3). Most of these, we believe, pass the test for scope, even though they may be difficult and demanding for some viewers. Included in such categories are images of sexual organs, images of sexual practices, images of bondage, etc. Most of these individual images, we believe, pass the test of “realistic use for an educational purpose” (although the numbers of repetitive images in some categories might be questioned.)

However, there is at least one class of sexual images that we feel does not pass the test for scope, even though they currently exist on the project. These images, we believe, exist merely for the purpose of sexual display, and serve no other purpose. It is with these images that we begin our recommendations of “controversial content” within Commons.

Recommendations on “Controversial” Images

It is recommended:

4. That Commons editors and administrators review the application of the existing Commons policy on educational scope to images of nudity in Commons, where breasts, genital areas (pubis, vulva, penis) and/or buttocks are clearly visible, and the intent of the image, to a reasonable person, is merely to arouse, not educate, with a view to deleting such images from Commons.

5. That historical, ethnographic and art images be excluded from such a review, and be considered in virtually all cases to be in scope.

6. That consideration be given by Commons editors and administrators to adopt policies of active curation within categories of images deemed controversial (sexual, violent, sacred) which would allow for restriction of numbers of images in a category, active commissioning of images deemed needed, but absent from the category (line drawings of sexual positions, eg.), such policies not necessarily to be applied to images not deemed controversial.

Explanations

Recommendation 4

We expect this recommendation to engender a fair bit of reaction, and not to anticipate it all, let us say three things.

First, the kind of image we are talking about is among those found in the over 3,000 images on Commons (by our count) in various categories and sub-categories around “Female toplessness” and “Nude women” (although there are some images of men that also fit into this category). Some of these images are clearly in scope (to demonstrate aspects of pornography, for example), but not the vast majority of them.

Secondly, we are not suggesting these images be deleted because they are immoral, or “potentially-offensive”, or even “controversial”. We are suggesting they be deleted because they are out of scope – they serve no “reasonable educational purpose”, in our view, and the reason for that is, as stated, their intent is to arouse, not inform. (This definition of these images based around intent, not a common North American approach, is one inspired by current definitions of pornography used in United Kingdom jurisprudence. Other sites also use definition of intent as a cataloguing principle of sexual images.)

Thirdly, it is our belief that the presence of these out of scope images in Commons is potentially dangerous for the Foundation and Community because they reduce the overall credibility of the projects as responsible educational endeavors, and thus call into question the legitimacy of the many images of sexual content and “controversial” sexual content that must remain on Commons for the projects to fulfill their mission. And, although not our primary motivation in making this recommendation, it must be noted that they are offensive to many people, men and women alike, and represent with their inclusion a very clear bias, and point of view – that of woman as sexual object. They are far from neutral. We would never allow this point of view untrammeled and unreflexive presence on any Wikipedia site as a clear violation of NPOV – we should not allow it on Commons either.

Recommendation 5

Basically, as stated, we believe artistic and historical sexual images to be in scope (perhaps with the occasional exception), almost by definition, as inherently educational. They should not be subject to restrictions of any kind, beyond those legal prohibitions of content which provide the basic framework within which Wikimedia projects operate.

Recommendation 6

We believe that one of the purposes of designating certain kinds of content “controversial” would be to legitimize certain practices within those categories of images that, by and large, would not be applied to other categories within Commons. (The equivalent to some of the practices that have been put in place in individual article pages within the Wikipedias.) Designating certain procedures applicable to clearly-defined subsets of content in our view provides two advantages. In the first place, it would allow certain practices to be applied to “controversial” content, because of the acknowledged extra sensitivity in the viewing community to these images. However, at the same time, it ensures that these procedures will not be applied to other content, for which they are unnecessary. The commissioning of images might especially be a means by which seemingly intractable difficulties around sexual imagery might be resolved (to illustrate sexual practices for which no photograph is available or desirable, e.g.). This concept will be expanded on in Part 3.

User-Controlled Viewing Options

Although not widely publicized, there exist at the moment, in English Wikipedia, a few procedures to opt out of the viewing of certain images within enwiki articles, although these are quite clunky and user-unfriendly.[9]

The question of whether to introduce these sorts of options on a more general basis has been discussed within the communities on several occasions, with a variety of opinion being expressed. By and large, Wikimedians believe in freedom, and increased freedom of choice in the viewing of Wikimedia content seems in harmony with that belief. However, caution is often expressed that these options, if implemented, be designed so as not to be misused by censorious third parties.

It is our feeling that it is a good idea and appropriate for Wikimedia projects to offer some user-controlled viewing options as a means of managing the contradictions we’ve noted inherent in the realm of controversial content – the contradictions between our desire to keep our content open, and the conflicting desire of accommodating those of our users who wish not to see some of it. Our reasons for so recommending these are partly in harmony with the principle of increased freedom for viewers to manage their own experience, but also to correct what we believe is an anomaly between Wikimedia sites and others of similar influence. Part of our mandate was to explore the content management practices of other sites with some similarities to Wikimedia sites – sites like Google, Flickr, YouTube – sites which, like Wikipedia, are heavily-used, powerful and influential in the world, attracting a diverse and culturally rich array of users – essentially other Top Ten Internet sites. While we realize Wikimedia sites are quite different from all these sites as non-commercial, public-service oriented sites, systems of content management on these sites have many similarities with those on Wikimedia sites. For example, all of these sites, as WMF pages do, have internally-generated policies that determine what content is permitted on their sites at all.

However, on every one of these sites, they also employ a series of user-controlled options (options designed by the site) that allow users to tailor their viewing experiences to their individual needs. Unique among these sites, at the moment, Wikimedia projects employ no such options. By and large, any content that passes on to a project page is immediately available for viewing, without any restriction, to anyone and everyone using the sites, of whatever age or cultural background, whether they wish access to this content or not. We believe this situation can be modified without serious harm being done to the projects’ openness, and in a way that could increase service to users.

Recommendations on User-Controlled Viewing Options

It is recommended:

7. That a user-selected regime be established within all WMF projects, available to registered and non-registered users alike, that would place all in-scope sexual and violent images (organized using the current Commons category system) into a collapsible or other form of shuttered gallery with the selection of a single clearly-marked command (“under 12 button” or “NSFW” button).

8. That no image be permanently denied any user by this regime, merely its appearance delayed, to prevent unexpected or unintentional exposure to images.

9. That a series of additional user-selected options be created for other images deemed controversial to allow registered users the easy ability to manage this content by setting individual viewing preferences.

10. That, by and large, Wikimedians make the decisions about what is filterable or not on Wikimedia sites, and consequently, that tagging regimes that would allow third-parties to filter Wikimedia content be restricted in their use.

11. That the “principle of least astonishment,” the notion that content on Wikimedia projects be presented to readers in such a way as to respect their expectations of what any page or feature might contain, be elevated to policy status within the projects as a fundamental principle governing relationships with readers.

Explanations

Recommendation 7 and 8

What we are recommending is that there be an option prominently visible on all WMF pages (like the Search Settings options on Google), available to registered and non-registered users alike, that, when selected, will place all images in Commons Categories defined as sexual (penises, vulvas, masturbation, etc.) or violent (images of massacres, lynchings, etc.) into either collapsible or other forms of shuttered viewing, wherever these images might appear on WMF sites. The rationales for this are several. Images of sexuality and violence are necessary components of Wikimedia projects for them to fulfill their mandates to be open, free and educational. However, these images – of genital areas and sexual practices on the one hand, or mass graves and mutilated corpses on the other – will inevitably still have the power to disturb some viewers, especially if they are children, or if they are happened upon unintentionally. The point of the “button” we’re proposing is to help alleviate that surprise and dismay, by making the images unavailable for viewing without a second command. Often, within the Wikimedia world, this is referred to as the principle of least astonishment, or least surprise. In our view, it needs to be strengthened.

On the other hand, we believe that this command should only delay the presentation of these images, not prevent the presentation of these images. Consistent with principle 7 we enumerated in our first part, we believe access to information on WMF sites should be compromised only as little as need be to satisfy our responsibilities to respect and serve all our audiences. In our view, a shuttered, rather than a deleted image, satisfies those responsibilities. We selected the Category approach to this shuttering because we thought it would be the simplest to administer.

Recommendation 9

For other images deemed “controversial” we are proposing a slightly different regime. In this case, the options for control would be available only to registered users, who would be able to select, on a category by category or image by image basis, which content they would like to shutter. (in effect, an expansion of the current regime of image suppression on Wikipedia). We are suggesting that these options of choice not be made available to users for categories outside of those deemed “controversial”, not because we are against choice, but because we believe that the bias on the projects must always be to openness of information. Development of this regime could also allow registered users to be more selective within categories covered by the one-button approach (i.e., to decide to shutter images of bondage, but not images of bare breasts, etc.).

Recommendation 10

The possibility of third parties to create filtering regimes to manage Wikimedia sites is to some extent outside of the Foundation’s control. All of the WMF content is available in freely-licensed form. However, it is possible for us to make that job easier or more difficult, by adopting regimes to tag our content in accordance with increasingly widely-used systems of content management (e.g., RDF tags), or not. We are adopting the position that Wikimedia projects stand against censorship, by and large, no matter who is doing the censoring, and that it is Wikimedia editors who should decide what restrictions, if any, should be placed on Wikimedia content, not people outside the organization (especially since entry to the organization is so easy). Consequently, without defining exact limits, we are recommending that our approach to potential third party editors be conservative.

Recommendation 11

At the moment, a guideline with this title exists as a section on the en:wp page “Writing better articles”.[10]

However, it is our feeling that the significance of the sentiment expressed there should be amplified beyond that specific place in the array of Wikimedia procedures. The principle of least astonishment should be one of the key policies used to address those service principles of Wikimedia projects that we enunciated back in part 1 of this study: our commitment to respect the needs of our users. Sometimes the principle is called into play when we ensure that images are catalogued properly, so that someone looking up the category “bus” does not find themselves confronted by political cartoons once presented on bus ads. Sometimes, the principle reminds us that feature articles on home pages need to be selected with a certain care. Overall, it is a principle that stresses commitment to clear service and honesty, that reminds us that it is possible for us to present our core mission to our audiences in a variety of ways, and that choosing the one that offers maximum respect for their expectations is a preferred route.

Conclusion

There’s a lot to absorb here, and we’re sorry if the presentation is a bit dense, but we thought you needed to know why we were making the recommendations we made, not just what they were. We’ll be expanding on our thinking, and adding a few other observations, in Part 3.

See also

Notes