Jump to content

Wikiquality

From Meta, a Wikimedia project coordination wiki
(Redirected from Wikiquality/en)
Other languages:

This page accompanies the Wikiquality Portal as an openly editable wiki brainstorming space for existing and future quality initiatives related to Wikimedia projects.

wikiquality-l

[edit]

For e-mail discussions regarding different ideas to recognize and increase quality content in wikis, please join the wikiquality-l mailing list. Traffic may get high at times, so you may want to filter it into a dedicated e-mail folder.

Revision tagging

[edit]

An initiative that has the support of the Wikimedia Foundation and Wikimedia Deutschland e.V. is the development and testing of the FlaggedRevs extension, jointly developed by Aaron Schulz and Jörg Baach. The goal of the revision tagging tool is to allow a subset of editors to identify the most recent version of an article that has been checked for vandalism, or even gone through an in-depth review process.

The extension is currently being reviewed for security. Our deployment strategy is to take online a public beta site later this month (September 2007), in at least two different configurations:

  • very basic configuration: This configuration will focus on reviewing changes for vandalism. There will be no difference in default view between unregistered and signed in users. It will use dummy content from the Simple English Wikipedia (due to its relatively small size).
  • "the German proposal": This configuration will implement as closely as possible a two-level system proposed by a group of German Wikipedians: "gesichtete Versionen" (sighted revisions) and "geprüfte Versionen" (validated revisions). It will use dummy content from German Wikibooks.

Based on user requests we may add additional test configurations in the future.

The beta test period will be closed in November. After that time, if there are no remaining security or scalability concerns, all Wikimedia Foundation communities (where "community" is the combination of project and language) will be given the option to request the extension to be activated. This process will be managed through our bug-tracking system, BugZilla. Communities will have to point to a consensus (a vote result is acceptable) for a particular configuration of the extension.

The big questions

[edit]

Communities should consider the following big questions:

  • Should unregistered users see different versions by default than registered users?
  • Should the default revision view be changed for some high risk pages, for all pages, or not at all?
  • Should there be multiple "tags" for other aspects of quality such as readability, copyright status, neutrality?
  • Who will be able to set which flags?

Please read the documentation of the FlaggedRevs extension carefully to get an understanding of the flexibility of this feature. Many of these aspects are already fully configurable, and many others can be made to be so.

Article trust

[edit]

Luca de Alfaro is an Associate Professor of Computer Engineering at the University of California, Santa Cruz. He is leading a research team to study the patterns in Wikipedia article histories. His team has created a demonstration which colorizes Wikipedia articles according to a value of trust, computed from the reputation of the authors who contributed and edited the text.

There is a demonstration available. The Wikimedia Foundation is supporting Luca's research by providing him with real-time data for analysis. Our goal is to be able to show a "trust overlay". Deployment could happen in multiple stages (these are not yet final):

  1. Optional trust colorization tab based on slightly out-of-date en.wikipedia.org dump integrated into English Wikipedia
  2. Trust ratings and colorization computed in real-time for at least one Wikipedia
  3. Trust ratings and colorization computed in real-time for all WMF projects
  4. Trust ratings and colorization combined with "revision tagging" above (e.g. allow editors to click untrusted pieces of text to transfer their reputation to the text, provide algorithmically selected "stable version" when no manually tagged one is available)

In any case, human article review and algorithmic article evaluation are not mutually exclusive.

Other ideas and research

[edit]

Ever since the inception of Wikipedia, ideas for analyzing and systematically improving wiki content have filled many hundreds of pages of electronic paper. The page Article validation is an index to some ideas of the past, but feel free to add your own by editing this page. :-)

Accountability of admins

[edit]

We lack funding to verify the identities of the admins, but Wikimedia privately verifying their identity to some level of certainty determined by funding and possibly including information relevant to conflict of interest would increase accountability and add to the trustworthiness of Wikipedia. I propose that a request for proposal be publicly be offered for a donation that would be contractually earmarked toward such an end with the details and amount left open for the submitting organization to suggest. 4.250.138.73 22:31, 17 September 2007 (UTC) (WAS 4.250)[reply]

Content Arbcom

[edit]

The "Apartheid in *" POINT problem has yet again shown the need for a content Arbcom. I suggest that Universities be contacted for named highly credentialed and respected volunteers to man an English-language Wikipedia content arbcom in which our regular arbcom passes them issues for deciding once and for all (or maybe only a year or two?) content decisions on highly limited but significant questions of fact that can not be resolved though consensus except by wearing out one side or the other. I see this as starting small and limited and becoming larger and more important and useful over time, especially with flagged versions. Using named people, limiting their time involvement, and limiting the issues to be decided can make this a post people will feel is worth their time and possibly useful in their career. 4.250.138.73 22:31, 17 September 2007 (UTC) (WAS 4.250)[reply]

Independent evaluation of issues concerning bias

[edit]

I propose a Request for Proposal be publicly issued for a grant/donation to the Wikimedia Foundation contractually earmarked for an independent evaluation of issues concerning bias where the amount and the details are part of the submitted proposal. Let complainers put their money where their mouth is. 4.250.138.73 22:43, 17 September 2007 (UTC) (WAS 4.250)[reply]