Jump to content

Grants:IdeaLab/Controversy Monitoring Engine

From Meta, a Wikimedia project coordination wiki
statusnot selected
Controversy Monitoring Engine
Prevent controversies from escalating to editing wars and intimidation through active monitoring.
amount7,800 USD
granteeRadfordj
idea creator
Radfordj
this project needs...
volunteer
developer
designer
community organizer
advisor
grantee
researcher
join
endorse
created on13:26, 17 March 2015 (UTC)


Project idea

[edit]

What is the problem you're trying to solve?

[edit]

Editing wars, inter-editor intimidation, and hostility to new editors have all been shown to perpetuate the gender gap. These practices very often stem from controversial edits on controversial articles which escalate to interpersonal attacks and intimidation. Such events cause editors of all kinds to leave, but especially affect female-identified editors who feel many of these events are driven by hostility towards women. These events have also generated negative press for the Wikipedia community, branding its body of editors as generally hostile and unwelcoming. This controversy detection engine is meant to prevent controversies from reaching that point through monitoring and early intervention.

What is your solution?

[edit]

The controversy monitoring engine would listen to the live stream of edits and, in real-time, rate the controversiality of articles. These ratings would be published on a page like stats.wikimedia.org for anyone to observe. Senior admins and editors could then watch the results to see what articles are becoming controversial and step in to help guide editors to a peaceful and productive resolution.

The engine works by implementing a set of controversy-detecting features, including the WikiWar Monitor Project’s (WWP) formula for controversiality, but also including features like intimidating words, series of reversions, and vandalism detection to quantify how “controversial” an article or set of articles is becoming. Using an historical analysis of previous editing wars, a scale of controversiality will be created to give admins a sense of how controversial the current editing behavior is. The controversy scores will be updated every 15 seconds with descriptors for what features (the WWP score, words, vandalism, etc.) are leading to the high controversiality score.

Anyone will be able to watch the controversy page, getting notifications when controversy scores exceed a certain threshold or occur in a given domain they pre-select. This will enable senior admins and editors to monitor current controversies on Wikipedia and step in to cool editors off and promote civil consensus-building.

Goals

[edit]
  1. Develop code to measure controversiality.
  2. Train the code on the past years’ worth of edits to generate a scale of controversiality,
  3. Implement the code on a page like stats.wikimedia.org.
  4. Create the watch functionality and visualizations.

Get Involved

[edit]

Participants

[edit]

Endorsements

[edit]
  • I think this would be a helpful tool. I'd like to know more about it (esp, re: which words will be considered).--Mssemantics (talk) 21:02, 18 March 2015 (UTC)
  • Great idea I've thought about myself. If it could be incorporated into a log like the block log that would not only list various "infringements" but also the severity of them, that would be great. Of course, this only could work if the issue of abusive admins who don't like women is dealt with, since it would be easy for a group of males to gang up on women to criticize their slight infractions and tip the balance of the "monitoring engine" and it would have to be manually corrected. A quorum of, say 1/3 verified women admins might be one solution. Carolmooredc (talk) 19:20, 28 March 2015 (UTC)
  • Wikipedians around the globe try to agree on objective facts about the matter of things. Nevertheless may there exist different versions of the truth, depending on origin, language, ethnicity, political position, etc. Wikidata tries to reflect this situation by allowing for several, potentially differently sourced statements. By having a Controversy Monitoring Engine, the level and quality of this controversy could be monitored and in consequence a deeper understanding of its nature be gained. Tomayac (talk) 12:01, 9 April 2015 (UTC)
  • I like the idea. It will be key how to define/learn what controversies are actually "worth looking at" in terms of damaging effects, maybe you could elaborate a bit more on that. Also talk pages should be interesting, they could show an intimidating atmosphere without having ns=0 disputes. Anyhow, would be great to have a unified platform for live Wikipedia conflict monitoring. Also have a look at Contropedia.net faflo (talk) 14:55, 20 April 2016 (UTC)

Expand your idea

[edit]

Do you want to submit your idea for funding from the Wikimedia Foundation?

Expand your idea into a grant proposal

Project plan

[edit]

Activities

[edit]

The controversy monitoring engine will be developed in stages. The first stage will involve three steps: building the code to measure controversy from existing measures of controversy in Wikipedia and testing that code on recent editing patterns to calibrate the detection system. The second stage will involve developing visualizations which capture controversiality and present useful information to users looking at the visualization. The third stage will involve integrating visualizations and other useful features into a page like stats.wikimedia.org. The fourth stage will involve gathering feedback from users through the engine’s page, using that to update the visualizations, and add new features, measures, and other visualizations.

Budget

[edit]

Stage 1: Build, test, and calibrate controversy detection code.

  • Researcher (40 hours * $15 hour) = $600

Stage 2: Visualization.

  • Visualizations Engineer (100 hours * $15 hour) = $1500

Stage 3: Design web interface for engine and visualizations.

  • Web Designer (100 hours * $15 hour) = $1500

Stage 4: Feedback and development

  • Researcher (80 hours * $15 hour) = $1200
  • Web Designer (80 hours * $15 hour) = $1200
  • Visualizations Engineer (80 hours * $15 hour) = $1200
  • Administration (40 hours * $15 hour) = $600

Community engagement

[edit]

Feedback will be integrated into the visualization website itself via a feedback submission process where users can submit their ideas for improving the website. In addition, in stage four, we will reach out to administrators and senior editors for feedback from admins and users who have and have not used the system for their input on its design. This feedback will not only focus on usability, but how to go about using the information to intervene in and deescalate conflicts.

Sustainability

[edit]

The project will hopefully require minimal updating to maintain its operation after stage 4 developments. Further development to the controversy detection code, visualizations, or interface will be under-taken on an as-needed basis by the researcher, developers, or members of the broader Wikipedia community who become interested in developing it.

Measures of success

[edit]

This project has four stages of success. Preliminary success involves a working version of the WikiWar Monitor Project’s code can be integrated with the live stream of edits and visualized on a webpage. The second stage of success would be an interactive visualization that enabled viewers to investigate controversies and identify the kinds of controversies they are interested in engaging with as moderators. A third measure of success will be engagement from admins and editors in the controversy page, both in usage and in the feedback give to the monitoring engine itself. A final measure will be case studies: reports from editors and administrators that it does work to moderate controversies and catch them before they escalate.

Project team

[edit]

Radfordj: I am a Ph.D. student who performs research using computational methods with online data. I am currently the manager of a research project (VolunteerScience.com) overseeing four employees and coordinating with seven research teams. As such, I work with people in my day job who have the skills and interest to design the web interface and visualize the results if savvy Wikimedians are not immediately interested.

Community notification

[edit]

Done

Gendergap mailing list

Research mailing list

Analytics mailing list

Editor Engagement Mailing list

To Do

WikiProject: New Pages Patrol

WikiProject: Countering Systemic Bias

WikiProject: Discrimination

Currently sorting through Article Improvement and Grading projects

--

Please paste links below to where relevant communities have been notified of your proposal, and to any other relevant community discussions. Need notification tips?