Jump to content

社群健康倡議

From Meta, a Wikimedia project coordination wiki
This page is a translated version of the page Community health initiative and the translation is 52% complete.

社群健康倡議
協助維基媒體志工社群減少在我們計劃中的騷擾和破壞性行為。

維基媒體基金會的受眾(Audiences)以及社群參與(Community Engagement)部門正在進行為期數年的工作,包含研究計畫、產品開發、以及增長方針,進而來協助維基媒體志工社群減少在我們計劃中的騷擾和破壞性行為。

This initiative addresses the major forms of harassment reported on the Wikimedia Foundation’s 2015 Harassment Survey, which covers a wide range of different behaviors: content vandalism, stalking, name-calling, trolling, doxxing, discrimination, anything that targets individuals for unfair and harmful attention.

This will result in improvements to both the tools on the MediaWiki software (see Anti-Harassment Tools) and the policies on communities suffering the most from disruptive behavior (see Policy Growth & Enforcement.) These improvements need to be made with the participation and support of the volunteers who will be using the tools in order for our efforts to be successful (see Community input.)

計畫

目前

The team is currently building new blocking tools. We want to provide more accurate and sophisticated tools to allow administrators to more appropriately respond to harassers. This project includes page and namespace blocking.

未來

We've also begun research on a improved tools and workflows to report users to moderators, which will be a major effort in 2019. Join us in the User reporting system consultation 2019 to join the decision making!

以前

背景

在維基媒體計畫中的騷擾狀況

On Wikipedia and other Wikimedia projects, harassment typically occurs on talk pages (article, project, and user), noticeboards, user pages, and edit summaries. Edit warring and wiki-hounding can also be forms of harassment. Conduct disputes typically originate from content disputes, such as disagreements about the reliability of a source, neutrality of a point-of-view, or article formatting and content hierarchy. These disputes can become harassment at the point when an editor stops thinking of this as a discussion about ideas, and starts associating the opponent with the idea — turning the other editor into an enemy, who needs to be driven off the site. This unhealthy turn is more likely to happen when the content is closely associated with identity — gender, race, sexual orientation, national origin — because it's easy for the harasser to think of the target as a living representation of the opposing idea. This is especially true when the target is a member of a historically disadvantaged group, and has disclosed information about their identity during the course of their time on the projects.

The English-language Wikipedia community (and most other projects) have drafted conduct policies for their communities to follow including policies on civility, harassment, personal attacks, and dispute resolution. The spirit of these policies is right-hearted but enforcement is difficult given deficiencies in the MediaWiki software and the ratio of contributors to active administrators.[1] The dispute resolution processes encourage users to attempt to resolve issues between themselves before bringing the situation to the attention of administrators on the Administrator's Noticeboard, and eventually ArbCom for extreme situations.[2]

2015年騷擾調查之結果報告

Online harassment is a problem on virtually every web property where users interact. In 2017, the Pew Research Center concluded that 41% of all internet users have been the victim of online harassment.[3] In 2015 the Wikimedia Foundation conducted a Harassment Survey with 3,845 Wikimedia user participants to gain deeper understanding of harassment occurring on Wikimedia projects. 38% of the respondents confidently recognized that they had been harassed while 51% of respondents witnessed others being harassed.

In Wikimedia's 2017 Community Engagement Insights Report, it was found that 31% of all 4,500 survey respondents felt unsafe in any Wikimedia online or offline space at any time during their tenure, 49% of 400 users avoided Wikimedia because they felt uncomfortable, and 47% of 370 users indicated that in the past 12 months they had been bullied or harassed on Wikipedia. Furthermore, 60% of people who reported a dispute to functionaries say their issue was "not at all resolved" and 54% called their interaction with functionaries "not at all useful." Results from the 2018 Insights Report shows similar findings, with a slight decrease of respondents reporting feeling unsafe.

這項研究具有啟發性,也是該社群健康倡議的推動力之一;但這只是我們為實現這一努力,而必需進行的研究的開端而已。

對於新工具的社群請求

The Wikimedia community has long struggled with how to protect its members from bad-faith or harmful users. The administrative toolset that project administrators can use to block disruptive users from their projects has not changed since the early days of the MediaWiki software. Volunteers have asked the Wikimedia Foundation to improve the blocking tools on a number of occasions, including:

In preparing for this initiative, we've been discussing issues with the current tools and processes with active administrators and functionaries. These discussions have resulted in requested improvements in several key areas where admins and functionaries see immediate needs — better reporting systems for volunteers, smarter ways to detect and address problems early, and improved tools and workflows related to the blocking process. These conversations will be ongoing throughout the entire process. Community input and participation will be vital to our success.

外部援助金

2017年針對維基媒體項目的反騷擾工具的撥款計劃

In January 2017, the Wikimedia Foundation received initial funding of US$500,000 from the Craig Newmark Foundation and craigslist Charitable Fund to support this initiative.[4] The two seed grants, each US$250,000, will support the development of tools for volunteer editors and staff to reduce harassment on Wikipedia and block harassers.

The grant proposal is available for review at Wikimedia Commons.

目標

  • Reduce the amount of harassing behavior that occurs on Wikimedia projects.
  • Fairly resolve a higher percentage of incidents of harassment that do occur on Wikimedia projects.

潛在的成功衡量標準

衡量騷擾存在挑戰,但我們仍然希望確保我們的工作對社群產生正面影響。目前的想法包括:

  • 降低英語維基百科上可識別的個人攻擊評論的百分比;
  • 通過調查測量,減少在維基百科上發生的非管理員用戶騷擾事件之報告百分比;
  • 通過調查測量,增強管理員對行為糾紛做出準確決策的能力的信心;
  • 增加新用戶的信息留存率。

季度與年度目標

For commentary on our quarterly progress, see Community health initiative/Quarterly updates.

社群投入

Gathering, incorporating, and discussing community input is vital to the success of this initiative. We are building features for our communities to use — if we design in a vacuum our decisions will assuredly fail.

The plans presented in the grant, on this page, and elsewhere will certainly change over time as we gather input from our community (including victims of harassment, contributors, and administrators,) learn from our research, and learn from the software we build. Community input includes, but is not limited to:

  • Socializing our goals
  • Generating, refining, validating, and finalizing ideas with community stakeholders
  • Conversations about freedom of expression vs. political correctness. It’s very important that this project is seen as addressing the kinds of abuse that everyone agrees about (obvious sockpuppet vandalism, death threats) and the kinds of abuse that people will differ over (gender, culture, etc.). The project will not succeed if it’s seen as only a “social justice” power play.[5]

Over the course of this initiative we plan to communicate with the community via regular wiki communication (talk pages, email, IRC) in addition to live-stream workshops, in-person workshops at hack-a-thons and Wikimanias, and online community consultations. At the moment, the best place to discuss the Community health initiative is on Talk:Community health initiative.

反騷擾工具

簡而言之,我們希望通過構建軟件,使貢獻者們和管理員們能夠在發生騷擾時,做出及時且明智的決策。其中,關於新工具將如何有助於應對和解決騷擾,已經大致確立了四個重點領域:

檢測

我們希望編輯人員能夠更輕鬆、更有效地識別和標記騷擾行為。我們目前正在構思——如何在騷擾開始之前預防騷擾;以及如何在小事件擴大並加劇至不文明問題之前,解決這些騷擾事件。

Potential features:

  • AbuseFilter performance management, usability, and functionality improvements
  • Reliability and accuracy improvements to ProcseeBot
  • Anti-spoof improvements to pertinent tools
  • Features that surface content vandalism, edit warring, stalking, and harassing language to wiki administrators and staff

報告

No victim of harassment should abandon editing because they feel powerless to report abuse. We want to provide victims improved ways to report instances that are more respectful of their privacy, less chaotic and less stressful than the current workflow. Currently the burden of proof is on the victim to prove their own innocence and the harasser's fault, while we believe the MediaWiki software should perform the heavy lifting.

Potential features:

評估

對於管理員而言,必須熟練使用MediaWiki答差異、歷史記錄和特殊頁面,才能夠分析和評估一個行為爭議事件的真實順序。志願者編寫的工具(如:編輯交互分析器WikiBlame)可以幫助評估,但當前的過程仍然非常耗時。我們希望建立工具來幫助志願者理解和評估騷擾案例,並告知其最佳的回應方式。

Potential features:

  • A robust interaction timeline tool, which will allow wiki administrators to understand the interaction between two users over time, and make informed decisions in harassment cases.
  • A private system for wiki administrators to collect information on users’ history with harassment and abuse cases, including user restrictions and arbitration decisions.
  • A dashboard system for wiki administrators to help them manage current investigations and disciplinary actions.
  • Cross-wiki tools that allow wiki administrators to manage harassment cases across wiki projects and languages.

封禁

We want to improve existing tools and create new tools, if appropriate, to remove troublesome actors from communities or certain areas within and to make it more difficult for someone who's blocked from the site to return.

Some of these improvements are already being productized as part of the 2016 Community Wishlist. See Community Tech/Blocking tools for more information.

Potential features:

  • Per-page and per-category blocking tools to enforce topic bans, which will help wiki administrators to redirect users who are being disruptive without completely blocking them from contributing to the project; this will make wiki admins more comfortable with taking decisive action in the early stages of a problem.
  • Tools that allows individual users to control who can communicate with them via Echo notifications, email, and user spaces.
  • Make global CheckUser tools work across projects, improving tools that match usernames with IP addresses and user agents so that they can check contributions on all Wikimedia projects in one query.
  • Improved blocking tools, including sockpuppet blocking tools.

工作的優先次序

Our projects are currently prioritized on the Anti-Harassment Phabricator workboard in the 'Epic backlog' column. We invite everyone to share their thoughts on our prioritization on Phabricator tickets, on this page's talk page, or by sending us an email.

Projects are prioritized by the product manager, taking into account:

  • Readiness — What is designed, defined, and ready for development? Are there any blockers?
  • Value — What will provide the most value to our users? What will solve the biggest problems, first? Has our research identified any exciting opportunities? Have our previous features identified any new opportunities or problems?
  • Feasibility — What can we accomplish given our time frame and developer capacity? Are we technically prepared? Is there external developer support that will accelerate our work?
  • Support — What has received support from the users who participate in the current workflows? What ideas have momentum from people currently affected by harassment on Wikipedia?

方針的歷程與執行

In addition to building new tools, we want to work with our largest communities to ensure their user conduct policies are clear and effective and the administrators responsible for enforcing the policies are well-prepared.

Beginning with English Wikipedia, a large community from which can be obtained a wealth of data, we will provide contributors with research and analysis of how behavioral issues on English Wikipedia are a) covered in policy, and b) enforced in the community, particularly noticeboards where problems are discussed and actioned. We will provide research on alternate forms of addressing specific issues, researching effectiveness, and identifying different approaches that have found success on other Wikimedia projects. This will help our communities make informed changes to existing practices.

參閱其他

子頁面列表

Pages with the prefix 'Community health initiative' in the 'default' and 'Talk' namespaces:

Talk:

參考來源

  1. "Wikipedia:List of administrators/Active". 2017-02-13. 
  2. "Wikipedia:Harassment § Dealing with harassment.". 2017-02-12. Retrieved 2017-02-12. 
  3. Duggan, Maeve (2017-07-11). "Online Harassment 2017". Pew Research Center: Internet, Science & Tech. Archived from the original on 2019-12-18. Retrieved 2017-07-25. 
  4. "Wikimedia Foundation receives $500,000 from the Craig Newmark Foundation and craigslist Charitable Fund to support a healthy and inclusive Wikimedia community – Wikimedia Blog". Archived from the original on 2019-12-18. Retrieved 2017-02-13. 
  5. "File:Wikimedia Foundation grant proposal - Anti-Harassment Tools For Wikimedia Projects - 2017.pdf - Meta" (PDF). meta.wikimedia.org. Retrieved 2017-02-14.