Jump to content

Wikimedia Foundation/Legal/Community Resilience and Sustainability/Trust and Safety/Overview

From Meta, a Wikimedia project coordination wiki

An overview of Trust and Safety ("T&S") at the Foundation and Wikimedia's approach relative to other platforms.

Profile

[edit]

The first job of T&S is to keep the online communities on our platform safe. Being a nonprofit organization with constrained resources, the Foundation traditionally relies on peer-elected volunteers ("functionaries") to do most of the trust and safety work on behalf of their own project communities for which other platforms hire hundreds of staff and contractors. This can include fighting spam and vandalism, resolving local community conflicts, and content moderation such as editorial issues and the deletion of copyright violations. Wikimedia relies on volunteer functionaries and staff with T&S-relevant roles and user rights: more than 6,500 volunteer admins, 12 Arbitration Committees ("ArbComs") with three to ten members each; specialist roles building on adminship (189 local checkusers, 105 local oversighters, 39 stewards), and the staff team (ten full time members). However, 337 of our 889 projects (38%) have no local admins, depending entirely on steward, global admin, and Foundation services. ~85% of Wikipedia wikis (representing <4% of Wikipedia editors) have little to no functional local conduct policies, essential to self-governance.

Wikimedia Foundation T&S Operations: T&S routinely handles workflows tied to the Foundation's role as an internet service provider (ISP), including close collaboration with Legal on statutory child protection and DMCA obligations or personality right violations of people with Wikipedia articles. It also leads the Foundation’s law enforcement collaboration, covering issues like threats to life, threats against Wikimedia offices, POTUS threats, school shootings, and terrorism threats published on our platform, as well as persecution of volunteers by hostile governments. Other core work tied to the ISP role includes legal case support, partnering with ArbComs and the stewards, and administration of sensitive volunteer functionary and staff access rights on the platform.

The team handles four types of investigations: a) long-term volunteer harassment and conduct investigations into volunteers beyond the capacity of volunteer functionaries or affiliates. This category also includes mediation support for the Affiliations Committee (AffCom); b) HR-investigations into online workplace harassment targeting staff; c) handling sustained security and privacy challenges in close collaboration with Legal and Security; and d) supporting staff investigations and the vetting of all Foundation hires and appointments in collaboration with Talent & Culture and Legal.

Wikimedia Foundation T&S Policy: T&S maintains a standing subteam that focuses on Foundation initiatives to structurally improve community health under the Board's three investment priorities outlined in November 2017 when adopting the Wikimedia 2030 strategy. These programs include the Universal Code of Conduct (UCoC), digital security training, building preventive anti-harassment resources and event safety material for offline spaces, and functionary support for the volunteer Technical Code of Conduct Committee for Wikimedia's technical on- and offline spaces.

Wikimedia Foundation T&S Disinformation: T&S maintains a dedicated team which focuses on the Foundation goal of protecting the projects against disinformation. The team supports communities in their fight against disinformation campaigns across the platform, with a focus on key strategic language communities under sustained state pressures. The team employs three workflows, (a) assisting the communities in identifying new and emerging disinformation threats (b) sharing intelligence with community volunteer functionaries on disinformation threats and on request, investigating such threats (c) working together with community volunteers on cooperation projects like elections integrity, by creating a Disinformation Response Team (DRT)

Wikimedia Foundation T&S Product: T&S routinely partners on key organizational initiatives like user privacy, security, and data management; provides the technical program management for the Anti-Harassment Program; and routinely advocates for volunteer functionary technology needs.

Comparison

[edit]

Wikimedia traditionally falls under the community-reliant branch of the three standard approaches to trust and safety. This contrasts to the industrial approach taken by platforms such as Facebook, Twitter, and YouTube, who outsource trust and safety work to thousands of contractors or employees who assess against a minutely defined checklist, or the artisanal approach taken by companies such as Medium and Vimeo where all such work is done by employees with little automation and more latitude for interpretation.

The community-reliant approach can be compared to federal government systems such as those seen in the United States, with central constitutional rules (such as our Terms of Use and supplemental Board resolutions) that may be adapted and expanded by local governance as long as the core rule book is adequately implemented. Local governance in this case is a consensus model, wherein volunteers discuss and decide collectively what makes sense for their projects within the framework provided by the Terms of Use and other Foundation policies. The bulk of trust and safety work, both moderating harmful content and addressing harmful behaviors, is undertaken by volunteers, who often also have many localized policies determining what content and behaviors are unacceptable to them, above and beyond the core requirements. In community-reliant trust and safety models, including Wikimedia's and Reddit's, there is also a central body supplementing community efforts to enforce trust and safety regulations over the platform. Wikimedia is unlike Reddit in that every user can take trust and safety actions, including deleting harmful content and cautioning users for bad behavior or reporting them to community-selected functionaries for sanction. It is not possible to provide a ratio of paid staff to volunteers doing this work. Reddit's users, for example, may report to moderators who have special tools. While Wikimedia's functionaries have special tools as well, every user has the ability to directly modify content, including removing personal attacks, disinformation, and illegal material. Thousands of volunteers across the world self-appoint to patrol recent changes feeds to our sites watching for issues that need to be addressed. This is a unique aspect of the wiki structure.

There are challenges and advantages within the model that could be explored at length, but here we focus on several worth highlighting:

  1. Our model recognizes and respects the ability and right of volunteers to shape projects in ways that work best in their context.
  2. We are partners in a complex ecosystem. Staff must protect the ecosystem carefully, both in taking action where necessary and leaving action to the local level when action can be handled by volunteers in a manner that protects them, other users, and our platform adequately.
  3. For our volunteers to exercise self-governance, they need robust product and engineering support. Wikimedia must balance its resources to enhance the platform for readers and content contributors while also ensuring that functionaries and T&S staff have effective, efficient tools as mandated by the Board's statement from May 2020.
  4. For our volunteers to exercise self-governance, they need to understand the framework of Foundation policies enabling self-governance, which often are not issued in their language and are frequently translated by volunteers. Wikimedia is not yet where it needs to be in terms of online training infrastructure and modules. We also have not yet found an effective system for measuring how policies are being interpreted and applied across our hundreds of language projects. These are both gaps we continue to seek to address.
  5. To protect the ecosystem, all roles and responsibilities must be understood, and our volunteer communities need to know what the staff T&S team is doing and why. However, transparency must be weighed against privacy and security, both in terms of user safety and security of the platform itself.