Wikimedia Foundation Annual Plan/2025-2026/Goals/Volunteer Support
The Annual Plan draft is in the process of being translated across languages. |
This content is currently in development as part of the 2025–2026 Annual Plan drafting process. We are seeking feedback on the talk page until May 31st, 2025 to shape our priorities for the next fiscal year. |
The Volunteer Support goal focuses on the Foundation's work to:
- Protect volunteers so they can safely contribute to the sum of all knowledge
- Support Wikimedia projects to increase their impact in the world, and
- Strengthen the movement, including volunteers, affiliates, movement partners, and others through connections, resourcing, and effective governance for impact, sustainability and resilience.
This goal will respond in particular to the following trends:
- Neutrality – We are observing rising risks connected to contributing to free knowledge online, ranging from increased politicization of controversial topics, fragmentation around concepts like “facts” and “neutrality,” and long-standing threats to free expression.
- Users with extended rights – As a result, the successful community-governed model of Wikimedia projects puts certain volunteers – particularly those with “extended rights” like administrators and members of ArbCom – at the front line of sustaining the Wikimedia projects amidst increasing threats. These users are also key to our projects becoming multigenerational, and a key focus of this goal.
- Contributors – There is an increasing need to ensure that the contribution experience is rewarding and appealing, supported by a sense of belonging and connection.
These trends make the work of supporting volunteers, contributors and affiliates more important than ever. We want volunteers to feel safe, connected, and empowered to contribute to the projects. We want readers around the world to be able to easily, freely, and safely access the projects. And we want allies from a diverse range of sectors, geographies, and ideologies to understand the Wikimedia projects so they can help defend and promote them.
Protect volunteers
1.1 Trust and safety: We will protect volunteers from growing threats
Wikimedia's crowdsourced self-governance is a key mechanism for trust and safety in our projects, with volunteers playing a central role in crafting and enforcing governance policies that reflect the principles of Wikimedia projects. Our role is to support the work of these community groups to protect people and support the integrity of content, and support a safer environment for sharing, receiving, and responsibly deploying information on Wikimedia projects. Examples of this work include:
- providing training and other resources for enforcement committees (Ombuds Commission, ArbComs, Stewards, U4C, and others),
- collaborating with communities on Wikimedia projects to improve their Universal Code of Conduct adherence,
- publishing material related to community safety,
- building relationships with digital security helplines and other organizations focused on the protection of media and civil society actors,
- implementing Digital Wellness Check program with a set of initial affiliate partners.
Specific goals include:
- Improve awareness of the changing international landscape of legal concerns among users with extended rights
- Improve enforcement and capacity-building for safety and community health within the Wikimedia movement
- Strengthen communication around risks to community safety
- Enhance support for users who report crises through emergency
, humanrights
, and ca
- Increase protection for volunteers facing threats due to their good faith contributions to the projects.
1.2 Scaled abuse: We will protect communities and systems from scaled abuse by improving our infrastructure, tools and process
People and platforms are facing increased threats around the world when sharing reliable, unbiased information. This makes it essential to strengthen our abuse prevention and mitigation capabilities, defend our projects against widespread and targeted threats, and strengthen privacy and safety protections for our users.
Specific goals include:
- Ensure an easily discoverable, in-context incident reporting system deployed on all our projects
- Improve the precision and efficacy of anti-abuse tooling, deploying improvements and new capabilities in anti-abuse workflows
- Reduce the number of large-scale attacks that require SRE human intervention by 50% over last year;
- Deploy Temporary Accounts to all Wikimedia projects, so that exposure of personally identifiable information for unregistered editors is available to less than 0.1% of registered users
- Conduct and publish an AI risk and opportunity assessment for Trust & Safety, identifying potential threats and opportunities, along with recommended mitigation or adoption strategies for Wikimedia projects.
Support our projects
2.1 Legal Defense: We will protect the projects from legal threats
As the host of Wikimedia projects, the Wikimedia Foundation responds to legal demands to defend volunteer contributions and engage in litigation where necessary to protect our principles and projects in particular. Given our global context, this includes considering multiple factors related to the applicability of laws for defense and compliance. These include what law is at issue, the location of users and contributors that could be impacted, and the potential effect of a case on other activities that are part of our global mission. While the Foundation remains a US-based non-profit organization, our strategy has always extended far beyond where the Foundation is legally incorporated and considers where Wikimedia operates to fulfill the mission and provide access to free knowledge.
Specific goals include:
- Defend the projects against lawsuits that try to remove content or other forms of bad-faith litigation
- Ensure our Privacy Policy is up-to-date for future initiatives, reflects changing legal requirements, and is easy for users to understand
- Protect the Wikimedia movement from bad actors who try to use the Wikimedia logos and branding to impersonate the movement, solicit fraudulent business, or otherwise confuse the public
- Expand our impact litigation program to protect the Wikimedia projects, contributors, and model in light of our changing regulatory environment
- Ensure that global audiences can safely and easily access the projects
2.2 Compliance: We will ensure Wikimedia projects comply with appropriate platform frameworks
To ensure access to our projects, we need to ensure that Wikimedia projects comply with the legal frameworks governing online platforms and uphold the commitments in our Human Rights Policy. Examples include annual requirements under the EU Digital Service Act and publishing a global transparency report twice a year.
Specific goals include:
- Responding to changing legal compliance requirements, consistent with human rights standards, including complying with applicable laws governing online platforms, based on relevant factors (including applicable law, jurisdictional risk, community safety and human rights considerations)
- Work with volunteers and affiliates to identify and mitigate global human rights risks linked to the Wikimedia projects, and implementing changes necessary and consistent with Wikimedia principles to limit these risks
2.3 Knowledge integrity: We will support volunteers to strengthen knowledge integrity on our projects
Shared public consensus around what information is considered true and trusted is fragmenting, leading to struggles over neutral and verifiable information. This makes supporting volunteers to improve information integrity on our projects a priority. Examples include providing training and resources, building closer relationships with volunteers who engage in this work at community events and conversations, and updating the Anti-Disinformation Repository, a collection of resources from across the Wikimedia movement about how the Wikimedia projects work and efforts to strengthen key principles like reliable sources and verifiability.
Specific goals include:
- Improving Wikipedia communities’ ability to counter disinformation on our projects
- Improving the Wikimedia Foundation’s ability to identify and counter threats to information integrity, including disinformation targeting Wikipedia and the Foundation, by improving understanding of the Wikimedia projects
- Ensuring that Wikipedia is widely understood by media, policymakers, policy influencers, and industry as a key trusted source of verified information for AI and valued as a digital public good
- Improving support for integrity and enforcement committees, with a focus on users with extended rights
- Supporting Wikipedia in establishing consistent and effective global standards related to the Neutral Point of View (NPOV) policy.
2.4 Addressing Knowledge Gaps: We will support communities to address knowledge gaps on our projects
Building trustworthy encyclopedic content across more than 300 languages is a massive undertaking. Our role is to support communities with insights, tools, research and organizing to make the best use of volunteer time and energy.
Specific goals include:
- Help communities prioritize and deliver the breadth and depth of topics needed for a usable Wikipedia language project, with reference to notability, relevance, predicted readership and connections between articles.
- Build on high-impact product features like suggested tasks, media search, and content translation; facilitate newer contribution options alongside onboarding and developing smaller language Wikipedias.
- Support the Wikimedia organizers who recruit, train, and support contributors to work on shared content goals through collaborative setups like WikiProjects and campaigns.
- Develop relationships with the most relevant publishers to remove barriers to source materials through Wikipedia Library.
- Ensure our interventions have a positive impact on vital knowledge by measuring the increase of community-prioritized content and quality, including reversion rates and the presence of citations and images.
2.5 Policy advocacy: We will advocate for internet policy that advances free knowledge
We will seek to make the legal and regulatory environment more favorable for safe and inclusive contributions. This includes educating policymakers and the stakeholders that influence them (media, academics, and civil society groups) about how Wikimedia's model works, and how the projects contribute positively to society.
Specific goals include:
- Build awareness and trust amongst the public, policy stakeholders and businesses to help advance our interests
- Ensure policy stakeholders understand how regulation can harm or protect Wikipedia, and provide guidance before proposing or implementing future laws
- Support volunteers and affiliates to lead public policy advocacy in their regions or areas of expertise
Strengthen the movement
3.1 Connecting the movement: We will develop belonging and connections across regions
There is joy in belonging to a movement, and feeling connected is what makes that joy come alive. We will develop and facilitate these points of connection globally and across regions to create a more seamless and joyful experience of working together.
Specific goals include:
- Support regional connections through co-created spaces like Afrika Baraza, WikiCauserie, CEE Catch Up, South Asia Open Community Call, MENA Connect and more.
- Host movement spaces like Diff, invest more in our presence on Meta Wiki, and make it easier to follow the foundation's work in a single-place through the Foundation Bulletin.
- Convene collaboration and co-creation at events like Wikimania, Wikimedia Hackathon, and regional and thematic conferences.
- Make every contribution count through WikiCelebrate, Wikimedian of the year and more.
- Celebrate various Wikipedias turning 25 – an important opportunity to celebrate Wikipedia’s role as the backbone of reliable information on the internet. Create a sustained global communications campaign that makes Wikipedia visible and vital to billions of internet users.
A particular focus throughout this work will be on supporting users with extended rights.
3.2 Grantmaking and Resource Allocation
Wikimedia affiliates, content campaigns, technical contributors, partners, and collaboration structures are an essential organizing layer of our movement.
Specific goals include:
- Continue to improve and streamline grant administration, impact reporting, and participatory decision-making process; enable more organizations to develop multi-year strategies aligned with Foundation advocacy, safety, and technical approaches. (More information about our grant funding can be found here).
- Work across regions to support strategic partnerships to close content gaps, enhance organizational capacity, and unlock additional resourcing.
- Support and evolve the ecosystem of movement organizations including supporting experimentation with hubs.
- Continue to experiment with decentralized decision making around resource distribution in line with the Movement Strategy principle of Subsidiarity & Self-Management, through initiatives such as the Global Resource Distribution Pilot.
3.3 Governance & Decision-making
Effective collective action across the movement is enabled by clear responsibilities and the ability to make timely decisions.
Specific goals include:
- Continue working with key stakeholders in the Movement Organizations and Global Resource Distribution pilot; answer key questions re: funds distribution for movement organizations, with changes to be implemented the following fiscal year (FY26-27).
- Support key governance committees, including the Foundation’s Board of Trustees, Affiliations Committee, and Elections Committee.
Links and Resources
- Protect volunteers
- Universal Code of Conduct Coordinating Committee
- Mental Health Resource Center for volunteers
- Increasing privacy protections through temporary accounts
- Support our projects
- Strengthen the movement