Annual Planning Community Workshop (March)

As part of the planning for fiscal year 2025-2026, the Wikimedia Foundation has been holding many discussions on-wiki and community workshops. This community workshop focused on global trends that are affecting our movement, and is a part of many meetings and conversations the Foundation is having around annual planning.
During the workshop we looked at the trends we see both happening in the world and in the Wikimedia context. We discussed what we as a movement can do to address the trends, and looked at how we can all support each other in this work. The workshop gathered online contributors, users with extended rights, movement organizers, and affiliate and Foundation staff and trustees. This was an experiment with a format, to get more direct and different experiences into the conversations. The in-person meeting also allowed for more difficult and direct conversations.
The workshop was structured around two topics:
- Trends: Understand the big trends impacting our mission and work. Both in the world and on our projects. Share research, data and learning about the changing landscape around us, and about what’s happening on Wikipedia and our projects. In areas like readers / consumers, contributors, AI, regulatory threats, and more.
- Roles & support: Given these trends: what does the world – and our movement – need from us now? Given these trends, what is the work to be done? What should the Foundation and others in the movement focus on, and where do people need support? How can we feed these insights into next steps, such as the Foundation’s annual plan?
Discussion summary
[edit]The discussion and resulting takeaways were structured around key trends, which we have summarized as the five trends below:
Trend 1: Trust in information online is declining and it is increasingly difficult to find shared consensus around what sources can be trusted. The trust people have in traditional institutions of knowledge continues to decline, and instead readers turn in growing numbers to online personalities, who are having a bigger impact on what people believe and trust.
Among people who know about Wikipedia globally, trust is very high relative to other tech platforms. However, we are seeing early indicators of a possible generational shift away from Wikipedia and toward other sources for information (social video, AI).
- What is the Wikimedia movement doing in relation to this trend?
- Expanding trustworthy, encyclopedic content through different initiatives, including GlamWiki, Wikimedians in Residence, supporting academic work
- Improving reader retention, through working on Wikipedia related apps and general reader experience
- Engaging young people through social media or influencer engagement and programs for education entities
- Reaching new audiences through partnerships, affiliate presence at public events, solutions like wiki-in-a-box or QR-pedia
- Continuing to build on our strong reputation through events, working with the media, brand collaborations, outreach campaigns (like 25 Birthday or WikiMinute), major donor events and fundraisers and affiliate outreach work
Discussions about what is needed to increase trust and grow readers and awareness of the Wikimedia projects included outreach to new partners that share our mission of reliable information, getting a better idea of how Wikimedia content is being reused around the internet, expanding our definition of what we call readers, and affiliate-created trainings to support volunteers to be ambassadors about the Wikimedia projects and increase awareness.
Trend 2: People participate eagerly in online spaces that provide rewarding connections.
We see a decline in the number of new people that are registering as editors on the Wikimedia projects, which may be related to the growing popularity and appeal of participating in some of these other rewarding online spaces.
Active editors continue to skew more male (80%). The average age of contributors decreased post-pandemic, with a larger share of young people participating. Younger editors are more likely to be motivated by potential self-improvement and career benefits of contribution than older editors. Total active editors decreased, driven primarily by a decrease in new users registering accounts, as well as more gradual declines in returning active editors.
- Discussion:
- High standards for editing and unwelcoming rules limit participation of newcomers. Mobile editing barriers further limit casual contribution.
- Unlike other open source communities, Wikipedia lacks career and social rewards for participation. Social connection should be a reward we can focus on.
- New editors should be able to do small tasks, and then help in transitioning to tasks that are more impactful and rewarding.
- What is the Wikimedia movement doing in relation to this trend?
- Improving engagement and resilience of contributing communities (through tools like newcomer homepage, EditCheck, and through community building)
- Creating stronger human connections through events, networks, projects targeted at different demographic groups
- Outreach and partnerships to reach new audiences, through initiatives like WikiWomen, EduWiki, GlamWiki or through off wiki community spaces (like Discord)
- Creating sharing and learning spaces, like hubs, mentorship programs, and targeted trainings.
- Enhancing safety in Wikimedia projects through organizations dedicated to representation, targeted actions, and user protection resources - including solutions like IP block exemptions.
- Welcoming newcomers through educational initiatives and technical solutions that support and motivate new editors, such as the Newcomer Homepage and gamification strategies.
- Raising visibility and appeal of contribution (for example through making editing a way to gain new skills and recognition)
- Capacity building initiatives designed to attract and retain new users — for example, through programs like Train the Trainer or other skill development trainings
- Recent PTAC recommendation to focus on mobile contributions
- What do we need to respond to this trend?
Discussions about what is needed to increase and encourage contributors to the Wikimedia projects centered around increased identification and documentation of issues faced by contributors, ways for editors to connect and have trusted allies to help with their work, and increased spaces for peer learning.
Trend 3: Digital information that is created and verified by humans is the most valuable asset in the AI tech platform wars. Last year we predicted that AI would be weaponized in creating and spreading online disinformation. This year, we are seeing that low-quality AI content is being churned out not just to spread false information, but as a get-rich-quick scheme, and is overwhelming the internet. High-quality information that is reliably human-produced has become a dwindling and precious commodity that technology platforms are racing to scrape from the web and distribute through new search experiences (both AI and traditional search) on their platforms.
- What is the Wikimedia movement doing in relation to this trend?
- Initiatives to harness AI for good (for example developing project policies around AI, trainings and presentations on using AI to support the work of contributors or affiliates, using translation tools)
- Creating AI enabled interfaces to knowledge and creation (example: Editor tools - like EditCheck or Image Suggestions - , AI powered summaries of existing articles
- Addressing the issue of the tension between open access to knowledge and controlling access to knowledge (example: Wikimedia Enterprise),
- Ensuring that stakeholders are involved in AI decisions through being active in AI policy discussions
Discussion about what is needed to leverage AI for good on the Wikimedia projects and harness the value of human generated knowledge include developing resources to increase understanding about how AI is being used to support online contributors, and explaining to the public how Wikipedia is being used (increased mission aligned re-use).
Trend 4: Disagreements about neutral and verifiable information threaten access to knowledge projects and their contributors. Public consensus around the meaning of concepts like “facts” and “neutrality” is increasingly fragmented and politicized, and the public does not have an in-depth understanding of Wikimedia’s neutral point of view (NPOV) policy. Special interest groups, influencers, and some governments are undermining the credibility of online sources that they disagree with. Others also try to silence sources of information through vexatious litigation.
- Discussion:
- Not all Wikipedia projects have a similarly robust set of neutrality-supporting policies
- The greatest protection of NPOV onwiki is a large editing community and many readers
- Some things we as Wikimedians may not be adequately addressing in relation to neutrality:
- The public's decreasing interest/care for neutrality
- Decreasing societal agreement on “facts”
- The challenges faced by journalism, on which Wikipedia relies for source material
- What is the Wikimedia movement doing in relation to this trend?
- Creating clear policies that uphold neutrality (e.g., NPOV, paid editing)
- Supporting online contributors, including more resources for users with extended rights and Arbitration Committees, addressing harmful behavior, including paid editing, and, when necessary, legal assistance for volunteers.
- Supporting content quality through citations, policies around reliable sources, and campaigns like #1Lib1ref
- Advocating in the public sphere for the Wikimedia projects, to increase the public's understanding of how the projects work and promoting our unique community-led models
What can we do to support our projects in addressing these trends? Better Wikimedia policies that support the neutrality and quality of our content. During the discussion participants mentioned an idea of having global standards around neutral point of view/ neutrality, and better cross-wiki learning about policies, such as spaces for policy focused bilateral conversations between wikis, and a policy exchange led by volunteers, allowing the projects to learn from each other.
Trend 5: Wikipedia's long-term sustainability relies on a steady influx of new editors who contribute quality content and remain engaged. However, recent trends indicate a decline in editors with extended rights, posing challenges to the growth and health of the editing community.
According to the Research Team’s report “Administrator recruitment, retention and attrition”, the number of monthly active administrators on large Wikipedias has been declining since 2018, with some exceptions. The current patterns of admin inflow (recruitment) are insufficient at replacing admin outflow (attrition) on many large Wikipedias.
What is the Wikimedia movement doing in relation to this trend?
- Building a sense of belonging through connections and celebrations, like Wikimania, Wikimedian of the Year, Wikicelebrate, swag giveaway, barnstars, certificates, peer to peer connections.
- Building and improving technical solutions that support online contributors, like the Newcomer Homepage, Community Wishlist, Admin Dashboard.
- Trainings supporting good communication and conflict resolution.
Knowledge exchange between admins and cross wiki through a special day at Wikimania including additional scholarships for users with extended rights.
Discussion around what is needed to increase the number of editors with extended rights include building more understanding about these key roles across wikis through increased education, peer learning across wikis, building a sense of belonging and connection.
If you feel like something is missing from this summary, please share on the discussion page!
If you want to participate in the continuous discussion about the global trends and Foundation Annual Plan, join one of the upcoming meetings, onwiki discussions or discuss on Meta.