Organizational effectiveness/Tool/Results/2014/December
Organizational effectiveness resources
[edit]- See the organizational effectiveness landing page for more information about OE work.
- Visit, the user guide to learn about the tool.
- Please visit the learning center for resources related to organizational effectiveness.
- You can find a copy of the questionnaire here: OE questionnaire.
Analysis by the TCC Group
[edit]I. Introduction and Methodology
II. Aggregate Analysis of Questionnaire Findings: Headline Findings and Interesting Differences
III. Capacity Building Recommendations and Next Steps
Introduction and methodology
[edit]What do we mean by organizational effectiveness?
- When we say organizational effectiveness, we are looking at how well organizations are achieving impact for the movement. This means that organizational effectiveness includes all the things that make an organization good at what it does, from strong leadership and systems, to how an organization chooses and does programs that lead to results. In this work, we are looking specifically and groups and organizations rather than individual volunteers, so we can understand how volunteers (and staff) work together when they are part of a group or organization.
Why launch this conversation?
- This work can launch a broader conversation in our movement around how organizations and groups can achieve the best results. This is a channel for organizations and groups to work together and learn from one another, regardless of organization type, organization focus, budget size, or funding stream. We can gain a better understanding of what organizations and groups bring to the movement and how. We can also understand what makes some organizations and groups particularly effective, and learn from those organizations and groups. Finally, we can gain a better understanding about where we need to work together to become more effective.
Project Phases
[edit]- Phase I: Benchmarking research to put our work in context.
- Phase II: An organization effectiveness tool for groups and organizations in the Wikimedia movement. To help identify strengths and gaps in capacity, for organizations and groups. To provide a structured process for building capacity, for organizations and groups.
• Phase III: Launch a discussion about the future of organizational effectiveness in the movement.
Phase II: OE tool
Ways of working
- Impact survey
- How do organizations understand impact?
- What strategies lead organizations to impact?
- Case studies
- How do specific organizations achieve impact in their contexts, with the resources they have?
- Tool
- Assessing capacity across different strategies
- Learning center
- Resources for working with specific strategies
- Report
- What needs are identified across organizations?
- How can we work together to build capacity?
- Discussions
- What are the next steps for the movement?
An inclusive, impact-focused process
- Understand --> Assess --> Roadmap
Understand
- How do Wikimedia movement actors define “effectiveness” and “impact”?
- Elements:
- Benchmarking
- Interviews
- Surveys
- Wikimania consultation
Assess
- What capacities are tied to impact? Where are there strengths and gaps in capacities?
- How can we work together on organizational effectiveness?
- Elements:
- Conversations with stakeholders
- Case studies
- OE questionnaire
- OE learning center
Roadmap
- Elements:
- Analysis
- Discussions
- Define next steps
Case studies on impact We interviewed three organizations (a chapter, user group, and thematic organization) to learn more about how they are achieving results.
- Wikimedia Österreich
- Wikimedians of Nepal User Group
- Amical Wikimedia
Case studies: what we learned
- Establish informal ways to collect information about how your organization is doing that are accessible, and encourage group contributions.
- Openly discuss success and failure with stakeholders; “lower the stakes” by launching programs as small pilots, and document and discuss what works and what doesn’t.
- Engage volunteers in different ways, according to their strengths and interests.
- Develop criteria for saying “no” to some opportunities that are not likely to lead to impact (for example, if they are unlikely to be sustained by volunteers in the long term).
- Establish partnerships with institutions that already have an existing infrastructure to initiate, support, host, or expand Wikimedia projects.
- Be proactive and direct when you need help, and reach out to other open knowledge organizations for tips on how to deal with local challenges or context.
- Create a “blueprint” for successful local programs specific to your organization’s context, so they can be repeated or sustained in the future.
OE tool: structure and relationship of elements
[edit]OE Tool
- Questionnaire. Questionnaire includes questions about strategies. Organizations only choose optional strategies relevant to them.
- User Guide. User Guide explains the project and how to use the tool (including questionnaire / results report, learning center).
- Learning Center. Learning Center includes resources for strategies from the questionnaire, and a space for organizations to learn and share around OE topics.
Aggregate analysis of questionnaire findings
[edit]Assumptions
- This tool is designed first and foremost as a self-assessment tool
rather than a survey. The tool is intended to be used by specific organizations who want to build a capacity-building plan. Learning from these aggregate results is an added benefit, but the analysis of the aggregate results is not the primary use case for this tool or the questionnaire.
- The first iteration of this tool is an experiment. This is one of the
reasons why we brought in an external firm with expertise in creating self-assessments to help us with this work. Improving this tool is an ongoing process, and your feedback is needed as part of this discussion.
Aggregate findings
- Organizations and groups were invited to participate without sharing
their specific results with the Wikimedia Foundation or others in the movement. This was done to give organizations space to conduct an accurate self-assessment.
- This means that we can only share aggregate findings with the
movement, without identifying specific organizations. This section will provide an overview about who participated, without identifying specific organizations.
Who took the questionnaire
[edit]Who took the questionnaire?
- 103 individuals responded
- 36 different organizations participated
- Organizations ranged in age from less than 1 to more than 7 years
- 39% had budgets less than $50k, 25% had budgets between $50K and $200K, 39% had budgets more than $200K
Categories used in analysis
- Four group characteristics are analyzed for differences in scores: age, budget, geography, organization type.
- Most individual respondents were with chapters with large budgets from the global north, and more than 7 years old.
By age
- 23 were created from 2008-2010
- 5 were created from 2011-2013
- 8 were created in 2014
By budget size
- 14 have small budgets
- 9 have medium budgets
- 35 have large budgets
By geography
- 22 are from the global north
- 12 are from the global South
- 2 do not have a geographic focus
By type
- 26 are chapters
- 6 are user groups
- 1 is a thematic organization
- 2 are not recognized
Questions scale
[edit]Most questions use the Likert Scale, which asks individuals to read a statement, then select the response that best reflects their agreement with that statement. The responses are numbered 1-5 from “Strongly Disagree” to “Strongly Agree.”
- TCC Group uses the following ranges to present the findings in this presentation:
- Disagree: 1.00 – 2.50
- Neither Agree nor Disagree: 2.51 - 3.50
- Agree: 3.51 – 5.00
What these scores mean
- A high average score for a given strategy may indicate that organizations rate this area as important and feel they have significant capacity in this area. A low average score for a given strategy may indicate that organizations do not rate this area as important, or that they have significant gaps in capacity.
- Scores in specific areas will give a more specific idea about how organizations responded
- As is typical in self-assessments, respondents tended to rate their organizations highly. This means that most scores will be on the higher end of the likert scale (above a 3.5).
Core strategies used by all Wikimedia organizations
[edit]Core strategies used by all organizations
- Volunteers
- Diversity
- Online Contributors
- Learning
- Resources
High and low scores across core strategies
- Overall scores may indicate how important a strategy is to organizations, as
well as indicating how organizations assess their own capacities.
- The highest scoring core strategy across all types of organizations is
resource mobilization (includes fundraising).
- The lowest scoring core strategy across all types of organizations is
working with volunteers.
Core strategies for all organizations Overall scores (N=36) 1 = Strongly Disagree 5 = Strongly Agree
- Resources 3.8
- Diversity 3.6
- Online contributors 3.6
- Learning 3.4
- Volunteers 3.2
Strengths and challenges in core strategies – by budget size Small (0-50,000 USD) Medium (50,000-200,000 USD) Large (>200,000 USD) Strengths
- Partnerships
- Finances
- Planning
- Collecting data and feedback
- Volunteer input
- Partnerships
- Finances
- Planning
- Prioritizing diversity
Challenges
- In-kind resources
- Collecting data and feedback
- Acknowledging volunteers
- Volunteer input
- Effective trainings
- Resources for communitybuilding
- Partnerships & resources for partnerships
- Planning
- Acknowledging volunteers
- Resources for communitybuilding
- Resources for partnerships
- Collecting data and feedback
- Effective trainings
- Resources for communitybuilding
- Resources for partnerships
Learning behaviors – strengths and challenges
- The highest scores are for understanding open knowledge context and refining strategy; lowest scores are for collecting data systematically.
- Strengths
- We understand open knowledge changes in our region or focus area: 3.96
- We understand global changes in open knowledge: 3.75
- We apply what we learn about our activities to have more impact in the future: 3.72
- Our strategies are informed by new information about open knowledge: 3.72
- Challenges
- We collect data from every activity we do to understand its impact: 2.97
- We collect info about how organization processes are working and can be improved: 2.85
- We have people whose role it is to understand how we can improve organizational functioning: 2.82
Learning behaviors – by budget size
- Organizations with larger budgets say they are able to collect data systematically. This may be because they have more resources to do this, or their boards or funders ask them to.
- Our organization systematically collects and documents data from every activity to understand its impact. 64% of larger budget chapters agree; 44% of medium budget chapters agree; 7% of smaller budget chapters agree
- Our organization collects feedback from everyone involved to understand what is and isn’ t working. 73% of larger budget chapters agree; 44% of medium budget chapters agree; 13% of smaller budget chapters agree
Learning Behaviors – by Organization Type
- No User Groups report collecting data from all of their stakeholders; less than half of chapters report doing so.
- Our organization collects feedback from everyone involved to understand what is and is not working about our organization from their perspective. 44% of chapters agree and no user groups agree.
- Our organization has effective ways to get feedback from volunteers and other key stakeholders. 36% of chapters agree and no user groups agree.
Working with Volunteers – Strengths and Challenges
- Organizations understand volunteer skills and interests, but don’t provide much formal structure to support them.
- Strengths
- Volunteers always work in the areas they are most interested in: 3.839
- Volunteers have opportunities to shift roles within our organization as their skills and interests develop: 3.813
- Our organization considers how volunteers would be affected when it makes decisions: 3.708
- People in our organization (leadership, volunteers, staff) understand what the strengths, capabilities, and skills of volunteers are: 3.598
- Challenges
- We have a written (printed or digital) handbook or orientation guide for new volunteers: 3.829
- Our organization always has as many volunteers as we need to achieve impact: 2.855
- When a volunteer leaves, we have an exit interview or other discussion to understand why: 1.998
Resource Mobilization – Strengths and Challenges
- Organizations believe getting more resources is important, and are actively looking for resources.
- Raising resources is critical: 3.98
- Proactively seeking in-kind resources: 3.98
- Diversifying funding is important: 3.89
- Using in-kind resources:3.68
Resource Mobilization – by organization type
- Chapters provide grants and equipment, while user groups believe meeting spaces for volunteers are important.
- We have the resources we need to effectively do programs that support online contributors. 0% of user groups agree; 24% of chapters agree
- People in our org believe providing places for volunteers leads to online impact. 100% of user groups agree; 76% of chapters agree
- People in our org believe providing equipment leads to impact. 67% of user groups agree; 91% of chapters agree
- We provide clear information in our local language about how to borrow equipment. 0% of user groups agree; 52% of chapters agree
- We provide equipment to volunteers and contributors. 0% of user groups agree; 62% of chapters agree
- We provide grants to online contributors and volunteers. 33% of user groups agree; 81% of chapters agree
Work with Online Contributors – by Organization Age
- Not surprisingly, older organizations that are often more formal have more experience giving grants to online contributors.
- We provide information in our local language on how to apply for grants. 80% of older organizations agree; 68% of middle organizations agree; 22% of newer organizations agree
- We provide grants to volunteers or contributors. 85% of older organizations agree; 68% of middle organizations agree; 20% of newer organizations agree
Working with Volunteers – by Organization Age
- Older organizations are more likely to acknowledge the contributions of volunteers; this may result from working with a large or core contingent of volunteers for a longer period than newer organizations.
- We acknowledge the contributions of volunteers. 61% of older organizations agree; 25% of middle organizations agree; 14% of newer organizations agree
Work with Online Contributors – by Organization Type
- Chapters are slightly more likely than other organization types to feel they have enough resources for community-building, but still only a minority of chapters agree with this statement.
- We have enough resources to achieve impact through community building. No user groups agree; 16% of chapters agree.
Optional strategies
[edit]Altogether, we asked about nine optional strategies. For optional strategies, organizations were given the option to not select a strategy that they were not using. This chart gives an indication of what optional strategies organizations indicated they were using.
Optional strategies by % using
- Edit-a-thons, workshops, trainings 88%
- Partnerships, including GLAM and education 86%
- Governance 84%
- Planning 84%
- Online contests 81%
- Finances 80%
- Supporting online contributors by providing resources 76%
- Advocacy work 66%
- Developing software or tools 35%
Average scores across optional strategies Overall scores by strategy (N=15-29)
- Finances 3.7
- Edit-a-thons, workshops, trainings 3.6
- Online contests 3.6
- Supporting contributors with resources 3.6
- Advocacy work 3.6
- Governance 3.5
- Planning 3.4
- Partnerships 3.2
- Software/tech 3.2
Many optional strategies had high scores. These included finances, edit-a-thons, online contests, supporting contributors with resources, advocacy work, and governance. This means organizations rated them as important, and / or they have capacity in these areas.
Planning, partnerships, and software / technology had significantly lower scores. This means organizations did not rate them as important, and / or may have gaps in capacity in these areas.
Support contributors with resources – strengths and challenges
- Organizations believed providing equipment and places to meet leads to online impact, but did not have enough capacity to effectively support contributors with resources.
- Strengths
- People in our organization (volunteers, leadership, staff) believe providing equipment for volunteers and contributors leads to online impact: 4.02
- Our organization provides grants, reimbursements, or travel scholarships to online contributors and volunteers: 3.94
- People in our organization (volunteers, leadership, staff) believe providing places for volunteers to gather leads to online impact: 3.87
- Challenges
- We have the resources we need to (for example, time, space, funding, supplies) to effectively do programs that support online contributors: 2.88
Partnerships – by organization age
- Despite a similar feeling across organization ages that they do not have enough resources, older organizations know how to establish partnerships with GLAMs.
- We have enough resources for partnerships. 20 0 0
- We know the steps to establish partnerships with GLAMs. 90 40 33
Partnerships – by Organization Type
- Chapters seem to know how to form partnerships more than User Groups but are also more likely to report lack of resources to achieve impact through partnerships. Perhaps chapters more actively pursue this strategy than others, but also have higher expectations for impact and for the resources it takes to do partnerships well.
- We have enough resources for partnerships. 20 8
- We know the steps to establish partnerships with GLAMs. 20 78
Advocacy Work – Strengths and Challenges Organizations agree outreach, awareness, and policy work can lead to impact, and are working toward keeping stakeholders more informed.
- Strengths
- People involved with our organization would agree that changing laws and policies is an important way that our organization can achieve impact: 4.17
- We understand open knowledge changes in our region or focus area: 3.96
- Our organization keeps key stakeholders (for example, volunteers, contributors, other Wikimedia Organizations, policymakers) informed about policies in our country that could affect the Wikimedia movement: 3.73
Advocacy – by organization age
- Newer organizations are very active in advocacy work (more active than older organizations). This category also includes awareness work and outreach (e.g. keeping the local public informed).
- We are active in global advocacy efforts. 47% of older organizations agree; 25% of middle organizations agree; 100% of newer organizations agree.
Advocacy – by organization type
- Chapters are much more likely than user groups to report access to local policymakers. This may be because their comparatively formal status enables them to lobby, or it may have to do with chapters more actively pursuing advocacy as a priority strategy.
- We have connections to and communicate directly with local policymakers: 0% of user groups agree; 63% of chapters agree
Edit-a-thons, Workshops, Trainings – by Geography
- Organizations in the global north are significantly more likely to hold event series, have outreach events, and to see improvement in volunteer skill and motivation after trainings.
- There is noticeable improvement in volunteer skill or motivation after workshops and trainings. 27% of global south organizations agree; 79% of global north organizations agree
- We have in-person events to introduce Wikimedia to new 36% of global south organizations agree; 85% of global north organizations agree
people.
- We hold event series. 46% of global south organizations agree; 81% of global north organizations agree
Finances – Strengths and Challenges
- Organizations score highly in managing finances, but are challenged to raise resources.
- Strengths
- We are aware of and comply with local laws and best practices for fund management: 4.41
- At least two people in our organization have access to our organization’s funds: 4.36
- Challenges
- Our organization is able to raise the resources we need to be effective: 3.09
Finances – by Organization Type
- User groups see that the amount of money they spend is in proportion to the impact they achieve, while chapters agree less so.
- The amount of money we spend corresponds to the impact we achieve: 100% of user groups agree; 35% of chapters agree
Finances – by Geography
- Global north organizations have resources and comply with best practices more than global south organizations.
- We have resources. 14% of global south organizations agree; 40% of global north organizations agree
- We understand how resources lead to impact. 57% of global south organizations agree; 80% of global north organizations agree
- Compliance with laws and best practices. 71% of global south organizations agree; 95% of global north organizations agree
Governance - by Budget Size
- Larger organizations are more likely to have boards delegate tasks to subcommittees.
- Our board delegates some tasks to committees. 82% or larger organizations agree; 57% of middle organizations agree; 14% of smaller organizations agree
Governance – by Organization Age
- Newer organizations have clarity on the board’s role, but need more clarity on who is responsible for decisions.
- Our board understands its role governing our organization. 45% of older organizations agree; 40% of middle organizations agree; 100% of newer organizations agree
- Key stakeholders are engaged with decision-making processes when needed. 70% of older organizations agree; 40% of middle organizations agree; 0% of newer organizations agree
- Everyone understands who's responsible for making decisions. 55% of older organizations agree; 40% of middle organizations agree; 0% of newer organizations agree
Qualitative analysis
[edit]A few organizations were unclear about the phrase “collecting data”.
- For example, when asked what data or information they collect on resource
mobilization, some organizations said:
- “We get financial support as we can, in no organized way.”
- “Each time we need a resource, we ask our members for help.”
- This indicates that for some groups, the idea of collecting data is unclear, or the phrase or language may be unfamiliar to them.
Organizations think about data collection in three ways.
- They collect data required by funders or internal policies, but it’s meaningless to their learning. “We…are spending a frustrating amount of time as it is tracking every trivial receipt and name. It is nice to be able to leave our attendees alone…and not obsess over [demographic data collection.]
- They collect a lot of rigorous data and use it internally for learning. “[We use] case study reports to include lessons learnt, FDC reports to include reflections on what worked/what has not, reports from site visits at
other chapters…programme review projects…including recommendations for the future.”
- They collect virtually no data. “[We collect] almost nothing, or perhaps, whatever anyone cares to remember. Nothing is written or recorded.”
Recommendations for movement-wide capacity building
[edit]Capacity building roadmap
- TCC Group Recommendations
- Best processes for building capacity, based on work with other organizations and in Wikimedia.
- Specific capacities that could be strengthened, based on the results of the questionnaire.
- A framework for the Wikimedia Movement
- To dive deeper and create its own prioritized capacity building plans.
- To plan for the OE Tool’s ongoing use and development.
Potential strategies to build capacity
- Support organizations in interpreting OE Tool results to create their own master capacity-building plans.
- Offer one-on-one interpretation sessions to help organizations look at strengths and challenges and prioritize
areas for growth and how to implement recommendations. This would require organizations sharing their results.
- Host working sessions for groups with similar capacity challenges or interests. Allow groups to lead the sessions, but offer input about content or resources available; make
connections.
- Conduct in-depth research with groups. Be a part of a pilot group on any of these strategies to better understand how to execute strategies more effectively.
- Facilitate connections among groups. “Matchmake” among groups that may be able to learn from one another but that haven’t yet connected. Proactively reach out to other organizations to build resources that are relevant to organizations. Build understanding around incentives for improving OE.
- Understand the benefit of building capacity in complex or conceptually difficult areas (like learning behavior). Provide easily tailored documents and templates.
- Consider materials like a volunteer handbook for core strategies such as work with volunteers and learning. These are already being created for specific programs, but more resources may be needed in non-program areas.
Specific recommendations for the movement: partnerships
- Better understand and document the informal and formal steps to build partnerships with GLAMs and education institutions.
- Understand why organizations in the global south have more difficulty with partnership models used elsewhere in the movement.
- Create criteria to identify the highest impact partnerships for organizations, so they can apply this in their own local contexts.
- Understand how specific partnerships lead to impact for organizations and the movement overall.
- Facilitate partnerships at a regional level and/or advocate for policy changes that could result in new, creative and high-impact partnerships for all organizations.
Specific recommendations for the movement: working with volunteers
- Only 25% of organizations agree that volunteers know their roles and what they should be doing in a given time frame. Define what volunteers to do, and when.
- Make volunteer trainings more effective. Identify what works and find ways to make it easier for volunteers to tell you if training are working (e.g. create or use existing templates to evaluate trainings).
6% of organizations agree they actively recruit volunteers.
- Explore different ways to recruit volunteers on and offline, and in different contexts. Make volunteer recruitment an explicit goal, and integrate this into existing activities.
- Understand why most organizations are not acknowledging volunteer contributions, and learn from organizations that are experimenting with volunteer recognition.
Future of the organizational effectiveness tool
- Is a self-assessment for capacity building useful to organizations? Does it fill a need?
- Is the tool in its current version useful? Why or why not?
- How are organizations currently using the tool? How might organizations use a tool like this in the future?
- Should we invest more resources in improving and maintaining the tool (e.g. moving the tool to a permanent platform, continuing to add strategies, promoting use of the tool on a regular basis)?
Improving the tool
- How can the tool be improved?
- Should we keep results anonymized?
- How can we make it easier to interpret and use the results?
- Is the tool equally useful to all types of Wikimedia organizations and groups?
Questions about the future of OE
- Should we continue to spend time discussing and exploring OE as a movement? What are the potential long term benefits of this work?
- Who should be responsible for moving discussion about OE forward? What kind of support, if any, is needed to keep this work going?
- Based on the research we have done so far, are there particular strategies or areas of OE that we should explore together as a movement?
Feedback from the first round of the tool
[edit]Applicability to All Organizations
- Translate it!
- Make it more applicable to informal organizations, User Groups, member
organizations and small-budget organizations.
Questions
- There should be fewer, more dissimilar questions.
- Questions types should be also be varied – not all likert scale.
- Questions about data should be clearer.
- Questions should be framed from an individual perspective – not asking one person
to respond on behalf of the entire organization. Structure
- Perhaps the Questionnaire should be split into two separate surveys (e.g. core,
optional) to cut down on fatigue.
- There should be a “progress bar” to show how many questions are left.
- Consider having one person take the Questionnaire, gathering input from volunteers
as necessary; this would be less burdensome on volunteers.
Appendix A: Supplementary analysis on core strategies by category
[edit]Background
[edit]Analysis of data for core strategies
- This analysis is based on the first round of results from the organizational effectiveness tool questionnaire, taken by Wikimedia organizations in November 2014 - January 2015, and administered by the TCC Group, a consulting firm engaged to develop this tool.
- The scores for core strategies here have been cleaned (e.g. negative scores have been reversed) and sufficiently anonymized in order to release the aggregate results to the Wikimedia movement. We are presenting some data here based on means across different questions, categories, strategies, and types of organizations in order to start a discussion about what the results may mean for organizations in the Wikimedia movement.
- We have grouped questions into categories to make the scores more meaningful, and to make it easier to see useful distinctions, but data for averages by question is also provided here, for those who need a more granular look at what comprises each category. TCC has already pulled out some high level observations in its initial report, and this detailed report is designed to supplement those high level observations.
Core strategies
- We have included some strategies that are probably used by all Wikimedia organizations. These include,
- (A) Volunteer engagement
- (B) Diversity
- (C) Online contributors
- (D) Learning
- (E) Resources
Key questions
These five strategies are core to what Wikimedia organizations do.
- What do high scores tell us about where Wikimedia organizations see their strengths?
- What do high scores tell us about where Wikimedia organizations can grow?
- Are there any interesting differences among chapters and user groups? Global north and global south organizations? Organizations of different ages or budget sizes?
Comparison of strategies by mean scores across all types
- Resources: 3.77
- Diversity: 3.58
- Online contributors: 3.58
- Learning: 3.38
- Volunteer engagement: 3.11
Highs and lows: core strategies
- The lowest overall score by question is 2.00. When a volunteer leaves, we have an exit interview or other discussion to understand why.
- The highest overall score by question is 4.20. Our organization has in-person events that support and sustain communities.
- The lowest overall score by category is Volunteer engagement (Recruitment): 2.480565476.
- The highest overall score by category is Online contributors (Values): 4.033469388.
Observations on overall scores
- Organizations scored themselves highly for mobilizing resources, but scored lower overall in the area of online engagement.
- Overall, organizations highly value supporting online contributors.
- Global south organizations generally scored themselves lower than global north organizations, except for the volunteer engagement strategy.
- User groups generally scored themselves lower than chapters, but had similar scores in the area of resources.
- Across budget sizes, organizations had similar scores. Smaller organizations gave themselves lower scores in most categories other than resources.
- Newer organizations rated themselves higher than medium-aged or older organizations for almost all strategies, except volunteer engagement. The oldest organizations rated themselves the highest for this strategy.
Volunteer engagement
[edit]Volunteer engagement
- This strategy is about effectively engaging volunteers that work with your organization. Volunteers who work to contribute to the Wikimedia projects are considered separately, as part of another strategy.
Volunteers are people who do work with your organization without pay because they care about your organization's mission. Many organizations have volunteers who serve on their boards, who organize activities, or who do other specific jobs. Volunteers may have a relationship with your organization over time, from the time they are recruited or discover your organization, through years of training and service, to the time they end their service. Engagement in this case refers to the way your organization interacts with these volunteers at every stage of the relationship. Volunteers are not only important to your organization because of the work they do, but also because they are people who care deeply enough about your organization's work to give their time.
Categories for volunteer engagement
- Roles (A-C): volunteers and organization are clear on roles that make sense.
- Training (D-J): organization trains volunteers and provides support.
- Recognition (K,L): organization welcomes and acknowledges volunteers.
- Flexibility (M, N): volunteer roles can shift as needed.
- Recruitment (O,P): organization can get the volunteers it needs.
Observations on volunteer engagement
- Organizations rated themselves as flexible with respect to how volunteers work in their organizations, but gave themselves low scores in areas around recruitment and training.
- The smallest organizations rated themselves as more flexible, and global south organizations rated their training more highly than global north organizations.
- Overall scores by category were fairly consistent across types.
Diversity
[edit]Diversity
- Being inclusive of people with diverse backgrounds may help your organization be effective in many other areas. For example, a more diverse board can lead to more effective governance. If your organization is trying to increase participation and improve content on the Wikimedia sites, including people with many different backgrounds in your organization's work as volunteers, donors, staff, and partners may be a good basis for increasing participation among a more diverse group of contributors. The knowledge and perspectives of a diverse group will make your organization stronger. Organizations may create a more inclusive and welcoming environment to ensure that people of many different backgrounds feel comfortable interacting with organizations once they are engaged.
Categories for diversity
- Engagement (A): organization actively engages diverse volunteers.
- Policy (B): organization has policies around harassment and friendly spaces.
- Accessibility (C): organization projects are easy to understand for diverse volunteers.
- Values (D): organization believes diversity is important.
- Perspectives (E): organization includes perspectives of people with different backgrounds.
- Expertise (F): organization uses external expertise (outside of Wikimedia).
Observations on diversity
- Organizations overall value diversity, but are less strong in making programs accessible and including diverse perspectives.
- Chapters rated themselves higher than user groups in this strategy, and global north organizations rated themselves higher than global south organizations, especially in the area of engaging diverse contributors.
- Chapters with medium-sized budgets also rated themselves higher than chapters with smaller or larger budgets.
Online contributors
[edit]Online contributors
- This strategy focuses on how organizations offer support to the people who work on the Wikimedia projects (for example, people who create or improve articles on Wikipedia or upload images to Wikimedia Commons), to help build communities. Communities in this context are a group of people who share a common interest, of contributing to the Wikimedia projects. Communities on the Wikimedia projects may define themselves in many different ways: they may be defined geographically, by language groups, or by specific projects. Building communities is an important aspect of supporting more contributions to the Wikimedia projects. While communities will grow themselves, organizations can play a supporting role in strengthening them by communicating with communities about their needs and supporting ways people can interact more effectively.
Categories for online contributors
- Support (A, H, I): organization supports and consults contributors effectively through its activities.
- Values (B): organization believes supporting online contributors is important.
- Offline activities (C-E): organization supports contributors through offline activities.
- Online activities (F, G): organization supports contributors through online activities.
- Resources (J, K): organization has the resources to support contributors and uses them effectively.
Observations on online contributors
- Organizations highly valued supporting online contributors, but organizations overall did not feel they had the resources to do so effectively. Chapters especially valued this strategy.
- Global north organizations rated themselves slightly higher than global south organizations, especially with respect to resources.
Learning
[edit]Learning
- Effective Wikimedia organizations learn from their work to improve over time, and also share their learning with others and learn from other Wikimedia organizations, to help Wikimedia organizations overall become more effective. Effective learning may include documenting what your organization has learned, collecting information that will help your organization learn, applying what your organization has learned to improve your organization's activities, and proactively sharing that information with others. Wikimedia organizations at different stages will approach this differently, as larger organizations doing many activities may need a more systematic approach.
Categories for learning
- Organizational learning (A-E): organization learns about itself and how it works.
- Values (F): organization believes learning is important.
- Environmental learning (H-K): organization learns about the environment in which it works.
- Applying learning (G, L-N): organization applies what it learns to improve.
Observations on learning
- While organizations overall valued learning, they rated themselves higher in learning about their environments than learning about their organization and activities.
- While user groups valued this even more than chapters, they rated themselves lower in most categories.
- Especially in the area of organizational learning, chapters with larger budgets rated themselves higher.
Resources
[edit]Resources
- Resource mobilization is about gaining and using the resources your organization needs to do its work. Resources may include money (donations, grants from institutions, money earned through fees), in-kind resources (for example, office space or materials to do your organization's activities), and pro-bono services (professionals donating their time to your organization for a specific purpose, who would normally charge for that service). It is important to consider your organization's reputation as a key resource your organization can leverage in its work. Those partners investing resources in your organization are also building a relationship with your organization in addition to offering your organization money, in-kind services, or pro-bono services. Volunteers are also a resource since they give your organization time, but they are considered through other strategies.
Categories for resources
- In-kind (A, B): organization is making use of in-kind donations.
- Sustainability (C, D): organization’s strategies for resources can be maintained over time.
- Values (E): organization believes resources are important.
Observations on resources
- Organizations overall value resources very highly, but still need capacity in building sustainable strategies to raising resources and leveraging in-kind resources.
- Across categories, global north organizations rate themselves more highly than global south organizations.
- On the other hand, user groups and small budget organizations value this strategy highly, but rate themselves lower across capacity areas.