User:MCruz (WMF)/Sandbox/Program Reports/Executive Summary
Wikimedia Evaluation initiative started two years ago, with the aim to bring forward successful programs, understand what works best for our shared goals, and how to best build capacity in the movement towards a culture of accountability and data-driven decisions. Learn more about Evaluation... the Wikimedia way!
Data and analysis
[edit]- We have a lot of questions, and the data reported has helped us answer
- What do these programs costs in terms of dollars and hours invested and what other costs may be hidden in donated resources used?
- What is the reach of these programs in terms of accessing new and existing editors/contributors?
- How much content do programs produce in terms of bytes pages or photos/media added?
- What are the costs in terms of dollars and hours input Tper unit of content (text pages or photos/media added) or per participant/recruit (for workshops which produce no content)?
- To what extent do program outputs increase the quality of Wikimedia projects?
- To what extent does program participation produces new active editors/contributors, or retain active editors, at 3- and 6- months retention points?
- To what extent does the program have examples for easy sharing and replication?
Priority goal setting
[edit]We worked with the community to discover the most commonly seen goals for programs. The June pilot workshop in Budapest served as a way for the our team to identify 18 commonly seen outcomes across programs, which were discovered through conversations, break out sessions, and logic modeling. Participants in the Data Collection Survey were asked to select outcomes and targets the programs they reported on had. These 18 priority goals are:
- Building and engaging community
- Increasing accuracy and/or quality of contributions (i.e. clean high resolution photographs which are placed in the proper articles)
- Increasing peoples awareness of Wikimedia projects
- Increasing peoples buy-in for the free knowledge/open knowledge/culture movements
- Increasing contributions to the projects
- Increasing diversity of contributions and content
- Increasing diversity of contributors
- Increasing positive perceptions about Wikimedia projects
- Increasing reader satisfaction
- Increasing the usefulness, usability, and use of contributions
- Increasing the use and access to projects
- Increasing peoples editing/contributing skills
- Increasing volunteer motivation and commitment
- Increasing respect for the projects (i.e. higher education acceptance)
- Making contributing fun
- Making contributing easier
- Recruiting new editors/contributors
- Retaining existing editors/contributor
In the survey, program leaders could also write in other goals in a section titled "other". This set of 18, with the "other" option, were presented for each program that program leaders reported on. They were asked to select priority goals for their reported programs. The number of goals program leaders reported ranged from five to 11. The overall mean for any given program was nine selected priority goals. In general, program leaders demonstrated difficulty in "prioritizing": Out of all reports, only 12.5% selected five or fewer priority goals.
Inputs and participation
[edit]Inputs
[edit]- The majority of program leaders reported some type of budget, but the majority didn't report data about hours it took to implement their programs.
Regarding inputs, program leaders were asked to report:
- Budget – how much it cost them to produce their program in US dollars
- Staff and volunteer hours – How many actual or estimated hours staff and volunteers put into their program from beginning to end
- Donated resources – Including equipment, prizes, give-aways, meeting space, and other similar things donated by organizations or individuals to support the program
Most program leaders reported budget data, while a larger number did not provide data about hours. Across the 119 report responses:
- 55% included budget data (22% of budgets reported were zero dollars)
- 34% included staff hours (51% of staff hours reported were zero dollars)
- 44% included volunteer hours (2% of volunteer hours reported were zero dollars)
Participation
[edit]- The majority of program leaders could report how many people participated in their program, when little over half were able to tell us how many new editors made accounts for their programs.
Regarding participation, program leaders were asked to report:
- Total number of program participants
- Number of participants that created new user accounts during the program
The majority of participants reported the total number of participants (98%), when little over half (57%) reported number of new user accounts created during their program.
GLAM content donations had a slightly different reporting request about participation:
- Total number of GLAM volunteers involved in the program (78% reported)
- Total number of GLAM staff involved in the program (89% reported)
Program leaders were also asked to provide the dates, and times, if applicable, for their program.
Content production and quality improvement
[edit]Content production
[edit]- Most program leaders were able to tell us how much media was added during their program, but a minority were able to report on how many characters (bytes) were added during their events, let alone how many editors actually editing during their programs.
Read more |
---|
Regarding content production, program leaders were asked to provide various types of data, about what happened during their program, depending on the level of data they were able to record and track. These data types were:
Content production metrics were not requested of those who reported about editing workshops, since content production is not the main goal of that type of program. |
Quality improvement
[edit]- Most program leaders were able to report how many and much of their images, uploaded during their program, were used in the projects after the program ended. However, most were unable to report about the quality of articles and images, and most that did report on it stated that no featured, good, or valued articles or images came out of their event.
Read more |
---|
The survey also asked that program leaders report on the quality of the content that was produced during the program. They could report:
Those who reported about edit-a-thons and workshops were not asked to report about image use and quality. |
Recruitment and retention
[edit]- Just over half of respondents were able to tell us how many of their participants were active 3 months following their program and less than half were able to do so 6 months after.
Read more |
---|
Tools like Wikimetrics can make this possible, which means tracking usernames is important to learning about retention. For edit-a-thons and workshops, the majority of those reported on did not retain new editors six months after the event ended. A retained "active" editor was one who had averaged five or more edits a month.[1] |
Regarding the recruitment and retention of active editors, program leaders were asked to report two areas of data. An "active editor" is defined as making 5+ edits a month.[2]
Read more |
---|
If the program reported on was an edit-a-thon or workshop, program participants may have been split into two groups: new editors or existing editors, in order to learn the retention details about each cohort. This is important, since both edit-a-thons and workshops often attract new and experienced editors, unlike on-wiki writing contests that generally target existing contributors, and the Wikipedia Education Program that generally targets new editors. |
In terms of recruitment and 6-month retention of new editors (those who made accounts at or for the event):
- Edit-a-thons (44% reported, 85% of those reported zero retained)
- Editing workshop programs (56% reported, 78% of those reported zero retained)
- Wiki Loves Monuments recruitment and retention data was mined using the entire set of uploader usernames for the 2012 events.
- We asked different questions about recruitment and retention for Wikipedia Education Program and GLAM content donations.
We asked about the retention of partnerships between educational institutions or cultural organizations, instead of editor retention. Wikipedia Education Program respondents were asked to identify how many instructors were participating in the program, and GLAM content donation respondents were asked if the GLAM they worked with would continue their partnership with Wikimedia and if the content donation would lead to other GLAM partnerships.
Replication and shared learning
[edit]We wanted to learn if program leaders believed their program(s) could be recreated (or replicated) by others. We also wanted to know if program leaders had developed resources such as booklets, handouts, blogs, press coverage, guides, or how-to's regarding their program. We asked if the program:
- had been run by an experienced program leader who could help others do;
- had brochures and printed materials developed to tell others about it;
- had blogs or other online information written to tell others about it (published by yourself or others);
- had a guide or instructions for how to implement a similar project.
Reporting format and goals
[edit]
References |
---|
|