Learning and Evaluation/Evaluation reports/2015/Wiki Loves Monuments/Limitations
This report systematically measures a specific set of inputs, outputs, and outcomes across Wikimedia programs in order to learn about evaluation and reporting capacity as well as programs impact. Importantly, it is not a comprehensive review of all program activity or all potential impact measures, but of those program events for which data was accessible through voluntary and grants reporting.
Read this page to understand what the data tells you about the program and what it does not.
Response rates and data quality/limitations [edit] |
Data was received for a total of 72 Wiki Loves Monuments contests from 2013 and 2014. Data on Wiki Loves Monuments contests in 2013 and 2014 were collected from three sources:
(1) from the Program Leaders directly;
(2) from publicly available information on organizer websites and on-wiki reports; and (3) through WMF Labs tools such as Wikimetrics, Quarry and Catscan.[1]
Through these sources we were able to collect information about: event priority goals, donated resources, budgets, staff and volunteer hours contributed, number of participants, number of new users, user retention, number of media files uploaded, number of unique media used in Wikimedia projects, and ratings of image quality (Featured, Quality, Valued). In total, we evaluated 72 Wiki Loves Monuments implementations: 37 from 2013 and 35 from 2014.
Values for a number of the key metrics were only reported for a minority of events, and in the cases of budgets and volunteer and staff hours these values could not be mined from other sources. Furthermore, data for the contests were not normally distributed; the distributions are, for the most part, skewed. This is partly due to small sample size and partly to natural variation, but does not allow for comparison of means or analyses that require normal distributions. Instead, we present the median and ranges of metrics and use the term average to refer to the median average, since the median is a more statistically robust average than the arithmetic mean.’’ To give a comprehensive picture of the distribution of data, we include the means and standard deviations as references. To see the summary statistics of data reported and mined, including counts, sums, arithmetic means and standard deviations, see the appendix.
| ||||||||
Priority goals [edit] |
As the table below shows, 8 priority goals were selected by over 50% of reporting program leaders. The two most popular goals, noted as priority by at least 85% of program leaders each year, were to increase contributions and to build and engage community. Furthermore, around three-quarters of program leaders stated that increasing awareness of Wikimedia projects and recruiting and/or converting new users was a priority for their contest.
|