Learning and Evaluation/Evaluation reports/2015/Limitations
Appearance
Wikimedia Programs Evaluation 2015
How deep do the data go?
This report systematically measures a specific set of inputs, outputs, and outcomes across Wikimedia programs in order to learn about evaluation and reporting capacity as well as programs impact. Importantly, it is not a comprehensive review of all program activity or all potential impact measures, but of those program events for which data was accessible through voluntary and grants reporting.
Read this page to understand what the data tells you about the program and what it does not.
This report systematically measures a specific set of inputs, outputs, and outcomes across Wikimedia programs in order to learn about evaluation and reporting capacity as well as programs impact. Importantly, it is not a comprehensive review of all program activity or all potential impact measures, but of those program events for which data was accessible through voluntary and grants reporting.
Read this page to understand what the data tells you about the program and what it does not.
Data and analysis |
| |||
Priority goal setting |
Since 2013 the Program Evaluation and Design team has worked with the community to discover the most commonly seen goals for programs. Since the June pilot workshop activities online and at conference workshops - such as the Metrics Dialogue - have served as a way for our team to identify common goals across programs, discovered through conversations, break out sessions, and logic modeling. Most often these general goals [1] have been to:
|