Learning and Evaluation/News/Pre-conference workshop day for Wikimedia Conference
New in-person pre-conference workshop day for Wikimedia Conference!
[edit]
Evaluation and Program Design Pre-Conference[edit]The Learning and Evaluation team are planning to present a half-day pre-conference Wednesday May 13, 2015 in advance of Wikimedia Conference 2015. The workshop will include an introduction to program evaluation and design, overview of tools and strategies for evaluation planning, program monitoring, and outcomes assessment. Workshop goals include that participants will:
Important information[edit]
To start planning your conference see the conference page!
|
|
Agenda
[edit]
Wednesday |
Thursday |
*Times and session order subject to change
Workshop outputs
[edit]Wednesday, May 13
[edit]Goals
[edit]At the beginning of the conference, participants shared the goals each of them had for the session that day. This is the captured list:
- How to structure projects along a spectrum and keep people interested.
- Bring back hope and learning from this session.
- Learn to problem solve.
- Learn to fly higher.
- Suggestions for how to keep working on current and future projects.
- Learn how to have more impact and quality (be more daring)
- How to divide labor for unified evaluation without relying on one person.
- How to expand education to a whole country.
- Learn frustrations and difficulties of evaluation.
- Talk, share and learn together.
- Learn how not to burn out
- How to document for evaluation
- New ways of thinking and adapting.
- Learn tools!
- How to improve our work —> success!
- More structured programs - Logic Models.
- How to get things done.
- New ideas.
- Understand diversity of our work.
- Improve evaluation + share.
- Feel positive about evaluation + next steps
- Meet everyone and learn success and challenges and get to know how to support with grants.
- How to grow education initiatives.
- Wikimetrics
Hopes and Fears
[edit]
Hopes
|
Fears
|
Logic Models
[edit]This resource was presented by Christof Pins and Manuel Merz (Wikimedia Deutschland). The group worked through the notion of a staircase logic model, and then grouped to create a logic model for a specific Wikimedia program. These are the logic models created:
- Learn more about Logic Models on its page in the evaluation portal.
Ah-Ha's
[edit]
After the session ended, we asked participants to share one thing that implied a particular realization for them. This is the list of takeaways:
|
|
Next actions
[edit]We also asked participants what they were foreseeing as next steps with the takeaways they had. This is what they told us:
- Use GLAMorous
- Translate tutorial video on Commons and transcribe script
- Use metrics
- Make an intro to the community about all tools.
- I will use the tools presented and introduce them to other people.
- Workshops / Training on building context-relevant metrics.
- Share a logic model as a best practice to doing a project.
- Use logic models and education program extension.
- Education program extension a must. Shall be used in India.
- Put order in the house (after Wikimania).
- I will use CatScan and GLAMourous to know more about Wiki Loves Monuments in Spain in 2014. And improve WLM 2015 and 2016.
- Use logic models (or at least learn how to use them).
- Dig deeper in potential of Education extension for evaluation.
- Ask the community about deploying new extensions/tools.
- Use Wikimetrics more.
- Get data from our education program (300 schools in be).
- Continue with discussions around Education Program extension.
- Building logic models for our activities/projects.
- Ask professor to agree us using education program extension.
- Translate tools! Our community needs them.
Remaining questions
[edit]- As a reminder that we keep working to find an evaluation framework that helps us all, we asked participants to share remaining questions after the workshop:
- How to find realistic metrics (Set realistic goals)?
- How to build up a roadmap of learning these tools?
- How to break up project management tasks among different roles?
- How to learn when the external factors don’t work as we plan?
- I will think about it.
- How to answer cranky admins who oppose new tools/strategies?
- How to share the learnings from the workshops with local volunteers in an inspiring way?
- List of custom tools needed.
- Category wise User contribution breakup tool without making it public?
- How to read data from GLAMorous / CatScan?
Further resources
[edit]-
Logic Models
-
Data sources. Gathering evidence for your program story
Thursday, May 14
[edit]This day was dedicated to Evaluation clinic and presentation on specific topics around evaluation, like survey design and SMART planning. We also presented program toolkits and the draft of storytelling toolkit.
Further resources
[edit]-
Designing SMART proposals and Annual Plans
-
Global Metrics Tutorial
-
Wikimetrics Overview
-
Designing effective questions