Jump to content

Grants talk:APG/Proposals/2012-2013 round1/Wikimédia France/Impact report form

Add topic
From Meta, a Wikimedia project coordination wiki

Notes on reporting process

[edit]

WMFR submitted an Impact Report for their 6-month bridge funding period since they were awarded 6 months of bridge funding in 2012-2013 Round 1 of the FDC process. As a result, WMFR is the only entity to submit an FDC Impact Report at this time. We asked WMFR to submit this report within 30 days of the end of the funding period, although entities typically have 90 days after the end of the funding period to prepare an Impact Report. This is because this Impact Report will close WMFR’s bridge funding period, but is also an important indicator of the current progress of WMFR’s programs that continue into its new 12-month funding period that began on 1 July 2013. The discussion of WMFR’s Impact Report will therefore not be typical for future Impact Reports.

We thank WMFR for preparing this Impact Report for the 6-month bridge funding period, and look forward to their Impact Report on their first full year of FDC funding, which will be submitted by September 28, 2014.

Overview

[edit]
  • We recognize that WMFR was working hard to maintain its program activities during this period of bridge funding, and without having an Executive Director in place. We also recognize that WMFR was preparing this report while also planning for its new funding period and working on hiring an ED.
  • We appreciate that WMFR is the first entity to use this Impact Report form. We welcome feedback on the form as we make improvements for future rounds of Impact Reports. For example, the grid format for reporting progress against the year’s goals is not easy enough to read, and clear guidelines for the case study sections are needed.
  • Noting that WMFR’s bridge funding and the timing of this Impact Report are an exception to the FDC process, we expect that entities completing Impact Reports in the future will reflect more on past successes and challenges, and offer more insight into their ways of working, programs, and operations, as well as detailed metrics. This greater depth will correspond to entities reporting on their fully executed annual plans for a 12-month funding period.

Spending and revenues

[edit]
  • Congratulations to WMFR on surpassing its goals for membership fees and donations!
  • WMFR spent 72.4% of its planned expenses during the bridge funding period. While spending less than planned in most areas, WMFR spent 142% of its anticipated expenses in the area of fundraising. WMFR offered detailed explanations for these variances from plan, and has adapted to changing plans and circumstances, especially around staffing transitions. Underspending may not indicate inactivity in all program areas in cases when WMFR has found a low-cost solutions. Some of this learning should be put into practice for future planning.
  • We understand that WMFR is facing financial uncertainty. In addition, WMFR is attempting to raise funds from outside the movement, which is a challenge many entities are facing.

Appreciation

[edit]
  • Congratulations to WMFR on hiring its new Executive Director, Nathalie! We welcome her to the movement, and we look forward to working with her.
  • We notice that WMFR highly values its relationships with partners and its strong reputation within France as a tool for achieving effective programs. We congratulate WMFR for continuing to successfully maintain its good relationships and reputation as a partner. 10 new partnerships is a lot! Congratulations. We are also glad that WMFR is thinking about how and whether it is possible to take on new partnerships. We look forward to learning more about how WMFR does this in the future.
  • We are interested in WMFR’s strategies to address volunteer decline (in the WMFR chapter) through offering different types of direct support to its community. We are glad WMFR prioritizing this important challenge.
  • We are excited to see elements of WMFR’s Afripédia programs focused on editing projects other than Wikipedia (for example, Wiktionary) in local languages without large Wikipedias.
  • We look forward to seeing WMFR multiply or expand the impact of its training activities by reaching more trainers.
  • We are excited to see your numbers showing “growth,” and not only by your success here but also by your ability to track growth in these areas. We are excited to know more about quality and retention of active editors over time.

We would like to know more

[edit]

Questions about spending

[edit]
  • We understand that WMFR may retain reasonable fundraising expenses (up to 10%) of the fundraising revenue collected. We did not see any total included for revenue offset for the fundraising expenses that is allowed under the fundraising agreement. Can you tell us where it is, if we’ve missed it?
  • We notice a line item for “training payment” in “other revenues” (Table 2); did the GLAM institutions pay for WMFR to provide training?
  • Kindly provide some more information about the review conducted of WMFR’s microgrants process. Is this review available publicly? Many other entities are facing challenges in this area as well, and may learn from your report. For example, in terms of your under-spending on community grants, do you think this was because community members were unaware of the program, or because activities taking place did not require funding?

Answers

[edit]
  1. We have retained 62,976.86€ (78,721.08 USD), including 21,434.33€ (26,792.91 USD) of staff costs and 41,542€ (51,928.16 USD) of other fundraising costs. The 43,985€ (54,981 USD) fundraising costs in the impact report include costs for finding other revenues. For details, you can read our fundraising costs report.
  2. Our trainings are free, however institutions like universities may consider Wikimedia France as an external training service and pay us according to their usual fares. We never ask money for trainings, but some institutions insist on paying (it is a French bureaucracy thing where it is simpler for everyone to let the institution pay for the training).
  3. The review we conducted was not formalized as a document which could be shared publicly ; that said, it sparked some reflections on such programs, which were shared (still a work in progress though). The challenges identified in the review were detailed in our answers to the FDC staff during the Q1 report. We indeed think that the underspending is mainly caused by unawareness of the program, coupled by inefficiency of the process as it was designed: the microgrants process used to be entirely done on the "private" website of the chapter − you needed to be a member (or to know a member) in order to receive money from the microgrants. Now the process is public on meta and we have more volunteers involved, meaning we are able to process the demands more quickly.

Questions about metrics

[edit]
  • We appreciate the metrics you shared in the report on new contributors thanks to WMFR’s programs, including the number of articles affected, and number of uploads to commons, etc. Please explain in more detail how you attribute some of these gains to WMFR’s activities.
  • We have a few questions about the outcomes of some of your projects and activities:
    1. You mention in “Highlights” that 69 pictures were featured as of April 2013, but later in the text (in “Growth”) you say 21 pictures were featured. Can you clarify the difference between the two figures?
    2. Would you provide the number of institutional reusers of DBPedia:fr you recorded?
    3. Did you organize a contest to encourage data visualization and other apps to improve Wikipedia as you planned?
    4. You’ve done many workshops and trainings with GLAM institutions and noted one of your goals was “15% of people being autonomous after 1 month.” How close did you come to that goal?
    5. You conducted a large number of trainings as part of the education program in order to reach new editors and promote Wikimedia. It seems these trainings were popular! Please tell us more about the outcomes of these trainings: for example, was new content generated or were new editors retained?
    6. In your “success” case study, you write about the long-term training with PhD students. Now that some time has passed, have you reached your goal of 10% editing 6 months after the session end, more than 1 article by PhD student, and 80% of changes not reverted?

Réponses

[edit]
  • We appreciate the metrics you shared in the report on new contributors thanks to WMFR’s programs, including the number of articles affected, and number of uploads to commons, etc. Please explain in more detail how you attribute some of these gains to WMFR’s activities.
    The new contributors metric of our report accounts for contributors who started editing Wikimedia projects after a workshop or training organized by WMFR. We track the usernames of people involved in our activities, which helps building these metrics.
  • We have a few questions about the outcomes of some of your projects and activities:
    1. You mention in “Highlights” that 69 pictures were featured as of April 2013, but later in the text (in “Growth”) you say 21 pictures were featured. Can you clarify the difference between the two figures?
      As of April 2013, Wikimedia France had supported the creation of 69 Featured pictures (all time results). On the reporting period we had tracked 21 new Featured pictures (see [1])
    2. Would you provide the number of institutional reusers of DBPedia:fr you recorded?
      As of November 2013, 3 institutional reusers are recorded:
      - http://museophile.fr/start?langage=fr a cultural application owned by the ministry of culture http://museophile.fr/apropos?language=en
      http://www.citedelamusique.fr/francais/ The website of the National center of Music is currently in progress to use DBPedia Fr
      http://hdalab.iri-research.org/hdalab/ HdALab, database of pedagogical resources for history of Arts (Ministry of Culture)
    3. Did you organize a contest to encourage data visualization and other apps to improve Wikipedia as you planned?
      We did not have time to organize the contest before the summer. Time was allocated to organize the transfer of hosting of the project. With the support and help of the ministry of Culture, we will organize the contest at the end of 2013−beginning of 2014. The ministry just created for early December a part-time job to help us to organize the contest.
    4. You’ve done many workshops and trainings with GLAM institutions and noted one of your goals was “15% of people being autonomous after 1 month.” How close did you come to that goal?
      In general, we have around 20-30% of people who make alone at least 1 edition after the workshop. But we have very different situations : some of workshops give 50% people autonomous to contribute after the workshop, some other 0 or 10%. After some months, the number of editors involved is less than 15%. During the period, we only have some "one shot" workshops for GLAM institutions, no regular workshops (which give better results in general)
    5. You conducted a large number of trainings as part of the education program in order to reach new editors and promote Wikimedia. It seems these trainings were popular! Please tell us more about the outcomes of these trainings: for example, was new content generated or were new editors retained?
      We have several goals for our trainings, according to different publics. With teachers from secondary schools, the goals are: reassure and improve the understanding of Wikimedia projects and give them working methods and tips in order to work with their high school students on Wikipedia (which may be adding content). Our role is to assist them in this process. In higher education we train researchers in charge of courses on the same goals, but we also offer training sessions for researchers and PhD students in order to add contents. The majority of courses in the secondary do not result by the addition of content but improve the understanding of Wikimedia projects. In higher education, we have specific trainings (as with PhD students) dedicated to adding quality content.
        • 6 trainings with researchers, librarians in charge of courses: 92 attendees --> outcomes : we measure the goals (improving the understanding of Wikimedia projects and encouraging researchers in charge of courses and university librarians to work with their students on Wikipedia) with a survey sent by email after each training session, and we identify the number of teachers who have started an educational project on Wikipedia). The outcomes are: on 33 answers, we have 42% who wish contribute to Wikipedia. It is hard to say if attendees have contributed after the training because the purpose of the trainings is not directly to contribute (no creation of user account, or no communication about their user account, etc.)
        • 4 trainings with PhD students: 67 attendees –> 33 articles added. 800 ko of content added. During trainings, each students had to add to their user page a specific userbox that allowed us to finely track how much content they added to the projects
        • 5 trainings with teachers: 86 attendees --> outcomes : we measure the goals (improving the understanding of Wikimedia projects and encouraging teachers to work with their students on Wikipedia) with a survey sent by email after each training sessions, and we identify the number of teachers who have started an educational project on Wikipedia). The outcomes are: on 23 answers, we have 39% who wish contribute to Wikipedia, 78% who wish start an educational project on Wikipedia. Its hard to say if attendees have contribute because the purpose of the trainings are not directly to contribute (no creation of user account, etc.) Among these attendees, 15 teachers have worked on Wikipedia with their students (most of them during the Wikicontest with high school students).
        • 5 trainings with students: 223 attendees --> outcomes: we measure the goal (improving the understanding of Wikimedia projects) with a survey before and after the training sessions, when we had a relay by their teachers. The outcomes are: 138 answers, we have 77% of them who have a better understanding of the workings of Wikipedia and who want to learn more, 38% wish to contribute. Like for teachers, in secondary school, it is hard to say if they have contributed).
    6. In your “success” case study, you write about the long-term training with PhD students. Now that some time has passed, have you reached your goal of 10% editing 6 months after the session end, more than 1 article by PhD student, and 80% of changes not reverted?
      We trained 67 PhD students. Of the 67 PhD students only 11 have continued to contribute in the weeks following the end of the long-term training session. Six months later, we just have a handful who have contributed once or twice. Interesting thing, most of them had not contribute in their area of ​​expertise but on entertainment topics. We have not reached our goal, but we have identified some points in order to improve our results for the two sessions scheduled this year. PhD students are very busy, especially in the final year of doctoral thesis. So, we'll favor the registration with the 1st year. We will plan to more focus on the value of contributing to Wikipedia in their résumé and especially in their scientific careers. For future trainings sessions, we also plan to require the creation of an article. We planned to send a survey, few months after the training session. That has not been done yet. About the revert, some contributions were reverted, less than 10% in total. Most of them was for copyright violations. These reverts appeared very quickly during the training sessions. This allowed us to insist about authors' right and free licences.

Questions about challenges and future strategies

[edit]
  1. Given your challenges with last year’s Research Award program, would you describe the changes you will make to the program to improve outcomes, and also discuss why you’ve decided to continue this program despite the challenges?
  2. We note your challenges around some of your Afripedia programs due to lack of dedicated support on the ground and low rates of retention and participation and would like to know more. Have you considered partnership models that might better support this work on the ground? Would you discuss how you are planning to change these programs based on what you’ve learned in order to achieve better outcomes?
  3. Please tell us more about the board’s decision to withdraw from the Europeana project.
  4. You observe that having stronger themes for your workshops will make your activities more successful. Would you please explain this a little more? How and why do you think stronger themes will lead to better outcomes or more impact?


Answers

[edit]
  1. Given your challenges with last year’s Research Award program, would you describe the changes you will make to the program to improve outcomes, and also discuss why you’ve decided to continue this program despite the challenges?
    Some elements are given in the Impact report Form. For the next Research Award (scheduled for autumn 2014), we will:
    • pay more attention:
      • to the calendar for a sitenotice on the Wikimedia projects in order to encourage more participation in the voting ;
      • to provide summaries of scientific papers in several languages and other media (for example, interviews of co-authors and jury members, etc.) on our dedicated website ;
      • to involve more volunteers of Wikimedia France (for translation and communication) ;
      • expand the network for the voting call (find other mailing lists involved in these issues) ;
      • find external partners to better promote the project and planning an award ceremony ;
      • find a dedicatee: for example Roy Rosenzweig (the first recipient)
    • Programming activities Research Award 2014
      • August-September 2014: contact jury members, contact Centre for History and New Media to propose Roy Rosenzweig as dedicatee, update of Meta page, update of dedicated website, contact external partners.
      • Early September: Call for Wikimedia community's participation, preparation of messages / sitenotices in different languages
      • End of September: papers submissions dead line, Jury starts selecting 5 shortlisted papers;
      • Mid of October: 5 shortlisted papers chosen by the jury; publication of summaries of shortlisted papers (in several languages) and discussion, press kit, update dedicated website for communication.
      • Mid of November: Closing of votes
      • End of November: Public announcement of the winner, press release.
    With all these issues and improvements identified, we thought that it would be a shame to stop here. We think (including the few volunteers involved in 2012) that the project has a great potential for the research community who work on the Wikimedia projects and for the Wikimedia community. If the Award did not work as well than we hoped it's because (like we said) we had troubles for organization and communication for this first edition. Second point, the cost (time and money) of this international project is not that high compared to the impact that it could have had.
  2. We note your challenges around some of your Afripedia programs due to lack of dedicated support on the ground and low rates of retention and participation and would like to know more. Have you considered partnership models that might better support this work on the ground? Would you discuss how you are planning to change these programs based on what you’ve learned in order to achieve better outcomes?
    • For the third training session, organized in Yaoundé in October 2013, we experimented a better selection of the trainees. We organized an online presentation with Q&A for the future trainees. At the end of the presentation, we asked them to give us a motivation letter explaining their understanding of the project and their involvement.
    • We plan a type of “certification” for the trainers we train, to help them to value the project in the institutions they are. The certification will be in two parts: one for the offline access to Wikipedia, one for the training to edit Wikipedia. The certification will be offered to all the people we already trained, and in the future for the people trained by the first Afripedia users interested in disseminating the project.
    • We discuss the opportunity to have a community manager installed in Africa to help, both online and “IRL”, the project. We have a plan in Mali, currently in discussion between the Afripedia team and the board of WMFR.
  3. Please tell us more about the board’s decision to withdraw from the Europeana project.
    Our withdrawal from the Europeana GlamWiki Toolset project was almost entirely driven by financial considerations. As part of our budget revisions, we had to make heavy cuts in our programs. As it was executed by an external organisation, the GlamWiki Toolset project was one of the most easy to cut as it did not impact our in-house programs. We were comforted in our decision by a (as far as we know) similar one made by the Dutch chapter.
    We do believe in the GlamWiki Toolset potential. We think it will greatly help the movement with the GLAM mass uploads (including us − we have a backlog in this area, and they represent a heavy toll on a few volunteers availability), and it aligns perfectly with our goal of enabling and empowering the GLAM institutions to contribute themselves.
    With hindsight, the withdrawal from this project may have been careless. It stemmed from an incomplete view of our financial situation (considering our current underspending, we could have afforded to keep the project on board) and from a lack of visibility of the board on the project and its deliverables, itself widely caused by a deficient monitoring of the project on our side.
  4. You observe that having stronger themes for your workshops will make your activities more successful. Would you please explain this a little more? How and why do you think stronger themes will lead to better outcomes or more impact?
    Having a special theme for a workshop − such as “GLAM”, “Science”, “WikiSource” etc. − have several aims.
    • It allows to target a specific public.
      • We can specifically invite contacts (eg new GLAM just met) still at an early stage, and to write targeted email invitations.
      • With a general public, the attendees tend to be people who are discovering Wikipedia, which often means non-digital native kind of people. Thus, a lot of time is spent on teaching basic computer skills instead of specific Wikimedia related skills. On the other hand, strong themes are attractive to people passionate about a subject, which means they are already confident in their relative expertise ; they are thus less reluctant to contribute, removing the “I don't edit Wikipedia because I'm not confident in my own knowledge” mental gate.
    • A workshop with a strong theme means a more synergetic atmosphere, since people are more likely to contribute to connected articles or projects: each person is contributing to a part of a consistent whole, which is much more satisfactory than writing on a isolated topic. People from the same professional background can easily discover, share and discuss about Wikimedia projects or philosophy in the light of their own experience.
    • We gain feedback on the format & organisation − such as the time or the place which can be adapted (eg, not at lunch break time) to better tailor a given public.
    • We can gain insight on the local opportunities − if a lot more people show up to the Science workshop than the GLAM one, it helps deciding where our efforts should be focused, and what human resources we may need to reinforce the workshop (for example, a GLAM partner to talk to GLAM newbies).

Suggestions for future reports

[edit]
  • In future reports, and especially future Impact Reports, we encourage WMFR to also provide images, sounds, or videos highlighting its work and tell its story in more meaningful and engaging ways. This could include, for example, charts of growth rates, or some of the images that were major successes for contributions.
  • Also, it would be helpful to include links to the specific projects that you ran. For instance, the “BROCAS” project sounds interesting, but it would be best to include links to documents published on this area.
  • Case Study sections are opportunities to tell a story and should not be presented as bulleted lists. We realize that this is the first time any entity used the Impact Report form, and so we see this as a clear area where the form needs to be improved.
  • For future reports, it would be useful to include links to communications posts.
  • When submitting the Impact Report, please report on all of the metrics listed in the original proposal, except in cases when a program or activity was cancelled or in cases when you could not measure it as planned. We also hope changes to the Impact Report form will help with this aspect of reporting.
  • In general, Impact Reports are an opportunity to share deeper reflections on both successes and challenges, and to show how an organization is growing and changing its strategies based on what it is learning. We encourage WMFR and other entities to take advantage of this opportunity to reflect.


Thanks a lot for these suggestions, this is helpful − we will definitely take them into account for future Reports. Regarding images or videos: we definitely agree that the report would gain from it. One suggestion though: we felt that the “Progress against past year's goals/objectives” table format did not lend itself well to feature additional media ; maybe the form on that particular regard could be improved for future impact reports? --Nathalie Martin (WMFR) (talk) 14:42, 15 November 2013 (UTC)Reply

Excellent point, Nathalie Martin (WMFR), on the table format: we completely agree and are making the revisions. Any other suggestions you have to make this a better final reporting process, please do let us know. We've started a list here. Many thanks! KLove (WMF) (talk) 16:46, 15 November 2013 (UTC)Reply