User:Stefan Schneider (WMDE)/test
Current Status
[edit]- March / April 2018 - Planning the next campaign.
- 2017-01-(01-15) - Thank You campaign in January 2018. This is the last campaign of the cycle we initially planned and the first one of the next cycle in 2018.
- 2017-10-(04-17) - After all findings of the previous campaigns and a pretest of the banners and landing pages the New Editors Campaign in autumn was realized.
- 2017-07-(11-18) - To get more information on live tracking and perception of banners we run an additional New Editors Campaign in Summer.
- 2017-04-(12-22) - In Spring we ran our first campaign with a detailed and working tracking of the clicks and registrations
- January 2017 - Our very first New Editors Campaign started.
Background
[edit]Since 2007 the number of active editors is constantly declining. To set a contrast to this development WMDE is testing online campaigning to convince Wikipedia readers to edit Wikipedia.
Motivation and Scope
[edit]Our motivation was to stop the downward trend of new editors (>10 edits). Therefore we planned four campaigns throughout 2017, including the January campaign in 2018. With that campaigns we challenged the underlying assumption, that online campaigning on Wikipedia can attract Wikipedia readers to get involved and become an editor (>10 edits).
As it is our first attempt to attract users with online campaigning in Wikipedia, we decided to use an iterative approach: We ran the first two campaigns on low banner diet (between 5-20%) to learn more about the effects of campaigning.
Questions we were interested in:
- Which effect does online campaigning has on readers and editors of Wikipedia?
- What works and what does not work to attract new users and motivate them to edit Wikipedia (banner design, call to actions, user journeys, videos, approaches to “learn Wikipedia”, etc.)?
- Which means are there to measure campaigns?
After running the first campaign on a low banner diet, we used the findings to create a big campaign with an 80% banner diet. With that approach we made sure to make crucial learnings at the beginning and to use only best practices on a big scale and guarantee the most positive impact of campaigning.
Key findings and results
[edit]- Use simple and actionable call to actions (CTA) for banners (e. g. You can improve the accuracy of Wikipedia! CTA: Learn how to improve articles).
- Use a factual and clean design for banners to attract new users. This is perceived more accurate and professional than colorful and creative banner designs (and is more accepted by Wikipedia readers).
- Explain very shortly and clearly the next steps to edit Wikipedia on a landing page, videos help people to understand and learn.
- Guided tours help people to get an overview of Wikipedia’s “buttons” before starting with the first edit.
- Campaign periods not during holidays. People use Wikipedia mostly on Mondays and least on Weekends (at least in Germany).
- Get as much feedback as possible from the community and if possible from external organisations to test messages and designs before running a campaign.
- Implement daily monitoring of the key indicators to correct at any given moment.
Campaign results in detail
[edit]Initially we planned four campaigns with an ascending banner diet:
- Thank You Campaign in January 2017 (100% But: The Main part of the banner is from the Fundraising team aimed to attract new members of our NGO. Only one small button directs the users to our landing page for new editors)
- Spring Campaign in April 2017 (5-10%)
- Autumn Campaign in October 2017 (80%)
- Thank You Campaign in January 2018 (100% But: The user first sees a banner form the Fundraising team. After four impressions the 'new editors banner' was shown.)
While conducting the first two campaigns, we found, that we need more testing and information to execute a big scale campaign in autumn with a banner diet of 80% or more. Therefore we developed an additional fifth campaign:
- Summer Campaign in July 2017
Below you can find the results of all campaigns in detail:
Thank You - Campaign
[edit]Banner
Shortly explained:
Our first campaign to attract new editors was supplementary to the Fundraising Thank You campaign. The Thank You banner originally says thank you to the donors of the fundraising campaign. The banner held three Buttons:
- Thank You text
- Become a member of Wikimedia Deutschland e. V.
- Get involved as an editor
With every campaign we wanted to answer a set of questions.
Questions:
- Do illustrations support the message / call to action?
- Are introductory videos videos helpful for new editors?
- Which of the videos is more attractive to users?
Framing:
- Length: 6 days (2017-01-01 – 2017-01-07)
- banner diet: 100%
- User journey: banner -> landing page -> registrations
Learnings:
- Illustrations attract people to get involved.
- Wikimedia Commons is not suitable as media source for campaigning. The numbers of clicks could only be measured with a complicated workaround. In the meantime we solved that problem with a Video-Tool. An additional problem was, that the videos need to be rendered in different screen sizes. That takes a lot of time. Hence the videos need to be produced at least one week before using them.
- Tracking of registrations was not possible – only clicks on the button on the landing page.
- There were too many options in the thank you banner. It was impossible to direct as many people to the registration as we assumed.
- Also the landing page had too many options to get involved. The potentially new users didn’t really know what to do next.
Landingpage (German): https://de.wikipedia.org/wiki/Wikipedia:Wikimedia_Deutschland/Mach_mit
Spring Campaign
[edit]Banner - correct mistakes
Shortly explained:
With the second banner campaign in spring we wanted to attract users with one simple call to action: Correcting errors in Wikipedia. In contrast to the first campaign we used a very simple and plain design.
Questions:
- How appealing is the design of the banner and the landing page for new editors?
- How does a simple call to action appeal to new editors?
- Does a guided tour help to get started in Wikipedia?
Framing:
- length: 10 days (12. - 22-04.2017)
- banner diet: 5 - 10% (of not logged in users)
- user journey: banner -> landing page -> registration -> guided tour
Learnings:
- simple call to action works very well (significantly higher conversion rate)
- tracking successfully implemented (To find a solution to that problem we needed know-how from our Software team in SQL and much trial and error with the Campaigns Extension)
- planned impressions and registrations not reached. Idea for future campaigns: Implementing live monitoring of impressions and registrations (creating the possibility of raising the banner diet while running the campaign).
- On a short term scale the guided tour did not influence the number of new editors edits. But numbers are not high enough to prove this finding.
Landingpage (German): https://de.wikipedia.org/wiki/Wikipedia:Wikimedia_Deutschland/Fehler_korrigieren
Summer Campaign (additional)
[edit]Banner 1 - factual approach
Banner 2 - emotional approach
Shortly explained:
As described in the Motivation and Scope we decided to use an iterative approach to learn about the aspects of a successful banner campaign with every campaign. Having conducted two campaigns did not equip us with all information we needed to run an autumn campaign on max scale. We decided to add a summer campaign to get additional information (see Questions).
Questions:
- Which approach is more appealing to potential editors: emotional illustration and text or a factual look & feel?
- Does live monitoring in Wikipedia work?
- Which of our video tutorials works best?
- Does a guided tour really does not make a difference to the new editors edits?
Framing:
- length: 8 days (11. - 18.07.2017)
- banner diet: 20 - 30%
- user journeys:
- banner (emotional) -> landing page (emotional & not wiki) -> registration -> guided tour
- banner (factual) -> landing page (factual & on-wiki) -> registration -> guided tour
Learnings:
- the emotional design affects the conversion rates not significantly higher than the factual design
- daily monitoring works
- the message of the banner was not understandable for users (also the German community criticized the emotional message).
- the video on the top of the page was the one most clicked.
- click rates of videos were higher on the external (not wiki) landing page.
- tracking of external pages is difficult and not comparable to wiki page views.
- guided tours have a slightly positive effect on new editors edit numbers.
Landingpage (German):
#1 (factual) https://de.wikipedia.org/wiki/Wikipedia:Wikimedia_Deutschland/Entdeckungen-sortieren
#2 (emotional) https://entdecke.wikipedia.de
Autumn Campaign
[edit]banner: Banner 1 - correct articles
Banner 2 - add pictures
Banner 3 - add sources
Banner 4 - get involved
Shortly explained:
The previous three campaigns aimed to include all important learnings for the autumn campaign – as this was the campaign with the highest banner diet. In this campaign we used our findings and decided to test further with four different banners. We used a quite formal and factual design with three simple call to actions for each banner to get started:
- You can make Wikipedia more vivid! CTA: Learn how to add pictures to articles (German: Du kannst Wikipedia anschaulicher machen! CTA: So bebilderst du Artikel)
- You can improve the accuracy of Wikipedia! CTA: Learn how to improve articles (German: Du kannst Wikipedia genauer machen! CTA: So überarbeitest du Artikel)
- You can improve the reliability of Wikipedia! CTA: Learn how to add citations (German: Du kannst Wikipedia noch verlässlicher machen! CTA: So ergänzt du Belege)
Additionally to the three simple entry points we decided to just call for registration in the fourth banner.
Questions:
- Do the users need further information on a landing page to start editing Wikipedia?
- How does just a call for registration perform compared to the three different calls to action that have an additional landing page explaining next steps?
- Which of the three simple entry points works best to get people start editing?
Learnings:
- Currently a proper a/b testing is not possible with central notice options. Hence four (more than two) banners are shown randomly to probably the same users instead of showing only one banner repeatedly to only one reader.
- users provided with more detailed information and a concrete call to action are more likely to register and also to edit after registering
- the following call to action worked best to get people started: “You can improve the accuracy of Wikipedia! CTA: Learn how to improve articles” (German: Du kannst Wikipedia genauer machen! CTA: So überarbeitest du Artikel)
- feedback from the community (online and offline) and a pretest with an external company on the different messages and the design led to the best product possible
Link zur Landingpage: https://de.wikipedia.org/wiki/Wikipedia:Wikimedia_Deutschland/JetztMitmachen
Thank You - Campaign 2018
[edit]Banner
Shortly explained:
Like our first campaign to attract new editors also this campaign was supplementary to the Fundraising Thank You campaign. The Thank You banner originally says thank you to the donors of the fundraising campaign. This time our banner were shown separately after the fundraising banner were shown to the readers. Thus there were very few users, that really saw a banner aimed to attract new editors.
With this limited conditions we focused our aims on finding out more about the training modules that we transferred from the english modules.
Questions:
- How does a Thank You banner attract to contribute as a new editor?
- Do the users complete the trainings modules, when they start them?
- How does the completion of training modules affect the editing behavior?
Learnings:
- The very low banner diet and also the different messages – saying thank you for funding und also asking for contribution with editing – resulted in half of the average conversion rate in page clicks
- The numbers of users completing a training module are very low (29 people) and thus cannot be transferred into common sense about the editing behavior that follows. Nevertheless, tendencies do not show any differences in people who did a training module or not.
Link zur Landingpage: https://de.wikipedia.org/wiki/Wikipedia:Wikimedia_Deutschland/LerneWikipedia
Thank You - Campaign | Spring - Campaign | Summer - Campaign (additional) | Autumn Campaign | Thank You - Campaign 2018 | |
Banner diet | 100% (attracting new users was just a small button in the thank you banner of the fund raising) | 5-10% | 20-30% | 80% | 100% (our banner was shown only after the user saw four banners from fundraising.) |
Impressions | 29,5 m | 7,5 m | 12 m | 34,8 m | 16,2 m |
Page views | 1,318 | 44,576 | 68,333 | 231,251 | 17,081 |
Registrations | not trackable (there were 375 clicks on the button, that direct to the registration page) | 319 (with 1,033 clicks on the button, that directed to the registration page) | 158 | 1,054 | 120 |
Conversion (from impression to registration) | 0.000012 (based on the 375 clicks to the registration page) | 0.0042 | 0.000013 | 0.000030 | 0.000007 |
attracted users > 10 edits | (no proper tracking) | 4 | 9 | 14 | 2 |
Activities
[edit]Community Feedback To guarantee an acceptable design of the banners we always asked the community on Wikimedia Deutschland’s site on Wikipedia at least 2 - 4 weeks before the campaign started. We presented the banner designs, landing pages, the user journeys and other additional features like guided tours. In this Wiki we summarised all plans, steps we undertook to create a campaign and finally the results of the campaigns.
Other Feedback In addition we strive to get as much feedback as possible. In community workshops or community conferences (like Wikimania, WikiCon) we explained our upcoming campaign and also further ideas on attaining new editors with the community. We also used our general assembly to ask non-Wikipedians. Direct questions on design and user journey with concrete drafts helped most to improve the campaign.
In the Autumn Campaign it was very helpful to connect with an external company to do a pretest of our campaign. Questions to a set of people about our user journey and the design of the steps and banners integrated a totally different point of view and really improved the campaign design.
Community Workshops Some Wikipedians also decided to conduct a workshop dedicated to onboarding options that would help to get started in Wikipedia.
Associated research
[edit]Survey: Welcoming new editors
[edit]To get a first impression of how new editors are welcomed in the Wikipedia we commissioned a survey in January 2016. In this survey we asked community members of different age that are contributing in Wikipedia. In total 686 people answered questions about how new editors are welcomed, what are the difficulties to start editing, who is willing to engage in new way of welcoming new editors – to name a few.
One of the main findings was, that the current culture does not welcome new editors. The respondents reported that newbies are often treated rude or do not have good opportunities to learn the rules of Wikipedia or how to communicate with others.
Survey: https://meta.wikimedia.org/wiki/Wikimedia_Deutschland/Editor_Survey_2016
Qualitative Interviews
[edit]In this survey done by a German corporation called GIM we wanted to know more about the video tutorials we created for new editors. In these videos we explain the main rules of Wikipedia and the how-to’s of editing. Additionally the survey should give us more information on the motivational aspects of new editors and their impression of when and how newbies would expect help in the process of getting started.
The main findings are:
- Wikipedia is very relevant for the respondents and they expect high quality of the articles in Wikipedia
- People feel, that they need to be very competent in a topic before they feel enabled to start editing and often do not know, that they can edit Wikipedia
- WikiCode is perceived as very complicated and many people are not willing to invest in learning WikiCode
- The videos are perceived as very informative but at times quite amateurish
Survey: https://de.wikipedia.org/wiki/Datei:Report_WMDE_Online-Kampagne_Neuautorengewinnung.pdf
Benchmark Analysis
[edit]We also did a benchmark analysis in which community building strategies from different companies (NGO and commercial) were compared. The following aspects were identified as potentially helpful to integrate in Wikipedia:
- establish more opportunities to share contributions or contributors on Social Media
- opportunity to follow editors or their contributions to create more interconnectedness
- exclusive events for editors
- establish exclusive functions for editors
Analysis: https://de.wikipedia.org/wiki/Datei:Wikimedia_Benchmarkanalyse_zum_Community-Building.pdf