Learning patterns/Country editor base survey of South Africa 2015
What problem does this solve?
[edit]Typically it is very difficult to serve a country wide community covering over eleven language versions of Wikipedia if you do not know who is editing what. This lack of knoweldge also creates problems when trying to address other important issues such as diversity.
What is the solution?
[edit]Conducting a nationwide survey using public banner ads for registered editors on Wikimedia projects such as Wikipedia within the country in question. In this case South Africa.
Key things with National Surveying
[edit]- Make sure people relevant to the survey participate. In this case that means editors based in South Africa. This is achieved by ensuring the banner notification is only visible in the geographic area you are interested in. South Africa in Wikimedia South Africa's case.
- Response rates are important; its not just about who you reach, but who actually answers your survey.
- Careful consideration of the questions that are being asked. This is hugely important to get right if you want to conduct followup surveys in the future to track change. If a question is even slightly differently worded it can greatly impact and change the answers given by survey participants.
- Work with the Wikimedia Foundation. The Foundation can be very helpful with designing, hosting, and organising the banner add/Centralnotice to host the survey on.
- Give the survey population sufficient time to participate. Increase the likelihood that as good a proportion of the local community will participate in the survey by leaving it up for a sufficient length of time. Not all editors are editing every month. We left it up for three months in mid-2015.
- Less questions the better. People do not like answering long surveys. Short surveys increase the quality of data collected as well as improves the response rate. Ten questions is great, twenty questions is okay, thirty questions is a long survey, over thirty questions is too many.
- Minimize the number of qualitative (text based) questions. Save such questions for issues that you really want to dive deep into or know very little about. Most other questions can be simplified to multiple or single choice questions. Only around a maximum of 10%-20% of your questions should ask for written answers. This is because survey takers do not like giving written answers and they therefore tend to go unanswered if there are too many such questions.
Findings
[edit]The infographic drawn up and displayed to the left illustrates key findings of the survey. It at once summarises data from the survey findings whist, hopefully, honestly letting the data tell a story of what the profile of the editing community looked like in South Africa in 2015. And does so in an easy to understand, memorable and engaging way.
The main finding or 'story' that emerged from the survey data was the lack of diversity amongst Wikipedia editors in South Africa. That, in addition to there being a need for more editors of all backgrounds, there was an even greater need for female editors and editors who were eager to edit in languages other than English and, to a lesser degree, Afrikaans. This means that outreach activities should be focused more at encouraging women to edit as well as editors who speak languages such as isiZulu, isiXhosa, and other less represented languages on Wikipedia.
Another interesting finding was the heavy concentration of editors in only two of South Africa's cities. Namely Cape Town and Johannesburg/Pretoria. Thereby indicating that to serve the existing community the chapter can focus on these two cities but to increase diversity productive efforts might be made outside of these areas.
The 31% of editors who had heard of Wikimedia ZA highlighted the need for the chapter to be more public and engage with the existing community. Whilst the other results highlighted the 'middle-class' origin of most editors. The age data showed a relatively high level of diversity in this indicator.
Learnings for the future
[edit]- Greater consideration of the survey questions asked. Whilst we did take this seriously there were questions that, with hindsight, we should have worded differently, dropped, added, or changed to get better insight for followup surveys. Such as asking for the survey participants self-identified ethnic or racial group. That information would have been helpful to tracking and measuring diversity which in turn would help shape future outreach activities to address diversity issues. In the future conducting a focus group to help draw up and shape questions beforehand would have helped improve this.
- Once data has been collected be mindful of the most important information to highlight in the end analysis. A lot of data is collected and people not directly involved often lack sufficient interest to want to know all the statistics needed. This gives the analysis a great responsibility to identify important and relevant information to highlight to others.
- Make the analysed information based on collected data as easily accessible to people as possible. This could involve using multimedia such as drawing up an infographic that simply, concisely and (very importantly) honestly displays the most important information that gives the most effective portrayal of that community's profile. The infographic displayed to the left is an example of this that was created to summarise the findings of the 2015 survey.
- Plan to conduct followup surveys. Getting a better understanding of the community the chapter needs to support is good; using that information to develop projects to better serve that community is great; running follow up surveys to measure the impact of chapter efforts is the best of all.