Jump to content

Wikimedia Foundation/Communications/Wikimedia Foundation messaging strategy/2014-16 audit/tr

From Meta, a Wikimedia project coordination wiki

Wikimedia Foundation messaging strategy: 2014–16 communications audit

In late 2016, the public relations firm working with the Wikimedia Foundation Communications department, Minassian Media, conducted a communications audit. To develop an effective Wikimedia Foundation messaging strategy, the Communications department wanted to first get an assessment of our recent communications efforts. The audit was commissioned to assist with making that assessment.

The audit analyzes all coverage garnered by Wikipedia and the Wikimedia Foundation between two time periods. The report covers two periods, October 2014 – October 2015 and November 2015 – May 2016, which were identified by the Foundation to analyze and best understand how coverage changes over time.

Download report

The report is available in English as a PDF. Due to the copyrighted content within the report, the file is available via Wikimedia Foundation's website rather than Wikimedia Commons.

Methodology

Minassian Media began their analysis by identifying 91 selected publications, a list created in consultation with the Wikimedia Foundation's Communications team.

After defining these outlets, Minassian Media began researching articles written on the Wikimedia Foundation. They started by assessing the catalogue of articles housed on the Wikipedia press coverage page, from which they were able to identify the articles written during period one and period two.

To ensure Minassian Media found all articles written by the selected publications, they continued their analysis by searching Google News for additional content. They did this by searching each outlet’s name in two ways: first using “Wikipedia” and then by using “Wikimedia” and site:"publication website address". They also visited each outlet’s website separately from Google News and used search functions to see all articles written on “Wikipedia” and “Wikimedia” during the two periods analyzed.

Minassian Media kept track of all articles in a Google Sheet and organized each article by a multitude of factors. First, they read the articles to make sure each was an article about Wikipedia and was not a caption credit for Wikimedia Commons or an article that had nothing to do with our organization.

Minassian Media then coded each article along the following metrics: outlet, author, title of article, date, total number of words, topics mentioned, story origins, sentiment, and an array of 16 keywords. They then categorized articles into period one and period two, based on the date of publication.

The section that includes analysis on Number of articles written, words written, and outlet share of voice was created by tallying the totals for each of the metrics and ranking each by category.

The number of articles by country section was created by tallying the total number of articles by outlet. Minassian Media then added the number of articles from each outlet to a list of corresponding countries. To do this, they used each publication’s headquarters as its geographic position (e.g., AllAfrica is based in Nigeria, but as the name implies publishes stories throughout Africa).

The topic mentions section was created by carefully reading each article and choosing a single, overriding topic. Because some articles touch on a number of topics or themes, Minassian Media chose the most prominent for the purposes of this audit.

In order to understand how Wikimedia Foundation's keywords played out in media coverage, Minassian Media identified 16 words that were most important to the Foundation’s work and purpose. This was done in coordination with the Wikimedia Foundation Communications team. Minassian Media then read all articles and noted the number of times each of the 16 keywords were mentioned or discussed.

Sentiment analysis was conducted by dividing the tone and sentiment of an article between three categories: neutral, positive, and negative. Minassian Media assigned a percent value for each article, with some being 100% in one category and others being dispersed across all three categories. In the case that an article’s sentiment was not 100% in one category, they rounded sentiment to the nearest 10% (e.g., 70% positive, 30% neutral).

For the top authors section, Minassian Media simply tallied the number of articles each journalist wrote and noted the top ten.

Wikimedia coverage arises in a number of ways. For the purposes of this report, Minassian Media categorized story origins as proactive, reactive, social media, or independent. “Proactive” describes an article that was part of a strategic media push (e.g., Wikipedia 15); “Reactive” describes an article for which we were asked for a comment and provided one (e.g., Knowledge Engine); and finally “Passive” describes an article that was written completely independent of the Foundation or one in which we may have been asked to comment and passed on the inquiry.

Approach

We conducted the media audit in two parts:

  • Influential media: 91 selected publications representing the world's most influential and widely circulated outlets, in addition to publications covering specific geographies (Nigeria, India, Mexico, Brazil, Indonesia and Egypt) and audiences (e.g., influential technology publications)
  • Meltwater analysis: we used a third party media resource, Meltwater, to determine the extent and tone of coverage from a much wider cast of publications (roughly 200,000 online sources).

Objectives

  1. Establish benchmarks for measuring our impact
  2. Understand current media perception
  3. Identify areas for improvement in messaging and outreach

Scope

Two timeframes:

  • Period one (P1): October 2014 - October 2015 (13 months)
  • Period two (P2): November 2015 - May 2016 (8 months)

Topics include:

  • Wikipedia
  • The other Wikimedia projects
  • The Wikimedia movement
  • The Wikimedia Foundation

Key findings

These are some, but not all, of the findings from the audit.

Coverage over time

The spikes in the Figure 1 indicate larger stories that garnered more coverage in a small amount of time.

  • In March 2015, the Wikimedia versus NSA lawsuit was announced, resulting in more coverage. There were a total of 86 articles written that month, 27 of which were NSA-related.
  • In September 2015, there were 64 articles written, with 20 published on Wikipedia’s decision to ban 381 “sock puppet” accounts.
  • Another large spike in coverage occurred in January 2016 due to Wikipedia 15, which was Wikipedia’s strongest messaging campaign during period one and period two. 100 articles were written this month; 46 of these were on Wikipedia’s 15th birthday.
Source: Independent Study

Top topics

Topic mentions identify what conversations are being had around the community and the Foundation. The prominence of Wikipedia 15 shows that a coordinated campaign within the Foundation can produce high volume of positive content.

  • After reading each article, we chose a single, overriding topic. Because some articles touch on a number of topics or themes, we chose the most prominent for the purposes of this audit.
  • The highest share of topic mentions throughout all articles were on the topic of vandalism or articles that mention our organization only in passing. Both these topics drove the most media coverage in both time periods.
  • Wikipedia 15 stood out as the Foundation’s best planned and executed messaging strategy between both time periods.

Top Topics Mentioned: Period One

Source: Independent Study

Top Topics Mentioned: Period Two

Source: Independent Study

Top outlets

The outlets and authors that write on Wikipedia are loyal. With one exception, the top five outlets remained the same across both time periods.

Top Outlets: Period One

Source: Independent Study

Top Outlets: Period Two

Source: Independent Study

Sentiment

Sentiment tells us that writers and the public like and trust our organization. The sentiment analysis section finds that coverage was heavily neutral and that sentiment remained nearly-constant across both time periods and across all outlets. Articles were neutral 65% of the time and positive 22.5% of the time, on average.

Sentiment Analysis: Period One

Source: Independent Study

Sentiment Analysis: Period Two

Source: Independent Study

Story origins

The origin of stories stayed the same across time -- but that might be changing. Passive story origins predominated in both time periods and none of the three categories changed by more than +/- 4%. As our social media presence becomes more robust, we will likely see proactive origins increase substantially.

Share of Story Origin: Period One

Source: Independent Study

Share of Story Origin: Period Two

Source: Independent Study

Coverage concentration

Coverage is concentrated in Western Europe and North America.

  • Countries in Western Europe and North America (where we raise the bulk of our funds) have the highest share of articles published about Wikipedia/Wikimedia.
  • Awareness is high in those areas, but there is space to emphasize how Wikipedia works and what our core values are.
  • Coverage is comparatively low across East Asia and in countries where the New Readers project is focussed (Nigeria, India, Mexico, Brazil, Indonesia and Egypt).

Source: Independent Study

Country Total number of articles: Period one Total number of articles: Period two
Avustralya 9 8
Brezilya 2 3
Kanada 11 9
Çin 1 3
Mısır 0 0
Fransa 11 14
Almanya 13 24
Hindistan 20 12
Endonezya 1 2
İtalya 3 10
Meksika 4 3
Yeni Zelanda 1 1
Nijerya 0 4
İspanya 13 3
Birleşik Krallık 166 41
Amerika Birleşik Devletleri 379 207

Source: Independent Study

Recommendations

Campaigns work

When we take the time to sufficient planning around big news (e.g. the 15th birthday or suing the NSA) we get big results.

Recommendations:

  • Engage communities.
  • Develop supporting multimedia materials for big pushes.
  • Share new, previously unreleased information (i.e. most edited articles for WP15).
  • Activate around actual news.
  • Leverage talent across teams (e.g. product, fundraising, legal + community).
  • Advance planning is key for everything: big campaigns, product launches, getting any news!

Engage globally

As a global organization, it’s important to be embedded in the global community. We engage in dialogue with reporters in countries that happen to be mainly English-speaking, but we tout the hundreds of languages in which Wikipedia is available.

Recommendations:

  • Identify spokespeople within local Wikimedia chapters.
  • Ensure that there is advance translation of materials for global announcements.
  • Create a global list of Wikipedia-dedicated journalists and include them in press announcements, big or small.
  • Focus on Asia (with an emphasis on Hong Kong and Japan). Note: Japan is one of our largest traffic sources, but represents a relatively smaller proportion of our traffic than other countries with lower traffic.
  • Take advantage of ink in Spanish wires - little effort, huge payoff (e.g. Asturias award)

Vandalism pushback

Vandalism is one of the most popular topics across all articles. This presents a problem for our messaging strategy, because it causes global readers to direct a disproportionate attention to a specific part of Wikipedia.

Currently, vandalism accounted for almost 20% of all articles throughout both time periods.

However, according to a recent study, most edits are constructive, with vandalism occurring in only 7% of edits.

Recommendation: We can turn negative press coverage into positive press coverage through coordinated efforts to change the narrative.

  • Highlight the good work our community does in the way of policing vandalism. (e.g. a story highlighting the top five “vigilante” editors who frequently catch and amend vandalism will bring a face to the reliability and accuracy of Wikipedia).
  • Use media inquiries around vandalism to get out positive messages of contribution and encourage people to fight back by helping, not hurting. We should rapidly issue short (like two sentence max) responses with only that main message to get it out consistently.

Leader in the open movement

We have an opportunity to double-down on several elements of the “open” narrative. Currently, we aren’t getting a ton of credit for our work in open software development. But that is only part of the equation.

Recommendations:

  • Do more storytelling about our work on open software development.
  • Continue to have conversations on Open vs. Closed (as opposed to Left vs. Right).
  • Continued presence at places like MozFest and OpenStreetMap conference.

Talk neutrality post-election

Given the record levels of partisanship/polarization and POV/opinion media (embodied perhaps most of all in the U.S. election), we have an opportunity to build on the neutrality message. The post-election space gives us an opportunity to do this, but does anyone think this is likely to change anytime soon?

Recommendations:

  • Think about a post-election gag about getting "back to facts now."
  • Bring Wikipedia’s "Citation needed" to the media (both mainstream and fringe) by pushing out our balanced coverage of the most controversial subjects.
  • Think about introducing a breaking news Twitter feed that pushes out neutral content when controversy breaks.

Find the controversy

If it bleeds, it leads. We see this when we sue the NSA and we also tend to get the most coverage outside the U.S. when we find a legislative issue or event.

Recommendations:

  • Find hot-button issues that affect us.
  • Don’t be afraid to get in the fray, clap back.
  • Quickly find non-Foundation partner.

Leverage studies of us

Right now, we get relatively little coverage out of studies about our reliability and neutrality.

Recommendation: Think about positioning Wikimania as (in part) a scientific conference. Encourage people to publish major science about Wikipedia, free knowledge, open movements, etc at our annual conference to get more coverage.

Future audits

We recommend doing audits on a regular, annual basis. If this methodology is satisfactory, we can continue this way. But we also welcome any suggestions for improvements.

Proposed timeline: We recommend starting a new audit Q1 2016 and deliver in July 2017.

Questions to consider:

  • Which outlets?
  • Which keywords?
  • What timeframes?
  • Quarterly review of future audits?