الاستراتيجية/حركة ويكيميديا/2017/مصادر/ترقّب عام 2030: المصداقية أمام الدعاية والمعلومات الكاذبة
مع تطلّع حركة ويكيميديا لعام 2030، كيف يمكننا مساعدة الناس في الحصول على مصدرٍ موثوق للمعرفة؟[Note 1]
تعملُ مؤسسة ويكيميديا الآن - كجزءٍ من الخطة الاستراتيجية لعام 2030 - مع مستشارين بحثيّين مستقلّين لفهم الاتجاهات الأساسيَّة التي ستؤثّر على المعرفة الحُرّة مستقبلاً، ولإخبار حركة ويكيميديا بنتائج هذه البحوث.[1] عملَت على إعداد هذا التقرير شركة Dot Connector Studio، وهي شركة اسراتيجية وبحوث إعلامية متمركزة في فيلادلفيا تهتمّ - بصورة خاصة - بدراسة كيفيَّة استغلال المنصَّات الناشئة للتأثير على المجتمع. كما اشتركت في التقرير شركة Lutman & Associates المترمكزة بسينت باول، والتي تختصّ بالتخطيط الاستراتيجي والتقييم وتهتمّ بالتقاطعات ما بين الثقافات والإعلام.
بناء أساسٍ قويّ
نجحنا، منذ انطلاق ويكيبيديا في عام 2001، بتأسيس سياسة تحريريَّة شفَّافة وفعَّالة وراسخة لموسوعتنا: إذ نعتمدُ على مساهمات كُتَّاب متطوّعين يرفقون بها المصادر التي جاءت منها، بحيثُ يمكن للقرَّاء الاطّلاع على مصدر المعلومة والتأكّد ممَّا لو كانت صحيحة ودقيقة. ويمكن للمحرّرين دوماً تغييرُ محتوى ويكيبيديا إذا لم يَعُد متوافقاً مع السياسات والقوانين، إلا أنَّ عليهم - في مثل هذه الحالة - تقديم شرحٍ للتعديلات التي يقومون بها.
بالتأكيد، يمكن أن تكونَ عملية تطوّر ويكيبيديا مُعقَّدة بعضَ الشَّيء. فقد تقعُ نزاعات تحريريَّة، خصوصاً في المقالات المختصّة بموضوعاتٍ سياسية أو ثقافية حسَّاسة.[2][3] or that could affect an entity's commercial success and profit.[4] وقد تتأثَّرُ مشاريع ويكيميديا (بلغاتها وبلدانها المختلفة) برقاباتٍ ومعلومات كاذبة ذات مصالح حكومية أو سياسية أو ثقافية أو تجارية، أو حتى بإضافاتٍ لمحتوى مُزوَّرٍ تماماً. وقد تتفاقمُ كلّ هذه المشكلات مع التطوّر الحالي والمستقبلي في وسائل نشر المعلومة والتحقّق منها من خلال الإعلام الصوتي والبصري. تتخوَّفُ الوكالات الإعلامية، في الوقت الحالي، من تحديَّات عدَّة من أهمّها مشكلة "الأنباء الكاذبة" والمعلومات الخاطئة التي تنتشرُ في العالم، وقد تكونُ هذه هي نفس التحديات التي ستواجهها ويكيبيديا - أثناء السنوات القادمة - مع كفاحها في إبقاء محتواها موثوقاً. وقد نجدُ مثالاً آخر على كيفيَّة التلاعبُ بالمحتوى من خلال مشكلات المصادر التعليمية (التي يمكنُ أن تكون مراجع لمقالات ويكيبيديا).
خلال عملنا على الخطة الاستراتيجية لويكيميديا في 2030، اضطّررنا لمراجعة أكثر من مائة تقريرٍ ومقالٍ ودراسة أكاديميَّة للتعرّف على أهمّ المشكلات الحالية والمستقبلية في نشر المعلومة الكاذبة، وكيف يمكنها أن تؤثّر على ويكيبيديا. سترونَ نتائج هذه المراجعات في الجدول أدناه.
أفقُ المعلومات الكاذبة المستقبلي
في حال أردنا أن نأخذ بعين الاعتبار مشكلات المعلومات الكاذبة التي قد تُضطرُّ ويكيميديا للتعامل معها خلال السنوات العشر إلى الخمسة عشر القادمة، قد يكونُ من المفيد أن نقسم هذه المشكلات إلى فئتين أساسيَّتين: "المحتوى" و"الوصولية"، حيث أنَّ لكلّ منهما ثلاثة انعكاساتٍ مهمَّة على العالم.
- المحتوى: تقصدُ به المشكلات التي يمكن أن تؤثّر على المصادر التي يعودُ إليها الويكيبيديّون أثناء كتابتهم للمقالات.
- الوصولية: تقصدُ بها قدرة متصفّحي الإنترنت على استخدام ويكيبيديا والاستفادة منها.
من بين الانعكاسات التي تؤثّر على هذين الاتجاهين في العالم قضايا التقنية، والحكومات وسياساتها، والتجارة.
المحتوى | الوصولية | |
---|---|---|
انعكاسات التقنية | أصبحت المعلومة تُولَّد بطرقٍ جديدة، مثل الذكاء الاصطناعي، والبوتات، والبيانات العملاقة، والأبعاد الثلاثية | أصبح المحتوى ينتقلُ للقارئ بوسائل جديدة، مثل الملابس الناطقة، والبرامج التي تستجُيب للكلام الصوتي، وغير ذلك |
انعكاسات الحكومات والسياسة | ازدياد انتشار المعلومة الكاذبة، وتهديدُ الحرية الأكاديمية والصحفية | الرقابة والحجب المفروضان على ويكيبيديا أو المواقع الأخرى، وحظر المستخدمين من تصفّح الإنترنت، ومراقبة ما يتصفّحونه |
انعكاسات التجارة | دعم الأبحاث، والدعايات الإعلان التجارية، والمحتوى الدعائي | انتشار فقاعات الترشيح |
انعكاسات التقنية
لا زالت التقنية في حالة تطوّر سريعة، ممَّا يُصعّب التكيف معها على الدوام.
المحتوى
تخلُقُ التقنية الكثير من التسهيلات والمصاعب - في الآن ذاته - بمجال صناعة المحتوى. فعلى سبيل المثال، تستخدم البرامج البوتية في المساهمة بمشاريع ويكيميديا منذ انطلاقها، فهي مُصمَّمة لأداء العديد من مهام الصيانة الروتينية، مثل إدراج الروابط وتصحيح الأخطاء الإملائية ومكافحة التخريب. ولدى بعض نسخ ويكيبيديا سياسة بوت تفرضُ تسجيل البوتات والحصول على إذنٍ قبل الشروع باستخدامها. ففي ويكيبيديا الإنكليزية، تشرف هيئة من المساهمين (تُسمّى مجموعة قبول البوتات) على مساهمات هذه البرامج.
وقد يكون تغيّر المحتوى مع التطور التقني أولى مشكلاتنا فحسب. فعلى مرّ السنوات الخمسة عشر القادمة، من المُتوقَّع أن تتحسَّن برمجيات الذكاء الاصطناعي بحيثُ تصبح مُعقَّدة جداً، إذ تقول مقالة منشورة مؤخراً في مجلة ساينتيفيك أميريكان: "من المُرجَّح أن تتفوَّق قدرات الحواسيب الفائقة على القدرات البشرية - في جميع المجالات تقريباً - بين عامي 2020 و2060.[5] كما ستزدادُ البيانات العملاقة ضخامةً، وتصبح أكثر اعتماديَّة على الوسائل المؤتمتة لجمع وترتيب وتحليل المعلومات.
وويكيميديا بدأت بالفعل بالاستفادة من المستقبل، فنحنُ لدينا الآن تقنياتٌ متطوّرة جداً مثل خدمة المراجع والتدقيق الموضوعيَّة (ORES)، التي تُوظِّف الذكاء الاصطناعي لمساعدة محرّري ويكيبيديا بالتعرّف على التعديلات الضارَّة بناءً على استرجاعات المحرّرين الآخرين لها بالماضي. وفي موضوع "اسئلني ما تشاء" نشر على ردت بشهر يونيو 2017، أبدى عالم الأبحاث الذي يعملُ لمؤسسة ويكيميديا، آرون هالفيكر، تصوّره عن المزايا والسيئات التي تلوحُ بها هذه الخدمة الجديدة بقوله: "يمكن للتنبّؤات الآلية أن تؤثّر على الحكم البشريّ بسهولة. فإذا أصرَّ ذكاؤنا الاصطناعي على تعليم التعديلات على أنَّها ضارة، فقد يقتنعُ المساهم البشريّ بذلك حتى ولو كانت (في الحقيقة) مفيدة... لذا فإنَّنا نحاول أن نكونَ حذرين جداً بكيفية تطوير برنامجنا".[6]
سيجلبُ لنا هذا التطور - غير المسبوق - بأتمتة الصناعة المعرفية وتحليل المعلومة تسهيلات وتحديات في الوقت نفسه.
فعلى الجانب المشرق، ستساعدنا هذه البرمجيَّات في إنتاج المعلومة. مثلاً، يعملُ الباحثون الآن على تطوير طرق لتوظيف الذكاء الاصطناعي في البرامج التلفزيونية لمساعدة السماعات على التقاط الكلام من مصدره الصحيح.[7] وقد تساعد هذه التطورات في التقنية مساهمي مشاريع الويكي على إنشاء محتوى مرئي أثناء عملهم. وتشير وكالة الأسوشييتد برس، في تقرير عن كيفية تغيّر الصحافة بالمستقبل، إلى أن تطور الذكاء الاصطناعي سيساعد المراسلين الصحفيين - وغيرهم - على "تحليل البيانات، واكتشاف الاتجاهات الكامنة فيها، والحصول على خلاصاتٍ من مصادر مختلفة ليصلوا إلى أشياء لم يكن يمكن للعين البشرية أن تراها، وليحوّلوا البيانات والكلمات المحكيَّة إلى نص، وليحوّل النصوص إلى وسائط مسموعة ومرئية، وليفهموا مشاعر الناس، ويحلّلوا مقاطع الفيديو ليروا ما فيها من أشياء ووجوه وألوان - وأكثر من ذلك بكثير".[8]
AI can also help shape learning environments, directing users to appropriate knowledge sources based on data about how they have previously interacted with similar information resources, and revealing insights about how and when these resources are valuable.[9] As a result, for example, AI might even be used to assemble various Wikipedia articles into custom textbooks on the fly. It is not difficult to imagine how Wikimedia editors could deploy such advancements to strengthen Wikimedia content.
However, the development of new tools also can lead to more misleading content that could pose challenges when sourcing entries: "At corporations and universities across [the U.S.], incipient technologies appear likely to soon obliterate the line between real and fake….[A]dvancements in audio and video technology are becoming so sophisticated that they will be able to replicate real news—real TV broadcasts, for instance, or radio interviews—in unprecedented, and truly indecipherable, ways," predicts Vanity Fair writer Nick Bilton.[10]
In response, between now and 2030, the Wikimedia movement will need to remain vigilant and to develop new methods of verification that match these new technological capabilities. In turn, this will mean that the process for determining verifiability and reliable sources might need to evolve—or that the movement may need to build their own corresponding tools to keep up with edits from competing interests.
As Kevin Kelly observes in The Inevitable: Understanding the 12 Technological Forces That Will Shape Our Future, "Wikipedia continues to evolve its process. Each year more structure is layered in. Controversial articles can be 'frozen' by top editors so they can no longer be edited by any random person, only designated editors. There are more rules about what is permissible to write, more required formatting, more approval needed. But the quality improves too. I would guess that in 50 years a significant portion of Wikipedia articles will have controlled edits, peer review, verification locks, authentication certificates and so on."[11]
Access
Technology also presents myriad obstacles to accessing the content served up on Wikimedia platforms. For example, the browser-based, computer-accessed model for Wikipedia is already challenged; depending on where in the world you live, you are likely to be reading and editing Wikipedia entries on a mobile device rather than a computer. This trend toward mobile is well underway and increasingly global.[12] (In future briefs, we'll explore how other developments in technology, such as the increasing sophistication of wearables and the rise of audio-based virtual assistants, will affect access to Wikipedia and other Wikimedia projects.)
Influence: government and politics
Governments and political actors have the power to both suppress and distort content, and to restrict access to Wikimedia platforms. For example, the Turkish Internet Regulator blocked access to all language versions of Wikipedia on April 29, 2017.
Content
Governments around the world can and do monitor and crack down on activists, journalists, academics and other citizens who otherwise might be creators of reliable source material for Wikimedia or create or edit Wikimedia content. Textbooks and reference content are also a target for repressive regimes, as noted in a June 2017 report from the U.S. based political rights organization Freedom House.[13] In India, for example, model textbooks published by the National Council of Educational Research and Training have been accused of reflecting the political views of those in power.
This will continue to be a crucial challenge to realizing Wikimedia's vision over the next 15 years. The most recent analysis of press freedom worldwide from Reporters Without Borders declared that the globe is "darkening", as press freedom decreases in specific nations, and generally across the globe.[14]
Meanwhile, authoritarian governments continued to persecute and prosecute journalists. Turkey jailed 120 journalists between July and November 2016, following a failed coup against the government in power.[15] Reporters Without Borders argues that the "political rhetoric" used by U.S. President Donald Trump in the 2016 U.S. presidential elections has influenced political discourse on press freedom around the globe. For example, in Tanzania, President John Magufuli warned newspapers that "their days are numbered."[16] Such government repression not only reduces source material for Wikipedia editors, but can also result in an overall chilling effect for those seeking to produce or verify information.
A related trend among governments and political actors is purposeful propagation of misinformation, disinformation, or propaganda. This sort of manipulation may weaken the overall information ecosystem, creating an overall culture of doubt related to the reliability of online information. This could have an impact on the reliability of sources, and therefore content, on Wikipedia. The current global battle to identify and control misinformation seems likely to shape the information environment over the next 15 years. And not all disinformation is the same: conflation of what, exactly, constitutes "fake news," inspired Claire Wardle, research director for First Draft News, to create a taxonomy of "mis" and "dis" information, ranging from satire to propaganda to all-out fabricated accounts.[17]
For example, a 2016 paper published by RAND Corporation charges Russia with trafficking in a "firehose of falsehood," broadcasting "partial truths or outright fictions" on high numbers of communication channels, following a philosophy of quantity over quality.[18] BuzzFeed reported in January 2017 that in France, Trump supporters in the U.S. were posing as French citizens online to manipulate the French election outcomes.[19] Days before the French election, hackers breached servers for leading candidate (and eventual victor) Emmanuel Macron, in an attack likened to the theft of emails from the U.S. Democratic National Committee.[20]
This spread of misinformation online is occuring despite recent growth in the number of organizations dedicated to fact-checking: world-wide, at least 114 "dedicated fact-checking teams" are working in 47 countries.[21]
Looking into the future, what's safe to expect? First, global freedom of expression will wax and wane depending on national and international political developments. Less clear is whether global trends toward autocracy will continue—or whether free societies will have a resurgence, grappling successfully with pressures on the press and academy, and the politicization of facts as merely individual biased perspectives.
Second, we can expect that politically motivated disinformation and misinformation campaigns will always be with us. Indeed, the phenomenon of "fake news," misinformation, or "alternative facts" can be traced to some of the earliest recorded history, with examples dating back to ancient times.[22]
The Wikimedia movement will need to remain nimble and editors become well-versed in the always-morphing means by which information can be misleading or falsified. It will be helpful to keep abreast of techniques developed and used by journalists and researchers when verifying information, such as those described in the Verification Handbook, available in several languages.[23]
Access
A May 2017 study commissioned by the Wikimedia Foundation from Harvard's Berkman Klein Center shows that outright censoring of the Wikipedia platform or articles is trending down globally, and was doing so even before the Wikimedia Foundation in June 2015 deployed "HTTPS" technology that makes blocking individual pages more difficult.[2]
With HTTPS, censors can't see which page within a website was visited, so censors must choose whether to block access to the entire site in order to restrict access to single pages. The Berkman Klein study used both client-side availability monitoring data to collect information from 40 countries, as well as server-side data to find anomalies in requests for a set of 1.7 million articles covering 286 Wikipedia language projects.[2]
However, while the overall trend may be down, the analysis did show that certain governments, such as China, are continuing to censor, with full censorship of the zh.wikipedia.org domain through 2016. (Access appears to be allowed to other Wikimedia subdomains.) The analysis also found anomalies that could indicate censorship, but have not been explained, such as Thailand's apparent blocking of Yiddish-language Wikipedia.[2]
Overall, the authors conclude: "[T]he shift to HTTPS has been a good one in terms of ensuring accessibility to knowledge." A technological change on the back end, in other words, has improved access on the front end. In general, technological solutions (HTTPS) to technological problems (outright blocking) can sometimes bring relief—until the next technological challenge emerges.[2]
Influence: Commerce
Commercial actors may also influence both access and content for the Wikimedia movement, which deliberately models a platform that is free and open to contributions.
=== Content === The rise of commercial social media platforms such as Twitter and Facebook over the last decade has been accompanied by a concurrent decline of and trust in traditional modern media institutions. This is true within open societies that once enjoyed a competitive and productive press sector, creating concerns about new ways that misinformation is being filtered and delivered online and used in public discourse and decision-making.
There is also some overlap with other challenges noted above—for example, spurious technology claims may be disseminated in order to drive up stock prices, or misinformation campaigns aimed at affecting public policy may be run for profit-seeking reasons, as when companies or industry groups pay for research to prove a policy point. For example, an examination of industry documents over time showed that the U.S. sugar industry throughout the 1960s and 1970s sponsored research that promoted dietary fat as a bigger health risk than sugar.[24]
Online platforms have recently announced steps seeking to address the dissemintion of misinformation. Google has introduced changes within its search function, returning fact-checking organizations' content alongside results. They have also introduced means of reporting problematic results (i.e. autocomplete that suggests questions about whether the Holocaust happened.)[25] Facebook offered fact-checks on articles based on their URLs, and tips on media literacy.[26]
Feature and functionality changes by these large platforms in the coming years may both inform and compete with parallel approaches developed for the Wikimedia projects.
The next frontier in understanding how to combat misinformation involves developing a more sophisticated grasp on how networks help to spread it. The Europe-based Public Data Lab released the first several chapters of a "Field Guide to Fake News," which emphasizes the importance of building tools that reveal the "thick web" of how fake news spreads online—showing how a story or idea is shared across networks to help viewers put it in context.[27]
The next level of innovation may involve ubiquitous fact-checking—a solution that could potentially leverage Wikipedia content as a key source. For example, the nonprofit organization Hypothes.is continues to develop an open platform that supplies a layer on the web where users can create annotations that give context to the content.[28] In addition, "big data" can be harnessed to help provide context to public discourse, for example by tapping into data about the flow of money between political entities engaged in misinformation.[Note 2]
With the infusion of philanthropic investment and widespread experimentation, there is a chance that social media networks or online information companies such as Google may successfully tweak algorithms to prevent some of the most widespread sharing of false information online. However, new technologies are likely to inspire profit-seekers and political actors to develop new ways to manipulate systems, and extend these manipulations beyond text to other media, such as data, videos, photos, and more.
Access
Concerns over how commercial companies could limit access to Wikimedia platforms over the next 15 years will be addressed in forthcoming briefs about the future of the commons, and the emergence and use of new platforms. These include but aren't limited to battles over net neutrality, the rise of proprietary apps and platforms, and corporations' willingness (or unwillingness) to provide access to Wikipedia content from within their own content properties and devices.
Concluding thoughts and questions
How might Wikimedia plan for combating misinformation and censorship in the decades to come?
- Encourage and embrace experiments in artificial intelligence and machine learning that could help enrich Wikipedia content.
- Track developments in journalism and academia for new ways to fact-check and verify information that may be used as sources for Wikimedia platforms, such as evaluating video or other new media, also valuable for content.
- Collaborate with other public interest organizations to advocate for press freedom, free speech, universal internet access, and other policy goals that ensure access and the free flow of information.
- Continue to monitor carefully access to Wikimedia platforms from around the globe, deploying technical changes where appropriate.
- Monitor the solutions being developed by commercial platforms and publications, both to see how they might be applied to improving content verification methods on Wikimedia platforms, and might offer opportunities for increasing access to that content.
Notes
- ↑ تعودُ الكثير من الروابط المذكورة في هذه الصَّفحة إلى ويكيبيديا الإنكليزية، وهي اللغة الأصلية التي كُتِبَت بها، إلا أنَّ ثمة روابط مقابلةً لها - تشيرُ إلى سياسات وقوانين الويكي - في معظم مواقع ويكيميديا.
- ↑ See, for example, data available from the Center for Responsive Politics on campaign donors, lobbyists, and more in the U.S.; and Transparency International's data on lobbyists in the European Union as well as other data.
References
- ↑ "كيف ستساعد القوى الخارجية حركة ويكيميديا - أو تُعيقها - في المستقبل؟ - مدوّنة ويكيميديا.". Retrieved 2017-07-13.
- ↑ a b c d e Clark, Justin, Robert Faris, Rebekah Heacock Jones. Analyzing Accessibility of Wikipedia Projects Around the World. Cambridge: Berkman Klein Center for Internet & Society, 2017. Accessed May 25, 2017.
- ↑ Alcantara, Chris. "The most challenging job of the 2016 race: Editing the candidates' Wikipedia pages." Washington Post. October 27, 2016. Accessed May 25, 2017.
- ↑ Kiberd, Roison. "The Brutal Edit War Over a 3D Printer's Wikipedia Page." Motherboard. March 23, 2016. Accessed June 1, 2017.
- ↑ Helbing, Dirk, Bruno S. Frey, Gerd Gigerenzer, Ernst Hafen, Michael Hagner, Yvonne Hofstetter, Jeroen van den Hoven, Roberto V. Zicari, and Andrej Zwitter, "Will Democracy Survive Big Data and Artificial Intelligence?" Scientific American. February 25, 2017. Accessed May 28, 2017. https://www.scientificamerican.com/article/will-democracy-survive-big-data-and-artificial-intelligence/.
- ↑ Halfaker, Aaron. "I'm the principal research scientist at the nonprofit behind Wikipedia. I study AIs and community dynamics. AMA!" Reddit. June 2, 2017. Accessed June 7, 2017.
- ↑ Watzman, Nancy. "Internet Archive's Trump Archive launches today." Internet Archive Blogs. January 5, 2017. Accessed May 19, 2017.
- ↑ Marconi, Francesco, Alex Siegman, and Machine Journalist. The Future of Augmented Journalism. New York: Associated Press, 2017. Accessed May 30, 2017.
- ↑ Luckin, Rose, Wayne Holmes, Mark Griffiths, and Laurie B. Forcier. Intelligence Unleashed: An Argument for AI in Education. London: Pearson, 2016. Accessed June 8, 2017.
- ↑ Bilton, Nick. "Fake news is about to get even scarier than you ever dreamed." Vanity Fair. January 26, 2017. Accessed May 30, 2017.
- ↑ Kelly, Kevin. The Inevitable: Understanding the 12 Technological Forces That Will Shape Our Future. New York: Viking, 2016.
- ↑ GSMA. "The Mobile Economy 2017." Accessed June 1, 2017.
- ↑ Puddington, Arch. Breaking Down Democracy: Goals, Strategies, and Methods of Modern Authoritarians. Washington, D.C.: Freedom House, 2017. Accessed June 8, 2017.
- ↑ Reporters Without Borders. "2017 World Press Freedom Index – tipping point?" April 26, 2017. Updated May 15, 2017. Accessed May 28, 2017.
- ↑ Nordland, Rod. "Turkey's Free Press Withers as Erdogan Jails 120 Journalists." The New York Times. November 17, 2016. Accessed June 7, 2017.
- ↑ Reporters Without Borders. "Journalism weakened by democracy's erosion." Accessed May 29, 2017.
- ↑ Wardle, Claire. "Fake News. It's Complicated." First Draft News. February 16, 2017. Accessed June 7, 2017.
- ↑ Paul, Christopher and Miriam Matthews. The Russian "Firehose of Falsehood" Propaganda Model: Why It Might Work and Options to Counter It. Santa Monica: RAND Corporation, 2016.
- ↑ Broderick, Ryan. "Trump Supporters Online Are Pretending To Be French To Manipulate France's Election." BuzzFeed. January 24, 2017. Accessed June 7, 2017.
- ↑ Tufekci, Zeynep. "Dear France: You Just Got Hacked. Don't Make The Same Mistakes We Did." BuzzFeed. May 5, 2017. Accessed June 7, 2017.
- ↑ Stencel, Mark. "International Fact-Checking Gains Ground, Duke Census Finds." Duke Reporters Lab. February 28, 2017. Accessed June 7, 2017. https://reporterslab.org/international-fact-checking-gains-ground/.
- ↑ Darnton, Robert. "The True History of Fake News." The New York Review of Books. February 13, 2017. Accessed June 7, 2017.
- ↑ Silverman, Craig, ed. Verification Handbook: A Definitive Guide to Verifying Digital Content for Emergency Coverage. Maastricht: European Journalism Centre, 2016. Accessed May 29, 2017.
- ↑ Kearns, Cristin E., Laura A. Schmidt, and Stanton A.Glantz. "Sugar Industry and Coronary Heart Disease Research: A Historical Analysis of Internal Industry Documents." JAMA Intern Med 176, no. 11 (2016): 1680-1685. Accessed June 8, 2017. doi:10.1001/jamainternmed.2016.5394.
- ↑ Gomes, Ben. "Our latest quality improvements for search." The Keyword. Google. April 25, 2017. Accessed May 19, 2017.
- ↑ Simo, Fidji. "Introducing: the Facebook Journalism Project." Facebook Media. January 11, 2017. Accessed May 19, 2017.
- ↑ Public Data Lab and First Draft News. "A Field Guide to Fake News." Accessed May 19, 2017.
- ↑ The Hypothesis Project. "To Enable a Conversation Over the World's Knowledge: Hypothesis Mission." Accessed 22 May 2017.