Jump to content

Grants:Project/JackieKoerner/Investigating the Impact of Implicit Bias on Wikipedia

From Meta, a Wikimedia project coordination wiki


statusnot selected
Investigating the Impact of Implicit Bias on Wikipedia
summaryImplicit bias is a problem for Wikipedia because it affects the range of available content and inhibits contributor retention. Implicit bias is unconscious bias that is learned beginning in childhood. It is our default setting. Without challenging our biases and reflecting on how they impact our behavior, our biases become problematic. This investigation will examine the impact implicit bias has on Wikipedia and how to change it.
targetEnglish Wikipedia
amount$79,117 USD
granteeJackiekoerner
contact• jackie.koerner(_AT_)gmail.com
volunteerBarbara Page
researcherJackiekoerner
this project needs...
contact
organization
volunteer
join
endorse
created on03:35, 27 September 2017 (UTC)


Project idea

[edit]

What is the problem you're trying to solve?

[edit]

Implicit bias is a problem for Wikipedia. This is because implicit bias affects the range of available content and inhibits contributor retention. Knowledge is powerful. It can preserve cultures, histories, and societies. This curating of knowledge is prevented by implicit bias. The implicit bias contributors experience on Wikipedia has a very real impact. Implicit bias prevents the inclusion of content on Wikipedia and excludes people from the community. If we don’t have everyone contributing, and at their fullest potential, we have no hope at collecting the sum of all human knowledge. This proposed project includes an in-depth investigation of implicit bias’ impact on English Wikipedia. The data from this proposed project will help the community and myself to better understand the problem implicit bias poses for Wikipedia. The data will inform educational solutions and policy recommendations, both of which I will design with community input, to begin to change the impact of implicit bias on Wikipedia.

Implicit bias is unconscious bias that affects our actions, behaviors, and understanding without us even being aware of it. Implicit bias is learned through socialization beginning in childhood. It is reinforced through social interactions with like-minded individuals and through messages in social media, literature, and other cultural and environmental methods. Without challenging our biases and reflecting on how they impact our behavior, our biases can become problematic. They already have become problematic for Wikipedia.

The problems caused by implicit bias on Wikipedia show up in many ways. This includes creating barriers to participation, inhibiting communication and collaboration, and limiting content by way of unequal notability and verifiability standards.[1] The technological learning-curve and the potential for and actual adversarial experiences with community members keep people from participating in the fullest sense. These situations were both started and are perpetuated because of implicit bias.[2] We need to "break down the social, political, and technical barriers preventing people from accessing and contributing to knowledge." All of these barriers and experiences place negative pressure on volunteers seeking to contribute.

Wikipedia relies on volunteers to create and curate the content available on Wikipedia. Volunteers contribute to Wikipedia because of what is important to them as individuals. To quote the 2017 Strategic Direction: "What brings us together is not what we do; it’s why we do it." This can vary from something as broad as contributing to free knowledge or as simple as editing for a specific grammatical error. Experiences can change a person’s decision about participating in Wikipedia. A bad experience might lead them to contribute sparingly, or leave altogether. All volunteers are valuable, but are we as a community valuing everyone? Wikipedia is the encyclopedia anyone can edit. Yet, not everyone is contributing. Not everyone feels welcome. Not everyone gets to see their culture on Wikipedia. Contributors reported avoiding Wikimedia Projects more frequently than Facebook and Twitter because they felt unsafe and uncomfortable. This is not exclusively about harassment. It’s about something more subtle, but just as damaging. It’s unconscious and it seeps into all of our actions and interactions. Implicit bias is preventing Wikipedia from reaching the true potential as a community: “a world in which every single human being can freely share in the sum of all knowledge.”

Contributors on Wikipedia are passionate individuals. Communication becomes complicated when people feel passionately about situations. Text communication additionally complicates communication on Wikipedia by removing the human aspect of dialogue. There are no facial expressions or verbal inflection to help decode the intended meaning of actions or communication. Additionally, responses are fueled by implicit bias, as bias is the foundation of our personal beliefs. Implicit bias is our default setting. In situations where people feel threatened, passionate about something, or feel something they are passionate about is threatened, implicit bias drives the response. These responses can lack empathy, civil discourse, and objectivity.[3] The Community Engagement Insights noted contributors found both group discussions and consensus building dissatisfying. These issues with bias impacting communication and collaboration show up in the social aspects of Wikipedia. Their real impact is yet to be investigated, which this proposed project intends to do.

Further, content on Wikipedia is limited by implicit bias. Implicit bias shows up in discussions and policies about notability and verifiably. The existing content and policies are driven by the interests of the most active contributors, who are largely white and cisgender male.[4] Studies have shown 84% of Wikipedia articles focus on Europe and North America.[5] Additionally, many of the existing articles about emerging communities are authored by contributors in developed communities. These contributors in developed communities are constructing content using their implicit biases, which can lead to an imbalanced or biased content.[6] This bias does not just affect articles about emerging communities, but is simply more obvious in that situation.

Wikipedia started as an experiment. We are still learning and growing as a community. We need to find out more about ourselves in order to grow as a community truly committed to curating the sum of all human knowledge. As we humans are the authors of Wikipedia, our contributions reflect our own implicit biases, and without critical reflection, we may not recognize their true impact. Implicit bias is affecting content on Wikipedia, collaboration, communication, and policy development. This is a cyclical problem woven into interconnected facets of Wikipedia. This project proposal connects with each of the themes from the strategic direction conversations. Knowing where implicit bias impacts content creation will help us develop reliable, neutral, high-quality content. Making implicit bias something we recognize in ourselves and challenge in others will make community health stronger. Acting on our implicit biases can be destructive. For the sake of our values as a movement for knowledge, outreach, and advancing education, we need to take a "we don't do that here" approach to where implicit bias becomes destructive. The beginning is really understanding implicit bias on Wikipedia. Understanding implicit bias' impact on Wikipedia will get the us to where we want to be by 2030. Only by further understanding the impact, can we begin to develop solutions.

What is your solution to this problem?

[edit]

I will investigate the impact of implicit bias on English Wikipedia. I will use the data from the investigation to design educational tools to educate the community about implicit bias. Educational content for contributors regarding implicit bias will provide information about recognizing bias, limiting its impact on contributions and interactions, and challenging bias on-wiki. I will make recommendations for policies and policy changes. Further solutions may be created based on the data from the investigation.

Project goals

[edit]

This proposed project will fulfill the following goals:

  1. Conduct an in-depth investigation of implicit bias’ impact on English Wikipedia to better understand the problem.
  2. Use data to develop educational solutions to change implicit bias’ impact on Wikipedia.
  3. Increase knowledge about how bias impacts participation, policy, and infrastructure.
  4. Create recommendations for policies and policy changes.

Project impact

[edit]

How will you know if you have met your goals?

[edit]

This proposed project does not have quantifiable goals. I will know I have met my goals by achieving what I set out to do as well as listening to the community feedback. I encourage the community to be involved throughout and let me know their feedback. I expand on how I plan to achieve my goal below:

During this proposed project, I will observe and review talk pages, discussion pages, and other locations where conversations take place between contributors, whether on- or off-wiki. I will begin my focus on topics where implicit bias has an overt impact. This would be, for example, gendered topics. Then I will examine conversation on popular pages, the Village Pump, Articles for Deletion, Requests for Comment, policy and procedure pages, as well as others as deemed appropriate for the investigation. I will also welcome community participation in providing examples of implicit bias documented on-wiki, in conversations off-wiki, and through personal experiences. I will examine the documented policies and procedures. Throughout the proposed project, I will continue to review published material by other authors regarding bias on Wikipedia and other related content.

The investigation and supporting information will be developed into a narrative and a report. This narrative and full report of the proposed project findings will be published on-wiki upon completion. The narrative will provide the readers with an overview of the information and the impact of bias on contributors. The report will contain more detailed information.

Once the investigation portion of the proposed project is over, educational materials will be developed with targeted learning outcomes. Other solutions as guided by the data will also be developed. These solutions will affect the impact of implicit bias on Wikipedia by teaching people how to challenge their own biases, and become allies to challenge implicit bias on Wikipedia. A post-survey will be available to all contributors who utilize the educational materials to gauge how successfully the material is addressing the learning outcomes.

As determined by the 2016-17 Community Engagement Insights Report, self-awareness in the Community is low. On questions about community health pertaining to people empathizing with others, being aware of their biases, and being aware of defensiveness, the responses were low. This means people on Wikipedia are not good at empathizing with others, people on Wikipedia are not aware of their biases. and people on Wikipedia are not aware of defensiveness. While not all contributors will consume the educational materials developed through this project proposal, the hope is by educating some contributors about implicit bias, the community culture will be further encouraged to shift in a positive direction.

Do you have any goals around participation or content?

[edit]

Community participation will enhance the activities of this proposed project. This is done by providing both feedback and participation. The broader Wikimedia community will be able to follow the process of this proposed project on Meta and provide feedback about educational materials in the draft stages. A community discussion session can be organized among Wikimedia volunteers who show interest.

The community will participate in sharing information about this project and developing awareness of the project.

I plan to have community members participating in interviews and sharing examples of bias with me. I also seek their input for the educational materials and policy recommendations to be developed.

I will convene with community members at Wikimania 2018 and other conferences I attend to discuss information about the project as well listen to their thoughts about the project.

After the educational materials are available for the community, I will rely on community members to continually share the materials so their impact continues.

I welcome additional community participation throughout the project.

Project plan

[edit]

Activities

[edit]

The methodological approach for this proposed project is grounded theory. The data found through this proposed project will drive the solutions and further research.

For this proposed project, I will examine written dialogue, capture experiences through semi-structured interviews, and review policies and procedures. The written dialogue will be talk pages, discussion pages, and other written conversations between contributors, both on- and off-wiki. The semi-structured interviews will take place via audio or video call, depending on the individual's comfort level. If the participant agrees, the interview will be audio recorded so I can go back and review later for clarification (See Privacy below). All audio recordings and identifiable information will strictly be kept private. The interviews will be transcribed and coded. While the interview protocol is still evolving, and will continue to do so during the community input process, the draft interview protocol is below:

  1. Tell me about what you do on Wikipedia.
  2. Why do you contribute?
  3. What areas of Wikipedia are you most passionate about?
  4. What areas of Wikipedia do you avoid? Can you tell me why?
  5. How have you been involved with policy and procedure development or other initiatives?
  6. Share a story about a time you were involved with (policy/procedure/initiative).
  7. How often do you find yourself working on Wikipedia, both on- and off-wiki? Have you ever taken a break? Have you thought about it?
  8. What is bias?
  9. How does bias show up on Wikipedia?
  10. What could be done to reduce bias?
  11. What do you do when you notice bias in yourself?
  12. What do you do when you notice bias in others?

Additionally, policies and infrastructure of English Wikipedia will be examined for implicit bias. Throughout the proposed project, I will continue to review published materials about Wikipedia, as new findings by other authors and researchers will inform and guide this proposed project.

Community participation will enhance the activities of this proposed project. This is done by providing both feedback and participation. The broader Wikimedia community will be able to follow the process of this proposed project on Meta and provide feedback about educational materials in the draft stages. A community discussion session can be organized among Wikimedia volunteers who show interest.

The investigation and supporting information will be developed into a narrative and a report. This narrative and the full report of the proposed project findings will be published on-wiki upon completion. The narrative will provide the readers with an overview of the information and the impact of bias on contributors. The report will contain more detailed information.

In an effort to remain reflexive and as unbiased as possible (go ahead, laugh. that was kind of funny.) I will be maintaining a research journal on-wiki. This practice of keeping a research journal is beneficial in qualitative research studies in aiding the researcher to identify bias, and keep a record of thoughts and interpretation. Essentially, the researcher is the tool with which the data are reviewed. This journal aids in transparency so readers and researchers interested in replicating the proposed project can identify how the researcher came to these conclusions.

I will travel to Wikimania 2018 in Cape Town, South Africa in order to present findings while the proposed project is in process. I will aim to travel to several other conferences and gatherings of the community as the budget permits. Regarding which travel destinations, I will seek the community's input.

After completion of the research stage, educational materials will be developed with targeted learning outcomes. Other solutions as guided by the data will also be developed. These solutions will affect the impact of implicit bias on Wikipedia by teaching people how to challenge their own biases, and become allies to challenge implicit bias on Wikipedia. A post-survey will be available to all contributors who utilize the educational materials. This post-survey will be anonymous and short to encourage participation. This will enable me to gauge the potential effectiveness of the educational materials.

While not all contributors will consume the educational materials, the hope is by educating some contributors about implicit bias, the community culture will begin to shift in a positive direction.

Privacy

[edit]

While the Wikimedia community does not have an Institutional Review Board (IRB), I take the privacy of participants seriously, and adhere to privacy standards I have with other projects where I did have to have IRB approval. The materials from interviews will be anonymized. If a participant gives consent, full transcripts of their interview will be published. Otherwise, only excepts and essence from the interview will be published. In either case, participants will be able to review the written content from their interviews before publication in order to correct or clarify the content. Names or other identifying information about the participants will not be published. Quotations from conversations off-wiki will not be published. Only the essence of those conversations will be published. This is in an effort to protect the identities of the participants and in an effort to not violate the trust these communities have developed in their off-wiki communication. If I am able to access off-wiki communication, available in asynchronous communication platforms (Slack, Telegram, etc.), I will purely observe. No quotes will be taken. No names identified. Only observations of the essence of the conversation as pertaining to and deriving from bias will be included in the proposed project notes and published documents.

Timeline

[edit]
Month Year Activity
December 17 - August 18 Examine implicit bias on English Wikipedia
December 17 - August 18 Hold interviews with community members
December 17 - August 18 Analyze data, code for themes
March 18 - June 18 Begin developing draft educational materials and policy recommendations
June 18 - July 18 Finalize presentation for Wikimania 2018
July 18 Travel to South Africa to present at Wikimania 2018
July 18 - August 18 Reach out to community members one final time about participating in the interviews or sharing documentation of bias
August 18 - December 18 Begin to draft narrative and final report
August 18 - December 18 Develop educational materials and policy recommendations further with community input
November 18 - December 18 Develop post-survey for educational materials
December 18 Publish narrative, full report, educational materials, post-survey for educational materials, and policy recommendations on-wiki
December 18 Reach out to community and stakeholders to share project outcomes (narrative, report, educational materials, etc.)
December 18 Initiate discussions about policy recommendations

Budget

[edit]
Budget item Description Amount (USD)
Software 9 month subscription to Dedoose (my choice CAQDAS), during the coding phase $117 ($13/mo.)
Salary Full-time, 12-month term, calculated by the average salary of a qualitative research analyst $70,000
Travel Travel to attend Wikimania 2018: Hotel (5 nights), flight, conference fee $4000
Travel Other conferences where closing the gaps are topics (perhaps WikiWomenCamp or Wikimedia Diversity Conference) $5000
Translation Translation of research findings and the produced educational materials $25 per page, per language
Total (without translation costs as I am still awaiting community feedback) $79,117

I will fulfill the roles of qualitative research analyst and additionally schedule interviews, transcribe interviews, code data, write and edit the narrative and report, and design the educational materials and coordinating learning outcome survey.

Community engagement

[edit]

I will engage with communities and organizations with interest in the proposal subject-matter both on- and off-wiki, like:

  1. Wikiprojects focused on diversity and systemic bias (for example: English Wikipedia's Gender Gap Task Force, English Wikipedia's Women in Red, WikiProject:Disability)
  2. Groups that work both on- and off-wiki on systemic bias (e.g. AfroCROWD, Art+Feminism, Wiki Loves Women, Wiki Loves Africa)
  3. User groups focused on diversity and systemic bias (e.g. WikiWomen)
  4. Whose Knowledge "aims to correct the skewed representations of knowledge on the internet."

This list of communities and organizations was developed due to their existing experiences regarding bias and possible interest in bias due to their efforts or the subject-matter they deal with as a community, project, or organization. They have existing knowledge and experience dealing with bias on Wikipedia. The broader Wikimedia community can follow the process of this proposed project on Meta and provide feedback about educational materials in the draft stages. A community discussion session can be organized among Wikimedia volunteers who show interest. The community interested in changing implicit bias’ impact on Wikipedia will make this proposed project stronger. Suggestions and community participation are welcome! This list will be updated as further community engagement occurs.

Get involved

[edit]

Participants

[edit]

Jackie Koerner, PhD (project proposer): qualitative research analyst, academic, educator, advocate for open education

Community notification

[edit]

I have placed notifications in the following locations:

This list will be updated as further community notifications occur.

Endorsements

[edit]

Do you think this project should be selected for a Project Grant? Please add your name and rationale for endorsing this project below! (Other constructive feedback is welcome on the discussion page).

  • Endorse While we have some studies on implicit bias on Wikipedia, we do need to continue to study this and go more in depth. I personally am interested in seeing how implicit bias may or may not affect choices made at AfD. It seems to me that there is a bias involved in some cases, especially in the selection of articles to delete and for the reasons stated for deletion, but without a study it's impossible to know if my experience is typical or not. In cases like these, I "feel" that something is off, but without empirical data backing up my feelings, they aren't very useful. It's also good to know where to best direct efforts. Are WikiProjects helping correct bias? Is there bias on Wikimedia Commons (I'd say, yes, in the way that photos are categorized.) Targeting the biggest problems will certainty help us out as a community. I am glad that you are also working on the issue of people writing others' stories. This is an area that I find personally problematic. As a white woman, I am often editing articles about countries in the Global South and civil rights topics for various individuals around the world. I find when I do this that I learn a lot about experiences I have never had. However, I know that my own lack of experience means that I may not write a fully balanced article. I strive to do so, but while I think it's perfectly acceptable for people to write about topics they are not "connected" to, it's even better to have people tell their own stories and contribute to what we have started on Wikipedia and then expand it. How to reach and connect with such editors, I am unsure. I hope you can explore this further. It may be interesting to see how other communities outside of Wikipedia view Wikipedia itself. For example, how do Mexican Americans identifying as Chicano view Wikipedia? This may be out of scope, but I would love to see something that addresses an issue like this. So thanks for taking this on! Megalibrarygirl (talk) 16:12, 27 September 2017 (UTC)
  • Endorse this kind of research is a necessary part of build self-awareness and skills within the Wikimedia community. There are many many contributors who work in areas directly impacting marganalized communities, that need to have the skills and tools to check their biases, especially when there is an opportunity to fill gaps in our content or practices. Sadads (talk) 20:22, 27 September 2017 (UTC)
  • Endorse - I heartily endorse this project AND can give you very specific examples of implicit bias related to my own editing of Women's health topics. Thank you for this proposal and if you want volunteer to help, let me know on my Wikipedia talk page. Best Regards, Barbara (WVS) (talk)
  • Strongly endorse - This research needs to be done and I think Jackie is capable of giving it the rigor it requires. --Rosiestep (talk) 23:17, 27 September 2017 (UTC)
  • Endorse - as Megalibrarygirl said, I work almost exclusively to add women and specifically minority women, i.e. non-English speakers or English as a Second Language to Wikipedia. I will be interested in the findings as regards to both AfD and AfC, which seem to function as gatekeeping projects, limiting contributions to Anglo, upper/middle, CIS standards. SusunW (talk) 06:38, 28 September 2017 (UTC)
  • Strongly endorse. The bias manifests itself in the choice of sources, text, images, in talk-page discussions, in article-review processes, in the way experienced editors and admins interact to decide what the accepted position is on any given issue (content or behaviour), and in the way policy is written and enforced. As Jackiekoerner wrote: "This is not exclusively about harassment. It’s about something more subtle, but just as damaging." SarahSV talk 17:12, 28 September 2017 (UTC)
  • Endorse – Important topic and capable researcher. Implicit bias is a real problem that needs to be understood and addressed in order to make this encyclopedia more truly representative of the diverse world we inhabit. Funcrunch (talk) 20:06, 29 September 2017 (UTC)
  • Endorse - I endorse this project as it may help us to better understand this important issue that exists between the people who work on and with Wikipedia and the technologies used. As issues of bias may be under-examined in the literature related to Wikipedia, this has the potential to help make some of those factors as transparent as everything else within this sphere. I am willing to help if there is a need as well. FULBERT (talk) 14:15, 30 September 2017 (UTC)
  • endorse Slowking4 (talk) 23:27, 30 September 2017 (UTC)
  • strongly endorse Nattes à chat it is important for projects that tackle the gender gap to have academic and serious ressources to share with participants to reflect on how to deal practically with these issues. This helps to direct action, in other words theory is needed! I would love to have you findings translated in French to share on our project page les sans pages (french), and also to be share with wikimujeres and wikidonne (maybe you could think about a little more on how you will share with the global movement in other languages than English). There is a meta page listing all concrete projects dealing with the gender gap, and I propose that you sure there as well.
  • Endorse - This kind of scholarly work is needed, not just as an important contribution to internet studies, but to help inform the Wikipedia development and work on its problems: its online and f2f communities, its content, and its future directions. Mozucat (talk) 02:24, 6 October 2017 (UTC)
  • Endorse My primary interest is disability, which is subject to a lot of bias - in a recent AFD discussion someone claimed (without a shred of evidence) that the SNG for cricket players does not apply to blind cricket - a fairly crude example of the type of bias this study will examine. As a South African I am also keenly aware of the biases affecting coverage of the global south and Africa in particular. In various discussions I sometimes need to remind other editors that "this is not the Yankopedia". A systematic study of the issues, as proposed here, will certainly enhance our understanding of such biases, and consequently inform efforts to ameliorate their effects. Dodger67 (talk) 22:36, 7 October 2017 (UTC)
  • Endorse - Bias on Wikipedia (or volunteer projects, encyclopedias, open access projects, etc.) is the subject of a good bit of scholarship, but there's a whole lot more to do. This long-term deep dive conducted by someone with both the necessary academic/methodological training and experience with/within the Wikipedia community has the potential to form a substantial contribution to that research, with implications beyond Wikipedia. I know Jackie through her participation in the Wikipedia Visiting Scholars program (I work on that program with Wiki Education). Jackie joined me and others at this year's Wikimania for a session about Visiting Scholars, but to me the highlight was her presentation about bias, delivered to a standing-room-only crowd. She has been traveling to talk about subjects like this one at multiple events/conferences, demonstrating a commitment to the study of bias and to the Wikimedia community as a volunteer, and I'd love to see what she can do when empowered to take this on in a more full-time capacity. --Ryan (Wiki Ed) (talk) 15:33, 13 October 2017 (UTC)
  • Endorse I endorse JackieKoerner doing a study of implicit bias in English Wikipedia. There is a strong need for a better understanding of how policies and procedures influence implicit bias. She is qualified to do the research and knowledgeable about English Wikipedia. Sydney Poore/FloNight (talk) 21:19, 15 October 2017 (UTC)
  • Endorse. Not sure what to add that hasn't already been said better by others above. It is important to support scholarly work of this type. Gamaliel (talk) 18:39, 17 October 2017 (UTC)
  • Strongly endorse, implicit bias may come from a variety sources, and most of the time we cannot distinguish the root cause of them. I hope this research can be a starting point for the global community to learn how to tackle the implicit bias issue. --Liang(WMTW) (talk) 05:02, 18 October 2017 (UTC)
  • Yes. Jaluj (talk) 23:50, 11 November 2017 (UTC)
  • Endorse I think this is a very important subject to be researched, both for the sake of Wikipedias readers and for Wikipedia editors. Trevor Bolliger (talk) 21:25, 27 November 2017 (UTC)
  • Endorse Just finished volunteering with the first Wikipedia + Libraries course. This is exactly the kind of research needed to support Wikipedia's educational programming. Avery Jensen (talk) 01:26, 7 December 2017 (UTC)

References

[edit]
  1. Vetter, Matthew A. (May 2015). "Teaching Wikipedia: The pedagogy and politics of an open access writing community". 
  2. Ford, Heather (2016). "'Anyone can edit' not everyone does: Wikipedia and the gender gap". Social Studies of Science. 
  3. Colón-Aguirre & Flemming-May (11 October 2012). ""You just type in what you are looking for": Undergraduates' use of library resources vs. Wikipedia". The Journal of Academic Librarianship 38 (6). 
  4. Samoilenko, Anna; Yasseri, Taha (2014). "The distorted mirror of Wikipedia: a quantitative analysis of Wikipedia coverage of academics". EPJ Data Science 3 (1). 
  5. Flick, Corinne M. "Geographies of the World's Knowledge" (PDF). University of Oxford. Retrieved 27 September 2017. 
  6. Graham, M; Hogan, B; Straumann, R.K.; Medhat, A (2014). "Uneven Geographies of User-Generated Information: Patterns of Increasing Informational Poverty" 104 (4).