Jump to content

Sistema privado para informar sobre incidentes

From Meta, a Wikimedia project coordination wiki
This page is a translated version of the page Incident Reporting System and the translation is 15% complete.
Outdated translations are marked like this.
Incident Reporting System

    La Fundación desea mejorar el modo en el que los usuarios que experimentan acoso u otras conductas problemáticas informan sobre ello para hacer de las comunidades unos lugares más sanos y seguros.

    El nuevo equipo de herramientas de confianza y seguridad (en inglés: Trust and Safety Tools team) ha sido requerido para crear un sistema privado para informar sobre incidentes (PIRS por sus siglas en inglés). El objetivo es facilitar que los usuarios puedan informar sobre incidentes de manera privada y segura.

    Antecedentes

    Reporting and processing of harmful incidents has been a topic of interest for Wikimedia communities for many years. With the new Universal Code of Conduct being set up, it is crucial to also have a discussion about user reporting systems.

    El modo de gestionar incidentes relativos al mal comportamiento o por vulneración de políticas es algo cambiante, dependiente de cada proyecto y muy descentralizado.

    Cada proyecto Wikimedia o comunidad tiene sus propios procedimientos. El modo de informar y gestionar dichos informes puede realizarse de varias maneras:

    • a través de las páginas de discusión
    • a través de tablones de anuncios
    • a través del correo electrónico
    • a través de comunicaciones privadas ajenas a la wiki (canales de IRC, Discord, etc.)

    Para muchos usuarios no resulta claro qué realizar cuando desean informar sobre un incidente: dónde acudir, con quién hablar, cómo hacer el informe, qué información ha de ser incluída o qué ocurre una vez presentado, entre otros.

    Los usuarios deben poder saber cómo informar sobre un incidente y de qué manera hacerlo. Existe además poca información acerca de lo que ocurre una vez se denuncia el incidente y qué expectativas tiene el informante.

    Algunos usuarios no se sientes seguros informando sobre incidentes debido a la complejidad del procedimiento o por cuestiones de intimidad/privacidad.

    No existe en la actualidad un procedimiento estándar que permita a los usuarios informar sobre incidentes de manera privada.

    Foco del proyecto

    Por lo tanto, el objetivo principal de este proyecto es hacer que sea más fácil tratar casos de acoso o incidentes dañinos, garantizar la privacidad y seguridad de quienes informan, así como garantizar que el informe llegue a la entidad que debe procesarlo, sin poner presión extra sobre quienes reportan

    We want to ensure the privacy and safety of those reporting. We also want to ensure that reports have the right information and are routed to the appropriate entity that needs to process them, while not putting extra pressure on the ones who do the processing.

    The Trust and Safety Product Team is looking at this incident reporting system as part of the larger Wikimedia community incident management ecosystem. We are studying how the individual systems work and how everything can connect together.

    Updates

    Prueba el Sistema de Notificación de Incidentes en versión Beta – 10 de noviembre de 2023

    Se invita a los editores a probar una versión beta conocida como producto mínimo de prueba inicial (MTP) para el Sistema de Informe de Incidentes. El equipo de Trust & Safety Product ha creado una versión básica del producto que permite al usuario presentar un informe desde la página de discusión donde ocurre un incidente. "'Nota:"' Esta versión del producto sirve para aprender sobre cómo presentar informes a una dirección de correo electrónico privada (por ejemplo, emergency(_AT_)wikimedia.org o un grupo de administradores). Esto no cubre todos los escenarios, como reportar a un tablero público. Se necesita su retroalimentación para determinar si este enfoque inicial es efectivo.

    "Para probar:"

    "1. "' Visite cualquier página de discusión de Wikipedia en Beta que contenga contenido. Tenemos muestras de páginas de conversación disponibles en User talk:Testing y Talk:African Wild Dog que puede usar e iniciar sesión.

    "2. "' A continuación, haga clic en el botón de desbordamiento (elipsis vertical), cerca del enlace "'Responder' de cualquier comentario, para abrir el menú de desbordamientos y haga clic en "'Reporte' (ver diapositiva 1). También puede utilizar el enlace "'Informe"' en el menú "' Herramientas"' (ver diapositiva 2).

    3. Proceda a presentar un informe, llene el formulario y envíe. Se enviará un correo electrónico al Trust and Safety Product team, que será el único que verá su informe. Tenga en cuenta que se trata de una prueba y no la utilice para informar incidentes reales.

    "4. "Conforme va probando, reflexione sobre estas preguntas:

    • ¿Qué piensa de este proceso de presentación de informes? Diga especialmente lo que le gusta o no en ella.
    • Si usted está familiarizado con las extensiones, ¿cómo se sentiría si tuviera esta extensión implementada en su wiki?
    • ¿Qué temas nos hemos perdido en esta etapa inicial de informes?

    5.' Después de la prueba, deje sus comentarios en la página de discusión.

    Si no puede encontrar el menú adicional o los enlaces de Informe, o si el formulario no se envía, asegúrese de que:

    • Ha iniciado sesión
    • Su dirección de correo electrónico de la cuenta Beta está confirmada
    • Su cuenta ha sido creada durante más de 3 horas y tiene al menos 1 edición.
    • Ha habilitado DiscussionTools porque MTP está integrado con DiscussionTools

    "'Si las herramientas de discusión no se cargan"', puede presentar un informe desde el menú herramientas. Si no puede presentar un segundo informe, tenga en cuenta que hay un límite de 1 informe por día para usuarios no confirmados y 5 informes por día para usuarios con confirmación automática. Estos requisitos antes de realizar la prueba ayudan a reducir la posibilidad de que usuarios malintencionados abusen del sistema.

    Please see the research section of the page for more.

    Process

    Project phases

    Figuring out how to manage incident reporting in the Wikimedia space is not an easy task. There are a lot of risks and a lot of unknowns.

    As this is a complex project it needs to be split into multiple iterations and project phases. For each of these phases we will hold one or several cycles of discussions in order to ensure that we are on the right track and that we incorporate community feedback early, before jumping into large chunks of work.

    Phase 1

    Preliminary research: collect feedback, reading through existing documentation.

    Conduct interviews in order to better understand the problem space and identify critical questions we need to answer.

    Define and discuss possible product direction and scope of project. Identify possible pilot wikis.

    At the end of this phase we should have a solid understanding of what we are trying to do.

    Phase 2

    Create prototypes to illustrate the ideas that came up in Phase 1.

    Create a list of possible options for more in-depth consultation and review.

    Phase 3

    Identify and prioritize the best possible ideas.

    Transition to software development and break down work in Phabricator tickets.

    Continue cycle for next iterations

    Research

    July 2024: Incident Reporting System user testing summary

    In March 2024, the Trust & Safety Product team conducted user testing of the Minimum Viable Product (MVP) of the Incident Reporting System to learn if users know where to go to report an emergency incident, and if the user flow makes sense and feels intuitive.

    We learned the following:

    • During user testing, all participants found the entry point to report an incident and the current user flow is well understood.
    • There was some confusion over two of the reporting options: “someone might cause self-harm” and “public harm threatening message”.

    Two participants also made assumptions about the system being automated. One participant was concerned about automation and wanted a human response, whereas the other participant felt assured by the idea it would check if the abuser had any past history of threats and offences, and delete the offensive comment accordingly. All participants expected a timely response (an average of 2-3 days) after submitting a report. Read more.

    September 2023: Sharing incident reporting research findings

    Research Findings Report on Incident Reporting 2023

    The Incident Reporting System project has completed research about harassment on selected pilot wikis.

    The research, which started in early 2023, studied the Indonesian and Korean Wikipedias to understand harassment, how harassment is reported and how responders to reports go about their work.

    The findings of the studies have been published.

    In summary, we received valuable insights on the improvements needed for both onwiki and offwiki incident reporting. We also learned more about the communities' needs, which can be used as valuable input for the Incident Reporting tool.

    We are keen to share these findings with you; the report has more comprehensive information.

    Please leave any feedback and questions on the talkpage.

    Pre-project research

    The following document is a completed review of research from 2015–2022 the Wikimedia Foundation has done on online harassment on Wikimedia projects. In this review we’ve identified major themes, insights, and areas of concern and provided direct links to the literature.

    The Trust and Safety Tools team has been studying previous research and community consultations to inform our work. We revisited the Community health initiative User reporting system proposal and the User reporting system consultation of 2019. We have also been trying to map out some of the conflict resolution flows across wikis to understand how communities are currently managing conflicts. Below is a map of the Italian Wiki conflict resolution flow. It has notes on opportunities for automation.

    On Italian Wikipedia, there's a 3-step policy in place for conflict resolution. This map visualizes this process and tries to identify opportunities for automation for both editors and admins.

    Frequently Asked Questions

    Questions and answers from Phase 1 of the project

    Q: The project used to be called PIRS, Private Incident Reporting System. Why was the P dropped?

    Buenas quiero vender una pintura orijinal del pintor jean Baptiste el pájaro muerto

    Q: Is there data available about how many incidents are reported per year?

    A: Right now there is not a lot of clear data we can use. There are a couple of reasons for this. First, issues are reported in various ways and those differ from community to community. Capturing that data completely and cleanly is highly complicated and would be very time consuming. Second, the interpretation of issues also differs. Some things that are interpreted as harassment are just wiki business (e.g. deleting a promotional article). Review of harassment may also need cultural or community context. We cannot automate and visualize data or count it objectively. The incident reporting system is an opportunity to solve some of these data needs.

    Q: How is harassment being defined?

    A: Please see the definitions in the Universal Code of Conduct.

    Q: How many staff and volunteers will be needed to support the IRS?

    A: Currently the magnitude of the problem is not known. So the amount of people needed to support this is not known. Experimenting with the minimum viable product will provide some insight into the number of people needed to support the IRS.

    Q: What is the purpose of the MVP (minimal viable product)?

    A: The MVP is an experiment and opportunity to learn. This first experimental work will answer the questions that we have right now. Then results will guide the future plans.

    Q: What questions are you trying to answer with the minimum viable product?

    A: Here are the questions we need to answer:

    • What kind of reports will people file?
    • How many people will file reports?
    • How many people would we need in order to process them?
    • How big is this problem?
    • Can we get a clearer picture of the magnitude of harassment issues? Can we get some data around the number of reports? Is harassment underreported or overreported?
    • Are people currently not reporting harassment because it doesn’t happen or because they don’t know how?
    • Will this be a lot to handle with our current setup, or not?
    • How many are valid complaints compared to people who don't understand the wiki process? Can we distinguish/filter valid complaints, and filter invalid reports to save volunteer or staff time?
    • Will we receive lots of reports filed by people who are upset that their edits were reverted or their page was deleted? What will we do with them?

    Q: How does the Wikimedia movement compare to how other big platforms like Facebook/Reddit handle harassment?

    A: While we do not have any identical online affinity groups, the Wikimedia movement is most often connected with Facebook and Reddit in regard to how we handle harassment. What is important to consider is nobody has resolved harassment. Other platforms struggle with content moderation, and often they have paid staff who try to deal with it. Two huge differences between us and Reddit and Facebook are the globally collaborative nature of our projects and how communities work to resolve harassment at the community-level.

    Q: Is WMF trying to change existing community processes?

    A: Our plan for the IRS is not to change any community process. The goal is to connect to existing processes. The ultimate goals are to:

    • Make it easier for people who experience harassment to get help.
    • Eliminate situations in which people do not report because they don’t know how to report harassment.
    • Ensure harassment reports reach the right entities that handle them per local community processes.
    • Ensure responders receive good reports and redirect unfounded complaints and issues to be handled elsewhere.