Jump to content

事件通報系統

From Meta, a Wikimedia project coordination wiki
This page is a translated version of the page Incident Reporting System and the translation is 35% complete.
事件通報系統

    維基媒體基金會期望改善騷擾和其他傷害事件受害者通報事件的方式,以為社群提供更安全、更健康的環境。

    信任與安全產品團隊的任務是建立事件通報系統(IRS)(以前稱為私人事件通報系統(PIRS))。我們的目標是讓用戶能夠輕鬆、安全、私密地通報有害事件。

    專案背景

    關於騷擾和有害事件的通報和處理,一直是維基媒體社群關注的話題。隨著新的通用行為準則的建立,討論用戶舉報系統也是非常重要的。通用行為準則(UCoC)是一項政策,為維基媒體專案和空間中的所有參與者定義了最低標準的行為準則和不可接受的行為。

    處理相關案件、不當行為和政策違反事件的方式已在不同的維基媒體空間和專案裡被執行並且有機地發展,並在不同社群之間存在差異。

    每個維基媒體專案或社群都有自己的管理方式。事件的通報和處理有多種方式:

    • 透過維基討論頁
    • 透過通告版面
    • 透過電子郵件
    • 透過維基以外的私人討論頻道(Discord、IRC 等)

    對於許多用戶來說,發生事件後該怎麼處理是不清楚的:去哪裡、與誰交談、如何通報、報告中應包含哪些資訊、報告如何處理、之後會發生什麼等。

    用戶必須知道如何通報問題以及到哪裡通報。 關於報告提交後會發生什麼,以及用戶應該有什麼期望的資訊也很少。

    由於通報過程的複雜性和隱私考量,有些用戶在通報事件時會感到不安全。

    目前還沒有標準化的方法可供用戶私底下提交報告。

    專案焦點

    此專案的高階目標是希望能更容易地解決騷擾和有害事件,並確保這些報告可以去到必須處理它們的適當實體組織。

    我們希望確保通報者的隱私和安全。我們還希望確保報告具有正確的訊息,並能被導引到必須處理它們的適當實體組織,同時不會給處理人員帶來額外的壓力。

    信任和安全產品團隊正在將此事件通報系統視為更大的維基媒體社群事件管理生態系統的一部分。我們正在研究各個系統如何運作以及一切如何連接整合在一起。

    更新

    Test the Incident Reporting System in Beta – November 2023

    We invite editors to test the initial version for the Incident Reporting System. It makes it possible file a report from the talk page where an incident occurs. This version is for learning about filing reports to a private email address (e.g., emergency(_AT_)wikimedia.org or an admin group). It doesn't cover all scenarios, like reporting to a public noticeboard. We need your opinions to see if this approach is effective.

    To test:

    1. Visit any talk namespace page on Wikipedia in Beta that contains discussions. We have sample talk pages available at User talk:Testing and Talk:African Wild Dog you can use and log in.

    2. Next, click on the overflow button (vertical ellipsis) near the Reply link of any comment to open the overflow menu and click Report (see slide 1). You can also use the Report link in the Tools menu (see slide 2).

    3. Proceed to file a report, fill the form and submit. An email will be sent to the Trust and Safety Product team, who will be the only ones to see your report. Please note this is a test and so do not use it to report real incidents.

    4. As you test, ponder these questions:

    • What do you think about this reporting process? Especially what you like/don’t like about it?
    • If you are familiar with extensions, how would you feel about having this on your wiki as an extension?
    • Which issues have we missed at this initial reporting stage?

    5. Following your test, please leave your feedback on the talk page.

    If you can't find the overflow menu or Report links, or the form fails to submit, please ensure that:

    • You have logged in
    • Your Beta account email address is confirmed
    • Your account has been created for over 3 hours and you have at least 1 edit.
    • You have enabled DiscussionTools because the MTP is integrated with DiscussionTools

    If DiscussionTools doesn’t load, a report can be filed from the Tools menu.

    If you can't file a second report, please note that there is a limit of 1 report per day for non-confirmed users and 5 reports per day for autoconfirmed users. These requirements before testing help to reduce the possibility of malicious users abusing the system.

    請參閱頁面的研究部分以了解更多資訊。

    流程

    项目阶段

    Figuring out how to manage incident reporting in the Wikimedia space is not an easy task. There are a lot of risks and a lot of unknowns.

    As this is a complex project it needs to be split into multiple iterations and project phases. For each of these phases we will hold one or several cycles of discussions in order to ensure that we are on the right track and that we incorporate community feedback early, before jumping into large chunks of work.

    第一阶段

    Preliminary research: collect feedback, reading through existing documentation.

    Conduct interviews in order to better understand the problem space and identify critical questions we need to answer.

    Define and discuss possible product direction and scope of project. Identify possible pilot wikis.

    At the end of this phase we should have a solid understanding of what we are trying to do.

    第二阶段

    Create prototypes to illustrate the ideas that came up in Phase 1.

    Create a list of possible options for more in-depth consultation and review.

    第三阶段

    Identify and prioritize the best possible ideas.

    Transition to software development and break down work in Phabricator tickets.

    Continue cycle for next iterations

    研究

    July 2024: Incident Reporting System user testing summary

    In March 2024, the Trust & Safety Product team conducted user testing of the Minimum Viable Product (MVP) of the Incident Reporting System to learn if users know where to go to report an emergency incident, and if the user flow makes sense and feels intuitive.

    We learned the following:

    • During user testing, all participants found the entry point to report an incident and the current user flow is well understood.
    • There was some confusion over two of the reporting options: “someone might cause self-harm” and “public harm threatening message”.

    Two participants also made assumptions about the system being automated. One participant was concerned about automation and wanted a human response, whereas the other participant felt assured by the idea it would check if the abuser had any past history of threats and offences, and delete the offensive comment accordingly. All participants expected a timely response (an average of 2-3 days) after submitting a report. Read more.

    2023年9月:分享事件报告研究成果

    Research Findings Report on Incident Reporting 2023

    The Incident Reporting System project has completed research about harassment on selected pilot wikis.

    The research, which started in early 2023, studied the Indonesian and Korean Wikipedias to understand harassment, how harassment is reported and how responders to reports go about their work.

    The findings of the studies have been published.

    In summary, we received valuable insights on the improvements needed for both onwiki and offwiki incident reporting. We also learned more about the communities' needs, which can be used as valuable input for the Incident Reporting tool.

    We are keen to share these findings with you; the report has more comprehensive information.

    Please leave any feedback and questions on the talkpage.

    Pre-project research

    The following document is a completed review of research from 2015–2022 the Wikimedia Foundation has done on online harassment on Wikimedia projects. In this review we’ve identified major themes, insights, and areas of concern and provided direct links to the literature.

    The Trust and Safety Tools team has been studying previous research and community consultations to inform our work. We revisited the Community health initiative User reporting system proposal and the User reporting system consultation of 2019. We have also been trying to map out some of the conflict resolution flows across wikis to understand how communities are currently managing conflicts. Below is a map of the Italian Wiki conflict resolution flow. It has notes on opportunities for automation.

    On Italian Wikipedia, there's a 3-step policy in place for conflict resolution. This map visualizes this process and tries to identify opportunities for automation for both editors and admins.

    常見問題

    Questions and answers from Phase 1 of the project

    Q: The project used to be called PIRS, Private Incident Reporting System. Why was the P dropped?

    我們已將私人事件回報系統更名為事件通報系統,「私人」兩字已被移除。在騷擾和通用共同準則的背景下,「私人」一詞是指尊重社群成員的隱私並確保他們的安全。這並不意味著所有的報告階段都是機密的。我們收到了回饋,說這個術語因此令人困惑,並且在其他語言中很難翻譯,因此進行了更改。

    Q: Is there data available about how many incidents are reported per year?

    A: Right now there is not a lot of clear data we can use. There are a couple of reasons for this. First, issues are reported in various ways and those differ from community to community. Capturing that data completely and cleanly is highly complicated and would be very time consuming. Second, the interpretation of issues also differs. Some things that are interpreted as harassment are just wiki business (e.g. deleting a promotional article). Review of harassment may also need cultural or community context. We cannot automate and visualize data or count it objectively. The incident reporting system is an opportunity to solve some of these data needs.

    Q: How is harassment being defined?

    A: Please see the definitions in the Universal Code of Conduct.

    Q: How many staff and volunteers will be needed to support the IRS?

    A: Currently the magnitude of the problem is not known. So the amount of people needed to support this is not known. Experimenting with the minimum viable product will provide some insight into the number of people needed to support the IRS.

    Q: What is the purpose of the MVP (minimal viable product)?

    A: The MVP is an experiment and opportunity to learn. This first experimental work will answer the questions that we have right now. Then results will guide the future plans.

    Q: What questions are you trying to answer with the minimum viable product?

    A: Here are the questions we need to answer:

    • What kind of reports will people file?
    • How many people will file reports?
    • How many people would we need in order to process them?
    • How big is this problem?
    • Can we get a clearer picture of the magnitude of harassment issues? Can we get some data around the number of reports? Is harassment underreported or overreported?
    • Are people currently not reporting harassment because it doesn’t happen or because they don’t know how?
    • Will this be a lot to handle with our current setup, or not?
    • How many are valid complaints compared to people who don't understand the wiki process? Can we distinguish/filter valid complaints, and filter invalid reports to save volunteer or staff time?
    • Will we receive lots of reports filed by people who are upset that their edits were reverted or their page was deleted? What will we do with them?

    Q: How does the Wikimedia movement compare to how other big platforms like Facebook/Reddit handle harassment?

    A: While we do not have any identical online affinity groups, the Wikimedia movement is most often connected with Facebook and Reddit in regard to how we handle harassment. What is important to consider is nobody has resolved harassment. Other platforms struggle with content moderation, and often they have paid staff who try to deal with it. Two huge differences between us and Reddit and Facebook are the globally collaborative nature of our projects and how communities work to resolve harassment at the community-level.

    Q: Is WMF trying to change existing community processes?

    A: Our plan for the IRS is not to change any community process. The goal is to connect to existing processes. The ultimate goals are to:

    • Make it easier for people who experience harassment to get help.
    • Eliminate situations in which people do not report because they don’t know how to report harassment.
    • Ensure harassment reports reach the right entities that handle them per local community processes.
    • Ensure responders receive good reports and redirect unfounded complaints and issues to be handled elsewhere.