Jump to content

Community Wishlist Survey 2021/Wikidata/Anti-vandalism tools for Wikidata

From Meta, a Wikimedia project coordination wiki

Anti-vandalism tools for Wikidata

  • Problem: Wikidata has a lot of vandalism and the tools to fight it are not very good.
  • Who would benefit: Admins, all wikis that use Wikidata
  • Proposed solution: Develop tools to fight vandalism. A start would be w:en:WP:TWINKLE, to automate reverting, warning a user, blocking, and leaving the right templated messages in all those cases. Then we could also get something like w:en:WP:HUGGLE to load changes in real time and get them patrolled.
  • More comments: This will attract an army of editors to fight vandalism on Wikidata, similar to English Wikipedia. This will improve trust in Wikidata.
  • Phabricator tickets:
  • Proposer: Rschen7754 01:57, 17 November 2020 (UTC)[reply]

Discussion

  • Note that there is currently a ticket for Huggle (T183141) marked high-priority for (at least partial) support of Wikidata. Courtesy ping since they authored said ticket: Petrb. Perryprog (talk) 02:39, 17 November 2020 (UTC)[reply]
  • Yes, this is needed so badly. {{u|Sdkb}}talk 02:43, 17 November 2020 (UTC)[reply]
  • There are some bugs with issuing warnings, but otherwise Huggle seems to work mostly OK for me... I always keep Wikidata in my feed alongside English Wikipedia. My only real complaint is phab:T199500, which is actually a big problem, but all things considered using Huggle is still more efficient than Special:RecentChanges. As for Twinkle, the first step to get the UI localized. That's in progress now and slated to be completed by the end of the year. MusikAnimal talk 02:57, 17 November 2020 (UTC)[reply]
  • There is also the fact that recreations are a lot harder to track due to them being at new entity ids instead of at the same title, so title blacklisting or creation protection isn't available... --DannyS712 (talk) 03:27, 17 November 2020 (UTC)[reply]
  • Nice idea, yes we do need twinkle like tool which will work on wikidata, actually modifying global twinkle would be an good idea? to fit in requirements of Wikidata, commons, wikisource? QueerEcofeminist [they/them/their] 05:40, 18 November 2020 (UTC)[reply]
  • This is an important proposal, but I do not think that Huggle and Twinkle can be made particularly useful for Wikidata RC patrolling. Those tools let us basically make ad-hoc revision-based assessments, but we rather need tools for a much more user-based patrolling process. In other words: the question usually is whether a given user generally makes good faith edits (and to a lesser degree has the skills to get it right), rather than whether a particular edit was made with good faith.
    Modifications in Wikidata are often composed of several “atomic” edits (i.e. a collection of edits that are usually close to a smallest possible increment); it is often not useful to look at individual edits (diffs) in particular, as only the overall picture tells us the full story. This is even more important in situations involving more than one item (sitelink moves, mergers, etc.).
    Another key feature for Wikidata patrolling are editorial filter options, so that the patroller can efficiently filter edits of a certain type of modification (e.g. limited to a specific language or property). There are some tools out there doing exactly this, but improvements are certainly possible. —MisterSynergy (talk) 09:20, 18 November 2020 (UTC)[reply]
  • Since more and more wikis rely on Wikidata, it's fundamental to protect the integrity of the information in it. --Andyrom75 (talk) 22:12, 20 November 2020 (UTC)[reply]
  • I know this proposal is vague but I think that is okay. More than any particular vandalism tool, we just need any vandalism tool so that we can encourage more discussion about what kind of vandalism Wikidata experiences and how we prepare a long term effort to counter it. This works as an open proposal - just start with the technical development of any vandalism tools on Wikidata. Blue Rasberry (talk) 18:17, 24 November 2020 (UTC)[reply]
  • tools are needed but rather than twinkle / huggle, you need an ORES AI built on existing reversion dataset. the high confidence vandalism can get reverted by bot, with the less confident vandalism sent to a response queue. with human intervention by editor not edit, trying to pivot editors to productive editing. good faith mistaken edits should go to a coaching queue. you would need to train a task force of responders in how to act in a firm but fair way - edit warring over individual edits is a failed model. the tools will not attract the vandal fighters, rather the response team must be recruited to increase data quality. Slowking4 (talk) 02:13, 27 November 2020 (UTC)[reply]

Voting