Hi,
I noticed that there is a high amount of suspicious edits that may be vandalism but were never reverted because people who were dealing with vandals (using some automated tool) in that moment weren't able to decide if it was vandalism or wasn't. For example some "smart" changes to statistical data, dates, football scores, changes that look weird but aren't clearly vandalism etc. These edits should be reviewed by expert on the topic, but in this moment, they aren't collected anywhere.
I think we should create a new service (on tool labs?) that would allow these tools to insert such edits to queue (or database) of "suspicious edits" for later review by experts, this categorized database / queue could be browsed by people who are experts on given topics and got reviewed / reverted by them.
The database would need to be periodically scanned and all changes that were reverted would need to be removed from it. The people who reviewed the edits could also flag them as "ok".
This way we could improve the efficiency of anti-vandalism tools by the amount of edits which are ignored or skipped these days.
Some suggestions or ideas how to implement such a feature?
That idea sounds like something already that could be done by the Flagged Revs extension.
Given that many of those suspicious edits could be extremely subtle, like minor changes to mathematical equations and statistics, articles with lots of potential for those types of subtle vandal edits would probably be better handled by having them approved with Flagged Revs and allowing trusted editors (like members of a particular Wiki Project based in those fields) to review the edits.
I doubt Twinkle or Huggle would be ideal for such vandalism, as it would be easy to mistake legitimate edits for vandal edits, and automated vandal detection/reversion processes would generally have a poor margin of error for such subtle vandalism.
Date: Thu, 26 Sep 2013 15:06:47 +0200 From: benapetr@gmail.com To: wikitech-l@lists.wikimedia.org Subject: [Wikitech-l] Improving anti-vandalism tools (twinkle, huggle etc) - suspicious edits queue
Hi,
I noticed that there is a high amount of suspicious edits that may be vandalism but were never reverted because people who were dealing with vandals (using some automated tool) in that moment weren't able to decide if it was vandalism or wasn't. For example some "smart" changes to statistical data, dates, football scores, changes that look weird but aren't clearly vandalism etc. These edits should be reviewed by expert on the topic, but in this moment, they aren't collected anywhere.
I think we should create a new service (on tool labs?) that would allow these tools to insert such edits to queue (or database) of "suspicious edits" for later review by experts, this categorized database / queue could be browsed by people who are experts on given topics and got reviewed / reverted by them.
The database would need to be periodically scanned and all changes that were reverted would need to be removed from it. The people who reviewed the edits could also flag them as "ok".
This way we could improve the efficiency of anti-vandalism tools by the amount of edits which are ignored or skipped these days.
Some suggestions or ideas how to implement such a feature?
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Hi,
That makes sense, but I don't think this is going to happen. Flagged Revs were never really popular on english wikipedia and this is a real problem that should be solved somehow. These edits were always reviewed by people who are already using huggle or twinkle, this change couldn't make it worse, these people already do ignore them. I am talking about manual review (even in browser), this is not about reviewing itself, rather about mechanism how to collect the edits that appear to be suspicious for later review.
I just don't believe that flagged revisions are going to be used for anything useful on english wikipedia in next... 10 years? That doesn't mean I don't like Flagged Revs nor that I think it wouldn't be a solution for this problem. It's just that people don't want them on wikipedia...
On Thu, Sep 26, 2013 at 3:16 PM, Arcane 21 arcane@live.com wrote:
That idea sounds like something already that could be done by the Flagged Revs extension.
Given that many of those suspicious edits could be extremely subtle, like minor changes to mathematical equations and statistics, articles with lots of potential for those types of subtle vandal edits would probably be better handled by having them approved with Flagged Revs and allowing trusted editors (like members of a particular Wiki Project based in those fields) to review the edits.
I doubt Twinkle or Huggle would be ideal for such vandalism, as it would be easy to mistake legitimate edits for vandal edits, and automated vandal detection/reversion processes would generally have a poor margin of error for such subtle vandalism.
Date: Thu, 26 Sep 2013 15:06:47 +0200 From: benapetr@gmail.com To: wikitech-l@lists.wikimedia.org Subject: [Wikitech-l] Improving anti-vandalism tools (twinkle, huggle etc) - suspicious edits queue
Hi,
I noticed that there is a high amount of suspicious edits that may be vandalism but were never reverted because people who were dealing with vandals (using some automated tool) in that moment weren't able to decide if it was vandalism or wasn't. For example some "smart" changes to statistical data, dates, football scores, changes that look weird but aren't clearly vandalism etc. These edits should be reviewed by expert on the topic, but in this moment, they aren't collected anywhere.
I think we should create a new service (on tool labs?) that would allow these tools to insert such edits to queue (or database) of "suspicious edits" for later review by experts, this categorized database / queue could be browsed by people who are experts on given topics and got reviewed / reverted by them.
The database would need to be periodically scanned and all changes that were reverted would need to be removed from it. The people who reviewed the edits could also flag them as "ok".
This way we could improve the efficiency of anti-vandalism tools by the amount of edits which are ignored or skipped these days.
Some suggestions or ideas how to implement such a feature?
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Hi Petr,
I can see the value. Although I'm not entirely sure how you were planning on identifying these edits-- are you thinking all our current tools would have another classification (like "more review needed"), and submit them? Or would these be identified by another, new bot?
On Thu, Sep 26, 2013 at 6:06 AM, Petr Bena benapetr@gmail.com wrote:
Hi,
I noticed that there is a high amount of suspicious edits that may be vandalism but were never reverted because people who were dealing with vandals (using some automated tool) in that moment weren't able to decide if it was vandalism or wasn't. For example some "smart" changes to statistical data, dates, football scores, changes that look weird but aren't clearly vandalism etc. These edits should be reviewed by expert on the topic, but in this moment, they aren't collected anywhere.
I think we should create a new service (on tool labs?) that would allow these tools to insert such edits to queue (or database) of "suspicious edits" for later review by experts, this categorized database / queue could be browsed by people who are experts on given topics and got reviewed / reverted by them.
The database would need to be periodically scanned and all changes that were reverted would need to be removed from it. The people who reviewed the edits could also flag them as "ok".
This way we could improve the efficiency of anti-vandalism tools by the amount of edits which are ignored or skipped these days.
Some suggestions or ideas how to implement such a feature?
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Yes, I mean this identification. The tools would have button like "needs review by expert" which would have similar effect like "skip" but the edit would be enqueued somewhere so that experts could review it later and revert in case if it wasn't correct.
Only task what would need to be done by a bot or so would be clean up of this list (categorizing of edits etc).
For example I know something about computers, but definitely not about geography or history. So I could just list all "suspicious edits" for pages in category "information technology" and revert / confirm as OK all edits in it (or those I know). While someone else, who is for example expert on geography could review these edits where someone for example changed the size / population added some dubious or weird content about some country, which I can't confirm is wrong neither correct myself.
These days I am testing huggle 3 as I am working on it (even on production, shame on me) thus I get in a role of "vandal fighter". And I can tell you that every day I skip hundreds of edits which I personally think that should be reviewed by someone because they looked weird to me, but which I didn't revert because I couldn't confirm it was vandalism either. Having evidence of such edits would help us not overlook these changes.
This queue already exists: it's the absolute complement of [[Help:patrolled edit|]]s. https://meta.wikimedia.org/wiki/Help:Patrolled_edit
Nemo
But this works the other way, every edit is marked as "suspicious" while users can flag these that appear to be OK. What I am talking about is the other way. Vandal fighters would flag these edits that look weird to them and experts would review only those edits, not all of them.
On Thu, Sep 26, 2013 at 7:24 PM, Federico Leva (Nemo) nemowiki@gmail.com wrote:
This queue already exists: it's the absolute complement of [[Help:patrolled edit|]]s. https://meta.wikimedia.org/wiki/Help:Patrolled_edit
Nemo
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
That's a problem in the client, not in MediaWiki. To implement that with current code, you can patrol everything that is not suspicious and you'll get what you describe; if your patrolling bot is error-prone you may hypothetically need an "unpatrol" feature, but then just fix the bot.
Nemo
No I wouldn't. The queue would start getting filled up by good edits in case everyone who uses huggle would disconnect or stopped using it. The current system as it is clearly isn't sufficient for this. We need to cherry-pick the bad edits, not good edits. Current system allows only to flag good edits as "don't need review" which isn't really useful for anything...
On Thu, Sep 26, 2013 at 7:30 PM, Federico Leva (Nemo) nemowiki@gmail.com wrote:
That's a problem in the client, not in MediaWiki. To implement that with current code, you can patrol everything that is not suspicious and you'll get what you describe; if your patrolling bot is error-prone you may hypothetically need an "unpatrol" feature, but then just fix the bot.
Nemo
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
I am also not talking about mediawiki at all. This evidence of edits that needs further review could be stored off-wiki, for example on wikimedia labs using some universal interface that all antivandalism tools can use
On Thu, Sep 26, 2013 at 7:41 PM, Petr Bena benapetr@gmail.com wrote:
No I wouldn't. The queue would start getting filled up by good edits in case everyone who uses huggle would disconnect or stopped using it. The current system as it is clearly isn't sufficient for this. We need to cherry-pick the bad edits, not good edits. Current system allows only to flag good edits as "don't need review" which isn't really useful for anything...
On Thu, Sep 26, 2013 at 7:30 PM, Federico Leva (Nemo) nemowiki@gmail.com wrote:
That's a problem in the client, not in MediaWiki. To implement that with current code, you can patrol everything that is not suspicious and you'll get what you describe; if your patrolling bot is error-prone you may hypothetically need an "unpatrol" feature, but then just fix the bot.
Nemo
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On 26/09/13 23:06, Petr Bena wrote:
Hi,
I noticed that there is a high amount of suspicious edits that may be vandalism but were never reverted because people who were dealing with vandals (using some automated tool) in that moment weren't able to decide if it was vandalism or wasn't. For example some "smart" changes to statistical data, dates, football scores, changes that look weird but aren't clearly vandalism etc. These edits should be reviewed by expert on the topic, but in this moment, they aren't collected anywhere.
I used to just revert them automatically when such changes appeared on my watchlist. If someone changes the population of Denmark or the formation enthalpy of carbon tetrachloride, without providing any reference or any suggestion that it is a revert, the chances that the new information is more accurate than the old information is extremely low.
In many cases, you would have to go to a university library to check that the original reference was correct, which seems like too high a burden to place on reviewers, considering that this sort of vandalism is extremely common.
I would be quite happy if AbuseFilter or Huggle flagged unreferenced changes to numbers for immediate reversion.
-- Tim Starling
Tim Starling wrote:
I used to just revert them automatically when such changes appeared on my watchlist. If someone changes the population of Denmark or the formation enthalpy of carbon tetrachloride, without providing any reference or any suggestion that it is a revert, the chances that the new information is more accurate than the old information is extremely low.
I don't follow this logic at all. It seems to be the exact opposite of "assume good faith." And obviously statistics such as the population of Denmark are mutable. If someone were changing, for example, a chemical element's properties, there might be more reasonable concern or suspicion, but even then it'd be pretty dickish to simply revert on sight.
Much of the content on Wikipedia and other Wikimedia wikis comes from non-vested contributors. That is, many, many helpful additions and corrections come from people who will make only a few edits in their lifetime. While I can't disagree with the suggestion that reverting is easier than fact-checking, I very much doubt that assuming bad faith helps build a better project or a better community. And this is to say nothing of the fact that the seemingly simple act of providing a reference is often painful and unintuitive, particularly in established articles that employ complicated markup (infoboxes, citation templates, and ref tags).
MZMcBride
----- Original Message -----
From: "MZMcBride" z@mzmcbride.com
Much of the content on Wikipedia and other Wikimedia wikis comes from non-vested contributors. That is, many, many helpful additions and corrections come from people who will make only a few edits in their lifetime. While I can't disagree with the suggestion that reverting is easier than fact-checking, I very much doubt that assuming bad faith helps build a better project or a better community. And this is to say nothing of the fact that the seemingly simple act of providing a reference is often painful and unintuitive, particularly in established articles that employ complicated markup (infoboxes, citation templates, and ref tags).
My first 2 edits at TV Tropes had this property: not only were they reverted, they were both reverted with snotty comments about procedure, and *the second one was me doing what the first one had yelled at me for not doing*. And I got yelled at the second time for following instructions.
I gave up. It's fun to read, but not worth my time to contribute to.
I concur with MZM: We don't want to become that.
Cheers, -- jra
We are getting somewhere else than I wanted... I didn't want to discuss what should be reverted on sight or not. Problem is that right now lot of vandal-fighters see certain amount of dubious edits they skip because they can't verify if they are correct or not, which are then ignored and get lost in editing history. That's a fact. This problem could be easily solved if these specific edits could be highlighted somehow so that they would get attention of people who understand the topic well enough to check if they are OK. But there is no such a system / mechanism that would allow us to do that. I think this is worth of implementing somehow because it could significantly improve the reliability of encyclopedia content. There is a lot of vandalism that remains unnoticed even for months
On Fri, Sep 27, 2013 at 5:51 AM, Jay Ashworth jra@baylink.com wrote:
----- Original Message -----
From: "MZMcBride" z@mzmcbride.com
Much of the content on Wikipedia and other Wikimedia wikis comes from non-vested contributors. That is, many, many helpful additions and corrections come from people who will make only a few edits in their lifetime. While I can't disagree with the suggestion that reverting is easier than fact-checking, I very much doubt that assuming bad faith helps build a better project or a better community. And this is to say nothing of the fact that the seemingly simple act of providing a reference is often painful and unintuitive, particularly in established articles that employ complicated markup (infoboxes, citation templates, and ref tags).
My first 2 edits at TV Tropes had this property: not only were they reverted, they were both reverted with snotty comments about procedure, and *the second one was me doing what the first one had yelled at me for not doing*. And I got yelled at the second time for following instructions.
I gave up. It's fun to read, but not worth my time to contribute to.
I concur with MZM: We don't want to become that.
Cheers,
-- jra
Jay R. Ashworth Baylink jra@baylink.com Designer The Things I Think RFC 2100 Ashworth & Associates http://baylink.pitas.com 2000 Land Rover DII St Petersburg FL USA #natog +1 727 647 1274
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On Fri, 27 Sep 2013 11:51:34 +0200, Petr Bena benapetr@gmail.com wrote:
We are getting somewhere else than I wanted... I didn't want to discuss what should be reverted on sight or not. Problem is that right now lot of vandal-fighters see certain amount of dubious edits they skip because they can't verify if they are correct or not, which are then ignored and get lost in editing history. That's a fact. This problem could be easily solved if these specific edits could be highlighted somehow so that they would get attention of people who understand the topic well enough to check if they are OK. But there is no such a system / mechanism that would allow us to do that. I think this is worth of implementing somehow because it could significantly improve the reliability of encyclopedia content. There is a lot of vandalism that remains unnoticed even for months
Really, you have just described FlaggedRevs. It could be enabled by default for all articles and would solve all of your problems. Many large Wikipedias already use it, including pl.wp and de.wp.
What I described are flagged revs the other way. Is it possible to enable them in reverse-mode so that all edits are flagged as good, but editors can flag them as bad? If not, I can't see how it could be useful for this purpose...
On Fri, Sep 27, 2013 at 12:06 PM, Bartosz Dziewoński matma.rex@gmail.com wrote:
On Fri, 27 Sep 2013 11:51:34 +0200, Petr Bena benapetr@gmail.com wrote:
We are getting somewhere else than I wanted... I didn't want to discuss what should be reverted on sight or not. Problem is that right now lot of vandal-fighters see certain amount of dubious edits they skip because they can't verify if they are correct or not, which are then ignored and get lost in editing history. That's a fact. This problem could be easily solved if these specific edits could be highlighted somehow so that they would get attention of people who understand the topic well enough to check if they are OK. But there is no such a system / mechanism that would allow us to do that. I think this is worth of implementing somehow because it could significantly improve the reliability of encyclopedia content. There is a lot of vandalism that remains unnoticed even for months
Really, you have just described FlaggedRevs. It could be enabled by default for all articles and would solve all of your problems. Many large Wikipedias already use it, including pl.wp and de.wp.
-- Matma Rex
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On Fri, 27 Sep 2013 12:39:46 +0200, Petr Bena benapetr@gmail.com wrote:
Is it possible to enable them in reverse-mode so that all edits are flagged as good, but editors can flag them as bad? If not, I can't see how it could be useful for this purpose...
"flagging as bad"? Do you mean reverting?
I just don't see what you are trying to accomplish. Sorry.
On Fri, Sep 27, 2013 at 1:37 PM, Bartosz Dziewoński matma.rex@gmail.comwrote:
On Fri, 27 Sep 2013 12:39:46 +0200, Petr Bena benapetr@gmail.com wrote:
Is it possible to
enable them in reverse-mode so that all edits are flagged as good, but editors can flag them as bad? If not, I can't see how it could be useful for this purpose...
"flagging as bad"? Do you mean reverting?
I just don't see what you are trying to accomplish. Sorry.
I think he's looking to flag as suspect, not to revert as bad. Are you really not following, or just not agreeing?
A possibility is to use a maintenance template, like {{cn}} or {{dubious}}, but this solution shares with using flagged revs for it - which would be a great solution - that it might be viewed as negative by the en.wp community.
-- Matma Rex
______________________________**_________________ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/**mailman/listinfo/wikitech-lhttps://lists.wikimedia.org/mailman/listinfo/wikitech-l
Is it possible to
enable them in reverse-mode so that all edits are flagged as good, but editors can flag them as bad? If not, I can't see how it could be useful for this purpose...
"flagging as bad"? Do you mean reverting?
I just don't see what you are trying to accomplish. Sorry.
I think he's looking to flag as suspect, not to revert as bad. Are you really not following, or just not agreeing?
A possibility is to use a maintenance template, like {{cn}} or {{dubious}}, but this solution shares with using flagged revs for it - which would be a great solution - that it might be viewed as negative by the en.wp community.
I thought FlaggedRevs prevented the newest version of the page from being shown until it has been approved?
"Flagged Revisions allows for Editor and Reviewer users to rate revisions of articles and set those revisions as the default revision to show upon normal page view. These revisions will remain the same even if included templates are changed or images are overwritten."
I think he also wants the edit to go through and be visible right away. I believe that he is trying to assume good faith in these types of edits. Trust, but verify, if you will.
I'm not entirely sure that FlaggedRevs is the best solution here.
Thank you, Derric Atzrott
On Sep 27, 2013 8:18 PM, "Derric Atzrott" datzrott@alizeepathology.com wrote:
Is it possible to
enable them in reverse-mode so that all edits are flagged as good, but editors can flag them as bad? If not, I can't see how it could be useful for this purpose...
"flagging as bad"? Do you mean reverting?
I just don't see what you are trying to accomplish. Sorry.
I think he's looking to flag as suspect, not to revert as bad. Are you really not following, or just not agreeing?
A possibility is to use a maintenance template, like {{cn}} or
{{dubious}},
but this solution shares with using flagged revs for it - which would be
a
great solution - that it might be viewed as negative by the en.wp
community.
I thought FlaggedRevs prevented the newest version of the page from being
shown until it has been approved?
"Flagged Revisions allows for Editor and Reviewer users to rate revisions
of articles and set those revisions as the default revision to show upon normal page view. These revisions will remain the same even if included templates are changed or images are overwritten."
I think he also wants the edit to go through and be visible right away.
I believe that he is trying to assume good faith in these types of edits. Trust, but verify, if you will.
I'm not entirely sure that FlaggedRevs is the best solution here.
The 'trust, but verify' model is patrolled edits, which nemo mentioned earlier.
Combine unpatrolled edits with abuse filter tags, and a nice interface like huggle, and this sounds like a great tool.
-- John
Not really, I can't see how tags help at all in here. We are talking about any kind of edit (nothing that can be matched by regex) which seems suspicious to vandal-fighter (human) but who can't make sure if it's vandalism or not. Nothing like abuse filter nor patrolled edits can help here (unless we mark every single edit as patrolled and these people who see such a suspicious edit would mark it as un-patrolled or something like that)
On Fri, Sep 27, 2013 at 3:47 PM, John Vandenberg jayvdb@gmail.com wrote:
On Sep 27, 2013 8:18 PM, "Derric Atzrott" datzrott@alizeepathology.com wrote:
Is it possible to
enable them in reverse-mode so that all edits are flagged as good, but editors can flag them as bad? If not, I can't see how it could be useful for this purpose...
"flagging as bad"? Do you mean reverting?
I just don't see what you are trying to accomplish. Sorry.
I think he's looking to flag as suspect, not to revert as bad. Are you really not following, or just not agreeing?
A possibility is to use a maintenance template, like {{cn}} or
{{dubious}},
but this solution shares with using flagged revs for it - which would be
a
great solution - that it might be viewed as negative by the en.wp
community.
I thought FlaggedRevs prevented the newest version of the page from being
shown until it has been approved?
"Flagged Revisions allows for Editor and Reviewer users to rate revisions
of articles and set those revisions as the default revision to show upon normal page view. These revisions will remain the same even if included templates are changed or images are overwritten."
I think he also wants the edit to go through and be visible right away.
I believe that he is trying to assume good faith in these types of edits. Trust, but verify, if you will.
I'm not entirely sure that FlaggedRevs is the best solution here.
The 'trust, but verify' model is patrolled edits, which nemo mentioned earlier.
Combine unpatrolled edits with abuse filter tags, and a nice interface like huggle, and this sounds like a great tool.
-- John _______________________________________________ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On Fri, Sep 27, 2013 at 8:52 PM, Petr Bena benapetr@gmail.com wrote:
Not really, I can't see how tags help at all in here. We are talking about any kind of edit (nothing that can be matched by regex) which seems suspicious to vandal-fighter (human) but who can't make sure if it's vandalism or not. Nothing like abuse filter nor patrolled edits can help here (unless we mark every single edit as patrolled and these people who see such a suspicious edit would mark it as un-patrolled or something like that)
If I understand correctly, you want user-applied tags on revisions, which is bug 1189.
https://bugzilla.wikimedia.org/show_bug.cgi?id=1189
All edits start out unpatrolled.
If your interface shows the union of unpatrolled edits and a huggle-user-selected-tag (be it tor, abusefilter, or manually added tag) ..
'Experts' edit the page if required, and then mark the revision as patrolled so it no longer appears in the queue.
-- John Vandenberg
I've got to say that this problem seems pretty straightforward. Essentially, we need something lighter than 'revert' for edits that need a second set of eyes.
What we really want is a queue of suspect revisions that allows Wikipedians to flag new revisions, query current flagged revisions and remove revisions from the list after review.
I see two clear options:
*3rd party tool. *A queue of suspect revisions can be created as a 3rd party tool (e.g. webapp + API running on labs). Then gadgets and other 3rd party tools make use of the API to add, remove, update & query the set of flagged edits. I worry about this option due to the lack of good identity sharing between Wikipedia and 3rd party wiki tools, but otherwise, it seems trivial to implement.
*Make use of infrastructure in MediaWiki. *We can either build on top of the features currently deployed or on top of new features in the pipeline.
- Current MW: Someone brought up the example of adding a template to articles who have recent revisions needing review. Such templates could appear on the talk page so as to not clutter the article. I've got to admit that this sounds messy, but the user warning level system employed by Huggle, ClueBot NG, Twinkle, etc. is equally message.
- New Features: If arbitrary tags could manually be added to revisions and queried from the MediaWiki (preferably, the API), the functionality of a third party tool described above could be captured without need for accessing an external tool. This might require a little bit of gadget support for common actions taken on the "suspicious edit queue".
On Fri, Sep 27, 2013 at 8:59 AM, John Vandenberg jayvdb@gmail.com wrote:
On Fri, Sep 27, 2013 at 8:52 PM, Petr Bena benapetr@gmail.com wrote:
Not really, I can't see how tags help at all in here. We are talking about any kind of edit (nothing that can be matched by regex) which seems suspicious to vandal-fighter (human) but who can't make sure if it's vandalism or not. Nothing like abuse filter nor patrolled edits can help here (unless we mark every single edit as patrolled and these people who see such a suspicious edit would mark it as un-patrolled or something like that)
If I understand correctly, you want user-applied tags on revisions, which is bug 1189.
https://bugzilla.wikimedia.org/show_bug.cgi?id=1189
All edits start out unpatrolled.
If your interface shows the union of unpatrolled edits and a huggle-user-selected-tag (be it tor, abusefilter, or manually added tag) ..
'Experts' edit the page if required, and then mark the revision as patrolled so it no longer appears in the queue.
-- John Vandenberg
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Hi,
I think you perfectly summarized this issue. I like the first solution (3rd provider on wikimedia labs with some well documented api interface) but I must admit that identity sharing might be little problem (if some troll figured out this system and we weren't using any identification at all, they could easily wipe all edits).
Having this directly in MW as tags that can be applied by users would be probably best solution, but I am afraid it's going to take ages for this to happen
On Fri, Sep 27, 2013 at 4:21 PM, Aaron Halfaker ahalfaker@wikimedia.org wrote:
I've got to say that this problem seems pretty straightforward. Essentially, we need something lighter than 'revert' for edits that need a second set of eyes.
What we really want is a queue of suspect revisions that allows Wikipedians to flag new revisions, query current flagged revisions and remove revisions from the list after review.
I see two clear options:
*3rd party tool. *A queue of suspect revisions can be created as a 3rd party tool (e.g. webapp + API running on labs). Then gadgets and other 3rd party tools make use of the API to add, remove, update & query the set of flagged edits. I worry about this option due to the lack of good identity sharing between Wikipedia and 3rd party wiki tools, but otherwise, it seems trivial to implement.
*Make use of infrastructure in MediaWiki. *We can either build on top of the features currently deployed or on top of new features in the pipeline.
- Current MW: Someone brought up the example of adding a template to
articles who have recent revisions needing review. Such templates could appear on the talk page so as to not clutter the article. I've got to admit that this sounds messy, but the user warning level system employed by Huggle, ClueBot NG, Twinkle, etc. is equally message.
- New Features: If arbitrary tags could manually be added to revisions and
queried from the MediaWiki (preferably, the API), the functionality of a third party tool described above could be captured without need for accessing an external tool. This might require a little bit of gadget support for common actions taken on the "suspicious edit queue".
On Fri, Sep 27, 2013 at 8:59 AM, John Vandenberg jayvdb@gmail.com wrote:
On Fri, Sep 27, 2013 at 8:52 PM, Petr Bena benapetr@gmail.com wrote:
Not really, I can't see how tags help at all in here. We are talking about any kind of edit (nothing that can be matched by regex) which seems suspicious to vandal-fighter (human) but who can't make sure if it's vandalism or not. Nothing like abuse filter nor patrolled edits can help here (unless we mark every single edit as patrolled and these people who see such a suspicious edit would mark it as un-patrolled or something like that)
If I understand correctly, you want user-applied tags on revisions, which is bug 1189.
https://bugzilla.wikimedia.org/show_bug.cgi?id=1189
All edits start out unpatrolled.
If your interface shows the union of unpatrolled edits and a huggle-user-selected-tag (be it tor, abusefilter, or manually added tag) ..
'Experts' edit the page if required, and then mark the revision as patrolled so it no longer appears in the queue.
-- John Vandenberg
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
I wonder if the refactor described in https://www.mediawiki.org/wiki/Requests_for_comment/Support_for_user-specifi... could be adapted to help with this use case. I suspect it would be highly useful to be able to create publicly viewable watchlists of suspicious edits.
With a little bit of tweaking maybe when viewing the watchlist only the suspicious edits would show in the list and users could collectively 'unwatch' them when reviewed?
On Fri, Sep 27, 2013 at 7:42 AM, Petr Bena benapetr@gmail.com wrote:
Hi,
I think you perfectly summarized this issue. I like the first solution (3rd provider on wikimedia labs with some well documented api interface) but I must admit that identity sharing might be little problem (if some troll figured out this system and we weren't using any identification at all, they could easily wipe all edits).
Having this directly in MW as tags that can be applied by users would be probably best solution, but I am afraid it's going to take ages for this to happen
On Fri, Sep 27, 2013 at 4:21 PM, Aaron Halfaker ahalfaker@wikimedia.org wrote:
I've got to say that this problem seems pretty straightforward. Essentially, we need something lighter than 'revert' for edits that need a second set of eyes.
What we really want is a queue of suspect revisions that allows Wikipedians to flag new revisions, query current flagged revisions and remove revisions from the list after review.
I see two clear options:
*3rd party tool. *A queue of suspect revisions can be created as a 3rd party tool (e.g. webapp + API running on labs). Then gadgets and other 3rd party tools make use of the API to add, remove, update & query the set of flagged edits. I worry about this option due to the lack of good identity sharing between Wikipedia and 3rd party wiki tools, but otherwise, it seems trivial to implement.
*Make use of infrastructure in MediaWiki. *We can either build on top of the features currently deployed or on top of new features in the pipeline.
- Current MW: Someone brought up the example of adding a template to
articles who have recent revisions needing review. Such templates could appear on the talk page so as to not clutter the article. I've got to admit that this sounds messy, but the user warning level system employed by Huggle, ClueBot NG, Twinkle, etc. is equally message.
- New Features: If arbitrary tags could manually be added to revisions and
queried from the MediaWiki (preferably, the API), the functionality of a third party tool described above could be captured without need for accessing an external tool. This might require a little bit of gadget support for common actions taken on the "suspicious edit queue".
On Fri, Sep 27, 2013 at 8:59 AM, John Vandenberg jayvdb@gmail.com wrote:
On Fri, Sep 27, 2013 at 8:52 PM, Petr Bena benapetr@gmail.com wrote:
Not really, I can't see how tags help at all in here. We are talking about any kind of edit (nothing that can be matched by regex) which seems suspicious to vandal-fighter (human) but who can't make sure if it's vandalism or not. Nothing like abuse filter nor patrolled edits can help here (unless we mark every single edit as patrolled and these people who see such a suspicious edit would mark it as un-patrolled or something like that)
If I understand correctly, you want user-applied tags on revisions, which is bug 1189.
https://bugzilla.wikimedia.org/show_bug.cgi?id=1189
All edits start out unpatrolled.
If your interface shows the union of unpatrolled edits and a huggle-user-selected-tag (be it tor, abusefilter, or manually added tag) ..
'Experts' edit the page if required, and then mark the revision as patrolled so it no longer appears in the queue.
-- John Vandenberg
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On Fri, 27 Sep 2013 15:18:01 +0200, Derric Atzrott datzrott@alizeepathology.com wrote:
I thought FlaggedRevs prevented the newest version of the page from being shown until it has been approved? "Flagged Revisions allows for Editor and Reviewer users to rate revisions of articles and set those revisions as the default revision to show upon normal page view. These revisions will remain the same even if included templates are changed or images are overwritten." I think he also wants the edit to go through and be visible right away. I believe that he is trying to assume good faith in these types of edits. Trust, but verify, if you will.
This can be configured either way. Almost everything about FlaggedRevs can be configured.
Yes, having https://bugzilla.wikimedia.org/show_bug.cgi?id=1189 would be definitely a solution. But question is if it's ever going to happen on production.
On Fri, Sep 27, 2013 at 4:05 PM, Bartosz Dziewoński matma.rex@gmail.com wrote:
On Fri, 27 Sep 2013 15:18:01 +0200, Derric Atzrott datzrott@alizeepathology.com wrote:
I thought FlaggedRevs prevented the newest version of the page from being shown until it has been approved? "Flagged Revisions allows for Editor and Reviewer users to rate revisions of articles and set those revisions as the default revision to show upon normal page view. These revisions will remain the same even if included templates are changed or images are overwritten." I think he also wants the edit to go through and be visible right away. I believe that he is trying to assume good faith in these types of edits. Trust, but verify, if you will.
This can be configured either way. Almost everything about FlaggedRevs can be configured.
-- Matma Rex
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On Sep 27, 2013 5:06 PM, "Bartosz Dziewoński" matma.rex@gmail.com wrote:
On Fri, 27 Sep 2013 11:51:34 +0200, Petr Bena benapetr@gmail.com wrote:
We are getting somewhere else than I wanted... I didn't want to discuss what should be reverted on sight or not. Problem is that right now lot of vandal-fighters see certain amount of dubious edits they skip because they can't verify if they are correct or not, which are then ignored and get lost in editing history. That's a fact. This problem could be easily solved if these specific edits could be highlighted somehow so that they would get attention of people who understand the topic well enough to check if they are OK. But there is no such a system / mechanism that would allow us to do that. I think this is worth of implementing somehow because it could significantly improve the reliability of encyclopedia content. There is a lot of vandalism that remains unnoticed even for months
Really, you have just described FlaggedRevs. It could be enabled by
default for all articles and would solve all of your problems. Many large Wikipedias already use it, including pl.wp and de.wp.
And en.wp does has Pending Changes enabled. It would be great to have dev resources thrown at improving it to resolve the issues preventing wider use.
All wikis have abuse filters, which provides tagging of suspicious edits. That extension has lots of bus and feature requests against it; fixing them will make it more flexible. Does huggle make use of the abuse filter tags?
-- John Vandenberg
All this is unnecessary complication. If you use Huggle and see something ok (= not to be reverted), Huggle must mark it patrolled; if you're unsure, you should be able to tell so to Huggle and it will be left unpatrolled.
If you're emotionally attached to the idea of doing the opposite, you can still do so with patrolled edits: just store somewhere the suspect list and periodically mark as patrolled all the edits at the bottom of the queue (say, $wgRCMaxAge - 72 h) for manual revision.
Nemo
If you use Huggle and see something ok (= not to be reverted), Huggle
must mark it patrolled; if you're unsure, you should be able to tell so to Huggle and it will be left unpatrolled.
This is not the same. Surely, most edits would appear in such an "unpatrolled" list. Most edits are not seen by Huggle users. We'd instead be looking for a list of edits that were seen by Huggle users, not reverted, but also not patrolled.
just store somewhere the suspect list
Where? How will others access the list?
On Fri, Sep 27, 2013 at 3:13 PM, Federico Leva (Nemo) nemowiki@gmail.comwrote:
All this is unnecessary complication. If you use Huggle and see something ok (= not to be reverted), Huggle must mark it patrolled; if you're unsure, you should be able to tell so to Huggle and it will be left unpatrolled.
If you're emotionally attached to the idea of doing the opposite, you can still do so with patrolled edits: just store somewhere the suspect list and periodically mark as patrolled all the edits at the bottom of the queue (say, $wgRCMaxAge - 72 h) for manual revision.
Nemo
______________________________**_________________ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/**mailman/listinfo/wikitech-lhttps://lists.wikimedia.org/mailman/listinfo/wikitech-l
I think a few different concepts are being muddled here.
Flagged revisions (and its variant, pending changes, on enwiki) is applied to individual articles to hold *all* edits from certain user classes for review.
What Petr is looking for is a way to flag *individual edits* to an article (not the whole article) for review.
Flagging edits for review through either method means creating an expectation that someone else will review the edit. The use of FR/PC is (to date) only enabled following community discussion and usually community determination of the applicable rules for its use. I suspect that enabling a means to flag individual edits would also require some sort of community consensus for its desirability before it is enabled; however, someone's going to have to write the code first before that happens.
This does come back to basic socialization of the editing/reviewing process, and likely some (project specific) rules of thumb for when to revert and when to take a few minutes and research the new data would be worthwhile. For example, I'd probably not revert the result of a sporting match held within the past 48 hours, but I'd probably revert the same edit if the sporting match was six years ago. Dates of birth are particularly sensitive and changes to them should always either be sourced or the prior consensus verified. It's often better to review fewer changes and verify information (most of which is usually available somewhere online) than to get as many edit reviews as possible done. I can't tell you how many times I used to get edit-conflicted by people using review tools when I used to manually review (and fix) recent changes.
Risker/Anne
On Fri, Sep 27, 2013 at 11:38 PM, Risker risker.wp@gmail.com wrote:
I think a few different concepts are being muddled here.
Flagged revisions (and its variant, pending changes, on enwiki) is applied to individual articles to hold *all* edits from certain user classes for review.
What Petr is looking for is a way to flag *individual edits* to an article (not the whole article) for review.
Flagging edits for review through either method means creating an expectation that someone else will review the edit. The use of FR/PC is (to date) only enabled following community discussion and usually community determination of the applicable rules for its use. I suspect that enabling a means to flag individual edits would also require some sort of community consensus for its desirability before it is enabled; however, someone's going to have to write the code first before that happens.
I don't know how others, but thanks to some my experience with "establishing of consensus" I first ask for it and then code. It's a huge waste of time when you spend months coding and then you receive "sorry we don't want this" response from community...
This does come back to basic socialization of the editing/reviewing process, and likely some (project specific) rules of thumb for when to revert and when to take a few minutes and research the new data would be worthwhile. For example, I'd probably not revert the result of a sporting match held within the past 48 hours, but I'd probably revert the same edit if the sporting match was six years ago. Dates of birth are particularly sensitive and changes to them should always either be sourced or the prior consensus verified. It's often better to review fewer changes and verify information (most of which is usually available somewhere online) than to get as many edit reviews as possible done. I can't tell you how many times I used to get edit-conflicted by people using review tools when I used to manually review (and fix) recent changes.
Risker/Anne _______________________________________________ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On 28 September 2013 00:54, Petr Bena benapetr@gmail.com wrote:
On Fri, Sep 27, 2013 at 11:38 PM, Risker risker.wp@gmail.com wrote:
I think a few different concepts are being muddled here.
Flagged revisions (and its variant, pending changes, on enwiki) is
applied
to individual articles to hold *all* edits from certain user classes for review.
What Petr is looking for is a way to flag *individual edits* to an
article
(not the whole article) for review.
Flagging edits for review through either method means creating an expectation that someone else will review the edit. The use of FR/PC is
(to
date) only enabled following community discussion and usually community determination of the applicable rules for its use. I suspect that
enabling
a means to flag individual edits would also require some sort of
community
consensus for its desirability before it is enabled; however, someone's going to have to write the code first before that happens.
I don't know how others, but thanks to some my experience with "establishing of consensus" I first ask for it and then code. It's a huge waste of time when you spend months coding and then you receive "sorry we don't want this" response from community...
On that we agree, Petr!
I don't honestly know how various communities would respond. English Wikipedia has been very cautious in returning to the use of pending changes - there are less than 800 pages on PC right now, and on looking at the list I'd estimate probably a quarter could get it yanked if someone cleaned up the list to match the criteria. On the other hand, being able to flag individual edits rather than entire pages might well draw significant support. I wonder if this might be something that would have a greater chance of success on a project other than enwiki, particularly one that doesn't have the flagged revision option in place. I realise that might mean running more than one version of Huggle, and that's a definite burden for someone who doesn't have a huge team behind him.
Risker
wikitech-l@lists.wikimedia.org