Before I give examples, let me just remind you that this is only a sample, so first of all, of course it is not inferred statistically that these 37% that I mentioned necessarily appear like this in general in the exact same way. Secondly, this number is generated when you assume that 80% of all participants uttered agreement to an edit pair being a
revert, i.e. of course there were cases in the sample where people did disagree and some cases where even the majority was voting for it to be a full revert while being detected by MD5, just not over 79%. I chose this threshold to make the differences clear, I could have also selected some other arbitrary value. That is exactly why we did not put it in the paper. Because the analysis in the paper is a much better ground for making statistical inferences about the data that is the "basic population" for this analysis.
Now, let me give you some examples for false positives generated by the MD5 hash method:
1. One self-generated example (inspired by observations) is given in the paper (almost identically):
RevID # RevContent (after edit)# Edit # Hash
1 # Peanut # +Peanut # Hash1
2 # Peanut Apple # +Apple # Hash2
3 # Peanut Apple Banana # +Banana # Hash3
4 # Peanut Banana # - Apple # Hash4
5 # Peanut # -Banana # Hash1
MD5 assigns 5 as reverting edit of 2,3,4
DIFF assigns 5 as reverting edit of 3, 4 as reverting edit of 2
false positive in this case (according to Wikipedia definition) for MD5: 5 reverting 4 and 2
(as 4 is unrelated to what 5 is doing and 2's contribution is removed already, it can thus not be undone anymore by 5)
2. "Real-life" examples rated as false positives in the user evaluation:
When you asked me for the examples, I started digging them up from the data sample that was used and in fact realized that many false positives of the MD5 method are related to self-reverts. As this is no issue for our data extraction aims (we want to have self-reverts in the results as well) and was not considered when just randomly drawing edit pairs from the two methods' results, we didn't discuss this in the paper. If you, of course, do not consider self-reverts to be reverts in the Wikipedia definition sense, they could be filtered out by collapsing subsequent edits of one editor before running the revert analysis with the MD5 method. That would reduce the number of false positives notably I assume. I will certainly look into that.
If you don't collapse these edits, however (which is not regularly done before reporting/using revert detection results), the number of false positives will be quite high, as the edits-to-be-collapsed (and prone to being misinterpreted) appear quite often and their span can at times be considerably large. And of course there are cases not related to self-reverts.
I tried to select examples representative for the sample, which received very little or no votes to be full reverts (as detected by MD5):
Example A
detected-as-reverting edit removes only "insomnia" from the detected-as-reverted edit, i.e. no full revert, as some insertions from previous edits have already been deleted by the reverted editor himself.
Would be a correct full revert if you collapsed edits by the reverted editor to one.
Example B
self-revert of vandalism introduced ("kirsty u tit") before second editor reverts--> cannot be reverted by detected-as-reverted edit. Would also be remedied by collapsing the first editors edits.
Example C
not related to self-revert, this is an example of incomplete vandalism repair, which is then subsequently completed:
Example D
not related to self-revert
A revert is carried out by TheJazzDalek targ
eting the edits by 74.131.204.39, but in the same edit, something is deleted by TheJazzDalek, leading to a new unique revisions content. As 74.131.204.39 in the next edit reverts this deletion by TheJazzDalek, but not the initial revert of his (74.131.204.39's) own edits, it is erroneously concluded that 74.131.204.39 reverts himself, which is not the case.
Example E
First, reverting editor (Laser brain) undoes (not rolling back to/ not creating duplicate revision) some edits by another editor before deleting the result
of 67.162.68.255 's edits (one of which was detected here as reverted). The "detected as reverted" revision is partly self-reverted by 67.162.68.255 . The other part, a date change in an "accessdate=" is not "undone" as such, but the whole "accessdate=" part is deleted (stemming from a third editor).
Example F
Here, between the "reverted" and the "reverting" one, there happens a mixture of self-reverts, reverts and different vandalism forms,:
If I now failed to answer any of your questions please excuse me and ask me again.