Hi,
Thanks for the positive feedback.
It's reasonably fast for medium sized documents. The problem is that it relies on a word-for-word LCS pass which means that the number of elements increases with (let's say a line has an average of 30 words) a factor 30 and the maximum execution time increases by 900.
In Daisy this has not shown to be a problem. There are heuristics that work in constant time and in practice the LCS complexity is O(N) in stead of O(N²). Performance might still be a problem though and investigating all options in that department would be part of the project itself.
Even if the speed is a problem for large installs, the project can be very useful to be used in smaller installs where ease of use has a higher priority. Note that I don't want to get rid of the old diff page just yet :)
--Guy
2008/3/23, Mohamed Magdy mohamed.m.k@gmail.com:
On Sun, Mar 23, 2008 at 12:28 AM, Guy Van den Broeck guyvdb@gmail.com wrote:
Hi,
I want to get some feedback on a possible Summer of Code project
proposal.
For last year's GSoC I created an HTML diffing library for Daisy CMS.
The
algorithm has proven to work well and I'm thinking of porting it to mediawiki.
What the algorithm does is take the source of 2 pages and merge them to visualize the diff. The code I have already does something like this: http://users.pandora.be/guyvdb/wikipediadiff.jpg
Is this a feasible project for wikimedia? I'm personally not very impressed with the current "diff pages". I think a visual diff would bring that
part
of mediawiki up to par with the rest of the software.
It would be a neat feature to add along with the normal diff (imo).. how fast is that algorithm in mw compared to the normal diff?
It is fast enough in daisy afaics:
http://cocoondev.org/daisy/index/version/34/diff?otherVersion=36
-- --alnokta
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l