For image comparisons I have used perceptualdiff[1], there is also wraith[2] from BBC and automated-screenshot-diff[3]
Erik B
[1] http://pdiff.sourceforge.net/ [2] https://github.com/BBC-News/wraith [3] https://github.com/igorescobar/automated-screenshot-diff
On Mon, Mar 10, 2014 at 2:07 PM, Ryan Kaldari rkaldari@wikimedia.orgwrote:
I believe the Java Selenium driver can take screenshots. No idea about performing comparisons.
Kaldari
On Mon, Mar 10, 2014 at 2:04 PM, Jon Robson jdlrobson@gmail.com wrote:
I just wondered if anyone doing MediaWiki development had any experience in catching CSS regressions?
We have had a few issues recently in mobile land where we've made big CSS changes and broken buttons on hidden away special pages - particularly now we have been involved in the development of mediawiki ui and moving mobile towards using them.
My vision of how this might work is we have an automated tool that visits a list of given pages on various browsers, take screenshots of how they look and then compares the images with the last known state. The tool checks how similar the images are and complains if they are not the same - this might be a comment on the Gerrit patch or an e-mail saying something user friendly like "The Special:Nearby page on Vector looks different from how it used to. Please check everything is okay."
This would catch a host of issues and prevent a lot of CSS regression
bugs.
Any experience in catching this sort of thing? Any ideas on how we could make this happen?
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l