I just wondered if anyone doing MediaWiki development had any experience in catching CSS regressions?
We have had a few issues recently in mobile land where we've made big CSS changes and broken buttons on hidden away special pages - particularly now we have been involved in the development of mediawiki ui and moving mobile towards using them.
My vision of how this might work is we have an automated tool that visits a list of given pages on various browsers, take screenshots of how they look and then compares the images with the last known state. The tool checks how similar the images are and complains if they are not the same - this might be a comment on the Gerrit patch or an e-mail saying something user friendly like "The Special:Nearby page on Vector looks different from how it used to. Please check everything is okay."
This would catch a host of issues and prevent a lot of CSS regression bugs.
Any experience in catching this sort of thing? Any ideas on how we could make this happen?
I believe the Java Selenium driver can take screenshots. No idea about performing comparisons.
Kaldari
On Mon, Mar 10, 2014 at 2:04 PM, Jon Robson jdlrobson@gmail.com wrote:
I just wondered if anyone doing MediaWiki development had any experience in catching CSS regressions?
We have had a few issues recently in mobile land where we've made big CSS changes and broken buttons on hidden away special pages - particularly now we have been involved in the development of mediawiki ui and moving mobile towards using them.
My vision of how this might work is we have an automated tool that visits a list of given pages on various browsers, take screenshots of how they look and then compares the images with the last known state. The tool checks how similar the images are and complains if they are not the same - this might be a comment on the Gerrit patch or an e-mail saying something user friendly like "The Special:Nearby page on Vector looks different from how it used to. Please check everything is okay."
This would catch a host of issues and prevent a lot of CSS regression bugs.
Any experience in catching this sort of thing? Any ideas on how we could make this happen?
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
This looks like it may be an option.
https://github.com/Huddle/PhantomCSS
On Mon, Mar 10, 2014 at 2:07 PM, Ryan Kaldari rkaldari@wikimedia.orgwrote:
I believe the Java Selenium driver can take screenshots. No idea about performing comparisons.
Kaldari
On Mon, Mar 10, 2014 at 2:04 PM, Jon Robson jdlrobson@gmail.com wrote:
I just wondered if anyone doing MediaWiki development had any experience in catching CSS regressions?
We have had a few issues recently in mobile land where we've made big CSS changes and broken buttons on hidden away special pages - particularly now we have been involved in the development of mediawiki ui and moving mobile towards using them.
My vision of how this might work is we have an automated tool that visits a list of given pages on various browsers, take screenshots of how they look and then compares the images with the last known state. The tool checks how similar the images are and complains if they are not the same - this might be a comment on the Gerrit patch or an e-mail saying something user friendly like "The Special:Nearby page on Vector looks different from how it used to. Please check everything is okay."
This would catch a host of issues and prevent a lot of CSS regression bugs.
Any experience in catching this sort of thing? Any ideas on how we could make this happen?
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Mobile-l mailing list Mobile-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mobile-l
For image comparisons I have used perceptualdiff[1], there is also wraith[2] from BBC and automated-screenshot-diff[3]
Erik B
[1] http://pdiff.sourceforge.net/ [2] https://github.com/BBC-News/wraith [3] https://github.com/igorescobar/automated-screenshot-diff
On Mon, Mar 10, 2014 at 2:07 PM, Ryan Kaldari rkaldari@wikimedia.orgwrote:
I believe the Java Selenium driver can take screenshots. No idea about performing comparisons.
Kaldari
On Mon, Mar 10, 2014 at 2:04 PM, Jon Robson jdlrobson@gmail.com wrote:
I just wondered if anyone doing MediaWiki development had any experience in catching CSS regressions?
We have had a few issues recently in mobile land where we've made big CSS changes and broken buttons on hidden away special pages - particularly now we have been involved in the development of mediawiki ui and moving mobile towards using them.
My vision of how this might work is we have an automated tool that visits a list of given pages on various browsers, take screenshots of how they look and then compares the images with the last known state. The tool checks how similar the images are and complains if they are not the same - this might be a comment on the Gerrit patch or an e-mail saying something user friendly like "The Special:Nearby page on Vector looks different from how it used to. Please check everything is okay."
This would catch a host of issues and prevent a lot of CSS regression
bugs.
Any experience in catching this sort of thing? Any ideas on how we could make this happen?
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On 03/10/2014 02:04 PM, Jon Robson wrote:
I just wondered if anyone doing MediaWiki development had any experience in catching CSS regressions?
We don't have experience with this yet in Parsoid land, but are looking for something very similar. We are interested in mass rendering comparisons of PHP parser HTML + CSS with Parsoid HTML + RDFa + CSS, likely as part of our distributed testreduce system.
Gabriel
http://sauceio.com/index.php/2014/03/shotsonsauce-by-jim-eisenhauer/
"Now you can grab the screenshots from all the browsers and OS platforms you want using Sauce and compare them using the little known image diff feature on Github."
Not sure I'm going to have time to poke at this, but it seems reasonable.
-Chris
On Mon, Mar 10, 2014 at 2:04 PM, Jon Robson jdlrobson@gmail.com wrote:
I just wondered if anyone doing MediaWiki development had any experience in catching CSS regressions?
We have had a few issues recently in mobile land where we've made big CSS changes and broken buttons on hidden away special pages - particularly now we have been involved in the development of mediawiki ui and moving mobile towards using them.
My vision of how this might work is we have an automated tool that visits a list of given pages on various browsers, take screenshots of how they look and then compares the images with the last known state. The tool checks how similar the images are and complains if they are not the same - this might be a comment on the Gerrit patch or an e-mail saying something user friendly like "The Special:Nearby page on Vector looks different from how it used to. Please check everything is okay."
This would catch a host of issues and prevent a lot of CSS regression bugs.
Any experience in catching this sort of thing? Any ideas on how we could make this happen?
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On Mon, Mar 10, 2014 at 10:04 PM, Jon Robson jdlrobson@gmail.com wrote:
My vision of how this might work is we have an automated tool that visits a list of given pages on various browsers, take screenshots of how they look and then compares the images with the last known state. The tool checks how similar the images are and complains if they are not the same
I have build something similar before. It is doable with our existing Jenkins+Selenium+Ruby test infrastructure. Something like this:
- open a page - take a screen shot - compare (pixel by pixel) with the previous screen shot - report if the more than x percent of the pixels are not the same
Repeat for interesting pages and browser+version/OS+version combinations.
Željko
On Mon, Mar 10, 2014 at 10:04 PM, Jon Robson jdlrobson@gmail.com wrote:
My vision of how this might work is we have an automated tool that visits a list of given pages on various browsers, take screenshots of how they look and then compares the images with the last known state. The tool checks how similar the images are and complains if they are not the same
I have noticed this today:
http://www.jameseisenhauer.com/2014/01/25/simple-screen-shot-comparison-tool...
Željko
See also:
https://bugzilla.wikimedia.org/show_bug.cgi?id=62633
-- Krinkle
On 10 Mar 2014, at 22:04, Jon Robson jdlrobson@gmail.com wrote:
I just wondered if anyone doing MediaWiki development had any experience in catching CSS regressions?
We have had a few issues recently in mobile land where we've made big CSS changes and broken buttons on hidden away special pages - particularly now we have been involved in the development of mediawiki ui and moving mobile towards using them.
My vision of how this might work is we have an automated tool that visits a list of given pages on various browsers, take screenshots of how they look and then compares the images with the last known state. The tool checks how similar the images are and complains if they are not the same - this might be a comment on the Gerrit patch or an e-mail saying something user friendly like "The Special:Nearby page on Vector looks different from how it used to. Please check everything is okay."
This would catch a host of issues and prevent a lot of CSS regression bugs.
Any experience in catching this sort of thing? Any ideas on how we could make this happen?
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Guys thanks so much for all these pointers and ideas. Lots of food for thought here. I find for the MobileFrontend perspective, in a perfect world, all our existing browser tests would visual diff a screenshot at each individual step of our tests against the previously known screenshot and report any oddities.
Will sit down and think about this some more but thanks so much - and please don't forget this thread - post any work you do in this area here so we can keep tabs of what the options are!
<quote name="Jon Robson" date="2014-03-14" time="11:07:34 -0700">
Will sit down and think about this some more but thanks so much - and please don't forget this thread - post any work you do in this area here so we can keep tabs of what the options are!
Not work, but Ryan Lane pointed out: https://github.com/facebook/huxley
cc'ing QA list, cuz well...
QA list readers, see also: https://bugzilla.wikimedia.org/show_bug.cgi?id=62633
wikitech-l@lists.wikimedia.org