This looks like it may be an option.

https://github.com/Huddle/PhantomCSS


On Mon, Mar 10, 2014 at 2:07 PM, Ryan Kaldari <rkaldari@wikimedia.org> wrote:
I believe the Java Selenium driver can take screenshots. No idea about performing comparisons.

Kaldari


On Mon, Mar 10, 2014 at 2:04 PM, Jon Robson <jdlrobson@gmail.com> wrote:
I just wondered if anyone doing MediaWiki development had any
experience in catching CSS regressions?

We have had a few issues recently in mobile land where we've made big
CSS changes and broken buttons on hidden away special pages -
particularly now we have been involved in the development of mediawiki
ui and moving mobile towards using them.

My vision of how this might work is we have an automated tool that
visits a list of given pages on various browsers, take screenshots of
how they look and then compares the images with the last known state.
The tool checks how similar the images are and complains if they are
not the same - this might be a comment on the Gerrit patch or an
e-mail saying something user friendly like "The Special:Nearby page on
Vector looks different from how it used to. Please check everything is
okay."

This would catch a host of issues and prevent a lot of CSS regression bugs.

Any experience in catching this sort of thing? Any ideas on how we
could make this happen?

_______________________________________________
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


_______________________________________________
Mobile-l mailing list
Mobile-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mobile-l