I've written a little tool [1] that shows file duplicates between a wikipedia and Commons, as well as internal duplicates. It runs of a static list created from the toolserver databases; currently, German and English are available. I will have to regenerate the data for other wikipedias and for updates manually.
But for now, there's ~29.000 dupes between en.wp and Commons, as well as ~8.500 between de.wp and Commons, so it might take you guys a while ;-)
A subset (default:25) images is selected randomly from the list, so you might run into images that already have {{NowCommons}}.
Cheers, Magnus
[1] http://toolserver.org/~magnus/cgi-bin/duplicate_images_across.pl?lang=en&...