Hoi,
There is work done on software that compares data from Wikidata and other external sources. This is by someone connected to Wikidata. The details are not clear to me. It is supposed to become available in 2015.
What I am looking for is a way to learn at what level the quality is. The problem we face is that Wikipedians are not convinced by the quality of Wikidata because there are no sources. Their observation is correct, there is hardly any credible source information but that does not necessarily imply that quality is worse than info at other sources or in Wikipedia itself. The two things are not really related. My blogpost is an approach to quality. What it does do is make it plausible that an approach where data is compared with data in linked sources may aid in improving quality.It does however not provide an argument that is easy to digest. It does not rate quality in percentages, it does not indicate in numbers how quality is improving when this approach is taken. They are the kind of arguments that may convince Wikipedians that Wikidata is safe to use even without the sources they seek.
I am NOT saying that sources on statements are not good to have, What I am saying is that it is unlikely for the many millions of statements to have credible sources any time soon. Consequently it is best to work on sourcing potential problematic statements and have statistics on problematic statements due to comparisons with other sources. With numbers like this, we encourage people to do the hard work by showing how much of a difference they make.
Finding such numbers is exactly what research is about. This is why I put this challenge to you as I am not a scientist nor do I have the right skills.