Christoph,
Thank you for your reply. I am happy to help and glad you have made
good decisions.
Are you familiar with Priyanka's work on Accuracy Review?
Fabian Floeck and
On Wed, Apr 18, 2018 at 7:07 AM, Christoph Hube <hube(a)l3s.de> wrote:
Hi James,
thanks a lot for your interest in our work!
The problem of crowdworkers being biased is a problem definitely not to be
neglected. Majority vote can help to sort out single extremist views of
workers but if many workers are strongly biased then I agree that this might
not be enough. We are actually already thinking about methods to improve
future crowdsourced bias datasets. One way to improve the quality is to have
a very well defined task that leaves only little room for subjective
interpretation. For example, instead of letting the workers decide whether a
statement is biased or not, we asked more specifically whether the statement
reflects an opinion or contains bias words. Of course, the decision if a
statement reflects a fact or an opinion is still subjective in many cases.
Given your example it is hard to make a decision (even when being unbiased)
without having the proper background knowledge. That is why our work mostly
focuses on language bias, i.e. bias that is introduced through the use of
judgemental language. Since there are many cases of bias without using
judgemental language, we are definitely interested to come up with good
approaches that cover these cases as well. Ideas and suggestions are always
welcome!
One other thing that we are planning to do for future crowdsourcing jobs is
to ask workers for their political opinions and to take this background
information into account when creating ground truth data.
Best regards,
Christoph
Am 4/18/2018 um 2:22 PM schrieb James Salsman:
... Accepted papers
Christoph Hube and Besnik Fetahu
Detecting Biased Statements in Wikipedia
http://wikiworkshop.org/2018/papers/wikiworkshop2018_paper_1.pdf
...
Hi Christoph and Besnik,
Having worked with several thousand of Amazon Mechanical Turkers over
the past year, I am not convinced that their opinions of bias, even in
aggregate, are not biased. Did you take any steps to measure the bias
against accuracy in your crowdworkers?
Here is an example of what I expect they would get wrong:
"Tax cuts allow consumers to increase their spending, which boosts
aggregate demand."
That statement, added by en:User:Bkwillwm in 2012,[1] is still part of
the English Wikipedia's Economics article. However, the statement is
strictly inaccurate, and heavily biased in favor of trickle-down
economics and austerity policy.[2] It and statements like it,
pervasive through many if not most of the popular language Wikipedias,
directly support increases in income inequality, which in turn is a
terrible scourge affecting both health[3] and economic growth.[4]
How can you measure whether your crowdworkers are truly unbiased
relative to accuracy, instead of just reflecting the
propaganda-influenced whims of the populist center?
Sincerely,
James Salsman
[1]
https://en.wikipedia.org/w/index.php?title=Economics&diff=prev&oldi…
[2]
https://en.wikipedia.org/wiki/Talk:Economics/Archive_7#Tax_cut_claim_in_Fis…
[3]
http://talknicer.com/ehip.pdf
[4]
http://talknicer.com/egma.pdf