Dear all,
I am writing you on behalf of my group
(www.liquendatalab.com).
During the last months we have been working on building an online bias
detection tool in order to help fight systemic bias in Wikipedia, but
unfortuntaley we have lost our developer before finishing the project.
This is why I would love to ask you for assitance to finish this digital
project in time for the next Art+Feminism edit-a-thon, which is in less than 4 weeks.
Our prototype has been designed in order to allow women, black and queer
folks react to biased content without exposing themselves to online
abuse taking under consideration issues of oppressed group identities within the Wikipedia communit.
access the prototype here -->
http://net.wanderingliquen.com/A brief description of the project:
When
accessing the prototype for Art+Feminism, users can explore 7 Wikipedia
categories and react with the feedback buttons they think that there is
any biased information. Users need to register before accessing. After
registering the app offers a series of maps (vector graphs) to explore
Wikipedia categories. The maps are build with vector graphs where nodes
represent pages and subcategories. Red nodes indicate controversy in the
page (controversy is calculated by counting the number of comments on
the discussion page of each Wikipedia entry). When clicking the node, a
module appears in the left side of the screen showing the article
abstract and the article link to access the Wikipedia page. Users then
can check the information on Wikipedia and, if they think that the
information is biased, they can react with the feedback buttons.
The
bias button aims to tackle precisely this time-consuming process of
locating biased content. The feedback button work in a similar way than
the Facebook like button, but instead of implying ‘I like this content’
with their reaction, users can imply: ‘I think this content is biased’
because is islamophobic, or sexist, or ageist, or ableist, or homophobic
etc. This particular feature may result useful first to gather
quantitative evidence regarding biased content in Wikipedia, because
they - the community- are pointing out where we have issues, and which
are those issues. Also it is expected that this information, and other
features for collaborative working still to be developed, will be of
assistance to the community of editors, specially for online
collaboration during the editathones that Wikimedia organize yearly
around Women’s Day. The long term goal of the application is to give
feminist communities an alternative space from sexist environments to
allow them to keep focused on women-centred knowledge production.
Nevertheless
there are still some work to do on the app, as to redesign the backend
and the front end focusing on usability issues, and also improve some of
the functionallities adding a text selector and charts to visualize the
information about user reactions gathered by app.
We
are looking for a volunteer developer willing to help us during this last stage, and I am wandering if we could find them whitin this community.
Please find attached a more comprehensive report for the proposal and please don't hesitate if you have any doubt or comment.
Many thanks in advance,
https://docs.google.com/document/d/1zTOWYaTRvgDyTb_OgkFTzzJoDU4ooyH0PuQX-tjT7Ks/edit?usp=sharing
Marta Delatte
creative director and co-founder
@martadelatte
PhD Researcher
Media & Memory Research Initiative
University of Hull (UK)
44 (0) 7923 528128
@liquen_