For those interested in "high tempo" editions in Wikipedia or editors
behaviour in breaking news articles, I've found an interesting work by
Heather Ford.
-
http://ethnographymatters.net/2012/07/31/beyond-reliability-an-ethnographic…
Beside the understanding of how sources are dealed, it's related to the
open source plaform Swiftriver (
http://ushahidi.com/index.php/products/swiftriver-platform) and the
Ushahidi<http://ushahidi.com/> projects
on information collection, visualization and interactive mapping.
In her owns word:
Almost a year ago, I was hired by Ushahidi <http://ushahidi.com/> to work
as an ethnographic researcher on a project to understand how Wikipedians
managed sources during breaking news events. Ushahidi cares a great deal
about this kind of work because of a new project called
SwiftRiver<http://ushahidi.com/index.php/products/swiftriver-platform>
that
seeks to collect and enable the collaborative curation of streams of data
from the real time web about a particular issue or event. If another
Haiti<http://blog.ushahidi.com/index.php/2012/01/12/haiti-and-the-power-…
earthquake
happened, for example, would there be a way for us to filter out the
irrelevant, the misinformation and build a stream of relevant, meaningful
and accurate content about what was happening for those who needed it? And
on Wikipedia’s side, could the same tools be used to help editors curate a
stream of relevant sources as a team rather than individuals?
__
Tomás Saorín
___
Tomás Saorín / Profesor asociado / Facultad de Comunicación y
Documentación.
Universidad de Murcia / 868 88 82 32 / tsp(a)um.es
Tomás Saorín, Ph.D. / Dep. of Information and Documentation / Faculty of
Communication and Documentation / University of Murcia