On 8/21/05, Cormac Lawler <cormaggio(a)gmail.com> wrote:
If you know
some shell scripting, then you can automate this somewhat
with curl/wget to automate the fetching of these pages, then use some
combo of grep/wc to actually find out how many user page, project
pages, talk pages, etc link to policy pages.
This is all Klingon to me - is there an encyclopedia of that somewhere? :-)
Looking back, it would have been useful to have a "research tools"
section at Wikimania, during the days before the formal conference.
Jakob V. and Erik Z. touched on some of this, by discussing some of
the visualization tools and software tools they used, but it might be
useful to have a step-by-step hands-on session for folks who don't
know much about UNIX command line and text processing tools. We should
definitely make it a part of WM2006.
Unfortunately, my suggestion of a roundtable for research didn't
materialize. But we should keep the conversation going. Perhaps a
tutorial on Meta on where to start when using Wikipedia data for
research.
-Andrew (User:Fuzheado)