Dear members,
I am a newbie of wiki and I want to do research on a wiki page. I want to know how to get the visit log of the page, i.e., hourly/ daily hit counts, and if possible, visitor IP address, user agent, etc.
If you happen know the way, please finger it for me.
Thanks in advance for your kind help!
With best regards,
Grain
On Wed, Jan 28, 2009 at 5:45 PM, Albert Grain grainbackup@gmail.com wrote:
Dear members,
I am a newbie of wiki and I want to do research on a wiki page. I want to know how to get the visit log of the page, i.e., hourly/ daily hit counts, and if possible, visitor IP address, user agent, etc.
dammit.lt/wikistats has a per-hour number of page requests. It does not contain IP addresses, user agents or other data apart from a) the number of requests of a given hour to b) the exact url with c) the number of bytes submitted.
Mathias
Thank you, Mathias. I have downloaded one file from the website you'd suggested, but I found the files are too large after decompression. More than that, I could not locate the wikipage I need which is in Chinese. Is it possible to download some data by something like SQL commands? The log info I am searching is only a few kbs.
Anyway, thanks again!
2009/1/29 Mathias Schindler mathias.schindler@gmail.com:
On Wed, Jan 28, 2009 at 5:45 PM, Albert Grain grainbackup@gmail.com wrote:
Dear members,
I am a newbie of wiki and I want to do research on a wiki page. I want to know how to get the visit log of the page, i.e., hourly/ daily hit counts, and if possible, visitor IP address, user agent, etc.
dammit.lt/wikistats has a per-hour number of page requests. It does not contain IP addresses, user agents or other data apart from a) the number of requests of a given hour to b) the exact url with c) the number of bytes submitted.
Mathias
Commons-l mailing list Commons-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/commons-l
On Thu, Jan 29, 2009 at 4:31 PM, Albert Grain grainbackup@gmail.com wrote:
Thank you, Mathias. I have downloaded one file from the website you'd suggested, but I found the files are too large after decompression. More than that, I could not locate the wikipage I need which is in Chinese.
You can use zgrep to save some disk space. Non-latin characters are encoded, the chinese language wikipedia front page should be something like:
%E9%A6%96%E9%A1%B5
Hence, type in
mathias@lenovo-r60:~/wikipedia/stats/200901$ zgrep "^zh %E9%A6%96%E9%A1%B5 " pagecounts-20090102-100000.gz zh %E9%A6%96%E9%A1%B5 2661 56018771
to the the number of page requests at the chinese language edition front page at the 2nd of January, 2009 at 10 am UTC until 11am. There were 2661 requests.
Mathias
I think I need to learn some Linux command first... At least I know there exist method to get the wikipage hit counts :)
With best regards,
Grain
2009/1/29 Mathias Schindler mathias.schindler@gmail.com:
On Thu, Jan 29, 2009 at 4:31 PM, Albert Grain grainbackup@gmail.com wrote:
Thank you, Mathias. I have downloaded one file from the website you'd suggested, but I found the files are too large after decompression. More than that, I could not locate the wikipage I need which is in Chinese.
You can use zgrep to save some disk space. Non-latin characters are encoded, the chinese language wikipedia front page should be something like:
%E9%A6%96%E9%A1%B5
Hence, type in
mathias@lenovo-r60:~/wikipedia/stats/200901$ zgrep "^zh %E9%A6%96%E9%A1%B5 " pagecounts-20090102-100000.gz zh %E9%A6%96%E9%A1%B5 2661 56018771
to the the number of page requests at the chinese language edition front page at the 2nd of January, 2009 at 10 am UTC until 11am. There were 2661 requests.
Mathias
Commons-l mailing list Commons-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/commons-l