On Sat, Aug 19, 2006 at 12:08:33PM +0200, J?rgen Herz wrote:
Domas Mituzas wrote:
This hasn't been done for a while, so I'll try to sum up changes in our operations since November, 2005. [...] And of course, as always, team has been marvelous ;-) Thanks!
Thanks to the team for all the work and for your summary.
Reading about so much new hardware and supposedly more free servers, what about reenabling access statistics? The problem of also counting requests from squids is still valid when using pure access logs.
But wouldn't an approach like the one from [[de:Benutzer:LeonWeber/WikiCharts]] be the solution? There a short JS snippet generates a short request to the toolserver which logs this. In theory every request is logged, but because of limitations on the toolserver only every 600th request is logged.
Is LeonWeber's talk page still the second most visited page? The problem with these JS tools is that they are easily faked if they use sampling (e.g. logging only 1 in 600 requests).
There's code the squid programmers are working on and which will very likely be in the next patch set for the stable release of squid that allows to specify a remote loghost, apparently using cheap UDP datagrams to send the log entries.
Having two or three dedicated servers for serving a one pixel image and logging its request, wouldn't it be possible to reenable a reliable article view statistics?
Not with pixel sampling, but with remote logging squids, yes. This is being worked on. Give us some time, and enjoy Leon's statistics in the meantime.
Regards,
jens