On 9/8/06, Gregory Kohs thekohser@gmail.com wrote:
I am curious if there is any factual data about how many clicks per day,
on
average, that a run-of-the-mill outbound link receives in the "External links" section of a typical Wikipedia article? My guess is that it's somewhere around 3 or 4, but that's just me looking at it as a [[Fermi problem]].
+++ Date: Fri, 8 Sep 2006 14:52:31 -0400 From: "Gregory Maxwell" gmaxwell@gmail.com
You should know, you spammed your blog on a number of pages.
If your customers are really interested in this data, perhaps you could fund wikimedia to perform a proper study.
+++ +++ Surprise, surprise, that past mistakes would come back to haunt me. Still, I felt that the content so linked on my blog would be of informational use to the community. Being that the 3 or 4 inbound hits per day that I received tended to spend an average of 1 to 5 minutes on the article, I guess it actually was of some value to most people, until the links were (appropriately) removed. (Remember, even Jimmy Wales edited his own article a number of times before "learning the rules".)
Anyway, you surely won't believe it, but my customers are not at all the reason I'm asking this question. Instead, I have a larger, more universally interesting reason for asking; but I'm not quite ready to disclose my agenda. I will assure you, though, that it is in the interest of underscoring a major "conflict of interest" problem within Wikipedia -- not for my personal financial gain.
As for funding Wikimedia to get a legitimate answer to the question... since I've already been a multi-time donor to Wikimedia fund drives, I would certainly entertain that. How much do you think it would cost to conduct such a study? Or, were you being facetious?
Greg
On 9/8/06, Gregory Kohs thekohser@gmail.com wrote:
Surprise, surprise, that past mistakes would come back to haunt me. Still, I felt that the content so linked on my blog would be of informational use to the community. Being that the 3 or 4 inbound hits per day that I received tended to spend an average of 1 to 5 minutes on the article, I guess it actually was of some value to most people, until the links were (appropriately) removed. (Remember, even Jimmy Wales edited his own article a number of times before "learning the rules".)
There is no strict prohibition against editing your own biography, but you are required to strictly conform to WP:V and WP:NPOV, and since *outsiders* often don't understand, such edit is generally discouraged.
I'm curious, how are you measuring how long people spent on your site?
Anyway, you surely won't believe it, but my customers are not at all the reason I'm asking this question. Instead, I have a larger, more universally interesting reason for asking; but I'm not quite ready to disclose my agenda. I will assure you, though, that it is in the interest of underscoring a major "conflict of interest" problem within Wikipedia -- not for my personal financial gain.
Honestly, it is highly variable. Although I don't have real data to support that nor do I know what factors drive influence the numbers.
As for funding Wikimedia to get a legitimate answer to the question... since I've already been a multi-time donor to Wikimedia fund drives, I would certainly entertain that. How much do you think it would cost to conduct such a study? Or, were you being facetious?
I'm not sure, but I was indeed being serious. It would be useful data that we don't currently have.
I'm sure it wouldn't be very expensive. All that needs to be done is that the external (non-interwiki) link code passes URLs through a script on a server that counts the number of clicks to that particular link and then passes the user on to the appropriate site, which would probably be specified through GET. I'll code it if I have time today, its very simple in php, and using a database means that statistics can be generated VERY easily.
[snip]
As for funding Wikimedia to get a legitimate answer to the question... since I've already been a multi-time donor to Wikimedia fund drives, I would certainly entertain that. How much do you think it would cost to conduct such a study? Or, were you being facetious?
I'm not sure, but I was indeed being serious. It would be useful data that we don't currently have.
[snip]
On 9/8/06, Akash Mehta draicone@gmail.com wrote:
I'm sure it wouldn't be very expensive. All that needs to be done is that the external (non-interwiki) link code passes URLs through a script on a server that counts the number of clicks to that particular link and then passes the user on to the appropriate site, which would probably be specified through GET. I'll code it if I have time today, its very simple in php, and using a database means that statistics can be generated VERY easily.
Turning every external link follow into a database write on Wikipedia is simply not going to be acceptable.
We don't currently log HTTP GETs for a reason...
On 9/8/06, Gregory Maxwell gmaxwell@gmail.com wrote:
Turning every external link follow into a database write on Wikipedia is simply not going to be acceptable.
We don't currently log HTTP GETs for a reason...
Yeah, I can see that may be troublesome. It might be interesting to do it
for a subset of pages...like the featured articles, for instance. I'm not sure how difficult it would be to implement, though...
-Rich [[W:en:User:Rholton]]
Just for, say, 50 links, it would be very easy to implement, if there is consensus and approval.
On 9/9/06, Richard Holton richholton@gmail.com wrote:
On 9/8/06, Gregory Maxwell gmaxwell@gmail.com wrote:
Turning every external link follow into a database write on Wikipedia is simply not going to be acceptable.
We don't currently log HTTP GETs for a reason...
Yeah, I can see that may be troublesome. It might be interesting to do it
for a subset of pages...like the featured articles, for instance. I'm not sure how difficult it would be to implement, though...
-Rich [[W:en:User:Rholton]] _______________________________________________ WikiEN-l mailing list WikiEN-l@Wikipedia.org To unsubscribe from this mailing list, visit: http://mail.wikipedia.org/mailman/listinfo/wikien-l
Thats the only problem. We can't AWB every single link in the database, then AWB it back just for the purpose of a study, which is why we would have to somehow do it at software parser level for a day or so. And we could easily host the file on toolserver and handle redirects / logging from there. I have spare low traffic mysql db server we could use. Then again, it would be simpler to build the functionality into the Wikipedia toolbar, and thats a violation of privacy, so it would have to be optional, and there'd be a clearly rigged demographic left of those who allowed the logging to take place...
On 9/9/06, Gregory Maxwell gmaxwell@gmail.com wrote:
On 9/8/06, Akash Mehta draicone@gmail.com wrote:
I'm sure it wouldn't be very expensive. All that needs to be done is that the external (non-interwiki) link code passes URLs through a script on a server that counts the number of clicks to that particular link and then passes the user on to the appropriate site, which would probably be specified through GET. I'll code it if I have time today, its very simple in php, and using a database means that statistics can be generated VERY easily.
Turning every external link follow into a database write on Wikipedia is simply not going to be acceptable.
We don't currently log HTTP GETs for a reason... _______________________________________________ WikiEN-l mailing list WikiEN-l@Wikipedia.org To unsubscribe from this mailing list, visit: http://mail.wikipedia.org/mailman/listinfo/wikien-l
Akash Mehta wrote: [fixed top posting]
On 9/9/06, Gregory Maxwell gmaxwell@gmail.com wrote:
On 9/8/06, Akash Mehta draicone@gmail.com wrote:
I'm sure it wouldn't be very expensive. All that needs to be done is that the external (non-interwiki) link code passes URLs through a script on a server that counts the number of clicks to that particular link and then passes the user on to the appropriate site, which would probably be specified through GET. I'll code it if I have time today, its very simple in php, and using a database means that statistics can be generated VERY easily.
Turning every external link follow into a database write on Wikipedia is simply not going to be acceptable.
We don't currently log HTTP GETs for a reason...
Thats the only problem. We can't AWB every single link in the database, then AWB it back just for the purpose of a study, which is why we would have to somehow do it at software parser level for a day or so. And we could easily host the file on toolserver and handle redirects / logging from there.
The toolserver is overworked enough as it is, thankyou very much.
I have spare low traffic mysql db server we could use. Then again, it would be simpler to build the functionality into the Wikipedia toolbar, and thats a violation of privacy, so it would have to be optional, and there'd be a clearly rigged demographic left of those who allowed the logging to take place...
Logging of such things can /always/ be circumvented if you try hard enough.
Gregory Maxwell wrote:
On 9/8/06, Akash Mehta draicone@gmail.com wrote:
I'm sure it wouldn't be very expensive. All that needs to be done is that the external (non-interwiki) link code passes URLs through a script on a server that counts the number of clicks to that particular link and then passes the user on to the appropriate site, which would probably be specified through GET. I'll code it if I have time today, its very simple in php, and using a database means that statistics can be generated VERY easily.
Turning every external link follow into a database write on Wikipedia is simply not going to be acceptable.
We don't currently log HTTP GETs for a reason...
I was going to suggest log parsing when I first saw this then I realized from my own work how unrealistic it would be to do very much logging on wikipedia.
SKL