http://www.labnol.org/internet/pictures/flickr-wikipedia-google-juice-pagera...
(Does anyone who isn't a SEO use the phrase "Google juice"?)
- d.
On Sun, Feb 24, 2008 at 1:25 PM, David Gerard dgerard@gmail.com wrote:
http://www.labnol.org/internet/pictures/flickr-wikipedia-google-juice-pagera...
(Does anyone who isn't a SEO use the phrase "Google juice"?)
I did, when I gave a talk at Google in Zurich and when we went to lunch afterwards.
Mathias
That they refuse to understand why websites have to do this is rather amusing.
-Matt
I still say that the wikipedia needs a mechanism to turn off nofollow on selected links; for example links added by, or marked out by high edit count users. There needs to be a bot or something that whitelists links.
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
On 25/02/2008, Ian Woollard wrote:
I still say that the wikipedia needs a mechanism to turn off nofollow on selected links; for example links added by, or marked out by high edit count users. There needs to be a bot or something that whitelists links.
How about based on link maturity? A fresh link (under 20 days, e.g.) has nofollow to prevent spam links being added, those links which have survived many eyes over 20 days have nofollow removed.
- -- Oldak Quill (oldakquill@gmail.com)
On Mon, Feb 25, 2008 at 2:05 PM, Oldak Quill oldakquill@gmail.com wrote:
How about based on link maturity? A fresh link (under 20 days, e.g.) has nofollow to prevent spam links being added, those links which have survived many eyes over 20 days have nofollow removed.
Last time I did a big scrub I found plenty of overt nastyness (direct links to trojans, browser crashing popup spam sites, etc) that were in articles for long spans of time.
My working theory was that established heavy duty users don't follow external links too often since they take them away from the site, and the casual users don't know or care enough to remove bad ones. True or not, externals very clearly get little oversight on EnWP.
First there needs to come some level of recorded review/oversight. Simply trusting that the links will get seen as is done today is demonstrably highly failure prone. Once that exists, teaching nofollow to follow it is 'just' technical details.
On 25/02/2008, Gregory Maxwell gmaxwell@gmail.com wrote:
On Mon, Feb 25, 2008 at 2:05 PM, Oldak Quill oldakquill@gmail.com wrote:
How about based on link maturity? A fresh link (under 20 days, e.g.) has nofollow to prevent spam links being added, those links which have survived many eyes over 20 days have nofollow removed.
Last time I did a big scrub I found plenty of overt nastyness (direct links to trojans, browser crashing popup spam sites, etc) that were in articles for long spans of time.
My working theory was that established heavy duty users don't follow external links too often since they take them away from the site, and the casual users don't know or care enough to remove bad ones. True or not, externals very clearly get little oversight on EnWP.
First there needs to come some level of recorded review/oversight. Simply trusting that the links will get seen as is done today is demonstrably highly failure prone. Once that exists, teaching nofollow to follow it is 'just' technical details.
How about new URLs being submitted to a Special:Externallinks, each item on the list has a "verify" or "reject" button which trusted users can use to work through these systematically? Once a URL is either verified or rejected it is removed from the list.
On 25/02/2008, Oldak Quill oldakquill@gmail.com wrote:
How about new URLs being submitted to a Special:Externallinks, each item on the list has a "verify" or "reject" button which trusted users can use to work through these systematically? Once a URL is either verified or rejected it is removed from the list.
Doesn't help with domain squatters, though, which is a pretty recurrent problem - domain lapses, snapped up by a squatter, filled with porn ads, and we continue linking to it because no-one notices for six months.
On 25/02/2008, Gregory Maxwell gmaxwell@gmail.com wrote:
Last time I did a big scrub I found plenty of overt nastyness (direct links to trojans, browser crashing popup spam sites, etc) that were in articles for long spans of time.
First there needs to come some level of recorded review/oversight. Simply trusting that the links will get seen as is done today is demonstrably highly failure prone. Once that exists, teaching nofollow to follow it is 'just' technical details.
I think that first we need a working, protected whitelist.
Once there's a whitelist it shouldn't be too hard to come up with a policy to add links to it, and if necessary somebody could generate a bot to help do that.
It seems reasonable that well established users could put a comment mark next to a link that is good, and a bot could check these marks are placed there by a trusted user and add these links to the whitelist, and if all the marks were removed, then it would be delisted.
But the policy is less relevant than the whitelist, once there's a functioning whitelist I'm sure we could rustle up a policy.
On 2/25/08, Ian Woollard ian.woollard@gmail.com wrote:
I still say that the wikipedia needs a mechanism to turn off nofollow on selected links; for example links added by, or marked out by high edit count users. There needs to be a bot or something that whitelists links.
Why even bother? I mean yes, there are a handful of sites which we consider useful and reliable enough to link to them in a very high volume, so high that removing "nofollow" would create a conspicuous spike in the job queue, so high that we've created templates to make them easier to link to. IMDB comes to mind, but somehow I doubt they would be bothered enough to care whether links from Wikipedia affect IMDB's page-rank or whether this would help IMDB pass "Photobucket" in the top 20. But whether they care or not, why should we?
—C.W.
Why even bother? I mean yes, there are a handful of sites which we consider useful and reliable enough to link to them in a very high volume, so high that removing "nofollow" would create a conspicuous spike in the job queue, so high that we've created templates to make them easier to link to. IMDB comes to mind, but somehow I doubt they would be bothered enough to care whether links from Wikipedia affect IMDB's page-rank or whether this would help IMDB pass "Photobucket" in the top 20. But whether they care or not, why should we?
Wikipedia is a very major site, I imagine they would care. I'm not sure why we should though... do we owe them anything for allowing us to link to them? I struggle to see how we would, since they're just putting the information out there, they aren't actively doing anything for us specifically... It would be nice of us to give them a helping hand in their search ratings, but we don't really exist to be nice.
On 27/02/2008, Charlotte Webb charlottethewebb@gmail.com wrote:
Why even bother? I mean yes, there are a handful of sites which we consider useful and reliable enough to link to them in a very high volume, so high that removing "nofollow" would create a conspicuous spike in the job queue, so high that we've created templates to make them easier to link to. IMDB comes to mind, but somehow I doubt they would be bothered enough to care whether links from Wikipedia affect IMDB's page-rank or whether this would help IMDB pass "Photobucket" in the top 20. But whether they care or not, why should we?
Because it may make a huge difference to those sites, and we want them to hang around, or we wouldn't link them. If we link to a site they can end up above or close to the wikipedia on the google ranking, and so they would get direct traffic from google searches. So we'd be gardening good sites that contain things that we can't for copyright or other reasons. They grow and we water them, and they can advertise and recoup their bandwidth costs and so forth and then they tend to stick around longer. Right now we're leeches. We link to them and they may get traffic from us, but none of that shows up in the google ranking. So they've got less incentive to hang around.
—C.W.