From the wmf office in San Francisco,
webcache.googleusercontent.comresolves to something geographically close and network RT time is around 25ms versus 86ms for en.wikipedia.org, 61ms in googles favor.
From chrome in incognito mode to avoid sending wikipedia cookies, it takes
me 391ms to fetch just the html for http://webcache.googleusercontent.com/search?q=cache:FXQPcAQ_2WIJ:en.wikiped... vs 503ms for http://en.wikipedia.org/wiki/Devo.
That difference of 112ms is less than the latency difference from two round trips, but the request depends on more, meaning that our squids are serving the content faster than google is. Pulling http://en.wikipedia.org/wiki/Devo from a host in our tampa datacenter takes an average of 3ms. If we had a west coast caching presence, I think we'd beat google's cache from our office, but I doubt we'll ever be able to compete with google on global points of caching presence, or network connectivity.
Note that if you're using wikipedia from a browser that has been logged in within the last month, it is likely still sending cookies that bypass our squid caches even when logged out.
On Fri, Sep 30, 2011 at 3:48 PM, jidanni@jidanni.org wrote:
Fellows,
This is Google's cache of http://en.wikipedia.org/wiki/Devo. It is a snapshot of the page as it appeared on 28 Sep 2011 09:22:50 GMT. The current page could have changed in the meantime. Learn more ...
Like why is it so much faster than the real thing? Even when not logged in.
Nope, you may be one of the top ranked websites, but no by speed.
So if you can't beat 'em join 'em. Somehow use Google's caches instead of your own. Something, anything, for a little more speed.
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l