Instead of witting it as an extra header to HTTP protocol ... why don't they write it as a proxy to wikimedia (or any other site the want to temporal proxy). Getting a new HTTP header out there is not an easy task at best a small percentage of sites will support it and then you need to deploy clients and write user interfaces that support it as well.
If viewing old version of sites is something interesting to them. It probably best to write a interface a firefox extension or grease monkey script that integrates makes a "temporal" interface of their likening for the mediawiki api (presumably the "history button" fails to represent their vision? )... for non-mediawiki sites could access "the way back machine".
If the purpose is to support searching or archival. Then its probably best to proxy the mediaWiki api through a proxy that they setup that supports those temporal requests across all sites (ie an enhanced interface to the wayback machine?)
--michael
Daniel Kinzler wrote:
Hi all
The Memento Project http://www.mementoweb.org/ (including the Los Alamos National Laboratory (!) featuring Herbert Van de Sompel of OpenURL fame) is proposing a new HTTP header, X-Accept-Datetime, to fetch old versions of a web resource. They already wrote a MediaWiki extension for this http://www.mediawiki.org/wiki/Extension:Memento - which would of course be particularly interesting for use on Wikipedia.
Do you think we could have this for Wikimedia project? I think that would be very nice indeed. I recall that ways to look at last weeks main page have been discussed before, and I see several issues:
- the timestamp isn't a unique identifier, multiple revisions *might* have the
same timestamp. We need a tiebreak (rev_id would be the obvious choice).
- templates and images also need to be "time warped". It seems like the
extension does not address this at the moment. For flagged revisions we do have such a machnism, right? Could that be used here?
- Squids would need to know about the new header, and by pass the cache when
it's used.
so, what do you think? what does it take? Can we point them to the missing bits?
-- daniel
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l