As a bot programmer, I would much regret this possibility. Not because of the waiting time itself (the bots are already slowing down their edits to not clutter recentchanges, as well as their other calls to decrease their use of server time), but because the bots currently often do not have the page 'served' in the way normal users are - in many cases the bots get their pages through Special:Export, which enables getting a number of pages with only one HTTP-request.
Another issue is that it might seriously impact the response time. The date of viewing must be either received from the submitter him/herself, or stored at the Wikimedia server. In the first case it offers very little security, because a bot programmer can easily adopt their bot to give whatever date suits best as the viewing date. In the second case some kind of list or database field or whatever needs to be updated with each view as well as checked with each edit.
Andre Engels
On Wed, 9 Feb 2005 00:11:12 +0100, Paul Youlten paul.youlten@gmail.com wrote:
I don't know much about bots - but they must be much faster then humans at making changes? Would it be possible to reject edits that happen less than 5 seconds after the page is served? and/or ask users to take a CAPTCHA test if they were making an edit to a page less than 15 seconds after it was served?