At 3/1/2005 06:09 PM, Rowan Collins wrote:
process was [detect tag] -> [recreate content], whereas it would actually be more like [detect tag] -> [flag article as dynamic] and then later [detect dynamic flag] -> [ignore cached copy]
You got it, that's what I meant.
After thinking about it for a while, I went further and, rather than having an is_dynamic flag (which would be just one field in the cur / revision / whatever table, you don't need a whole new table for it)
Yes, definitely, but I did not want to be flamed too early on by suggesting a new column in 'cur' :)
Now more seriously, out of curiosity, we would have talked about an extra bit or an extra-byte per page record, wouldn't we ? Are you guys concerned by that in regards to web sites like Wikipedia ? My gut feeling is that the list of pages using extensions or dynamic contents would probably be *way* smaller than the total number of pages, so maybe a whole new table would not have been a totally stupid idea, it would have saved space and computational resources, and made wikimedia easier to maintain or upgrade once we had figured out that maybe there was a better or different way to solve this caching issue.
In other words
Current / normal: [parse] -> [store in cache] and on request [check cache] -> [load from cache] Proposal for "dynamic pages": [parse] {[detect tag] -> [disable cache]} and on request [check cache] -> [can't load from cache, so parse again]
Sounds good.
One gotcha for anyone wanting to implement this is that if you have a template with dynamic content in it, you've got to make sure that neither the template itself *nor* pages containing that template get cached.
I don't follow you. Aren't templates just like regular pages ? I mean, if a template has dynamic content, it won't get cached. Therefore, when it's time to render a page that uses a template, the template will have to be re-parsed (and not cached again), therefore the page will be up-to-date, as will any other pages using that template.
-- Sebastien Barre