I have put together a mysql call that grabs the latest content of all of the sub pages for an article.It seems to be working fine ... but i am wondering whether i'm missing something critical that may come back to haunt me down the road.
* Does this seem like a safe way of grabbing the latest content of multiple articles?
SELECT page_id, page_title, rev_text_id, text.old_text FROM wiki_page INNER JOIN ( SELECT * FROM wiki_revision GROUP BY rev_page ORDER BY rev_timestamp DESC ) AS rev ON page_id = rev_page INNER JOIN wiki_text AS TEXT ON rev_text_id = text.old_id WHERE page_namespace =0 AND page_title LIKE 'Article/%' ORDER BY page_latest LIMIT 0 , 30
Also ... * Tables.sql in the maintenance folder mentions that the data in the text table may be compressed of encoded in a funky way. When is this the case? would it stop me from being able to parse the content using $wgOut->parse($content); ?
Any insights greatly appreciated.
mediawiki-l@lists.wikimedia.org