I've been working on a little project that intercepts normal wiki links from reloading the entire page and instead requests that page with ajax, seperating out the content and replacing it on the page. So far, I've gotten this to work with only the main content, however I need to also be able to safely grab the 'mw-navigation' and 'footer' which seems to also be standardized in most skins.
The only problem with that is client side JS cannot safely parse returned pages for all three elements because it is possible for users to specify replica elements with the same ID in-page, and DomDocument is not optimized or well supported.
I would like to be able to grab all of the page content before its echoed out, DomDocument in PHP, grab my elements, and echo out JSON of them. The client side js plugin would then refresh those elements.
I've done a good bit of trial and error and research in the Mediawiki docs for this, and it seems its not currently supported because the active skin is the one that echoes out the entire page, not Mediawiki itself.
Am I wrong in my findings, and are there any plans to make Mediawiki handle pages and echo them out? It would break standard currently, but I feel that having skins build up an output string and passing it to Mediawiki rather than delivering the content itself is a better approach.
Thank you for your time.