Hans Voss <hans.voss@...> writes:
While this may seem a good idea, the first thougth that sprung to mind was that this makes for a very exploitable structure (for wikispammers). With a "normal" page the spammers are annoying, but simple revert the edit and the information is gone from the wiki (the search anyway). But how does this work with indices:
Hans understood what I meant.
Some more things: these tables are filled (records added, deleted) just after a page is updated - when it is parsed. If you add a "record" to a page - in the inline syntax - it will be added to the apropriate database table at that time too. If you revert a page to a previous version, or remove a declaration of an "inline" record, then the database record it corresponds to is deleted too - for example if it was a "spam". Maybe database records only kept for current versions of pages, since they can be recreated anytime from the page sources. So database records refer to pages by their name.
And I thought this indexing would be done by hande. BTW, I'm studiing to be a "real" indexer (a librarian) at the moment...