I'm writing a parser function extension that outputs about 5000 lines of text (an organizational chart of a company) as a nested, bulleted list.
* Bob the CEO ** Jane Jones ** Mike Smith *** etc.
It takes about 3 seconds (real time) for MediaWiki to render this list, which is acceptable. However, if I make it a list of links, which is more useful:
* [[User:Bob | Bob the CEO]] ** [[User:Jane | Jane Jones]] ** [[User:Mike | Mike Smith]]
the rendering time more than doubles to 6-8 seconds, which users perceive as too slow.
Is there a faster implementation for rendering a large number of links, rather than returning the wikitext list and having MediaWiki render it?
Thanks, DanB
________________________________ My email address has changed to danb@cimpress.com. Please update your address book.
Cimpress is the new name for Vistaprint NV, the world’s leader in mass customization. Read more about Cimpress at www.cimpress.com. ________________________________
Probably the fastest thing would be to manually create the <ul><li> etc and wrap them around a loop calling the linker functions (Linker::link).
https://doc.wikimedia.org/mediawiki-core/master/php/html/classLinker.html#a5...
-- brion
On Tue, Jan 27, 2015 at 1:33 PM, Daniel Barrett danb@cimpress.com wrote:
I'm writing a parser function extension that outputs about 5000 lines of text (an organizational chart of a company) as a nested, bulleted list.
- Bob the CEO
** Jane Jones ** Mike Smith *** etc.
It takes about 3 seconds (real time) for MediaWiki to render this list, which is acceptable. However, if I make it a list of links, which is more useful:
- [[User:Bob | Bob the CEO]]
** [[User:Jane | Jane Jones]] ** [[User:Mike | Mike Smith]]
the rendering time more than doubles to 6-8 seconds, which users perceive as too slow.
Is there a faster implementation for rendering a large number of links, rather than returning the wikitext list and having MediaWiki render it?
Thanks, DanB
My email address has changed to danb@cimpress.com. Please update your address book.
Cimpress is the new name for Vistaprint NV, the world’s leader in mass customization. Read more about Cimpress at www.cimpress.com. ________________________________ _______________________________________________ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On Tue Jan 27 2015 at 1:37:36 PM Brion Vibber bvibber@wikimedia.org wrote:
Probably the fastest thing would be to manually create the <ul><li> etc and wrap them around a loop calling the linker functions (Linker::link).
https://doc.wikimedia.org/mediawiki-core/master/php/html/classLinker.html# a52523fb9f10737404b1dfa45bab61045
Another option could be using LinkBatch.
-Chad
You should be able to return something like this to make your parser function output raw HTML instead of WikiText.
return array( $output, 'noparse' => true, 'isHTML' => true );
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://danielfriesen.name/]
On 2015-01-27 1:33 PM, Daniel Barrett wrote:
I'm writing a parser function extension that outputs about 5000 lines of text (an organizational chart of a company) as a nested, bulleted list.
- Bob the CEO
** Jane Jones ** Mike Smith *** etc.
It takes about 3 seconds (real time) for MediaWiki to render this list, which is acceptable. However, if I make it a list of links, which is more useful:
- [[User:Bob | Bob the CEO]]
** [[User:Jane | Jane Jones]] ** [[User:Mike | Mike Smith]]
the rendering time more than doubles to 6-8 seconds, which users perceive as too slow.
Is there a faster implementation for rendering a large number of links, rather than returning the wikitext list and having MediaWiki render it?
Thanks, DanB
My email address has changed to danb@cimpress.com. Please update your address book.
Cimpress is the new name for Vistaprint NV, the world’s leader in mass customization. Read more about Cimpress at www.cimpress.com. ________________________________ _______________________________________________ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
As of https://gerrit.wikimedia.org/r/#/c/29879/2/utils/MessageTable.php,cm , Linker::link took 20 KiB of memory per call. Cf. http://laxstrom.name/blag/2013/02/01/how-i-debug-performance-issues-in-media... I don't know if such bugs/unfeatures and related best practices were written down somewhere.
Nemo
wikitech-l@lists.wikimedia.org