Пн. 13 янв. 2014 12:59:18 пользователь Pavel Astakhov
Hi! I would like to discuss an idea.
In MediaWiki is not very convenient to docomputingusing the syntax of
the wiki. We have to use several extensions like Variables, Arrays,
ParserFunctions and others. If there are a lot of computing, such as
data processing received from Semantic MediaWiki, the speed of page
construction becomes unacceptable. To resolve this issue have to do
another extension (eg Semantic Maps displays data from SMW on Maps).
Becomes a lot of these extensions, they don't work well with each other
and these time-consuming to maintain.
I know about the existence of extension Scribunto, but I think that you
can solve this problem by another, more natural way. I suggest using PHP
code in wiki pages, in the same way as it is used for html files. In
this case, extension can be unificated. For example, get the data from
DynamicPageList, if necessary to process, and transmit it to display
other extensions, such as Semantic Result Formats.This will give users
more freedom for creativity.
In order to execute PHP code safely I decided to try to make a
controlled environment. I wrote it in pure PHP, it is lightweight and in
future can be included in the core. It can be viewed as an extension
Foxway. The first version in branch master. It gives an idea of what it
is possible in principle to do and there's even something like a
debugger. It does not work very quickly and I decided to try to fix it
in a branch develop. There I created two classes, Compiler and Runtime.
The first one processes PHP source code and converts it into a set of
instructions that the class Runtime can execute very quickly. I took a
part of the code from phpunit tests to check the performance. On my
computer, pure PHP executes them on average in 0.0025 seconds, and the
class Runtime in 0.05, it is 20 times slower, but also have the
opportunity to get even better results. I do not take in the calculation
time of class Compiler, because it needs to be used once when saving a
wiki page. Data returned from this class is amenable to serialize and it
can be stored in the database. Also, if all the dynamic data handle as
PHP code, wiki markup can be converted into html when saving and stored
in database. Thus, when requesting a wiki page from the server it will
be not necessary to build it every time (I know about the cache). Take
the already prepared data (for Runtime and html) and enjoy. Cache is
certainly necessary, but only for pages with dynamic data, and the
lifetime of the objects in it can be greatly reduced since performance
will be higher.
I also have other ideas associated with the use of features that provide
this realization. I have already made some steps in this direction and I
think that all of this is realistic and useful.
I'm not saying that foxway ready for use. It shows that this idea can
work and can work fast enough. It needs to be rewritten to make it
easier to maintain, and I believe that it can work even faster.
I did not invent anything new. We all use the html + php. Wiki markup
replaces difficult html and provides security, but what can replace the
I would like to know your opinion: is it really useful or I am wasting
Best wishes. Pavel Astakhov (pastakhov).
I implemented something similar before Scribunto was stable enough and deployed:
However, instead of creating interpreter I just used php built-in tokenizer via
token_get_all() to sanitize the code (to disallow some operators and calls) then
eval'ed() it when security check passed successfully.
I probably should convert everything to use Scribunto instead because many people say that
Lua has more secure VM and better language in general and also it's VM is one of the
fastest (only JVM is faster). And Wikimedia needs Scribunto because they had high CPU load
while executing some large templates at their servers.
However it's of course a bit sad that PHP runkit is so much outdated and abandoned and