I've noticed a growing in extensions extending link syntax. (Namely
SMW's annotations, and other extensions using Embed:, Video:, or
theoretically even Audio: namespaces for embedding things).
However all implementations have strong issues. We have an internal
parsing of links, however when an extension does something it's
customary to use a regex rather than duplicating a small part of the
parser. This normally leads to either a limited syntax substandard of
what the parser does, or a regex so complex it causes server errors when
syntax is a bit broken (missing a trailing ]] ).
For that reason I'm looking into adding a new feature for the parser
Link Hooks. Basically this would allow an extension to hook into link
processing for a Namespace, or a pattern.
I plan to support a number of flags (Link/Media callbacks [link
modification, vs. embedding], namespace/pattern [ns number, or a special
pattern (like SMW's ::)], Multi-params [Pipe separated params rather
than one display text], Recursive parameters [Things like Image: where
links can be inside parameters], Recursive link text [For patterns which
break things up and may contain links]) so it should handle most cases.
Unfortunately I hit a snag in the code when dealing with
[[Embedablens:Page|Content with [[link|displaytext]] inside]]. I can't
provide data to extensions in a sane way. Either plaintext is sent to
them, and they work with that (albet breaking things like usual), or I
try to split up the |'s which doesn't work with nested things, or I
first parse the nested links, but then extensions get a hard to work
with mess passed to them as their data.
The nice way the preprocessor works with objects has pointed me out that
the best way this would work, would probably be to recursively parse the
text into link objects, and then do our expansion, also allowing them
access in special ways to the tree (Extract as WikiText, HTML, Plain Text).
Doing some research into the way the parser handles links at first
provided me with good results ([[link [[inside of]] link]] nicely gives
you a link to "inside of" with the outside stuff verbatim just as the
processor I think of would do). However I ran into an ugly, sticky, mess
with image embedding.
http://dev.wiki-tools.com/wiki/LinkHook#Old_Tests
(Ignore the fact my examples here don't have the frame option)
[[Image:File.ext|Caption]] Renders as a image with "Caption"
[[Image:File.ext|[[Image:File.ext|Caption]]]] Renders an image inside of
another image that has a caption of "Caption".
[[Image:File.ext|[[Image:File.ext|[[link]]]]]] Renders [[link]] as a
link, the rest is completely verbatim.
Honestly, the syntax is inconsistent with itself. If we were trying to
stop embeds inside of embeds, then the last one should render as an
image, with a link to [[link]] and the other Image: verbatim as a caption.
I believe there is a bug about the 2nd case, if anyone has it handy I'd
love a link. I hunted through bugzilla but couldn't find it.
Some use cases, what's expected would be nice.
My issue is that Image links are functionally supposed to be the same as
a setLinkHook using the Media, Multi-params, and Recursive parameters
options. (Embed but not with : at the start, pipe separated parameters,
and parameters can have links inside of them).
However, in terms of any extension or anything that would be using
setLinkHook, something like that making use of the recursive parameters
option would be expecting something different.
[[Embed:Title|[[Otherembed:Title]] and [[link]]]]
Would actually render as an embed, with two links (since it's inside of
another embed the 'Otherembed' reverts to a link).
And: [[Embed:Title|[[Otherembed:Title|[[link]]]]]]
Would actually render as an embed, with a link to [[link]] and the rest
of the caption verbatim.
--
~Daniel Friesen(Dantman, Nadir-Seen-Fire) of:
-The Nadir-Point Group (http://nadir-point.com)
--It's Wiki-Tools subgroup (http://wiki-tools.com)
--The ElectronicMe project (http://electronic-me.org)
--Games-G.P.S. (http://ggps.org)
-And Wikia ACG on Wikia.com (http://wikia.com/wiki/Wikia_ACG)
--Animepedia (http://anime.wikia.com)
--Narutopedia (http://naruto.wikia.com)
---- Guy Van den Broeck <guyvdb(a)gmail.com> schrijft:
> r39406 contains a brand new implementation of the diff algorithm (in
> PHP). It's in the same family as the previous one but has better
> performance and a better heuristic for large diffs.
>
> This one is actually considerably faster than the old diff. In
> attachment I included
The attachment seems to have been scrubbed by the mailing list.
> benchmark output that I use with some of the
> biggest articles on wikipedia and some medium sized ones. It simulates
> the actual diff workload with a line based diff first, followed by a
> word based diff. The comparison is done between a set of sometimes
> distant versions for both the old algorithm as the new one. The
> summary in the end concludes that the new implementation is almost 30
> times faster.
Faster than what? Then the old PHP DifferenceEngine or then wikidiff2?
Roan Kattouw (Catrope)
---- siebrand(a)svn.wikimedia.org schrijft:
> Revision: 39314
> Author: siebrand
> Date: 2008-08-13 23:08:22 +0000 (Wed, 13 Aug 2008)
>
> Log Message:
> -----------
> (bug 15157) Special:Watchlist now has the same options as Special:Watchlist:
>
> (snip)
> +* (bug 15157) Special:Watchlist has the same options as Special:Watchlist:
> + Show/Hide logged in users, Show/Hide anonymous, Invert namespace selection
Should this be: "Special:Watchlist has the same options as *Special:Recentchanges*"?
Roan Kattouw (Catrope)
---- gbtkjd dgjksbg <the_rock3353(a)yahoo.com> schrijft:
> Hello all,
>
>
>
> Is it possible to change wiki page layout?
>
> ex: I want to remove watch tab and add new tab that adds new page.
Yes. Write an extension that uses the SkinTemplateContentActions hook [1].
Roan Kattouw (Catrope)
[1] http://www.mediawiki.org/wiki/Manual:Hooks/SkinTemplateContentActions
---- Tim Starling <tstarling(a)wikimedia.org> schrijft:
> -----BEGIN PGP SIGNED MESSAGE-----
> Hash: SHA1
>
> MediaWiki 1.13.0 is now available.
Mediawiki-announce automatically forwards to wikitech-l, it seems, so the release announcement was posted twice.
Roan Kattouw (Catrope)
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
MediaWiki 1.13.0 is now available.
MediaWiki is now using a "continuous integration" development model
with quarterly snapshot releases. The latest development code is
always kept "ready to run", and in fact runs our own sites on
Wikipedia. MediaWiki 1.13.0 is our latest snapshot release, based on
the development code from three weeks ago, with several bugfixes applied.
Selected changes since MediaWiki 1.12.0:
* 59,000 new localised text messages have been added, taking the total
to 266,000, spread across 281 languages
* New special pages: FileDuplicateSearch, ListGroupRights
* Special:UserRights and Special:SpecialPages have been redesigned
* More options on Special:Recentchangeslinked and Special:WhatLinksHere
* New parser functions: PAGESINCATEGORY, PAGESIZE
* Can hide categories with __HIDDENCAT__
* Friendlier behaviour for users who click a red link but can't edit
* Image redirects are now enabled by default
* Drop-down AJAX search suggestions ($wgEnableMWSuggest)
* Search results show image thumbnails
* The search box in the MonoBook sidebar can be moved up by editing
[[MediaWiki:Sidebar]]
* Double redirects created by a page move can be fixed automatically
Changes since release candidate MediaWiki 1.13.0rc2:
* (bug 13770) Fixed incorrect detection of PHP's DOM module
* Fix regression from r37834: accesskey tooltip hint should be given
for the
minor edit and watch labels on the edit page.
* Updated Chinese simplified/traditional conversion tables
Full release notes:
http://svn.wikimedia.org/svnroot/mediawiki/tags/REL1_13_0/phase3/RELEASE-NO…
**********************************************************************
Download:
http://download.wikimedia.org/mediawiki/1.13/mediawiki-1.13.0.tar.gz
Patch to previous version (1.13.0rc2)
http://download.wikimedia.org/mediawiki/1.13/mediawiki-1.13.0.patch.gz
GPG signatures:
http://download.wikimedia.org/mediawiki/1.13/mediawiki-1.13.0.tar.gz.sighttp://download.wikimedia.org/mediawiki/1.13/mediawiki-1.13.0.patch.gz.sig
Public keys:
https://secure.wikimedia.org/keys.html
SHA-1 checksums:
f53f7548510a39fd9f365d3097e8a581fdb14353 mediawiki-1.13.0.tar.gz
7243a2b50edb78b6f1882ff09c0882e771502539 mediawiki-1.13.0.patch.gz
MD5 checksums:
6e51cb81fd57d90b870e984688734db6 mediawiki-1.13.0.tar.gz
a4880023b196238cb0a77af717b4fd8b mediawiki-1.13.0.patch.gz
- --
Tim Starling
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.6 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
iD8DBQFIpE7CdWgrCOij/sQRAthwAKC9agD3P0CMjXeWTaaHGsmfXvWEfgCfYxSo
nKl2tvJq+BT8u16CjPrG3tI=
=Jk7Q
-----END PGP SIGNATURE-----
peter green wrote:
>> Commons seems to be a target for such an attack. Upload is easy, although I'm
>> not to sure about the damage potential. I suppose if an administrators
>> account would get compromised an applet could be manufactured to mass delete
>> content or mass block users.
>>
>>
> If commons is vulnerable all wikimedia wiki's are and there is nothing
> that local commons users or admins can really do about this. Mediawiki
> should probablly be modified to check for garbage on the end of image
> files if it does not already do so.
>
> Sending this on to wikitech-l so the devs can comment on it.
Replied on commons-l and fixed for default MediaWiki installations in r39203.
-- Tim Starling
On Wed, Aug 13, 2008 at 6:29 PM, <aaron(a)svn.wikimedia.org> wrote:
> Log Message:
> -----------
> I really don't like the idea of invalid IPs sending these passwords out
> . . .
> Modified: trunk/phase3/includes/specials/SpecialUserlogin.php
> . . .
> - if ( '' == $ip ) { $ip = '(Unknown)'; }
> + if( !$ip ) {
> + return new WikiError( wfMsg( 'badipaddress' ) );
> + }
> + #if ( '' == $ip ) { $ip = '(Unknown)'; }
Under what circumstances would the $ip ever be invalid? Where
$_SERVER['REMOTE_ADDR'] is unset? When might that be? If there's no
known circumstance, this chunk of code should just be removed. If
there is one, wfGetIP()'s documentation should be updated (but whether
this change is reasonable depends on when wfGetIP() might fail).
Overall, I have a hard time imagining why a strange IP address should
merit blocking e-mail reset requests.