I know the redirect changes have been planned for
awhile (bug 14418). I haven't had the time to clean up
the patch yet though.
-Chad
On Jun 6, 2009 1:58 PM, "Andrew Garrett" <agarrett(a)wikimedia.org> wrote:
On 06/06/2009, at 5:31 PM, Tim Starling wrote: > Please let me know if you
know of any schema chan...
There are some other AbuseFilter feature blockers that I'd like to run
(related to hiding and patrolling log entries) although the features
aren't yet written. I'll try to get them done in a few hours and tell
you the patch file names when I'm done.
--
Andrew Garrett
Contract Developer, Wikimedia Foundation
agarrett(a)wikimedia.org
http://werdn.us
_______________________________________________ Wikitech-l mailing list
Wikitech-l(a)lists.wikimed...
Hi all,
We now have english wikipedia fully migrated to new servers / new search
backend. We cannot fully migrate other wikis until we resolve some hardware
issues. In the meantime, here is the overview of new features now deployed
on en.wiki:
1) Did you mean... - we now have search suggestions. Care has been taken to
provide suggestions that are context-sensitive, i.e. on phrases, proper
names, etc..
2) fuzzy and wildcard queries - a word can be made fuzzy by adding ~ to it's
end, e.g. query sarah~ thompson~ will give all different spellings and
similar names to sarah thompson. Wildcards can now be prefix and suffix,
e.g. *stan will give various countries in central asia.
3) prefix: - using this magic prefix, queries can be limited to pages
beginning with certain prefix. E.g.
mwsuggest prefix:Wikipedia:Village Pump
will search all village pumps and archives for mwsuggest. This should be
especially useful for archive searching in concert with inputbox or
searchbox
4) intitle: - using this magic prefix, queries can be limited to titles only
5) generally improved quality of search results via usage of related
articles (based on co-occurrence of links), anchor text, text abstracts,
proximity within articles, sections, redirects, improved stemming and such
Cheers, Robert
As you may know I have been working on firefogg integration with
mediaWiki. As you may also know the mwEmbed library is being designed to
support embedding of these interfaces in arbitrary external contexts. I
wanted to quickly highlight a useful stand alone usage example of the
library:
http://www.firefogg.org/make/advanced.html
This "Make Ogg" link will be something you can send to a person so they
can encode source footage to a local ogg video file with the latest and
greatest ogg encoders (presently the thusnelda theora encoder & vorbis
audio). Updates to thusnelda and other free codecs will be pushed out
via firefogg updates.
For commons / wikimedia usage we will directly integrate firefogg (using
that same codebase) You can see an example of how that works on the
'new-upload' branch here:
http://sandbox.kaltura.com/testwiki/index.php/Special:Upload ...
hopefully we will start putting some of this on testing.wikipedia.org
~soonish ?~
The new-upload branch feature set is quite extensive including the
script-loader, jquery javascript refactoring, the new upload-api, new
mv_embed video player, add media wizard etc. Any feedback and specific
bug reports people can do will be super helpful in gearing up for
merging this 'new-upload' branch.
For an overview see:
http://www.mediawiki.org/wiki/Media_Projects_Overview
peace,
--michael
Seems like (at least) the API of #pos in ParserFunctions is
different from the one in StringFunctions.
{{#pos: haysack|needle|offset}}
While the StringFunctions #pos in MediaWiki 1.14 returned an
empty string when the needle was not found, the ParserFunctions
implementation of #pos in svn now returns -1.
This is most unfortunate since current usage depends on this.
Example:
{{#if: {{#pos: abcd|b}} | found | not found }}
{{#if: {{#pos: abcd|x}} | found | not found }}
Now both of these example will return "found"!
Usage scenario:
I try to use #pos in template calls to implement a sort-of-database
functionality in a mediawiki.
I have a big template that contains data in named parameters.
those parameters get passed along to a template that can select "columns"
by rendering some of those named parameters and ignoring others.
Now I want to implement "row selection" by passing along a parameter name
and a substring that should be in the value of that parameter in order
for the data to be rendered.
something like this:
{{#if: {{#pos: {{{ {{{selectionattribute}}} }}} | {{{selectionvalue}}} }} | render_row | render_nothing }}
If I want this to work in different MediaWiki installations I need
to rely on the API of #pos.
Currently there is seems to be no way to use #pos in a way that works
on 1.14 and on 1.15-svn.
cheers
-henrik
Hi,
I'm trying to format a link like this: [[musulman]]ă. On ro.wp, this
is equivalent to [[musulman|musulman]]ă (the special letter is not
included in the wiki link. While going through
http://www.mediawiki.org/wiki/Markup_spec I saw that:
<internal-link> ::= <internal-link-start> <article-link> [ "#"
<section-id> ] [ <pipe> [<link-description>] ] <internal-link-end>
[<extra-description>]
<extra-description> ::= <letter> [<extra-description>]
<letter> ::= <ucase-letter> | <lcase-letter>
<ucase-letter> ::= "A" | "B" | ... | "Y" | "Z"
<lcase-letter> ::= "a" | "b" | ... | "y" | "z"
This tells me that only ASCII letters are used for this type of
linking. However, on fr.wp I can write [[Ren]]é and this is equivalent
to [[Ren|René]].
How was this made? Is it something that can be set by from a page or
should some php be changed?
Thanks,
Strainu
David Gerard wrote:
> Web bugs for statistical data are a legitimate want but potentially a
> horrible privacy violation.
>
> So I asked on wikitech-l, and the obvious answer appears to be to do
> it internally. Something like http://stats.grok.se/ only more so.
>
> So - if you want web bug data in a way that fits the privacy policy,
> please pop over to the wikitech-l thread with technical suggestions
> and solutions :-)
>
>
> - d.
Yes, modifying the http://stats.grok.se/ systems looks like the way to go.
What do people actually want to see from the traffic data? Do they want
referrers, anonymized user trails, or what?
-- Neil
Forwarding to wikitech-l, which is probably a more appropriate list
for this question.
---------- Forwarded message ----------
From: sl contrib <sl.contrib(a)googlemail.com>
Date: 2009/6/1
Subject: Re: [Mediawiki-api] Revisions since certain date / wiki mirror
To: MediaWiki API announcements & discussion <mediawiki-api(a)lists.wikimedia.org>
Hi Roan,
thanks again for the reply. Comments in line.
On Sun, May 31, 2009 at 8:05 PM, Roan Kattouw <roan.kattouw(a)gmail.com> wrote:
>
> 2009/5/31 sl contrib <sl.contrib(a)googlemail.com>:
> > On the other hand it seem
> > strange though that I can't get easily get all 'events' between two dates.
> You can, with recentchanges. It has its limitations, but IMO you
> should be able to cope with them.
While looking at this I noticed that log entries for moved pages don't
contain revids:
'logaction' => 'move',
'move' => {
'new_ns' => 0,
'new_title' => 'Sandpit/test2'
},
'logtype' => 'move',
'revid' => 0,
'timestamp' => '2009-05-31T21:47:11Z',
'old_revid' => 0,
This seems to be inconsistent: For edits, there's an old_revid and a
revid (which are recorded in the log), and when moving a page, it's
there's also an old_revid and a revid. However, those are not recorded
in the log.
Any ideas as to why that is, and if it doesn't make sense, which bug
tracker should it go on?
>
> > Would it somehow be possible to build an intermediate solution? E.g. would
> > it be feasible to build a dedicated
> > action=query&prop=allchanges&start=...&end=...
> > that just solved that problem?
> For revisions, possibly. It wouldn't include log events, though.
To be able to query:
(a) all pages that changed between two dates (with the latest revision
of that page) and
(b) all revisions that were made between two dates
would be useful, with similar options to prop=revisions and in
particular rvprop (and going across all namespaces).
Merging this with log information would not be essential, as most
things would be visible from the revisions themselves. Only the
deletion log would have to be taken into account, but this could be
done in a second query.
Would that be feasible? Something like that would make a mirroring
process very easy, as you could just feed in the date of your last
update, and get the pages back that you need.
All the best,
Bjoern
_______________________________________________
Mediawiki-api mailing list
Mediawiki-api(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-api
Hi,
In the lab I'd have use for a MW configuration where some tables are
placed in separate databases. I've tried googling and also tried
searching at mediawiki.org but I find out from there how to go about (if
it's possible at all).
Does anyone have a hint or link where I can find out how to set up such
a configuration?
Best regards,
// Rolf Lampa