The MediaWiki API seems to add paragraph tags when it parses wikitext, but only sometimes. Example:
http://www.mediawiki.org/w/api.php?action=parse&page=Extension:Header/versi…
This page (Extension:Header/version) contains only the text "1.0" but gets turned into "<p>1.0</p>" when parsed via API. On the other hand, the <p> tag is absent if you transclude {{Extension:Header/version}} into another wiki article.
This doesn't happen if your wiki page contains just a table, so something tricky is going on here. :-)
Is there a way to make api.php suppress the <p> tag in my first example?
Thanks,
DanB
One of the products of our MediaWiki Release Management meetup here at
Wikimania yesterday is a new mailing list for people running Wiki Farms.
We want to start standardizing the tools and methods of setting up
these farms.
If you're interested, please join us:
https://lists.wikimedia.org/mailman/listinfo/wikifarm
On 08/09/2013 11:34 PM, Mark A. Hershberger wrote:
> Markus and I will be holding a meetup this afternoon at 1430 (HKT) in
> room N114
>
> If you have a wiki on shared hosting or run a wiki farm, we'd love to
> talk to you. Please come by.
>
> http://wikimania2013.wikimedia.org/wiki/Meetups#MediaWiki_Release_Management
>
> Time: Saturday, 1430-1530 in room N114 (next to Logo Square).
>
> Description: MediaWiki is mainly developed for the Wikimedia sites. So,
> naturally, there is some room for improvement when it comes to the
> tarball releases. Besides automating the release process, we care a lot
> about third party users and extension developers. So in this meetup,
> we'd like to listen to you. What are your needs? Where do you feel the
> current release (process) is not good enough? What are the biggest
> issues? And: do you want to help?
>
> _______________________________________________
> Wikimania-l mailing list
> Wikimania-l(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikimania-l
>
--
http://hexmode.com/
Love alone reveals the true shape of the universe.
-- "Everywhere Present", Stephen Freeman
Once in a while my mobile data connection hits the limit and then goes
into a 64kb/s mode. When that happen pages can't be delivered (or are
delivered with a very long delay) with the HTTP protocol, but the
HTTPS keeps going.
Anyone with an idea what goes on and why HTTPS seems to work while
HTTP don't? And does the same thing happen to other users while they
are on a slow connection?
Hello all.
Here's you deployment highlights for the first week back from Wikimania
(next week).
It'll be another quiet week.
https://wikitech.wikimedia.org/wiki/Deployments#Week_of_August_12th
On Thursday the 15th we'll be upgrading the "group 0" wikis (ie, all of
the test wikis, and mediawiki.org) to MedaiWiki version 1.22wmf13.
Coming up the following week, however, are:
* August 19th - 9am Pacific, OAuth deploy to testwikis (Chris S)
* August 20th - Echo to first set of non-english pilot sites: French and
Polish (so far). See mw:Echo/Release Plan 2013.
* August 21st - 11-noon Pacific, SecureLogin by default (Chris S/Chad)
As always, questions welcome.
Greg
--
| Greg Grossmeier GPG: B2FA 27B1 F7EB D327 6B8E |
| identi.ca: @greg A18D 1138 8E47 FAC8 1C7D |
Hi all,
Wikistats always used a fixed list of namespaces for which
article/edit/editor counts and other metrics should be collected.
This list was pretty short: always ns 0, for commons also 6 and 14 (binaries
and categories), for wikisource 102,104,106, for strategy 106.
As more and more projects create extra namespaces after community vote, in
order to get and stay in sync wikistats from now on will use the api to
determine which namespaces to include.
Countable namespaces are flagged in the api response with: content=""
Example:
api.php?action=query&meta=siteinfo&siprop=namespaces
http://tinyurl.com/mes5l4a
102: cookbooks
110: wikijunior
Here is the latest list of wikis with extra namespaces
http://tinyurl.com/n8thnfl (csv file)
( wb=wikibooks, wk=wiktionary, wn=wikinews, wo=wikivoyage, wp=wikipedia,
wq=wikiquote, ws=wikisource, wv=wikiversity )
Note that namespaces which were always counted will still be included even
when the api does not report them.
See e.g. http://tinyurl.com/mjc55bu (commons)
Here only ns 0 is flagged as content namespace, but we will still include ns
6 and 14.
If you see namespaces which are missing, please flag them wherever that
should be done, so they will be picked up on next wikistats run.
Thanks,
Erik Zachte
While we're on the whole "make HTTPS secure" wave, might as well bring this
up:
https://bugzilla.wikimedia.org/show_bug.cgi?id=24413
DNSSEC is an authenticated method of retrieving DNS records, hence
disallowing attackers from providing fake IP address resolutions to
clients. Usually this doesn't mean much while using HTTPS, since you're
authenticating with TLS anyway, but I still think it'd be a good idea to
implement.
On a side note, there's also a proposed RFC called DANE, which basically
allows TLS certificate verification through DNSSEC (usually in addition to
CA verification). That is another thing we can consider.
*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
www.whizkidztech.com | tylerromeo(a)gmail.com