(apologies for the none-theading of this; I wasn't subscribed to
wikitech-l with this address when this message/thread was sent)
<quote name="Mark A. Hershberger" date="2013-02-19" time="23:09:18">
> On Tue 19 Feb 2013 04:39:25 PM EST, Sumana Harihareswara wrote:
> >> My longer term question is: Who is MediaWiki's release manager, and
> >> what
> >> can we expect of the person who has that role?
> >
> > I think the answer is now that Greg Grossmeier fills the role of
> > MediaWiki's release manager so he will have to answer this. :-)
>
> This subject has come up a couple of times in the past week so I look
> forward to working with Greg to implement some policy around MediaWiki
> releases -- especially the point releases for 1.19, the LTS release.
> There is a lot to discuss and I look forward to those conversations.
Hello!
To make this explicit:
Everyone: please do feel free to contact me (email or on IRC, I'm
greg-g) with any ideas, concerns, breakthroughs, gotchas, whatever
dealing with this topic. I might not be able to do anything about it
now, and I might not be the right person to deal with it in all cases,
but I can help route things and keep notes so that we don't lose track
of good ideas.
Generally, what can you expect from me in this new role? I hope the
email robla sent announcing my position can clarify much of it:
http://lists.wikimedia.org/pipermail/wikitech-l/2013-February/066672.html
Quoting robla:
> Greg will be managing the deployment process for the Wikimedia
> websites, focusing at first on improving release notes and outbound
> communication, freeing up folks like Sam to focus the engineering
> aspects of the role. He'll help our Bug Wrangler (Andre) figure out
> how to deal with high priority deployment-related issues; Andre will
> continue to broadly manage the flow of all bugs, while Greg will
> narrowly focus on very high priority issues through fix deployment.
> He'll also take over coordination of our deployment calendar[1], and
> will likely be a little nosier than many of us have had the time to
> do. Over time, Greg will look more holistically at our deployment
> practice, and potentially lead a change over to a more continuous
> deployment model.
Best,
Greg
--
| Greg Grossmeier GPG: B2FA 27B1 F7EB D327 6B8E |
| [[User:Greg G (WMF)]] A18D 1138 8E47 FAC8 1C7D |
Hi,
On Sat, 2013-03-09 at 02:16 -0600, wiki wrote:
> Sorry, I forgot to mention that I have in mind the English wikipedia dump.
http://en.wikipedia.org/wiki/Wikipedia:Database_download says "The size
of the 3 January 2013 dump is approximately 9.0 GB compressed, 41.5 GB
uncompressed)."
andre
--
Andre Klapper | Wikimedia Bugwrangler
http://blogs.gnome.org/aklapper/
I'm going to spam wikitech-l with release updates as well :)
---------- Forwarded message ----------
From: Yuvi Panda <yuvipanda(a)gmail.com>
Date: Mon, Mar 11, 2013 at 3:19 PM
Subject: Commons Android app v1.0 Beta 3 released to the play store
To: mobile-l <mobile-l(a)lists.wikimedia.org>
A new version of the commons app will be released every Monday, and I'll
spam mobile-l with the CHANGELOG and the link :)
The release version of the app that uploads to commons is available from
https://play.google.com/store/apps/details?id=org.wikimedia.commons
The nightly / beta version that uploads to testwiki can be downloaded from
https://bit.ly/commons-beta (redirects to
https://integration.mediawiki.org/nightly/mobile/android-commons/android-co…
)
CHANGELOG for last two releases:
## v1.0 beta 3
- Fix reported crashes
- i18n updates
## v1.0 beta 2
- Fix bug with non-ASCII characters
- Preserve user and description information across upload restarts
- Rudimentary OGG uploading support (when shared from another app only)
- Transparent images now have a white background
- UI improvements for Login
New features planned for next release: More bug fixes, and multiple file
uploads!
Pull requests welcome at https://github.com/wikimedia/android-commons and
bug reports at https://bugzilla.wikimedia.org/enter_bug.cgi?product=CommonsApp
Thank you!
--
Yuvi Panda T
http://yuvi.in/blog
--
Yuvi Panda T
http://yuvi.in/blog
Hi Guys,
I was trying to fix
this<https://bugzilla.wikimedia.org/show_bug.cgi?id=1542>bug. I am a
newbie to mediawiki and it's a first bug I'm trying to solve,
so I don't know much.
I want to know about the spam block list, how does it works, how does
trigger the action, and its logging mechanism.
It would be great if some one could help me fix this bug.
Cheers,
Anubhav
Anubhav Agarwal| 4rth Year | Computer Science & Engineering | IIT Roorkee
This page came up with raw mathtex, then I saw a "math rendering xx%"
counter at bottom right, then 15 seconds later I had the page:
https://en.wikipedia.org/wiki/Noether's_theorem
I admit this sort of page would make a good stress test ...
- d.
Hi,
I think that irc feed of recent changes is working great, but there is
still a lot of space for improvement.
As Ryan Lane suggested once, we could probably use system of queues
instead of irc which would be even more advanced. My suggestion is to
create some kind of feed that would be in machine parseable format,
like XML
This feed would be distributed by some kind of dispatcher living on
some server, like feed.wikimedia.org and offering not just recent
changes but also a recent history (for example last 5000 changes per
project)
In case that service which is parsing this feed would be down for a
moment, it could retrieve a backlog of changes.
The current feed irc.wikimedia.org should stay, but we could change it
so that the current bot is retrieving the data from new xml feed
instead of directly from apaches.
> Sorry, I forgot to mention that I have in mind the English wikipedia dump.
>
>wiki writes:
>
>> Hello.
>>
>> I'm a newbie who wants to start playing with the xml dumps. I've found
>> instructions here and there on how to import these. I'd like to seek
>> guidance though as to how much free disk space one is required to have for
>> the MySql import to succeed? i.e. after I have already installed LAMP +
>> Mediawiki, and already allocated space for the bzip file and the converted
>> import statements file, roughly how much more space is needed?
Hi!
First (because someone else will probably tell you), you shouldn't
cross-post to multiple lists -- at least without announcing it. (I saw
this post on wikitech-l; Xmldatadumps-l)
As to disk space, the text size of the English Wikipedia dump is
roughly 25 GB. I imagine this will be < 32 GB in a MySQL database (I'm
guesstimating 75% fill factor)
However, I think that importing an xml dump is going to be quite
challenging -- especially for English Wikipedia. I've not done one,
but everything I've read indicates the process will probably take well
over 30 hours to complete. You can read more about it here:
https://meta.wikimedia.org/wiki/Data_dumps/ImportDump.php and here:
https://www.mediawiki.org/wiki/Manual:MWDumper. (I would also look at
https://www.mediawiki.org/wiki/Manual_talk:MWDumper to get an idea of
other people's experiences).
There is probably going to be a lot of work involved. The official
importer (ImportDump.php) is said to be slow and the other candidate
(mwdumper) does not seem to be supported. You will also have to import
other tables as well (for example, categories). Images is an entirely
other issue.
If you want a more automated process, you can look at wp-mirror:
http://www.nongnu.org/wp-mirror/ . It is under-development, but it
aims to produce "one-step" full mirror sites for any wiki (with
images). However, English Wikipedia will take 2 months to set up (5
million seconds)
If you just want a copy of English Wikipedia offline (and not a
MediaWiki installation), then you are probably better off with an
offline app. If so, you should try one of the following:
* Kiwix (http://www.kiwix.org) is the official offline app for
Wikipedia. It is complete, stable, well-featured, and fully functional
for any of the major Wikipedias. However, it uses a ZIM format (no
dumps) and has a copy of English Wikipedia available from last year.
* WikiTaxi (http://www.wikitaxi.org) works with any of the XML dumps.
It only works on a Windows machine (on Linux you can try WINE).
* XOWA (http://sourceforge.net/projects/xowa/) works with any of the
xml dumps. It handles images and allows editing. However, it is
relatively new and in an alpha state. Also, note that I am the XOWA
dev.
Hope this is useful.