I am working on an extension and would like to add some functionality to the 'usercontribs' api module (api/ApiQueryUserContributions.) I cannot subclass the ApiQueryUserContributions class and get the functionality I want because there are a number of methods that have a 'private' scope. Is there a reason that these are private and not protected?
Daniel Renfro
Senior Software Engineer
T: 781-652-6465
Vistaprint Make an impression.
Business Cards are FREE at www.vistaprint.com!
Hey,
Do we trust that messages do not have evil (XSS) stuff in them? The reason
why I ask is that I was just using .msg from mediawiki.jqueryMsg, and
realized that things in the message do not get escaped. Since the function
can take in HTML elements, this seems to be pretty inherent.
Is this "properly" escaped? (Any HTML in the message is not.)
http://pastebin.com/XaWL2bVJ
Cheers
--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil.
--
I've arranged for MediaWiki to get some interns this spring. Four
bright college students -- they already know web development and have
had internships at Facebook, Google, and Microsoft -- need a mentor.
https://www.mediawiki.org/wiki/UCOSP_Spring_2012
It's a bit like Google Summer of Code, but a part-time team rather than
one full-timer. They'll each be working on MediaWiki 8-10 hours per
week between now and the end of April. I figure I need one experienced
MediaWiki developer as a mentor -- I'll be your admin, and community dev
Amgine will be your backup. Amgine will lead the students in person
next weekend during a code sprint in Vancouver.
These interns don't yet know what they want to work on so you can choose
a project with them. I'd help you run two half-hour IRC meetings per
week and we'd run them through two-week iterations between now and the
end of April. I figure it'd be about 5 hours a week, possibly more at
the start as we ramp the students up.
Please let me know as soon as possible, preferably today or Monday. I'd
prefer that the students work on something that will be deployed on
Wikimedia sites.
--
Sumana Harihareswara
Volunteer Development Coordinator
Wikimedia Foundation
Hello, this topic is from foundation-l, I think it is more suited on
wikitech-l.
-------- Message original --------
What about sharing the whole databases among the millions of users, in
some p2p net with a lot of redundancies?, something like a dense, cloudy
internet of databases who remains whole even if it looses part of
itself? Does it sound unwordly?
It could be a good complement to the server based versions.
Le 22/01/2012 20:50, Jussi-Ville Heiskanen a écrit :
> The simple option that will just blow all this talk fo lobbying away,
> is to migrate outside US jurisdiction entirely. It does entail some
> costs, and may well not be optimal, on many fronts.
>
> A medium option is to do a plan on the lines of the actions that
> Google has already put into force, of diversifying datacenters that
> have our non-fungible assets, so that for enforcement they would
> have to invade sovreign territory. But for a non-profit, our best line
> would be to say that we are making those plans, but actually want
> to keep the US have the PR benefit of being able to say that WMF
> like entities find the US best to be incorporated in. And then grin
> very hard, so they know we mean business. Follow up with saying
> the very real contingency plans can not wait on their realizing they
> have the wrong end of the stick, so we have to act now.
>
> So we will put a few fallback datacenters elsewhere, just so our
> various communities and chapters realize we aren't going to be
> bullied by US jurisdiction. But we have a much more expansive
> plan which we tell we will eventually realize. But the legislators
> in the US have to understand we are doing this all so they realize
> what they are working on is harmful to prosperity around the globe.
>
> And if they play ball, (we won't give a cent of tribute, sorry) we will
> not accelerate the rate at which we realize the full international
> nature of the Wikimedia Foundation.
>
> That is pretty much the line of "education" that might be effective,
> without costing the Foundation a single backhander.
> Does anyone have any stats on how far short we are of that
> goal? As in, what fraction of accounts on all wikis are still part
> of the 'messy' part of SUL rather than the 'clean' part?
>
> --HM
What's messy and what's not? Real usernames conflicts seem pretty rare
nowadays and it's not really a problem in general, nor you can predict
when it will.[1]
Really messy situations might be when a user has more edits and would
get the global account but another one is sysop somewhere else, or when
unattached accounts are active in multilingual wikis.
Anyway it could be useful to know (and not hard to query) how many user
accounts are there 1) with some activity in the last year, 2) not fully
unified on all CentralAuth wikis, 3) nor fully unifiable by email or
password.
3) is very important: there are still many unique usernames whose owner
didn't care to unify yet and perhaps we should suggest them en masse to
merge accounts.
Nemo
[1] For instance, there are dozens of "Nemo" who happily live alone in
their pet projects because I didn't try to usurp them.
> Message: 10
> Date: Tue, 24 Jan 2012 15:43:14 -0800
> From: Neil Kandalgaonkar <neilk(a)wikimedia.org>
> To: Wikimedia developers <wikitech-l(a)lists.wikimedia.org>
> Subject: Re: [Wikitech-l] Escaping messages
> Message-ID: <4F1F4212.9050003(a)wikimedia.org>
> Content-Type: text/plain; charset=ISO-8859-1; format=flowed
>
> On 1/23/12 9:59 PM, Daniel Friesen wrote:
>
> >> 2 - We could ensure that the message library never emits scripts, by
> >> applying a simple jQuery filter to the final result.
> > Don't delude yourself into thinking that you can easily blacklist the
> > elements that would run a script.
> > http://ha.ckers.org/xss.html
>
> Thanks for the pointer. You're right, I wasn't being careful enough.
>
> Even so I think we have some reason for limited optimism in this case,
> because jQuery operates on nodes in browser, not strings on the server.
> Adding something to a DOM usually normalizes it, so there's less chance
> of missing something due to unusual ways of encoding, escaping, or
> delimiting input.
>
> As far as I know these are the main dangers:
> - SCRIPT, STYLE tags
> - LINK, IFRAME, FRAME, FRAMESET, META, OBJECT, EMBED tags
> - inherently scripted attributes, such as "onclick".
> - attribute values beginning with 'javascript:', 'vbscript:',
> 'mocha:', 'livescript:', matched case-insensitively.
> - hardest one: element styles with values that, once cleaned of
> comments, contain the script words above or /expression(.*)/
>
> However there are other dangers too. Yesterday I discovered that in
> Chrome, a script will be executed if you .append() it to anything, even
> if it's not part of the document. Annoying.
>
> Anyway I'm not going to war on this, but some reasonable efforts can be
> made.
>
> --
> Neil Kandalgaonkar <neilk(a)wikimedia.org>
That's a really scary approach to security in my opinion. Well its
true that browsers may generally normalize things, are we sure every
browser ever made (including browsers not yet released) do that? Are
we sure there isn't certain weird situations (aka bugs) where the
browser would not normalize something, etc.
Things should either be escaped totally (So we know its safe), or not
escaped at all (so we know its dangerous and treat it as such). Half
measures of just stripping some tags on a blacklist will lull people
into a false senses of security.
-bawolff
Hi everyone,
Please welcome new committer Thibaut Horel (username: zaran), who
plans to help maintain the Proofread Page extension. Thibaut is an
active contributor on fr.wikisource.org and is here at the San
Francisco Hackathon this weekend.
Rob