Hey everybody,
So today at the iSEC Partners security open forum I heard a talk from Zane Lackey, the former security lead for Etsy, concerning the effectiveness of bug bounties.
He made two points:
1) Bug bounties are unlikely to cause harm, especially for Wikipedia, which I asked him about, because the mere popularity of our service means we are already being scanned, pentested, etc. With a bounty program, there will be incentive for people to report those bugs rather than pastebin them.
2) Even without a monetary reward, which I imagine WMF would not be able to supply, crackers are motivated simply by the “hall of fame”, or being able to be recognized for their efforts.
Therefore, I thought it may be beneficial to take that over to Wikipedia and start our own bug bounty program. Most likely, it would be strictly a hall of fame like structure where people would be recognized for submitting bug reports (maybe we could even use the OpenBadges extension *wink* *wink*). It would help by increasing the number of bugs (both security and non-security) that are found and reported to us.
Any thoughts? (Of course, Chris would have to approve of this program before we even consider it.)
-- Tyler Romeo 0xC86B42DF
On Wed, Jun 25, 2014 at 4:28 PM, Tyler Romeo tylerromeo@gmail.com wrote:
Therefore, I thought it may be beneficial to take that over to Wikipedia and start our own bug bounty program. Most likely, it would be strictly a hall of fame like structure where people would be recognized for submitting bug reports (maybe we could even use the OpenBadges extension *wink* *wink*). It would help by increasing the number of bugs (both security and non-security) that are found and reported to us.
Any thoughts?
Some time ago I ran a number of public exercises testing various aspects of Wikipedia. I ran into a number of issues:
1) It takes a lot of preparation and time spent to do well. 2) Essentially 100% of bugs reported by naive reporters are DUPLICATE, WONTFIX, or are in the backlog of some feature already. 3) Reporting bugs directly in bugzilla creates a lot of noise and annoys people who monitor traffic there. (Mozilla runs things like this from time to time, from them I learned to have people report in a separate system e.g. etherpad or email, and have someone triage and sort the reports before creating Bugzilla tickets, see point 1) above.)
Google, who spends a lot of money doing stuff like this for security exploits, narrows the circumstances radically: http://www.chromium.org/Home/chromium-security/pwnium-4 .
On Wed, Jun 25, 2014 at 4:28 PM, Tyler Romeo tylerromeo@gmail.com wrote:
Hey everybody,
So today at the iSEC Partners security open forum I heard a talk from Zane Lackey, the former security lead for Etsy, concerning the effectiveness of bug bounties.
He made two points:
- Bug bounties are unlikely to cause harm, especially for Wikipedia, which
I asked him about, because the mere popularity of our service means we are already being scanned, pentested, etc. With a bounty program, there will be incentive for people to report those bugs rather than pastebin them.
- Even without a monetary reward, which I imagine WMF would not be able to
supply, crackers are motivated simply by the "hall of fame", or being able to be recognized for their efforts.
Therefore, I thought it may be beneficial to take that over to Wikipedia and start our own bug bounty program. Most likely, it would be strictly a hall of fame like structure where people would be recognized for submitting bug reports (maybe we could even use the OpenBadges extension *wink* *wink*). It would help by increasing the number of bugs (both security and non-security) that are found and reported to us.
Any thoughts? (Of course, Chris would have to approve of this program before we even consider it.)
I've been thinking of at least putting up a list of top contributors on mediawiki.org for a while, and just hadn't had the time to do it. If anyone wants to compile that list from the list of closed security bugs, I'd be very supportive.
As for a more official program, the downside that I predict we would quickly hit (from talking to a few people who have run these) is the high volume of very low quality reports that have to be investigated and triaged. Which is something that just takes time from a human... so my evil_plans.txt towards this was (I really had almost this exactly in my todo list): * Get more volunteers access to security bugs ** {{done}} get list of top contributors ** Find out from Philippe how to get a bunch of volunteers identified *** Doh, we're probably changing our identification process soon. On hold.
So, I was planning to wait until we have a more streamlined process for getting volunteers access to data that could potentially be covered by our privacy policy, then invite some people who have contributed significantly to MediaWiki's security in the past to get access to those bugs and help triage/assign/fix bugs, then look into starting something official or semi-official. But if a few of you would be willing to deal with our current identification/NDA process and are willing to help out investigate report, I'm happy to start working on it sooner.
-- Tyler Romeo 0xC86B42DF
Chris, why don't we leave privacy policy compliance to the users posting on the bug? Wikimedia personal user data shouldn't be going to the security product.
Why does WMF get the right to control by access to MediaWiki security bugs anyway? Could we not simply host MediaWiki stuff externally? Perhaps on the servers of any other major MediaWiki user.
Alex Sent from phone
On Wed, Jun 25, 2014 at 4:28 PM, Tyler Romeo tylerromeo@gmail.com wrote:
Hey everybody,
So today at the iSEC Partners security open forum I heard a talk from Zane Lackey, the former security lead for Etsy, concerning the effectiveness of bug bounties.
He made two points:
- Bug bounties are unlikely to cause harm, especially for Wikipedia,
which
I asked him about, because the mere popularity of our service means we are already being scanned, pentested, etc. With a bounty program, there will be incentive
for
people to report those bugs rather than pastebin them.
- Even without a monetary reward, which I imagine WMF would not be able
to
supply, crackers are motivated simply by the "hall of fame", or being able to be recognized for their efforts.
Therefore, I thought it may be beneficial to take that over to Wikipedia
and
start our own bug bounty program. Most likely, it would be strictly a hall of fame like structure where people would be recognized for submitting bug reports (maybe we could even use the OpenBadges extension *wink* *wink*). It would help by increasing the
number
of bugs (both security and non-security) that are found and reported to us.
Any thoughts? (Of course, Chris would have to approve of this program
before
we even consider it.)
I've been thinking of at least putting up a list of top contributors on mediawiki.org for a while, and just hadn't had the time to do it. If anyone wants to compile that list from the list of closed security bugs, I'd be very supportive.
As for a more official program, the downside that I predict we would quickly hit (from talking to a few people who have run these) is the high volume of very low quality reports that have to be investigated and triaged. Which is something that just takes time from a human... so my evil_plans.txt towards this was (I really had almost this exactly in my todo list): * Get more volunteers access to security bugs ** {{done}} get list of top contributors ** Find out from Philippe how to get a bunch of volunteers identified *** Doh, we're probably changing our identification process soon. On hold.
So, I was planning to wait until we have a more streamlined process for getting volunteers access to data that could potentially be covered by our privacy policy, then invite some people who have contributed significantly to MediaWiki's security in the past to get access to those bugs and help triage/assign/fix bugs, then look into starting something official or semi-official. But if a few of you would be willing to deal with our current identification/NDA process and are willing to help out investigate report, I'm happy to start working on it sooner.
-- Tyler Romeo 0xC86B42DF
_______________________________________________ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On Wed, Jun 25, 2014 at 5:49 PM, Alex Monk krenair@gmail.com wrote:
Chris, why don't we leave privacy policy compliance to the users posting on the bug? Wikimedia personal user data shouldn't be going to the security product.
There are a few cases where there may be legitimate private data in a security bug ("look, sql injection, and here are some rows from the user table!", "Hey, this was supposed to be suppressed, and I can see it", "This user circumvented the block on this IP"). But there might be ways to flag or categorize a report as also including private data? Someone with more bugzilla experience would need to comment.
Why does WMF get the right to control by access to MediaWiki security bugs anyway? Could we not simply host MediaWiki stuff externally? Perhaps on the servers of any other major MediaWiki user.
This certainly could be done. That "other major MediaWiki user" would have to be someone everyone trusts, and preferably with a strong track record of being able to keep their infrastructure secure. If there's a legitimate proposal to try it, let's definitely discuss.
Alex Sent from phone
On Wed, Jun 25, 2014 at 4:28 PM, Tyler Romeo tylerromeo@gmail.com wrote:
Hey everybody,
So today at the iSEC Partners security open forum I heard a talk from Zane Lackey, the former security lead for Etsy, concerning the effectiveness of bug bounties.
He made two points:
- Bug bounties are unlikely to cause harm, especially for Wikipedia,
which
I asked him about, because the mere popularity of our service means we are already being scanned, pentested, etc. With a bounty program, there will be incentive
for
people to report those bugs rather than pastebin them.
- Even without a monetary reward, which I imagine WMF would not be able
to
supply, crackers are motivated simply by the "hall of fame", or being able to be recognized for their efforts.
Therefore, I thought it may be beneficial to take that over to Wikipedia
and
start our own bug bounty program. Most likely, it would be strictly a hall of fame like structure where people would be recognized for submitting bug reports (maybe we could even use the OpenBadges extension *wink* *wink*). It would help by increasing the
number
of bugs (both security and non-security) that are found and reported to us.
Any thoughts? (Of course, Chris would have to approve of this program
before
we even consider it.)
I've been thinking of at least putting up a list of top contributors on mediawiki.org for a while, and just hadn't had the time to do it. If anyone wants to compile that list from the list of closed security bugs, I'd be very supportive.
As for a more official program, the downside that I predict we would quickly hit (from talking to a few people who have run these) is the high volume of very low quality reports that have to be investigated and triaged. Which is something that just takes time from a human... so my evil_plans.txt towards this was (I really had almost this exactly in my todo list):
- Get more volunteers access to security bugs
** {{done}} get list of top contributors ** Find out from Philippe how to get a bunch of volunteers identified *** Doh, we're probably changing our identification process soon. On hold.
So, I was planning to wait until we have a more streamlined process for getting volunteers access to data that could potentially be covered by our privacy policy, then invite some people who have contributed significantly to MediaWiki's security in the past to get access to those bugs and help triage/assign/fix bugs, then look into starting something official or semi-official. But if a few of you would be willing to deal with our current identification/NDA process and are willing to help out investigate report, I'm happy to start working on it sooner.
-- Tyler Romeo 0xC86B42DF
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l _______________________________________________ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On 6/26/14, Chris Steipp csteipp@wikimedia.org wrote:
On Wed, Jun 25, 2014 at 5:49 PM, Alex Monk krenair@gmail.com wrote:
Chris, why don't we leave privacy policy compliance to the users posting on the bug? Wikimedia personal user data shouldn't be going to the security product.
There are a few cases where there may be legitimate private data in a security bug ("look, sql injection, and here are some rows from the user table!", "Hey, this was supposed to be suppressed, and I can see it", "This user circumvented the block on this IP"). But there might be ways to flag or categorize a report as also including private data? Someone with more bugzilla experience would need to comment.
Why does WMF get the right to control by access to MediaWiki security bugs anyway? Could we not simply host MediaWiki stuff externally? Perhaps on the servers of any other major MediaWiki user.
This certainly could be done. That "other major MediaWiki user" would have to be someone everyone trusts, and preferably with a strong track record of being able to keep their infrastructure secure. If there's a legitimate proposal to try it, let's definitely discuss.
Personally I'd prefer that MediaWiki related support software stay hosted by WMF (at least for the foreseeable future). WMF just seems like the logical people to host it, and I don't see any harm in MediaWiki being a "Wikimedia project" in a similar sense as wikipedia is a Wikimedia project. What I would like to see though is a mediawiki world where WMF is not special. What I mean by that is that being a WMF employee/contractor wouldn't get you any special treatment - trusted people would get special access where needed because they're trusted and have demonstrated their competence. A WMF staffer would have to go through the same procedure as anyone else would have to to get any sort of special access. Much of the people who have special access would still be WMF employees, since WMF employs most senior developers, but it wouldn't be "you're a wmf employee = here's access to everything even if you don't need it", "you're not a WMF employee = have to jump through a million hoops plus sign something in blood plus bribe someone to get access to things that would be extremely helpful to your work".
--bawolff
OK, so really the process that we need here is:
1) Get more people on the security team via NDA and whatnot (sign me up, by the way, obviously) 2) Develop a triage system to quickly investigate and handle invalid and duplicate bugs 3) Determine when and how we’re going to do the program 4) Do it.
-- Tyler Romeo 0xC86B42DF
From: Brian Wolff bawolff@gmail.com Reply: Wikimedia developers wikitech-l@lists.wikimedia.org> Date: June 26, 2014 at 0:34:54 To: Wikimedia developers wikitech-l@lists.wikimedia.org> Subject: Re: [Wikitech-l] MediaWiki Bug Bounty Program
On 6/26/14, Chris Steipp csteipp@wikimedia.org wrote:
On Wed, Jun 25, 2014 at 5:49 PM, Alex Monk krenair@gmail.com wrote:
Chris, why don't we leave privacy policy compliance to the users posting on the bug? Wikimedia personal user data shouldn't be going to the security product.
There are a few cases where there may be legitimate private data in a security bug ("look, sql injection, and here are some rows from the user table!", "Hey, this was supposed to be suppressed, and I can see it", "This user circumvented the block on this IP"). But there might be ways to flag or categorize a report as also including private data? Someone with more bugzilla experience would need to comment.
Why does WMF get the right to control by access to MediaWiki security bugs anyway? Could we not simply host MediaWiki stuff externally? Perhaps on the servers of any other major MediaWiki user.
This certainly could be done. That "other major MediaWiki user" would have to be someone everyone trusts, and preferably with a strong track record of being able to keep their infrastructure secure. If there's a legitimate proposal to try it, let's definitely discuss.
Personally I'd prefer that MediaWiki related support software stay hosted by WMF (at least for the foreseeable future). WMF just seems like the logical people to host it, and I don't see any harm in MediaWiki being a "Wikimedia project" in a similar sense as wikipedia is a Wikimedia project. What I would like to see though is a mediawiki world where WMF is not special. What I mean by that is that being a WMF employee/contractor wouldn't get you any special treatment - trusted people would get special access where needed because they're trusted and have demonstrated their competence. A WMF staffer would have to go through the same procedure as anyone else would have to to get any sort of special access. Much of the people who have special access would still be WMF employees, since WMF employs most senior developers, but it wouldn't be "you're a wmf employee = here's access to everything even if you don't need it", "you're not a WMF employee = have to jump through a million hoops plus sign something in blood plus bribe someone to get access to things that would be extremely helpful to your work".
--bawolff
_______________________________________________ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Tyler Romeo wrote:
OK, so really the process that we need here is:
- Get more people on the security team via NDA and whatnot (sign me up,
by the way, obviously)
Any process that involves volunteers signing non-public, indefinite vows of secrecy and silence are antithetical to Wikimedia's values and mission. This isn't a cult. Our bedrock principles are open access and transparency.
MZMcBride
On Jun 26, 2014 9:44 AM, "MZMcBride" z@mzmcbride.com wrote:
Any process that involves volunteers signing non-public, indefinite vows of secrecy and silence are antithetical to Wikimedia's values and mission. This isn't a cult. Our bedrock principles are open access and
transparency.
To clarify, I think Max wants more docs listed at https://meta.wikimedia.org/wiki/Non-disclosure_agreements
Maybe Max is unaware about https://wikitech.wikimedia.org/wiki/Volunteer_NDA
-Jeremy
On 26 June 2014 15:02, Jeremy Baron jeremy@tuxmachine.com wrote:
On Jun 26, 2014 9:44 AM, "MZMcBride" z@mzmcbride.com wrote:
Any process that involves volunteers signing non-public, indefinite vows of secrecy and silence are antithetical to Wikimedia's values and
mission.
This isn't a cult. Our bedrock principles are open access and
transparency.
To clarify, I think Max wants more docs listed at https://meta.wikimedia.org/wiki/Non-disclosure_agreements
Maybe Max is unaware about https://wikitech.wikimedia.org/wiki/Volunteer_NDA
I wouldn't be surprised. I've never seen this page before and, according to the history, it was only created a week ago.
I’ll be frank. I care a lot more about the security of MediaWiki as a software product, as well as the security of its customers (both WMF and third-party) than I do about some made-up notion of “open access” to security bugs.
I think it makes complete sense to have people with access to security bugs sign an agreement saying they will not release said bugs to the public until they have been fixed, released, and announced properly. -- Tyler Romeo 0xC86B42DF
From: MZMcBride z@mzmcbride.com Reply: Wikimedia developers wikitech-l@lists.wikimedia.org> Date: June 26, 2014 at 9:44:25 To: Wikimedia developers wikitech-l@lists.wikimedia.org> Subject: Re: [Wikitech-l] MediaWiki Bug Bounty Program
Any process that involves volunteers signing non-public, indefinite vows of secrecy and silence are antithetical to Wikimedia's values and mission. This isn't a cult. Our bedrock principles are open access and transparency.
As a third-party user: I completely concur. NDAs for security bug access are pretty much standard, aren't they?
- d.
On 26 June 2014 15:08, Tyler Romeo tylerromeo@gmail.com wrote:
I’ll be frank. I care a lot more about the security of MediaWiki as a software product, as well as the security of its customers (both WMF and third-party) than I do about some made-up notion of “open access” to security bugs. I think it makes complete sense to have people with access to security bugs sign an agreement saying they will not release said bugs to the public until they have been fixed, released, and announced properly.
On 06/26/2014 10:15 AM, David Gerard wrote:
NDAs for security bug access are pretty much standard, aren't they?
I don't know about "standard" but they are certainly common in cases where said software has a large installed base and early disclosure of a vulnerability would place them at risk without being able to protect themselves. It's not about avoidance of being "transparent" but to give a bit of protection to third parties - note how fixed security issues are moved from security back to their "real" components when being closed.
-- Marc
Marc A. Pelletier wrote:
On 06/26/2014 10:15 AM, David Gerard wrote:
NDAs for security bug access are pretty much standard, aren't they?
I don't know about "standard" but they are certainly common in cases where said software has a large installed base and early disclosure of a vulnerability would place them at risk without being able to protect themselves. It's not about avoidance of being "transparent" but to give a bit of protection to third parties - note how fixed security issues are moved from security back to their "real" components when being closed.
If you know of any non-disclosure agreements for large, open-source projects, it'd be interesting and helpful to collect a list of links to them for reference. If they're standard/common, it shouldn't be too difficult to find a lot of examples to look over and learn from.
A very brief search turned up https://wiki.mozilla.org/Legal/Confidential_Information, which outlines some of the issues that Wikimedia similarly faces with respect to non-disclosure agreements and volunteers.
Jeremy Baron wrote:
Maybe Max is unaware about https://wikitech.wikimedia.org/wiki/Volunteer_NDA
Err, thanks for the link. As pointed out, that page is less than a week old and had not been advertised or linked from anywhere, as far as I can tell. I don't think there's a reasonable expectation that anybody would have known about it. I'm also not sure any volunteer is following that page... i.e., I'm not sure it's active or authoritative (yet?).
MZMcBride
P.S. Who's Max?
On Thu, Jun 26, 2014 at 12:57 PM, MZMcBride z@mzmcbride.com wrote:
Jeremy Baron wrote:
Maybe Max is unaware about https://wikitech.wikimedia.org/wiki/Volunteer_NDA
Err, thanks for the link. As pointed out, that page is less than a week old and had not been advertised or linked from anywhere, as far as I can tell. I don't think there's a reasonable expectation that anybody would have known about it. I'm also not sure any volunteer is following that page... i.e., I'm not sure it's active or authoritative (yet?).
Yeah, it's not authoritative yet - we're still figuring out the kinks.
Luis
On Thu, Jun 26, 2014 at 12:33 AM, Brian Wolff bawolff@gmail.com wrote:
What I mean by that is that being a WMF employee/contractor wouldn't get you any special treatment - trusted people would get special access where needed because they're trusted and have demonstrated their competence. A WMF staffer would have to go through the same procedure as anyone else would have to to get any sort of special access. Much of the people who have special access would still be WMF employees, since WMF employs most senior developers, but it wouldn't be "you're a wmf employee = here's access to everything even if you don't need it", "you're not a WMF employee = have to jump through a million hoops plus sign something in blood plus bribe someone to get access to things that would be extremely helpful to your work".
Note this is my own personal view as a WMF software developer, and not any sort of official statement.
The situation already isn't "you're a wmf employee = here's access to everything even if you don't need it". For example, for a good while after I was hired I didn't have access to security bugs. Eventually there were enough "hey, look at this" "sure, CC me so I can see it?" conversations that we realized I should be given the access.
There's a security IRC channel (existence of which is publicly acknowledged), which again you need a reason better than "WMF pays me" to get access to.
Or consider root access to anything outside of a few labs projects: I don't have it and if I were to ask for it there'd be a discussion that I'm sure would rightly conclude that I don't need it. Probably an extremely short discussion.
And don't forget that at least some of the hoops to jump through (e.g. demonstration of competence, NDA, idenfitication, privacy policy agreement) is also part of being hired in the first place. It's like how international travel is easier for someone who already has their passport than for someone who needs to get one.
+2 for example is this way: Getting hired as a developer usually gets +2 on the repos that you've been hired to work on, yes. But if you can't convince the hiring managers that you are competent, can contribute high quality patches, and have good judgment in knowing when to merge something (i.e. the sort of things that are listed at [[mw:Gerrit/+2#Granting]] for community members), why are they going to hire you?
A general and boring explanation on how access restrictions are handled/configured in Bugzilla currently. No opinions involved.
On Wed, 2014-06-25 at 21:18 -0700, Chris Steipp wrote:
There are a few cases where there may be legitimate private data in a security bug ("look, sql injection, and here are some rows from the user table!", "Hey, this was supposed to be suppressed, and I can see it", "This user circumvented the block on this IP"). But there might be ways to flag or categorize a report as also including private data? Someone with more bugzilla experience would need to comment.
I'm not aware of any "standardized" way to do this. Current practice is described in item 2 below.
In general, Bugzilla offers two things:
1) Access restriction to all tickets in a certain product by default (like all tickets under "Security"). Only Bugzilla admins, members of the security group, the bug reporter, and people explicitly CC'ed on such a ticket can access such a ticket in such a product.
2) Separate from that, marking both attachments and specific comments in a ticket as "private". It's configured that it can be set and seen by Bugzilla admins and members of the security group. There is a practice (tradition?) to set the 'private' flag if somebody finds or notifies about private data exposed (IPs, passwords, SSIDs), insults / personal attacks, or spam. We don't have an explicit policy defined for setting that flag.
A while ago I was told that people who by default have access to Security tickets in Bugzlla need to have an NDA [1] in place.
andre
[1] https://en.wikipedia.org/wiki/Non-disclosure_agreement
I feel like this would result in a ton of reports that say "YOU CAN DEFACE THE MAIN PAGE!!!" which is editable, if not protected, because it's a wiki.
On Thu, 2014-06-26 at 16:17 +0200, Bartosz Dziewoński wrote:
I feel like this would result in a ton of reports that say "YOU CAN DEFACE THE MAIN PAGE!!!" which is editable, if not protected, because it's a wiki.
This. I have seen several 'bug reports' in Mozilla Bugzilla by 'security researchers' about source code of projects being exposed on Mozilla's servers. Clearly a security breach. What does "FOSS" stand for?
So it boils down to "how to keep clueless people out", to be rough.
andre
On Thu, Jun 26, 2014 at 8:03 AM, Andre Klapper aklapper@wikimedia.org wrote:
On Thu, 2014-06-26 at 16:17 +0200, Bartosz Dziewoński wrote:
I feel like this would result in a ton of reports that say "YOU CAN DEFACE THE MAIN PAGE!!!" which is editable, if not protected, because it's a wiki.
This. I have seen several 'bug reports' in Mozilla Bugzilla by 'security researchers' about source code of projects being exposed on Mozilla's servers. Clearly a security breach. What does "FOSS" stand for?
So it boils down to "how to keep clueless people out", to be rough.
Heck, we get it to security@ pretty often. Just had one a few weeks ago saying "If I append a ?title=foo param it changes the page title!"
-Chad
Le 26/06/2014 17:03, Andre Klapper a écrit :
I have seen several 'bug reports' in Mozilla Bugzilla by 'security researchers' about source code of projects being exposed on Mozilla's servers. Clearly a security breach. What does "FOSS" stand for?
So it boils down to "how to keep clueless people out", to be rough.
Eons ago, we had a couple security experts that paid us a visit to the then very young #mediawiki .
They were willing to help us by auditing the code security and already found a pretty nasty bug that could be a vector of attacks for other website.
It was possible to inject in an uploaded image any arbitrary code such as javascript (enclosed in <script>) then embed that image on another site and point a victim at it.
Damn. Wikipedia, a few years old, has been a serious threat to the internet. We were shocked and took the matter very "seriously".
Then it was either Brion or Tim that showed up and wrote something like:
Your attack vector is too complicated. Just paste the JavaScript to any page by pressing [edit].
Two security experts promptly disappeared.
Le 26/06/2014 01:28, Tyler Romeo a écrit : <snip>
Therefore, I thought it may be beneficial to take that over to Wikipedia and start our own bug bounty program. Most likely, it would be strictly a hall of fame like structure where people would be recognized for submitting bug reports (maybe we could even use the OpenBadges extension *wink* *wink*). It would help by increasing the number of bugs (both security and non-security) that are found and reported to us.
Hello,
I would like us to have our own instance of Google Code-in to list tasks that could be fulfilled by volunteers. Kind of the +easy bugs we have in Bugzilla but with a nicer interface that only has those tasks.
I would totally use such interface to request documentations updates, code reformatting, simple command line utilities and so on. Maybe we can figure out a way to have them filled in Phabricator.
For the bounty system, a task could be attached some kind of scores that would provides folks a bounty in an OpenBadges system.
I could totally imagine granting points for CSS edits, a test being proposed or have the ability to grant badges/rewards to folks proposing patches. I often mail folks when they do their first Jenkins job addition or create a new test in MediaWiki core.
Doesn't WMF has a plan to provide badges in MediaWiki itself? Kind of Wikiloves which let you distribute barn pages on talk pages but a bit more robust?
On Fri, 2014-06-27 at 15:06 +0200, Antoine Musso wrote:
I would like us to have our own instance of Google Code-in to list tasks that could be fulfilled by volunteers. Kind of the +easy bugs we have in Bugzilla but with a nicer interface that only has those tasks.
https://openhatch.org/search/?q=&project=MediaWiki sounds close.
andre
On Fri, Jun 27, 2014 at 9:06 AM, Antoine Musso hashar+wmf@free.fr wrote:
Doesn't WMF has a plan to provide badges in MediaWiki itself? Kind of Wikiloves which let you distribute barn pages on talk pages but a bit more robust?
Well we made an OpenBadges extension for Facebook OpenAcademy, but it's in an infant state, and needs more attention (including from myself) if we actually plan on using it more extensively.
*-- * *Tyler Romeo* Stevens Institute of Technology, Class of 2016 Major in Computer Science
wikitech-l@lists.wikimedia.org