Hi devs,
I've been investigating MediaWiki within my Bachelor's thesis "Application of security test tools in open source" at the Free University of Berlin (FU Berlin) [1]. Basically, I am looking for security measures which have been taken to prevent security leaks/vulnerabilities especially with security test tools
MediaWiki is one of the most popular applications across the web. So the attack area maybe quite large.
I have searched across the wiki itself, the mailist list and repo. I have noticed some things, I'd like like to remark:
You advise, as most projects do, to turn "register_globals" off to decrease attack possibilities [3]. A security reponse team [2] handles security vulnerabilities and patches them immediately. Most releases do include security fixes.
I am sure that you do anything you can to assure security.
Spite the recommondations and the security team. Does this team or any other group/person take any measures to assure security with testing tools, with a special test plan or functional requirements?
Thanks in advance,
Michael
[1] https://www.inf.fu-berlin.de/w/SE/ThesisFOSSSecurityTools [2] http://www.mediawiki.org/wiki/Security [3] http://meta.wikimedia.org/wiki/Documentation:Security#General_PHP_recommenda...
On Wed, Apr 30, 2008 at 7:48 AM, Michael Osipov ossipov@inf.fu-berlin.de wrote:
Spite the recommondations and the security team. Does this team or any other group/person take any measures to assure security with testing tools, with a special test plan or functional requirements?
Well, first of all, I think our security team consists of Brion, although maybe some other people receive the security@wikimedia.org mailings as well. Since he's also the lead developer, it's not so much a question of recommendations as mandates, which he usually implements personally (either fixing it himself, or reverting whatever broke it).
Nick Jenkins has done some fuzz-testing on MediaWiki in the past. As far as I'm aware, that's about the end of specific security testing that's done on MediaWiki, at least by the developers. The rest is covered by general code review: checking new code to make sure everything is escaped properly, and looking over old code as it's being maintained.
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
Simetrical wrote:
Well, first of all, I think our security team consists of Brion, although maybe some other people receive the security@wikimedia.org mailings as well. Since he's also the lead developer, it's not so much a question of recommendations as mandates, which he usually implements personally (either fixing it himself, or reverting whatever broke it).
Nick Jenkins has done some fuzz-testing on MediaWiki in the past. As far as I'm aware, that's about the end of specific security testing that's done on MediaWiki, at least by the developers. The rest is covered by general code review: checking new code to make sure everything is escaped properly, and looking over old code as it's being maintained.
Indeed, there's not a lot of organized testing, though the fuzz testing tools get pulled out from time to time to look for HTML injection bugs and other such surprises.
Generally, we try to maintain safe programming practices to ensure the borders are patrolled, as it were:
* Don't construct SQL by hand; use query-building abstractions which ensure proper encoding
* Don't construct HTML output by hand; use wiki parser where suitable or XML-building abstractions which ensure proper encoding
* Don't use $_GET, $_POST, $_REQUEST etc values straight; use abstractions which provide some basic data type validation
* Don't use explicit include() or require()s with configured paths; use class autoloader (when an explicit include is needed, always precede it with a constant check to avoid remote include vulnerabilities)
etc
It's not always perfect, and there's going to be lazy code here and there, but working within a safe framework at input/output points is always a big help in combatting many of the traditional web app vulnerabilities.
When it comes to ensuring that private data in the wiki stays private, there's perhaps less of an automatic guarantee, as you have to decide what is or isn't private and ensure that the visibility is properly restricted.
- -- brion vibber (brion @ wikimedia.org)
On Wed, Apr 30, 2008 at 11:10:13AM -0700, Brion Vibber wrote:
[ snip ]
Generally, we try to maintain safe programming practices to ensure the borders are patrolled, as it were:
[ snip ]
Brion -- I just wanted to let you know that I took your name in vain:
http://blogs.techrepublic.com.com/security/
Brion Vibber wrote:
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
Simetrical wrote:
Well, first of all, I think our security team consists of Brion, although maybe some other people receive the security@wikimedia.org mailings as well. Since he's also the lead developer, it's not so much a question of recommendations as mandates, which he usually implements personally (either fixing it himself, or reverting whatever broke it).
Nick Jenkins has done some fuzz-testing on MediaWiki in the past. As far as I'm aware, that's about the end of specific security testing that's done on MediaWiki, at least by the developers. The rest is covered by general code review: checking new code to make sure everything is escaped properly, and looking over old code as it's being maintained.
Hi Brion,
thanks for your input!
Indeed, there's not a lot of organized testing, though the fuzz testing tools get pulled out from time to time to look for HTML injection bugs and other such surprises.
Sounds good that you found some suitable tools to fuzz with. Are you able to name those tools?
etc
It's not always perfect, and there's going to be lazy code here and there, but working within a safe framework at input/output points is always a big help in combatting many of the traditional web app vulnerabilities.
When it comes to ensuring that private data in the wiki stays private, there's perhaps less of an automatic guarantee, as you have to decide what is or isn't private and ensure that the visibility is properly restricted.
Is there any multi-tier patch revision? The folks at Apache Tomcat do a three-person-review of patches before they get committed.
Thanks
Michael Osipov schreef:
Is there any multi-tier patch revision? The folks at Apache Tomcat do a three-person-review of patches before they get committed.
Well patches submitted from the outside (i.e. by folks who don't have commit access) are reviewed by the one who commits it and by Brion, who reviews everything (and no, they're never the same person, because Brion is generally too busy to review and apply submitted patches). Also, some people are experts on a certain part of the code, and will review all changes to that part. For instance, Tim Starling reviews all changes to the parser and I review all changes to the API.
Roan Kattouw (Catrope)
On Thu, May 1, 2008 at 5:56 AM, Michael Osipov ossipov@inf.fu-berlin.de wrote:
Is there any multi-tier patch revision? The folks at Apache Tomcat do a three-person-review of patches before they get committed.
We have no formal process at the moment, except that Brion reviews everything after it's committed but before it's synced to the servers. People with commit access basically commit whatever they want, and if someone spots that it's broken or otherwise objectionable, they either revert it immediately or post a note to some development forum (this list, #mediawiki on FreeNode, etc.) asking for people's opinions on whether to revert it. In the event of a dispute, Brion resolves it as lead developer. People other than Brion can review whatever they feel like. I at least glance at all commits to core code or extensions used by Wikimedia, and sometimes look them over more closely. It's likely that most interesting commits get at least two other people looking them over.
Bad changes do occasionally go live on Wikipedia (I broke it within hours of getting commit access, woo), but rarely for long. They tend to be spotted quickly by editors, and since changes go live every couple of days on average, it's easy to quickly figure out what must have caused the breakage and fix it.
Michael Osipov a écrit : <snip>
Sounds good that you found some suitable tools to fuzz with. Are you able to name those tools?
The tool have been done by Nick Jenkins and is named fuzz-tester.php. It is available under the name fuzz-tester in the subversion depot : http://tinyurl.com/5fduhx
You can get some history from his website http://nickj.org/MediaWiki or dig in this mailing list archives <:o)
cheers,
Simetrical wrote:
On Wed, Apr 30, 2008 at 7:48 AM, Michael Osipov ossipov@inf.fu-berlin.de wrote:
Spite the recommondations and the security team. Does this team or any other group/person take any measures to assure security with testing tools, with a special test plan or functional requirements?
Hi,
Nick Jenkins has done some fuzz-testing on MediaWiki in the past. As far as I'm aware, that's about the end of specific security testing that's done on MediaWiki, at least by the developers. The rest is covered by general code review: checking new code to make sure everything is escaped properly, and looking over old code as it's being maintained.
Do you think it's worth trying to contact Nick? I know that he fuzzes JAMWiki too. Seems like he's into it.
Mike
You can find some interesting security measures at WebStart.php Checking register_globals for overwriting, the use of a define for entry point so undefined variables can't be abused even with register_globals... Some other security checks are the user-agent forbidding of action=raw for IE (it detects html while it's not) when not using index.php (now changed for any browser), the manual verification of user variables instead of $_SERVER for getting the user ip... As Mediawiki grows, it learns new tricks, sometimes when attacked ;)
And then obviously there're our developers ready to detect bugs. Brion reviews all mediawiki changes, and no extension goes live to wikimedia servers unless it has been verified by a sysadmin. Which OTOH produces a bottleneck on some requests. Or delays for big code changes.
wikitech-l@lists.wikimedia.org