Domas Mituzas wrote:
Now with the reverse proxy we are not deactivating XSS entirely, we are just allowing remote controlled access to pages on one single server: the toolserver (plus we enable XHR which is very useful).
That remote controlled access provides with session data of wikipedia users to any toolserver account.
That could already happen. Currently there're handy scripts on the wikimedia projects, used even by sysops, vulnerable to XSS. The peer-review is quite small, because people need to understand javascript, take X time to fully understand what is that javascript doing, and then check it for vulnerabilities. XSS are subtle. People happily give "SELECT page_content WHERE page_name = ""getParam('title') + ";"
It's not that hard to add a JS vulnerability. You only need to convince a local sysop that your tool is really nice to do X. He will blindly copy the line you provide into their monobook.js (the global one if you're lucky).
If you add set the toolserver to be non-blocked, the log is exactly the same: He who adds <script src="/tools/... What do you want to have it working? Have it pointing to svn repository? (so anything running there is versioned)
Having an JS audit would be nice too, but that's another thing (and some users may get a bit angry). If we show you X running vulnerabilities, would you accept it? (and yes, i do thing an attacker could put "huge genitalia images on front pages")