Tomasz Wegrzanowski wrote:
Toby Bartels wrote:
>As for exponential computing effort, most TeX
installations
>normally stop working after a small memory limit has been reached.
>Again, we can check with the above sites if we want to check
>that this is really enough.
That's fine for a cracker. To DoS Wikipedia
you'd need to send
only N requests per minute, for some rather small N, and for each request
smaller than 1kB. "Computational complexity DoS" isn't anything new.
Do you mean N individual requests to TeX something?
Sure, that's a DoS attack, but how is texvc safe from that?
How is the regular wiki parser safe from that, for that matter?
>The moral: when Donald E. Knuth writes a program
instead of Microsoft,
>then it is not riddled with security flaws. ^_^
You're wrong, because there are so many modules
available, and each of them
may introduce new dangerous functions in some new version.
You'd need to check every single functions whether it's safe or not,
and ban all functions not checked, thus getting a nice whitelist.
Otherwise, you're just asking for troubles.
None of the many modules for TeX is capable of introducing new I/O commands.
So you check each relevant module for its use of these commands.
TeX is extremely complex and it wasn't written with
security in mind.
Putting it on CGI isn't a very wise idea.
Then DoS PlanetMath or arXiv and see how well you do.
I mean, state specifically, if it is possible, what you would do.
You never explained this the last time we had this conversation,
to me or anybody else.
-- Toby