Charlie Reams wrote:
Timwi wrote:
Simetrical wrote:
Any web spider *automatically* sends *millions* of arbitrary GET requests, and has to for the Internet as we know it to function. There is no way that sending arbitrary GET requests can hurt *anything*.
That is simply not true. Web spiders only follow links.
The kinds of webmasters we are talking about here will assume that you can never fire a given GET URL if you never see a page with a link to it on it.
Timwi
Still, it would be a shame to lose such an obviously useful feature just to protect badly-programmed websites from a very weak attack that can easily be launched in myriad other ways. From my own experience with the Commons, it can be really tedious to upload even a medium-sized file, and this feature would totally solve that.
Well yes, I completely agree with that. I am only pointing out things; I am not against the feature.