On Sat, 10 Jul 2004 13:10:38 -0700, Brion Vibber brion@pobox.com wrote:
We have over 300 wikis, each with a virtual subdomain. Each "major" project which supports all languages will add about 150 wikis: right now that's Wikipedia and Wiktionary.
Oh right, I forgot that every language has its own subdomain.
Yeah, I guess IP-based virtualhosts aren't really an option then, are they? :)
Can you give some pointers on setting this up with an Apache server, and providing a sane failure mode for clients that don't support it?
I've never actually used TLS myself, but this seems as good an excuse as any to look into it. I'll get back to you on this.
Can't squid be reconfigured to handle the SSL portion itself? In other words, can it simply treat all requests to the backend as if they were HTTP, and simply serve out cached/fresh copies of pages via SSL?
I don't know, can it?
I'm not sure, and honestly I look for any excuse I can NOT to play with squid. IMHO that software is simply too flaky for production use, and I'm frankly astonished you have it working as well as you (apparently) do. When it works, it's great... but when it doesn't...
I watched squid flat-out lie to me about checking for more recent copies of a requested file once. I was sitting there watching the webserver logfile and squid logs simultaneously, and squid claimed it contacted the webserver when I could see for myself that it was completely full of shit. I uninstalled it immediately afterwards.
I play with squid every couple years, hoping that it will surprise me with how stable and reliable it's become... but I keep finding myself disappointed instead.
-Bill Clark