Hey,
I have a wiki that is accessible both via HTTP and HTTPS which has the file cache enabled. When someone loads a page over HTTP and it gets cached, it will have HTTP resource URLs in it. When someone then loads it over HTTPS and gets this cached page, they'll end up with mixed content, which causes the resources to not be loaded on recent FF versions. Is there a way to split the cache based on protocol used?
Cheers
-- Jeroen De Dauw http://www.bn2vs.com Don't panic. Don't be evil. ~=[,,_,,]:3 --
The standard practice is not to split the cache but to use protocol-relative urls.
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://danielfriesen.name/]
On 2013-10-01 12:45 AM, Jeroen De Dauw wrote:
Hey,
I have a wiki that is accessible both via HTTP and HTTPS which has the file cache enabled. When someone loads a page over HTTP and it gets cached, it will have HTTP resource URLs in it. When someone then loads it over HTTPS and gets this cached page, they'll end up with mixed content, which causes the resources to not be loaded on recent FF versions. Is there a way to split the cache based on protocol used?
Cheers
-- Jeroen De Dauw http://www.bn2vs.com Don't panic. Don't be evil. ~=[,,_,,]:3 -- _______________________________________________ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Hey,
The standard practice is not to split the cache but to use
protocol-relative urls.
This wiki is running MediaWiki 1.21.2 with the Vector skin. The resources with protocol specific URLs are resource loader modules. Does that mean I have to modify the Vector skin? I hope not.
Cheers
-- Jeroen De Dauw http://www.bn2vs.com Don't panic. Don't be evil. ~=[,,_,,]:3 --
On 2013-10-01 12:57 AM, Jeroen De Dauw wrote:
Hey,
The standard practice is not to split the cache but to use protocol-relative urls.
This wiki is running MediaWiki 1.21.2 with the Vector skin. The resources with protocol specific URLs are resource loader modules. Does that mean I have to modify the Vector skin? I hope not.
Cheers
Oh right, the file cache works that way. That's probably a bug we'll have to fix. Most sites with HTTPS are more likely using a real front end cache rather than the file cache.
That said even if we can make the file cache split by protocol that's probably not the right solution. I can't think of any reason for RL to ever have a valid reason to output fully absolute urls. The proper solution would probably be dealing with RL instead.
Could you give some more info for debugging.
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://danielfriesen.name/]
On 01/10/13 17:57, Jeroen De Dauw wrote:
Hey,
The standard practice is not to split the cache but to use
protocol-relative urls.
This wiki is running MediaWiki 1.21.2 with the Vector skin. The resources with protocol specific URLs are resource loader modules. Does that mean I have to modify the Vector skin? I hope not.
No, just set $wgServer to the protocol-relative URL (e.g. "//wiki.example.com") and set $wgCanonicalServer to the protocol-specific URL (e.g. https://wiki.example.com). This is how we do it at WMF. I have tested this setup just now with the HTML file cache, and it works just fine.
On 01/10/13 18:07, Daniel Friesen wrote:
Oh right, the file cache works that way. That's probably a bug we'll have to fix. Most sites with HTTPS are more likely using a real front end cache rather than the file cache.
You are incorrect. There is no bug.
Honestly, I don't think I have ever seen a flame war started for such a stupid reason.
-- Tim Starling
Hey,
No, just set $wgServer to the protocol-relative URL (e.g.
"//wiki.example.com") and set $wgCanonicalServer to the protocol-specific URL (e.g. https://wiki.example.com). This is how we do it at WMF. I have tested this setup just now with the HTML file cache, and it works just fine.
Thanks Tim! I had pretty much given up hope to get a useful answer in this thread after 50 mails, so this was a happy surprise :) Problem solved.
Cheers
-- Jeroen De Dauw http://www.bn2vs.com Don't panic. Don't be evil. ~=[,,_,,]:3 --
Le 01/10/13 09:45, Jeroen De Dauw a écrit :
I have a wiki that is accessible both via HTTP and HTTPS which has the file cache enabled. When someone loads a page over HTTP and it gets cached, it will have HTTP resource URLs in it. When someone then loads it over HTTPS and gets this cached page, they'll end up with mixed content, which causes the resources to not be loaded on recent FF versions. Is there a way to split the cache based on protocol used?
The file cache is barely maintained and I am not sure whether anyone is still relying on it. We should probably remove that feature entirely and instruct people to setup a real frontend cache instead.
On 1 October 2013 12:44, Antoine Musso hashar+wmf@free.fr wrote:
Le 01/10/13 09:45, Jeroen De Dauw a écrit :
I have a wiki that is accessible both via HTTP and HTTPS which has the file cache enabled. When someone loads a page over HTTP and it gets cached, it will have HTTP resource URLs in it. When someone then loads it over HTTPS and gets this cached page, they'll end up with mixed content, which causes the resources to not be loaded on recent FF versions. Is there a way to split the cache based on protocol used?
The file cache is barely maintained and I am not sure whether anyone is still relying on it. We should probably remove that feature entirely and instruct people to setup a real frontend cache instead.
I used it for a few days on rationalwiki.org and it was *great*! Then we got a coupla Squids and they're just ridiculously better.
- d.
Hey,
The file cache is barely maintained and I am not sure whether anyone is
still relying on it. We should probably remove that feature entirely and instruct people to setup a real frontend cache instead.
This means we say "you no can has cache" to people using shared hosts. Do we really want to do that? If so, the docs ought to be updated, and the deprecation announced.
Cheers
-- Jeroen De Dauw http://www.bn2vs.com Don't panic. Don't be evil. ~=[,,_,,]:3 --
On 01.10.2013, 16:21 Jeroen wrote:
This means we say "you no can has cache" to people using shared hosts. Do we really want to do that? If so, the docs ought to be updated, and the deprecation announced.
Some shared hosts have PHP 5.0 or even 4 - should we care enough to support them at this point?
I think the policy has been "we make basic stuff work on shared installations, but if you want performance, VPS/dedicated is a must" for a few years now, and this deprecation aligns nicely with it.
On Tue, Oct 1, 2013 at 8:21 AM, Jeroen De Dauw jeroendedauw@gmail.comwrote:
The file cache is barely maintained and I am not sure whether anyone is
still relying on it. We should probably remove that feature entirely and instruct people to setup a real frontend cache instead.
This means we say "you no can has cache" to people using shared hosts. Do we really want to do that? If so, the docs ought to be updated, and the deprecation announced.
In my opinion we should completely drop support for shared hosts. It's 2013 and virtual hosts are cheap and superior in every way. Supporting shared hosting severely limits what we can do in the software reasonably.
- Ryan
On 10/01/2013 07:44 AM, Antoine Musso wrote:
The file cache is barely maintained and I am not sure whether anyone is still relying on it. We should probably remove that feature entirely and instruct people to setup a real frontend cache instead.
While this is a nice thought, it is completely unreasonable for the majority of wikis out there who use shared hosting.
We should find a way to maintain these bits that WMF doesn't use.
Mark.
On Tue, Oct 1, 2013 at 6:07 AM, Mark A. Hershberger mah@nichework.comwrote:
On 10/01/2013 07:44 AM, Antoine Musso wrote:
The file cache is barely maintained and I am not sure whether anyone is still relying on it. We should probably remove that feature entirely and instruct people to setup a real frontend cache instead.
While this is a nice thought, it is completely unreasonable for the majority of wikis out there who use shared hosting.
We should find a way to maintain these bits that WMF doesn't use.
We've been moving away from being friendly to old-style shared-hosting servers for some time with key features that people are going to expect to replicate on their MediaWikis in the future...
* Lua templates * VE and Parsoid * Math next-generation version migrating to using a web service instead of a shell-out * ???
I'd recommend starting to deprecate support for 'shared PHP web host' environments altogether in favor of either custom installations or virtual machines -- VMs are both more flexible and easier to install.
That might not be something we can do immediately, but we should strongly think about planning that way for the future.
(I also strongly recommend having an official MediaWiki hosting & support service, with ad- and ad-free options, various degrees of customization vs automated service, etc. This'd cover a huge portion of the "I just need to stick a wiki somewhere" cases that are probably ending up on shared hosting because people don't have money for better hosting and/or the experience to run their own VM.)
-- brion
On Tue, Oct 1, 2013 at 6:25 AM, Brion Vibber bvibber@wikimedia.org wrote:
On Tue, Oct 1, 2013 at 6:07 AM, Mark A. Hershberger <mah@nichework.com
wrote:
On 10/01/2013 07:44 AM, Antoine Musso wrote:
The file cache is barely maintained and I am not sure whether anyone is still relying on it. We should probably remove that feature entirely and instruct people to setup a real frontend cache instead.
While this is a nice thought, it is completely unreasonable for the majority of wikis out there who use shared hosting.
We should find a way to maintain these bits that WMF doesn't use.
We've been moving away from being friendly to old-style shared-hosting servers for some time with key features that people are going to expect to replicate on their MediaWikis in the future...
- Lua templates
- VE and Parsoid
- Math next-generation version migrating to using a web service instead of
a shell-out
- ???
I'd recommend starting to deprecate support for 'shared PHP web host' environments altogether in favor of either custom installations or virtual machines -- VMs are both more flexible and easier to install.
That might not be something we can do immediately, but we should strongly think about planning that way for the future.
I'd like to echo everything Brion said here. I think we were just talking about this last week or the week before, right?
-Chad
On 10/01/2013 09:25 AM, Brion Vibber wrote:
(I also strongly recommend having an official MediaWiki hosting & support service, with ad- and ad-free options, various degrees of customization vs automated service, etc. This'd cover a huge portion of the "I just need to stick a wiki somewhere" cases that are probably ending up on shared hosting because people don't have money for better hosting and/or the experience to run their own VM.)
Interestingly enough, that circles back to something I've been planning for Labs for a while now. "Our" use case is to allow easy deploy of MW installs with an arbitrary set of extensions and easy configuration for core and extension dev work (almost certainly based around vagrant), but that same setup/infrastructure should be reusable to do something like you describe.
-- Marc
On 10/01/2013 09:25 AM, Brion Vibber wrote:
We've been moving away from being friendly to old-style shared-hosting servers for some time with key features that people are going to expect to replicate on their MediaWikis in the future...
Fair enough.
If WMF were the only user of MW, you could freely decide to take MW it whatever direction you choose. But doing that now without considering the needs of other MW users isn't responsible.
Not everyone needs to run Wikipedia and it is a worthwhile effort to make sure MediaWiki remains scalable down to the shared-hosting level.
On 10/01/2013 09:56 AM, Chad wrote:
I'd like to echo everything Brion said here. I think we were just talking about this last week or the week before, right?
You and Brion talked about this? I don't recall any conversation on wikitech-l about abandoning support for the "little guy".
Before a decision like this is made (and I would like people like David Gerard of RationalWiki to continue to weigh in, not just WMF employees), I would like to get some actual statistics on the number of shared hosting users. How are they running their sites? Do larger wikis have the time and the budget to support this sort of move?
Perhaps WMF should just fork off their own "enterprise" branch of MW?
It would be good to discuss this on mediawiki-l, too.
Last I checked I'm making recommendations in an early stage of discussion on an open mailing list, not making decisions for everybody by myself.
-- brion
On Tue, Oct 1, 2013 at 7:27 AM, Mark A. Hershberger mah@nichework.comwrote:
On 10/01/2013 09:25 AM, Brion Vibber wrote:
We've been moving away from being friendly to old-style shared-hosting servers for some time with key features that people are going to expect
to
replicate on their MediaWikis in the future...
Fair enough.
If WMF were the only user of MW, you could freely decide to take MW it whatever direction you choose. But doing that now without considering the needs of other MW users isn't responsible.
Not everyone needs to run Wikipedia and it is a worthwhile effort to make sure MediaWiki remains scalable down to the shared-hosting level.
On 10/01/2013 09:56 AM, Chad wrote:
I'd like to echo everything Brion said here. I think we were just talking about this last week or the week before, right?
You and Brion talked about this? I don't recall any conversation on wikitech-l about abandoning support for the "little guy".
Before a decision like this is made (and I would like people like David Gerard of RationalWiki to continue to weigh in, not just WMF employees), I would like to get some actual statistics on the number of shared hosting users. How are they running their sites? Do larger wikis have the time and the budget to support this sort of move?
Perhaps WMF should just fork off their own "enterprise" branch of MW?
It would be good to discuss this on mediawiki-l, too.
-- Mark A. Hershberger NicheWork LLC 717-271-1084
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Indeed. Please don't read into me speaking with Brion. It was just idle chit-chat over lunch which ended with me saying something about bringing it up on-list. Obviously any decisions on such an issue need to be widely discussed and advertised.
-Chad
On Tue, Oct 1, 2013 at 7:47 AM, Brion Vibber bvibber@wikimedia.org wrote:
Last I checked I'm making recommendations in an early stage of discussion on an open mailing list, not making decisions for everybody by myself.
-- brion
On Tue, Oct 1, 2013 at 7:27 AM, Mark A. Hershberger <mah@nichework.com
wrote:
On 10/01/2013 09:25 AM, Brion Vibber wrote:
We've been moving away from being friendly to old-style shared-hosting servers for some time with key features that people are going to expect
to
replicate on their MediaWikis in the future...
Fair enough.
If WMF were the only user of MW, you could freely decide to take MW it whatever direction you choose. But doing that now without considering the needs of other MW users isn't responsible.
Not everyone needs to run Wikipedia and it is a worthwhile effort to make sure MediaWiki remains scalable down to the shared-hosting level.
On 10/01/2013 09:56 AM, Chad wrote:
I'd like to echo everything Brion said here. I think we were just talking about this last week or the week before, right?
You and Brion talked about this? I don't recall any conversation on wikitech-l about abandoning support for the "little guy".
Before a decision like this is made (and I would like people like David Gerard of RationalWiki to continue to weigh in, not just WMF employees), I would like to get some actual statistics on the number of shared hosting users. How are they running their sites? Do larger wikis have the time and the budget to support this sort of move?
Perhaps WMF should just fork off their own "enterprise" branch of MW?
It would be good to discuss this on mediawiki-l, too.
-- Mark A. Hershberger NicheWork LLC 717-271-1084
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On 10/01/2013 10:50 AM, Chad wrote:
Indeed. Please don't read into me speaking with Brion. It was just idle chit-chat over lunch which ended with me saying something about bringing it up on-list.
Chad, Brion,
I appologize for making it sound like you speak for the Foundation's direction when it comes to MediaWiki. You don't and I understand that.
You both have important voices that have a strong influence on the direction of the conversation, though, which is why I responded the way I did.
More than anything, I would like some real data from non-WMF users of MW. Wikitech-l isn't the right place to get that data, so I want to avoid giving the idea that the decision to drop support for shared hosting has been made.
I think I've accomplished that, so I'll stand down now while I look for a way to get the data I want.
On 02/10/13 01:19, Mark A. Hershberger wrote:
More than anything, I would like some real data from non-WMF users of MW. Wikitech-l isn't the right place to get that data, so I want to avoid giving the idea that the decision to drop support for shared hosting has been made.
You could have the installer send environment details to mediawiki.org for aggregation.
-- Tim Starling
On 10/01/2013 07:24 PM, Tim Starling wrote:
On 02/10/13 01:19, Mark A. Hershberger wrote:
More than anything, I would like some real data from non-WMF users of MW.
You could have the installer send environment details to mediawiki.org for aggregation.
As you are probably aware, I *do* have an RFC for this[1].
But now we have REAL data on the sorts of hosting people use to run their wikis.
Yesterday, I went to Jamie of WikiApiary and talked to him about ways to get this sort of data from his bot. The first idea I had was the reverse DNS for the IP that hosts a whiki.
Jamie added [[Property:Has IP address]] and [[Property:Has reverse lookup]] [2] to his data set and used it to come up with a preliminary list of hosting providers[3].
Dreamhost is the overwhelming favorite.
There are, of course, problems with his data. For example, we're still missing a *lot* of wikis, but I'm working on addressing this and other problems.
Mark.
[1] https://www.mediawiki.org/wiki/Requests_for_comment/Opt-in_site_registration... [2] http://wikiapiary.com/wiki/WikiApiary_talk:Operations/2013/October#New_Prope... [3] http://wikiapiary.com/wiki/User:Thingles/Hosting_providers
On 10/02/2013 06:42 AM, Mark A. Hershberger wrote:
But now we have REAL data on the sorts of hosting people use to run their wikis.
Yesterday, I went to Jamie of WikiApiary and talked to him about ways to get this sort of data from his bot. The first idea I had was the reverse DNS for the IP that hosts a whiki.
A VPS owner would often set the reverse DNS to match their own domain, so I think reverse DNS data is not very useful for gauging the relative use of shared vs. VPS hosting.
The question we are interested in is mainly whether a wiki is on a host with support for the installation of wiki-specific software. In general that is hard to figure out in an automated way, especially for less common providers / reverse DNS entries.
Gabriel
On 10/02/2013 03:09 PM, Gabriel Wicke wrote:
A VPS owner would often set the reverse DNS to match their own domain, so I think reverse DNS data is not very useful for gauging the relative use of shared vs. VPS hosting.
But the VPS would all be on the same net block. Jamie has added that to his bot and it is finding those on EC2, Linode, Dreamhost, etc.
For example, here is an Amazon netblock: http://wikiapiary.com/wiki/Special:SearchByProperty/Has-20netblock-20organiz...
Compare with the Dreamhost netblock: http://wikiapiary.com/w/index.php?title=Special:SearchByProperty&offset=...
The question we are interested in is mainly whether a wiki is on a host with support for the installation of wiki-specific software. In general that is hard to figure out in an automated way, especially for less common providers / reverse DNS entries.
I will not dispute this ("hard to figure out"), but I will point out that the using the above information gives us a better idea than simply relying on reverse DNS.
Mark.
On Tue, Oct 1, 2013 at 7:27 AM, Mark A. Hershberger mah@nichework.comwrote:
On 10/01/2013 09:25 AM, Brion Vibber wrote:
We've been moving away from being friendly to old-style shared-hosting servers for some time with key features that people are going to expect
to
replicate on their MediaWikis in the future...
Fair enough.
If WMF were the only user of MW, you could freely decide to take MW it whatever direction you choose. But doing that now without considering the needs of other MW users isn't responsible.
We are considering other users here. Isn't that the point of this thread?
Not everyone needs to run Wikipedia and it is a worthwhile effort to make sure MediaWiki remains scalable down to the shared-hosting level.
I disagree. I think making it scale down to the VPS level is acceptable. Should you always be able to at least *install* the bare MediaWiki on some dinky shared host? Sure. But we don't have to make promises about scaling. It's never going to scale, ever.
Before a decision like this is made (and I would like people like David Gerard of RationalWiki to continue to weigh in, not just WMF employees), I would like to get some actual statistics on the number of shared hosting users. How are they running their sites? Do larger wikis have the time and the budget to support this sort of move?
If they're large they're not on shared hosting. It's impossible to run a large wiki on shared hosting.
Also: moving? Who said anything about moving? We're talking about possibly just not caring about shared hosts so much, not actively breaking them.
Perhaps WMF should just fork off their own "enterprise" branch of MW?
We do branch MediaWiki every release cycle. A full blown fork would be a bad idea.
-Chad
On 10/01/2013 07:27 AM, Mark A. Hershberger wrote:
On 10/01/2013 09:25 AM, Brion Vibber wrote:
We've been moving away from being friendly to old-style shared-hosting servers for some time with key features that people are going to expect to replicate on their MediaWikis in the future...
Fair enough.
If WMF were the only user of MW, you could freely decide to take MW it whatever direction you choose. But doing that now without considering the needs of other MW users isn't responsible.
Not everyone needs to run Wikipedia and it is a worthwhile effort to make sure MediaWiki remains scalable down to the shared-hosting level.
With VPS prices starting in the $2-$5 a month range [1][2][3][4] there are not many reasons left for using less secure and less predictable shared hosting. People have been migrating away from shared hosting for a while, and this trend is set to continue in the future.
Given limited resources, it seems to be wiser to start focusing our efforts on packaging rather than shared hosting work-arounds. Using the best tools for different parts of the system rather than limiting ourselves to what is available on $.99 shared hosts lets us make MediaWiki more efficient and cleaner. This benefits users both at the low and high end.
I'm looking forward to the days when "apt-get install mediawiki" installs and configures a fully-featured MediaWiki system with proper caching, VE, Parsoid, Lua and so on. And performs well on a $2 a month VPS.
Gabriel
[1]: http://ramnode.com/ [2]: http://lowendbox.com/ [3]: https://www.digitalocean.com/ [4]: http://www.vpscolo.com/
On 1 October 2013 17:10, Gabriel Wicke gwicke@wikimedia.org wrote:
I'm looking forward to the days when "apt-get install mediawiki" installs and configures a fully-featured MediaWiki system with proper caching, VE, Parsoid, Lua and so on. And performs well on a $2 a month VPS.
That would be lovely! So that stuff will be stable and present for the next LTS, as is likely to be used for Debian's package?
- d.
On Tue, Oct 1, 2013 at 12:13 PM, David Gerard dgerard@gmail.com wrote:
That would be lovely! So that stuff will be stable and present for the next LTS, as is likely to be used for Debian's package?
I know you're trolling, but: presumably wikimedia would start by maintaining their own apt repository. Integration with upstream release cycles would be the responsibility of upstream. --scott
On 1 October 2013 22:34, C. Scott Ananian cananian@wikimedia.org wrote:
On Tue, Oct 1, 2013 at 12:13 PM, David Gerard dgerard@gmail.com wrote:
That would be lovely! So that stuff will be stable and present for the next LTS, as is likely to be used for Debian's package?
I know you're trolling, but: presumably wikimedia would start by maintaining their own apt repository. Integration with upstream release cycles would be the responsibility of upstream.
Well, I wasn't actually. A mediawiki.org PPA is an idea that sounds utterly wonderful and bypasses the Debian packaging effort completely, which does seem quite worth it.
- d.
<quote name="David Gerard" date="2013-10-01" time="22:51:46 +0100">
Well, I wasn't actually. A mediawiki.org PPA is an idea that sounds utterly wonderful and bypasses the Debian packaging effort completely, which does seem quite worth it.
OFFTOPIC:
now if only Launchpad would support building packages against Debian... :(
https://bugs.launchpad.net/launchpad/+bug/188564
Given that LP is in 'maintenance mode' and that's a low priority bug...
Le 02/10/13 00:52, Greg Grossmeier a écrit :
now if only Launchpad would support building packages against Debian... :(
We could build the packages ourself using a Debian image in labs then upload the resulting deb to launchpad / whatever place.
On 10/02/2013 04:36 AM, Antoine Musso wrote:
Le 02/10/13 00:52, Greg Grossmeier a écrit :
now if only Launchpad would support building packages against Debian... :(
We could build the packages ourself using a Debian image in labs then upload the resulting deb to launchpad / whatever place.
We already have very good relationships with Debian and Fedora packagers. There isn't any need for us to do this work.
I worked with the Debian packagers to get the LTS in their latest stable release[1] and Fedora has recently started regular updates to their Mediawiki package[2].
Instead, I would suggest submitting patches to the existing packages or joining the mediawiki-distributors mailing list and talking about how you think the packages could be improved.
Mark.
[1] http://packages.debian.org/wheezy/mediawiki [2] https://apps.fedoraproject.org/packages/mediawiki
On 2 October 2013 14:53, Mark A. Hershberger mah@nichework.com wrote:
We already have very good relationships with Debian and Fedora packagers. There isn't any need for us to do this work. I worked with the Debian packagers to get the LTS in their latest stable release[1] and Fedora has recently started regular updates to their Mediawiki package[2]. Instead, I would suggest submitting patches to the existing packages or joining the mediawiki-distributors mailing list and talking about how you think the packages could be improved.
I'm assuming Thorsten doesn't scale, and a PPA with releases other than LTS might be of interest.
- d.
On 10/02/2013 10:14 AM, David Gerard wrote:
Instead, I would suggest submitting patches to the existing packages or joining the mediawiki-distributors mailing list and talking about how you think the packages could be improved.
I'm assuming Thorsten doesn't scale, and a PPA with releases other than LTS might be of interest.
Absolutely.
But Thorsten isn't the only person working on the Debian packages. And even a PPA isn't going to address the needs of Fedora and RedHat users.
I would still recommend working with Kartik (a Debian Developer working with the WMF) to get a newer version in Debian unstable.
I think the issue isn't really "improve the existing packages" so much as "package more extensions with useful configurations" so that, for example, I could install mediawiki-visualeditor and have mediawiki-parsoid installed and properly configured as well.
But point taken: this (part of the) discussion really belongs on mediawiki-distributors (or #mediawiki-visualeditor, etc). --scott
On 10/01/2013 02:51 PM, David Gerard wrote:
On 1 October 2013 22:34, C. Scott Ananian cananian@wikimedia.org wrote:
I know you're trolling, but: presumably wikimedia would start by maintaining their own apt repository.
A mediawiki.org PPA is an idea that sounds utterly wonderful and bypasses the Debian packaging effort completely, which does seem quite worth it.
We have a group of potential Debian packagers (and a few who actually know what they are doing) here at the foundation, and we discussed this during the all-staff a few weeks ago.
The will is there, we just need to make it happen.
Gabriel
PS: Parsoid packaging is tracked at https://bugzilla.wikimedia.org/show_bug.cgi?id=53723
On Tue, Oct 1, 2013 at 9:40 PM, Gabriel Wicke gwicke@wikimedia.org wrote:
I'm looking forward to the days when "apt-get install mediawiki" installs and configures a fully-featured MediaWiki system with proper caching, VE, Parsoid, Lua and so on. And performs well on a $2 a month VPS.
Not in the same vein, but there exists Labsvagrant now - https://wikitech.wikimedia.org/wiki/Labs-vagrant, and setting up VE, Parsoid, Lua, Caching is about 2 commands in total there :) Third parties who wish to use this can also use it without too much trouble (provided they're already using puppet...)
On 10/01/2013 12:10 PM, Gabriel Wicke wrote:
With VPS prices starting in the $2-$5 a month range there are not many reasons left for using less secure and less predictable shared hosting.
People have their own reasons for using shared hosting. This doesn't mean we should support them no matter what, but their needs can't be dismissed automatically. Using a VPS requires more knowledge than your average shared hosting account.
Telling people "You can't use Dreamhost or GoDaddy to run your wiki" is supercilious.
Again, I think the best route to go here is to try and gather some real data from the users of MediaWiki instead of trying to come to a decision on a mailing list that is populated mostly with developers.
On Tue, Oct 1, 2013 at 9:29 AM, Mark A. Hershberger mah@nichework.comwrote:
Again, I think the best route to go here is to try and gather some real data from the users of MediaWiki instead of trying to come to a decision on a mailing list that is populated mostly with developers.
Seems you've volunteered for the job! Please let us know when you have some data to share with the rest of the group.
In the meantime, please don't tell people that they can't share their professional opinions and experiences with the limitations and problems of old-fashioned shared hosting and the promises of modern VPS-based hosting. That's valuable input, too.
-- brion
I'm usually just an observer on these lists, but I'll weigh in as a user who runs MediaWiki on a shared host. The host *is* a VPS, but our wiki is used by the environmental department of a large international non-profit. As such it lives on the "enviro" server along with some WordPress sites and other minor things.
If we have to give the wiki its own dedicated VPS, it will likely not survive, or we will move to another platform. I see some REALLY low costs for VPSes being tossed around here, but honestly, we'd probably be looking at a minimum of $50/month to give the site its own dedicated VPS. I realize that in the grand scheme of things, that's not a huge cost, but at that point, management will probably insist on making the site pay for itself rather than the current situation of letting it exist on shared resources.
-Chris
On Tue, Oct 1, 2013 at 9:36 AM, Brion Vibber bvibber@wikimedia.org wrote:
On Tue, Oct 1, 2013 at 9:29 AM, Mark A. Hershberger <mah@nichework.com
wrote:
Again, I think the best route to go here is to try and gather some real data from the users of MediaWiki instead of trying to come to a decision on a mailing list that is populated mostly with developers.
Seems you've volunteered for the job! Please let us know when you have some data to share with the rest of the group.
In the meantime, please don't tell people that they can't share their professional opinions and experiences with the limitations and problems of old-fashioned shared hosting and the promises of modern VPS-based hosting. That's valuable input, too.
-- brion _______________________________________________ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On Tue, Oct 1, 2013 at 9:46 AM, Christopher Wilson gwsuperfan@gmail.comwrote:
I'm usually just an observer on these lists, but I'll weigh in as a user who runs MediaWiki on a shared host. The host *is* a VPS, but our wiki is used by the environmental department of a large international non-profit. As such it lives on the "enviro" server along with some WordPress sites and other minor things.
If we have to give the wiki its own dedicated VPS, it will likely not survive, or we will move to another platform. I see some REALLY low costs for VPSes being tossed around here, but honestly, we'd probably be looking at a minimum of $50/month to give the site its own dedicated VPS. I realize that in the grand scheme of things, that's not a huge cost, but at that point, management will probably insist on making the site pay for itself rather than the current situation of letting it exist on shared resources.
As long as you're getting the performance you need from your wiki I wouldn't see any reason to worry. What we're talking about here is truly shared hosting, where you've got maybe FTP access and a single mysql database at your disposal.
-Chad
On 10/01/2013 09:46 AM, Christopher Wilson wrote:
I'm usually just an observer on these lists, but I'll weigh in as a user who runs MediaWiki on a shared host. The host *is* a VPS, but our wiki is used by the environmental department of a large international non-profit. As such it lives on the "enviro" server along with some WordPress sites and other minor things.
If we have to give the wiki its own dedicated VPS,
I see no reason why the wiki would need its own VPS. What really matters is the ability to install and run software, which is normally the case in a VPS.
Gabriel
On Tue, Oct 1, 2013 at 12:46 PM, Christopher Wilson gwsuperfan@gmail.comwrote:
I'm usually just an observer on these lists, but I'll weigh in as a user who runs MediaWiki on a shared host. The host *is* a VPS, but our wiki is used by the environmental department of a large international non-profit. As such it lives on the "enviro" server along with some WordPress sites and other minor things.
If we have to give the wiki its own dedicated VPS, it will likely not survive, or we will move to another platform. I see some REALLY low costs for VPSes being tossed around here, but honestly, we'd probably be looking at a minimum of $50/month to give the site its own dedicated VPS. I realize that in the grand scheme of things, that's not a huge cost, but at that point, management will probably insist on making the site pay for itself rather than the current situation of letting it exist on shared resources.
We aren't discussing dropping support for running MediaWiki along with other applications, but we're discussing dropping support for shared hosting services, which run hundreds of applications on the same host as different customers. It's horribly insecure and doesn't allow the user to install anything at the system-level.
- Ryan
On 10/01/2013 12:36 PM, Brion Vibber wrote:
Seems you've volunteered for the job! Please let us know when you have some data to share with the rest of the group.
I did say I was going to try and get the data. :)
In the meantime, please don't tell people that they can't share their professional opinions and experiences with the limitations and problems of old-fashioned shared hosting and the promises of modern VPS-based hosting. That's valuable input, too.
I agree.
If I can across as saying "Your opinion doesn't matter" that was not intended. Deciding that MediaWiki shouldn't worry about shared hosting is a major change, though, and it needs more discussion than just the developers and users with substantial technical skill.
Hey,
Again, I think the best route to go here is to try and gather some real
data from the users of MediaWiki instead of trying to come to a decision on a mailing list that is populated mostly with developers.
Seems you've volunteered for the job! Please let us know when you have some data to share with the rest of the group.
In the meantime, please don't tell people that they can't share their professional opinions and experiences with the limitations and problems of old-fashioned shared hosting and the promises of modern VPS-based hosting. That's valuable input, too.
Brion, I appreciate your and others input, and am sure the same is true for Mark.
One the one side it would be convenient to not support a certain group of users, on the other this means we no longer support that group of users. It is quite clear there are people that simply do not care about supporting these users, and thus prefer dropping the support, and people that want to retain this support.
I'm not using any shared hosting and am not really a user of MW to begin with, so it does not matter all that much to me personally. However as an involved developer I think we should not ignore the arguments on either side.
Seems you've volunteered for the job! Please let us know when you have some
data to share with the rest of the group.
If the proposal here is to no longer support these users, then it seems logical to make analysis of the impact of such a change a requirement for the change to happen. Making it a requirement for the change to not happen is obviously appealing to people on one side of the argument and definitely not to those on the other. Ignoring preference there, it seems to not be the most logical approach.
Cheers
-- Jeroen De Dauw http://www.bn2vs.com Don't panic. Don't be evil. ~=[,,_,,]:3 --
On Tue, Oct 1, 2013 at 9:10 AM, Gabriel Wicke gwicke@wikimedia.org wrote:
With VPS prices starting in the $2-$5 a month range [1][2][3][4] there are not many reasons left for using less secure and less predictable shared hosting. People have been migrating away from shared hosting for a while, and this trend is set to continue in the future.
Sadly, most VPS'es are less secure, because big hosting companies patch their shared hosting environment, but most users don't setup any patching on their vps.
Has anybody ever considered the possibility that maybe people don't know (or want to know) how to set up a caching proxy? One of the nice things about MediaWiki is that it's extraordinarily easy to set up. All you have to do is dump a tar.gz file into a directory, run the web installer and call it a day. No sysadmin experience required.
The file cache allows simple and easy caching for wiki administrators who aren't system administrators and just want their site to be more performant without having to learn how to configure their web server as well as an additional caching daemon.
Also, like Mark mentioned, I'd like to see some statistics on how many people use shared hosting for MediaWiki before dropping support for them out of principle.
*-- * *Tyler Romeo* Stevens Institute of Technology, Class of 2016 Major in Computer Science
On Tue, Oct 1, 2013 at 1:18 PM, Chris Steipp csteipp@wikimedia.org wrote:
On Tue, Oct 1, 2013 at 9:10 AM, Gabriel Wicke gwicke@wikimedia.org wrote:
With VPS prices starting in the $2-$5 a month range [1][2][3][4] there are not many reasons left for using less secure and less predictable shared hosting. People have been migrating away from shared hosting for a while, and this trend is set to continue in the future.
Sadly, most VPS'es are less secure, because big hosting companies patch their shared hosting environment, but most users don't setup any patching on their vps. _______________________________________________ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On Tue, Oct 1, 2013 at 1:43 PM, Tyler Romeo tylerromeo@gmail.com wrote:
Has anybody ever considered the possibility that maybe people don't know (or want to know) how to set up a caching proxy? One of the nice things about MediaWiki is that it's extraordinarily easy to set up. All you have to do is dump a tar.gz file into a directory, run the web installer and call it a day. No sysadmin experience required.
This is only true if you want almost no functionality of out MediaWiki and you want it to be very slow. MediaWiki is incredibly difficult to properly run and requires at least some minor sysadmin experience to do so. There's a reason that almost every MediaWiki install in existence is completely out of date.
When we get to Wordpress's ease of use, then we can assume this.
- Ryan
On Tue, Oct 1, 2013 at 2:46 PM, Ryan Lane rlane32@gmail.com wrote:
This is only true if you want almost no functionality of out MediaWiki and you want it to be very slow. MediaWiki is incredibly difficult to properly run and requires at least some minor sysadmin experience to do so. There's a reason that almost every MediaWiki install in existence is completely out of date.
Do you have some specific examples?
Also, if that's the case then removing file caching would be a step backwards.
*-- * *Tyler Romeo* Stevens Institute of Technology, Class of 2016 Major in Computer Science
On Tue, Oct 1, 2013 at 2:53 PM, Tyler Romeo tylerromeo@gmail.com wrote:
On Tue, Oct 1, 2013 at 2:46 PM, Ryan Lane rlane32@gmail.com wrote:
This is only true if you want almost no functionality of out MediaWiki
and
you want it to be very slow. MediaWiki is incredibly difficult to
properly
run and requires at least some minor sysadmin experience to do so.
There's
a reason that almost every MediaWiki install in existence is completely
out
of date.
Do you have some specific examples?
Extension management, upgrades, proper backend caching, proper localization cache, proper job running, running any maintenance script, using a *lot* of different extensions, etc. etc..
Also, if that's the case then removing file caching would be a step backwards.
Pretending that we support the lowest common denominator while not actually doing it is the worst of all worlds. We should either support shared hosts excellently, which is very difficult, or we should just stop acting like we support them.
I'm not saying we should make the software unusable for shared hosts, but we also shouldn't worry about supporting them for new features or maintaining often broken features (like file cache) just because they are useful on shared hosting. It makes the software needlessly complex for a dying concept.
- Ryan
On 10/01/2013 04:43 PM, Ryan Lane wrote:
I'm not saying we should make the software unusable for shared hosts, but we also shouldn't worry about supporting them for new features or maintaining often broken features (like file cache) just because they are useful on shared hosting. It makes the software needlessly complex for a dying concept.
For anyone who missed Tim's post, file cache is not broken (at least not in this regard).
It was just a simple configuration change to handle multiple protocols correctly.
Matt Flaschen
Hey,
This is only true if you want almost no functionality of out MediaWiki and
you want it to be very slow.
There are quite a few people running MW without a cache or other magic config and find it quite suitable for their needs, which can be quite non-trivial.
MediaWiki is incredibly difficult to properly
run and requires at least some minor sysadmin experience to do so. There's a reason that almost every MediaWiki install in existence is completely out of date.
So because MediaWiki already sucks in some regards, its fine to have it suck more in others as well? Is that really the point made here?
Cheers
-- Jeroen De Dauw http://www.bn2vs.com Don't panic. Don't be evil. ~=[,,_,,]:3 --
On Tue, Oct 1, 2013 at 3:00 PM, Jeroen De Dauw jeroendedauw@gmail.comwrote:
Hey,
This is only true if you want almost no functionality of out MediaWiki and
you want it to be very slow.
There are quite a few people running MW without a cache or other magic config and find it quite suitable for their needs, which can be quite non-trivial.
MediaWiki is incredibly difficult to properly
run and requires at least some minor sysadmin experience to do so.
There's
a reason that almost every MediaWiki install in existence is completely
out
of date.
So because MediaWiki already sucks in some regards, its fine to have it suck more in others as well? Is that really the point made here?
I'm actually arguing that we should prioritize fixing the things that suck for everyone over the things that suck for shared hosts. We should especially not harm the large infrastructures just so that we can support the barely usable ones. If we keep supporting shared hosts we likely can't break portions of MediaWiki into services without a lot of duplication of code and effort (and bugs!).
- Ryan
On Tue, Oct 1, 2013 at 4:47 PM, Ryan Lane rlane32@gmail.com wrote:
I'm actually arguing that we should prioritize fixing the things that suck for everyone over the things that suck for shared hosts. We should especially not harm the large infrastructures just so that we can support the barely usable ones. If we keep supporting shared hosts we likely can't break portions of MediaWiki into services without a lot of duplication of code and effort (and bugs!).
But that's not the question at hand. I agree that fixing more important features should be prioritized, but the problem is that Antoine suggested earlier in this thread that file caching be removed completely.
*-- * *Tyler Romeo* Stevens Institute of Technology, Class of 2016 Major in Computer Science
On 2013-10-01 3:46 PM, "Ryan Lane" rlane32@gmail.com wrote:
On Tue, Oct 1, 2013 at 1:43 PM, Tyler Romeo tylerromeo@gmail.com wrote:
Has anybody ever considered the possibility that maybe people don't know (or want to know) how to set up a caching proxy? One of the nice things about MediaWiki is that it's extraordinarily easy to set up. All you
have
to do is dump a tar.gz file into a directory, run the web installer and call it a day. No sysadmin experience required.
This is only true if you want almost no functionality of out MediaWiki and you want it to be very slow. MediaWiki is incredibly difficult to properly run and requires at least some minor sysadmin experience to do so. There's a reason that almost every MediaWiki install in existence is completely
out
of date.
When we get to Wordpress's ease of use, then we can assume this.
- Ryan
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
I disagree. For a low traffic site, its probably performant enough with just APC caching, which the installer sets up for you. Vanilla mediawiki does the things you expect a wiki to do. I doubt these types of users want/need complex things like abuse filter and lua. (Unless you are copying templates from wikipedia)
The major thing out of the box mw is missing is confirm edit. The other stuff is cool but non-essential imo.
-bawolff
On 1 October 2013 20:01, Brian Wolff bawolff@gmail.com wrote:
I disagree. For a low traffic site, its probably performant enough with just APC caching, which the installer sets up for you. Vanilla mediawiki does the things you expect a wiki to do. I doubt these types of users want/need complex things like abuse filter and lua. (Unless you are copying templates from wikipedia) The major thing out of the box mw is missing is confirm edit. The other stuff is cool but non-essential imo.
For my use on intranets, the thing I'm desperate for is a usable visual editor. (Testing and reporting bugs on VE as fast as I can ...)
- d.
I disagree. For a low traffic site, its probably performant enough with just APC caching, which the installer sets up for you. Vanilla mediawiki does the things you expect a wiki to do. I doubt these types of users want/need complex things like abuse filter and lua. (Unless you are copying templates from wikipedia)
The major thing out of the box mw is missing is confirm edit. The other stuff is cool but non-essential imo.
Just want to throw in my hat here as well. I use Mediawiki on my personal website, my laptop, and my place of business. In each case it fulfils a different role. In the case of my laptop, I don't need caching it just acts a good place to take notes on a variety of projects that I working on. I went with it because I was used to how it worked from editing Wikipedia and it was easy to set up. My laptop runs Windows with a copy of Apache and PHP installed on it for development work.
On my personal website I have a wiki that requires an account to edit with account creation disabled. I don’t use caching their either, but if my site got more traffic I would. In this case I installed Mediawiki in order to be able to quickly create pages of text for documentation or when translating news articles for people for political advocacy. On my personal website Mediawiki is set up on a shared host with all of the disadvantages that come with such a setup.
At my work we use Mediawiki as a knowledge base and to document standard operating procedures. Here Mediawiki was chosen for its ease of use for an end user and the audit trail that articles inherently leave. It was a bonus that I have experience writing Mediawiki extensions as well. Here we run it on a dedicated web server running Ubuntu Server Edition that I have root on.
Anyhow, I guess I didn't really make what I'm trying to say here terribly clear. Lots of people use Mediawiki for a lot of different reasons in a lot of different enviornments. I can't easily set up many of the items that are used by the WMF on my Windows laptop, nor on my shared hosting account. Personally in those situations I would enjoy Mediawiki to still work reasonably well.
By all means drop the file cache if no one really uses it, but I personally would be against dropping support for people who are in shared hosting environments or other environment that don't match up with the typical VPS or dedicated server.
Thank you, Derric Atzrott
...now back to lurking.
On 2013-10-01 2:18 PM, "Chris Steipp" csteipp@wikimedia.org wrote:
On Tue, Oct 1, 2013 at 9:10 AM, Gabriel Wicke gwicke@wikimedia.org
wrote:
With VPS prices starting in the $2-$5 a month range [1][2][3][4] there are not many reasons left for using less secure and less predictable shared hosting. People have been migrating away from shared hosting for a while, and this trend is set to continue in the future.
Sadly, most VPS'es are less secure, because big hosting companies patch their shared hosting environment, but most users don't setup any patching on their vps. _______________________________________________ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Vps isn't the only reason people might want to use file cache. The person may just not be very experianced, and actually setting up squid could be beyond them.
I don't think large amount of foundation resources should be spent on features like file cache. However I still think it should be allowed to exist provided it doesn't cause problems. As far as I can tell, no one has pointed to an active problem its causing other than the feature isn't "perfect".
-bawolff
As the last person to maintain that code, I tend to agree with this.
-- View this message in context: http://wikimedia.7.x6.nabble.com/File-cache-HTTPS-question-tp5014197p5014229... Sent from the Wikipedia Developers mailing list archive at Nabble.com.
I'm not dead set against it, but there are some problems I see with it:
a) It's not well maintained nor documented as people don't really consider it when changing anything. A concerned volunteer could manage this probably. For example, dealing with HTTPs could be documented better and the code could have some failsafe logic around it (just like with $wgShowIPinHeader). b) It requires additional code paths and complexity everywhere there is already CDN (squid/varnish) purge logic. Bugs fixed in one may not carry over to the other. Because of (b), this makes it more vulnerable to bit rot. c) Files are not LRU and don't even have an expiry mechanism. One could make a script and put it on a cron I guess (I rediscovered the existence of PruneFileCache.php, which I forgot I wrote). If one could do that, they probably also have the rights to install varnish/squid. Hacking around the lack of LRU requires MediaWiki to try to bound the worst case number of cache entries; page cache is only the current version and the resource loader cache uses a bunch of hit count and IP range uniqueness checks to determine if a load.php cluster of modules is worth caching the response for (you don't want to cache any combination of modules that happens to hit the server, only often hits ones by different sources). d) It can only use filesystems and not object stores or anything else. This means you need to either only have one server, or use NFS, or if you want to be exotic use fuse with some DOS, or use cephfs/gluster (though if you can do all that you may as well use varnish/squid). I'd imagine people would just use NFS, which may do fine for lots of small to moderate traffic installs. Still, I'd rather someone set up a CDN rather than install NFS (either one takes a little work). People would use CDN if it was made easier to do I'd bet. e) I'd rather considering investing time in documentation, packaging, and core changes to make CDN as easy to set up as possible (for people with VMs or their own physical boxes). Bugs found by third parties and WMF could be fixed and both sides could benefit from it since common code paths would be used. Encouraging squid/varnish usage fits nicely with the idea of encouraging other open source projects and libraries. Also, using tools heavily designed and optimized for certain usage is better than everyone inventing their own little hacky versions that do the same thing (e.g. file cache instead of a proper CDN). f) Time spent keeping up hacks to do the work of CDNs to make MediaWiki faster could be spent on actually make origin requests to MediaWiki faster and making responses more cache friendly (e.g. ESI and such). For example, if good ESI support was added, would file cache just lag behind and not be able to do something similar? One *could* do an analogous thing with file cache reconstructing pages from file fragments...but that would seem like a waste of time and code if we can just make it easy to use a CDN.
In any case, I would not want to see file cache removed until CDN support was evaluated, documented, and cleaned up, so people have an easy alternative in it's place. For example, if a bunch of confusing vcls are needed to use varnish, then few will go through the effort.
-- View this message in context: http://wikimedia.7.x6.nabble.com/File-cache-HTTPS-question-tp5014197p5014448... Sent from the Wikipedia Developers mailing list archive at Nabble.com.
On 10/03/2013 04:41 PM, Aaron Schulz wrote:
In any case, I would not want to see file cache removed until CDN support was evaluated, documented, and cleaned up, so people have an easy alternative in it's place.
If file cache is a stand-in for CDN support, why not keep support for file cache's as well as documenting and cleaning up the CDN support?
It seems like you are telling people "MediaWiki isn't for you if you don't want to use a CDN." This seems to ignore the use cases of majority of users (at least, those in the WikiApiary data) who don't have servers dedicated to their Wiki and seem satisfied without them.
Yes, I know that some VPS exist for less than $10/mo, but I've also been told by a developer whose clients wanted to use such a thing that it wasn't really suitable for production use. Even Dreamhost was better, he said.
Mark.
On 10/03/2013 06:25 PM, Mark A. Hershberger wrote:
On 10/03/2013 04:41 PM, Aaron Schulz wrote:
In any case, I would not want to see file cache removed until CDN support was evaluated, documented, and cleaned up, so people have an easy alternative in it's place.
If file cache is a stand-in for CDN support, why not keep support for file cache's as well as documenting and cleaning up the CDN support?
I'm not saying we should remove the file cache (certainly not without careful consideration and hopefully an easy alternative).
However, the answer to "why not support keep support" for both is pretty clear. It's another code path to maintain, and that means more work (meaning less time to work on other stuff), and potentially more bugs.
Matt Flaschen
On 10/04/2013 12:45 AM, Matthew Flaschen wrote:
However, the answer to "why not support keep support" for both is pretty clear. It's another code path to maintain, and that means more work (meaning less time to work on other stuff), and potentially more bugs.
You're right. We need to make sure it is maintained. That is part of what the Release Management Team should be concerned with: Making sure those bits of the code that WMF doesn't use are maintained.
Mark.
On 04/10/13 06:41, Aaron Schulz wrote:
In any case, I would not want to see file cache removed until CDN support was evaluated, documented, and cleaned up, so people have an easy alternative in it's place. For example, if a bunch of confusing vcls are needed to use varnish, then few will go through the effort.
I don't think our Squid/Varnish support deserves to be called CDN support. A CDN is usually a third-party system, like Akamai or Limelight, and we don't have any support for those. For text, we don't even support stock Squid/Varnish. We require patches and custom configuration for Squid. We haven't even managed to migrate text to Varnish ourselves yet -- only Wikia have done that. It's a lot to ask of a newbie with no time to spend.
-- Tim Starling
On 01/10/13 21:44, Antoine Musso wrote:
Le 01/10/13 09:45, Jeroen De Dauw a écrit :
I have a wiki that is accessible both via HTTP and HTTPS which has the file cache enabled. When someone loads a page over HTTP and it gets cached, it will have HTTP resource URLs in it. When someone then loads it over HTTPS and gets this cached page, they'll end up with mixed content, which causes the resources to not be loaded on recent FF versions. Is there a way to split the cache based on protocol used?
The file cache is barely maintained and I am not sure whether anyone is still relying on it. We should probably remove that feature entirely and instruct people to setup a real frontend cache instead.
Many people use it and rely on it. It is as useful as it was when it was introduced. Many of the bugs in the original implementation have now been fixed.
-- Tim Starling
I lost track of this thread at about the second message, but could someone look into this reproducible bug which seems similar to the issue reported by the original poster? https://bugzilla.wikimedia.org/show_bug.cgi?id=48133
Nemo
wikitech-l@lists.wikimedia.org