I notice that lately articles on the Featured article Candidates are often longer, sometimes almost twice, than the recommended size of 32k and nobody complains anymore about it. Personally, I do not object to articles that have a size <64k but I do have a problem with the inconsistent enforcement of standards for size that are now used. For example, some contributors oppose to an article about [[Germany]] larger than 32k. A comparable article like the [[United States]] has a size of 61k. I request consistent enforcement of this recommendation or a more lenient recommendation e.g. <64k. http://en.wikipedia.org/wiki/Talk:Germany http://en.wikipedia.org/wiki/Wikipedia:Article_size http://en.wikipedia.org/wiki/Wikipedia:Article_size Andries K.D.
Andries Krugers Dagneaux (andrieskd@chello.nl) [050521 05:35]:
I notice that lately articles on the Featured article Candidates are often longer, sometimes almost twice, than the recommended size of 32k and nobody complains anymore about it. Personally, I do not object to articles that have a size <64k but I do have a problem with the inconsistent enforcement of standards for size that are now used. For example, some contributors oppose to an article about [[Germany]] larger than 32k. A comparable article like the [[United States]] has a size of 61k. I request consistent enforcement of this recommendation or a more lenient recommendation e.g. <64k. http://en.wikipedia.org/wiki/Talk:Germany http://en.wikipedia.org/wiki/Wikipedia:Article_size http://en.wikipedia.org/wiki/Wikipedia:Article_size
The 32k limit used to be quite strongly agreed with on FAC, then 172 came along with several excellent articles that were very lengthy indeed. He argued that that was what it took to cover the subject properly, even with many breakouts into sub-articles, and enough people agreed that the articles got through.
The 32k limit was of course originally because of broken browser behaviour with text boxes over 32k. As it happens, some (including me) think it's a good recommended maximum length for an article for readability - there should be a compelling reason to go over six thousand words. But considerations like 172's (subject requires it, article is really good) still get past. So if the Germany article should get by even being over 32k, the nominator would probably have to convince people the subject required it for proper coverage. (Which I can well see being likely.)
FAC requires a ridiculously high and sometimes inconsistent standard, and is a tremendous amount of work and often very frustrating for nominators. I'm not sure what can be done about this without [[m:instruction creep]].
- d.
As it happens, some (including me) think it's a good recommended maximum length for an article for readability - there should be a compelling reason to go over six thousand words. But considerations like 172's (subject requires it, article is really good) still get past. So if the Germany article should get by even being over 32k, the nominator would probably have to convince people the subject required it for proper coverage. (Which I can well see being likely.)
In this case, the issue is more complex than just the 32k limit. Regardless, and without getting into extreme detail, brevity is being strongly encouraged on the article, which is already well over 32K, and seems to be getting larger with every edit. The argument from the person most strongly arguing against limiting the size seems to me to mostly be "These are facts, and the United States article is even bigger, that's not fair".
Jay.
Unless we're living in a world of 56K internet still, I think 32K could become at least 50K. Even articles on albums by Eminem and so forth are getting above 32K - Giving that limit is slowly even limiting the growth of good articles. If an article grows above that it should be allowed to grow, unless it is obviously repeating itself. ----- Original Message ----- From: "JAY JG" jayjg@hotmail.com To: wikien-l@Wikipedia.org Sent: Friday, May 20, 2005 9:06 PM Subject: Re: [WikiEN-l] Article size consistency 32k
As it happens, some (including me) think it's a good recommended maximum length for an article for readability - there should be a compelling reason to go over six thousand words. But considerations like 172's (subject requires it, article is really good) still get past. So if the Germany article should get by even being over 32k, the nominator would probably have to convince people the subject required it for proper coverage. (Which I can well see being likely.)
In this case, the issue is more complex than just the 32k limit. Regardless, and without getting into extreme detail, brevity is being strongly encouraged on the article, which is already well over 32K, and seems to be getting larger with every edit. The argument from the person most strongly arguing against limiting the size seems to me to mostly be "These are facts, and the United States article is even bigger, that's not fair".
Jay.
WikiEN-l mailing list WikiEN-l@Wikipedia.org http://mail.wikipedia.org/mailman/listinfo/wikien-l
From: "David 'DJ' Hedley" spyders@btinternet.com Unless we're living in a world of 56K internet still, I think 32K could become at least 50K. Even articles on albums by Eminem and so forth are getting above 32K - Giving that limit is slowly even limiting the growth of good articles. If an article grows above that it should be allowed to grow, unless it is obviously repeating itself.
As far as I know the issue was never with download speeds, and currently most of the concerns are stylistic, rather than technical.
Jay.
The issue is always download speed when we serve a diverse international audience and at least make noises about serving the poor and the third world. Serving up articles over 100kb long with several images each over 200kb will basically stop a slower computer with limited memory operating with a modem in its tracks, sometimes even requiring a reboot. Essentially the site becomes unusable.
Fred
From: "JAY JG" jayjg@hotmail.com Reply-To: English Wikipedia wikien-l@Wikipedia.org Date: Fri, 20 May 2005 17:07:56 -0400 To: wikien-l@Wikipedia.org Subject: Re: [WikiEN-l] Article size consistency 32k
From: "David 'DJ' Hedley" spyders@btinternet.com Unless we're living in a world of 56K internet still, I think 32K could become at least 50K. Even articles on albums by Eminem and so forth are getting above 32K - Giving that limit is slowly even limiting the growth of good articles. If an article grows above that it should be allowed to grow, unless it is obviously repeating itself.
As far as I know the issue was never with download speeds, and currently most of the concerns are stylistic, rather than technical.
Jay.
WikiEN-l mailing list WikiEN-l@Wikipedia.org http://mail.wikipedia.org/mailman/listinfo/wikien-l
On Fri, 20 May 2005, Fred Bauder wrote:
The issue is always download speed when we serve a diverse international audience and at least make noises about serving the poor and the third world. Serving up articles over 100kb long with several images each over 200kb will basically stop a slower computer with limited memory operating with a modem in its tracks, sometimes even requiring a reboot. Essentially the site becomes unusable.
It's not only the third world that needs to rely on dialup; the majority of Internet users in the US -- which, last I checked, was still counted as one of the developed countries -- still have dialup connections (like yours truly).
And the primary problem I have with accessing information from Wikipedia is not the article length (although a concise well-organized article is always better than a long discursive one, no matter the amount of detail contained), but the size & number of images.
Maybe I ought to squawk more about those pages where someone has stuffed into it as many images as they can find about the subject. After all, EN-Wikipedia isn't the only archive available for usage-free graphics -- even on Wikimedia.
Geoff
Geoff Burling (llywrch@agora.rdrop.com) [050522 07:28]:
It's not only the third world that needs to rely on dialup; the majority of Internet users in the US -- which, last I checked, was still counted as one of the developed countries -- still have dialup connections (like yours truly). And the primary problem I have with accessing information from Wikipedia is not the article length (although a concise well-organized article is always better than a long discursive one, no matter the amount of detail contained), but the size & number of images.
Then you need to get MediaWiki 1.5 installed locally and write the user preference code to switch all images off ;-)
- d.
On 5/21/05, Geoff Burling llywrch@agora.rdrop.com wrote:
Maybe I ought to squawk more about those pages where someone has stuffed into it as many images as they can find about the subject. After all, EN-Wikipedia isn't the only archive available for usage-free graphics -- even on Wikimedia.
There are also the alternatives of having a separate image gallery page, or using {{Commons|pagename}} to link to an image page on Commons.
-Matt
Geoff Burling wrote:
And the primary problem I have with accessing information from Wikipedia is not the article length (although a concise well-organized article is always better than a long discursive one, no matter the amount of detail contained), but the size & number of images.
What browser are you using?? Firefox (and IIRC Opera) doesn't wait for all the images to load before it displays the text.
On Sun, 22 May 2005, Timwi wrote:
Geoff Burling wrote:
And the primary problem I have with accessing information from Wikipedia is not the article length (although a concise well-organized article is always better than a long discursive one, no matter the amount of detail contained), but the size & number of images.
What browser are you using?? Firefox (and IIRC Opera) doesn't wait for all the images to load before it displays the text.
It's an ancient version of Mozilla that I probably compiled incorrectly when I installed in years ago -- but text/image display was my point. (And note: because I know it is ancient, I haven't mentioned any of the problems I have using it with Wikipedia, which have been many; I wouldn't be surprised if I'm the only one experiencing them.)
Download times are a very off-putting experience whenever one deals with the Web, & very few web developers bother to optimize for speed -- or even consider it a problem. Google is an amazing -- & very rare exception. (I've had this discussion with a web designer friend several times, who at least understands this issue -- although he's still a bit hobbled with the "I want them to see the site how I choose, not how they may want to choose" attitude.)
And turning images off is not the solution. Much of the time, I want to see some of the images in an article, such as a map, or specific photographs of a person or a place; I'm not interested seeing in every known image with the proper license that could be related to the subject. Which is why I mentioned commons: not only does it support a competition for the best images in a given category by allowing a practically unlimited number of images to be uploaded, it does not require the losers to be deleted because they are unused -- & allows them to be available to compete in other categories.
Geoff
This is the typical type of problem you run into with folks using old equipment and obsolete software. Like, for example, folks in Mexico who use castoff US equipment. Given our geek editor base, most of whom have reasonably good equipment, it is hard to put ourselves in the position of those who are are trying to access the internet under less than optimal conditions, but is is good if we could to the extent we are able and accomodate them to the extent we can.
Fred
From: Geoff Burling llywrch@agora.rdrop.com Reply-To: Geoff Burling llywrch@agora.rdrop.com, English Wikipedia wikien-l@Wikipedia.org Date: Sun, 22 May 2005 11:48:25 -0700 (PDT) To: English Wikipedia wikien-l@Wikipedia.org Subject: Re: [WikiEN-l] Re: Article size consistency 32k
On Sun, 22 May 2005, Timwi wrote:
Geoff Burling wrote:
And the primary problem I have with accessing information from Wikipedia is not the article length (although a concise well-organized article is always better than a long discursive one, no matter the amount of detail contained), but the size & number of images.
What browser are you using?? Firefox (and IIRC Opera) doesn't wait for all the images to load before it displays the text.
It's an ancient version of Mozilla that I probably compiled incorrectly when I installed in years ago -- but text/image display was my point. (And note: because I know it is ancient, I haven't mentioned any of the problems I have using it with Wikipedia, which have been many; I wouldn't be surprised if I'm the only one experiencing them.)
Download times are a very off-putting experience whenever one deals with the Web, & very few web developers bother to optimize for speed -- or even consider it a problem. Google is an amazing -- & very rare exception. (I've had this discussion with a web designer friend several times, who at least understands this issue -- although he's still a bit hobbled with the "I want them to see the site how I choose, not how they may want to choose" attitude.)
And turning images off is not the solution. Much of the time, I want to see some of the images in an article, such as a map, or specific photographs of a person or a place; I'm not interested seeing in every known image with the proper license that could be related to the subject. Which is why I mentioned commons: not only does it support a competition for the best images in a given category by allowing a practically unlimited number of images to be uploaded, it does not require the losers to be deleted because they are unused -- & allows them to be available to compete in other categories.
Geoff
WikiEN-l mailing list WikiEN-l@Wikipedia.org http://mail.wikipedia.org/mailman/listinfo/wikien-l
Fred Bauder (fredbaud@ctelco.net) [050523 06:07]:
This is the typical type of problem you run into with folks using old equipment and obsolete software. Like, for example, folks in Mexico who use castoff US equipment. Given our geek editor base, most of whom have reasonably good equipment, it is hard to put ourselves in the position of those who are are trying to access the internet under less than optimal conditions, but is is good if we could to the extent we are able and accomodate them to the extent we can.
I was recently on dialup (waiting for BT to switch on my new DSL) and using my lovely but aging Thinkpad 560X - Pentium MMX 233MHz, 96MB RAM. Firefox is horribly slow with 96MB RAM ;-) I managed to work on Wikipedia and even do arbitration (a real test of the usability of a tabbed browser) - so we're doing okay for equipment of that level. It'd be interesting to try it in Netscape 3 on a 486 with 12MB RAM like I had up to 2001 ;-)
- d.
On Sun, 22 May 2005, Fred Bauder wrote:
This is the typical type of problem you run into with folks using old equipment and obsolete software. Like, for example, folks in Mexico who use castoff US equipment. Given our geek editor base, most of whom have reasonably good equipment, it is hard to put ourselves in the position of those who are are trying to access the internet under less than optimal conditions, but is is good if we could to the extent we are able and accomodate them to the extent we can.
If anyone out there is interested in the issues around this problem, let me suggest checking the links in the RULE project. These people are working at the coal face of making ancient hardware work with current Linux distros, & last time I looked in on the list, they had a fair number of questions from folks working with old 486s & low-end Pentiums in third-world villages many miles from where the asphalt ends.
Geoff
Geoff Burling wrote: <snip>
Download times are a very off-putting experience whenever one deals with the Web, & very few web developers bother to optimize for speed -- or even consider it a problem. Google is an amazing -- & very rare exception. (I've had this discussion with a web designer friend several times, who at least understands this issue -- although he's still a bit hobbled with the "I want them to see the site how I choose, not how they may want to choose" attitude.)
And turning images off is not the solution. Much of the time, I want to see some of the images in an article, such as a map, or specific photographs of a person or a place; I'm not interested seeing in every known image with the proper license that could be related to the subject. Which is why I mentioned commons: not only does it support a competition for the best images in a given category by allowing a practically unlimited number of images to be uploaded, it does not require the losers to be deleted because they are unused -- & allows them to be available to compete in other categories.
I don't know if this might have ever been suggested before already, but perhaps a change in software could allow us to set a maximum image size in our user preferences? Either in width/height, e.g. "always shrink images to less than 200 pixels wide or 150 pixels high, whichever is smaller", or in kB, e.g. "always shrink images to less than 10 kB". There should probably be a limited number of selections, so that the images could be cached the same way they are for regular thumbnailing. And of course, you probably wouldn't want the rule applied to Image: namespace pages. How hard would this be to implement, and would it strain the servers much to have to serve up even more different-size versions of the same picture? It seems to me it could certainly speed up the page loads for those on <56kb/s connections. But then again, you might also end up with weird formatting, when the author/editor inserting into a page doesn't know how large the image ends up being displayed.
This has been a problem on Wikinfo. We have set a limit of 200kb for images (Wikipedia has a 2mb limit). Some images simply can't be used but reducing the size and generally using thumbs produces difficulties in formatting which have to be corrected by hand which is quite laborious. A Wikipedia solution would have to be expressed as policy and have general support among editors.
Fred
From: "John R. Owens" jowens.wiki@ghiapet.homeip.net Reply-To: English Wikipedia wikien-l@Wikipedia.org Date: Sun, 22 May 2005 16:11:26 -0500 To: English Wikipedia wikien-l@Wikipedia.org Subject: Re: [WikiEN-l] Re: Article size consistency 32k
Geoff Burling wrote:
<snip> > Download times are a very off-putting experience whenever one deals with > the Web, & very few web developers bother to optimize for speed -- or > even consider it a problem. Google is an amazing -- & very rare exception. > (I've had this discussion with a web designer friend several times, who > at least understands this issue -- although he's still a bit hobbled > with the "I want them to see the site how I choose, not how they may want > to choose" attitude.) > > And turning images off is not the solution. Much of the time, I want to > see some of the images in an article, such as a map, or specific > photographs of a person or a place; I'm not interested seeing in every > known image with the proper license that could be related to the > subject. Which is why I mentioned commons: not only does it support a > competition for the best images in a given category by allowing a > practically unlimited number of images to be uploaded, it does not > require the losers to be deleted because they are unused -- & allows > them to be available to compete in other categories.
I don't know if this might have ever been suggested before already, but perhaps a change in software could allow us to set a maximum image size in our user preferences? Either in width/height, e.g. "always shrink images to less than 200 pixels wide or 150 pixels high, whichever is smaller", or in kB, e.g. "always shrink images to less than 10 kB". There should probably be a limited number of selections, so that the images could be cached the same way they are for regular thumbnailing. And of course, you probably wouldn't want the rule applied to Image: namespace pages. How hard would this be to implement, and would it strain the servers much to have to serve up even more different-size versions of the same picture? It seems to me it could certainly speed up the page loads for those on <56kb/s connections. But then again, you might also end up with weird formatting, when the author/editor inserting into a page doesn't know how large the image ends up being displayed.
-- John R. Owens ProofReading Markup Language : http://prml.sourceforge.net/ _______________________________________________ WikiEN-l mailing list WikiEN-l@Wikipedia.org http://mail.wikipedia.org/mailman/listinfo/wikien-l
On 5/22/05, Fred Bauder fredbaud@ctelco.net wrote:
This has been a problem on Wikinfo. We have set a limit of 200kb for images (Wikipedia has a 2mb limit). Some images simply can't be used but reducing the size and generally using thumbs produces difficulties in formatting which have to be corrected by hand which is quite laborious. A Wikipedia solution would have to be expressed as policy and have general support among editors.
A 200kb image isn't likely to be of a suitable quality for print, so I would recommend not imposing such limits on Wikipedia, where many people do have the goal of producing print versions. If anything, we ought to be encouraging higher resolution images with this aim in mind.
Angela.
This discussion is about how to improve the usability of the website. Looking forward to a print edition the 2mb limit makes sense, but these very large images ought not to make it difficult for the less advantaged to use the website.
Some way needs to be found to not impose them on users whose equipment is unable to handle them.
Fred
From: Angela beesley@gmail.com Reply-To: Angela beesley@gmail.com, English Wikipedia wikien-l@Wikipedia.org Date: Mon, 23 May 2005 00:57:23 +0200 To: English Wikipedia wikien-l@wikipedia.org Subject: Re: [WikiEN-l] Re: Article size consistency 32k
On 5/22/05, Fred Bauder fredbaud@ctelco.net wrote:
This has been a problem on Wikinfo. We have set a limit of 200kb for images (Wikipedia has a 2mb limit). Some images simply can't be used but reducing the size and generally using thumbs produces difficulties in formatting which have to be corrected by hand which is quite laborious. A Wikipedia solution would have to be expressed as policy and have general support among editors.
A 200kb image isn't likely to be of a suitable quality for print, so I would recommend not imposing such limits on Wikipedia, where many people do have the goal of producing print versions. If anything, we ought to be encouraging higher resolution images with this aim in mind.
Angela. _______________________________________________ WikiEN-l mailing list WikiEN-l@Wikipedia.org http://mail.wikipedia.org/mailman/listinfo/wikien-l
Fred Bauder (fredbaud@ctelco.net) [050523 10:39]:
This discussion is about how to improve the usability of the website. Looking forward to a print edition the 2mb limit makes sense, but these very large images ought not to make it difficult for the less advantaged to use the website. Some way needs to be found to not impose them on users whose equipment is unable to handle them.
The thumbnails in articles are usually much smaller - JPEGs, a few tens of kilobytes; the sort of size that's fine on dialup (or was when dialup was all there was).
- d.
John R. Owens wrote:
I don't know if this might have ever been suggested before already, but perhaps a change in software could allow us to set a maximum image size in our user preferences? Either in width/height, e.g. "always shrink images to less than 200 pixels wide or 150 pixels high, whichever is smaller", or in kB, e.g. "always shrink images to less than 10 kB".
This is absolutely brilliant. I love and support this idea.
Not because I have a slow connection (which I do, but I don't mind), but rather because I'm getting annoyed when people make images so huge because they have a 1280x1024 resolution and they think everyone else must have that too. (I even had one replying back to me saying "Not everyone has a resolution as low as yours, so please leave the image this big!")
And of course, you probably wouldn't want the rule applied to Image: namespace pages.
Yes, you do want it applied to Image:-namespace pages. The actual image above it is not part of the page. :)
But then again, you might also end up with weird formatting, when the author/editor inserting into a page doesn't know how large the image ends up being displayed.
People should learn that formatting is more than just looking at what it looks like on *your* screen. Pages must be formatted in such a way that they are readable on any resolution (OK, I guess we can assume a minimum resolution of 800x600, but not more!). Resizing your window is all it takes to check. Once this is done properly, varying sizes of images shouldn't have any adverse effect on the formatting.
Of course, if it was entirely up to me, all images and taxo-/infoboxes would be one big right-floating element and at the top of the article source text. They would form a neat column on the right, with the text entirely unaffected. This would solve pretty much all layout issues instantly, especially the one with the section edit links. There are a few special cases where this doesn't do, and in some circumstances you really do want left-floating images, but they can always be handled on a case-by-case basis.
Timwi
Timwi wrote:
I'm getting annoyed when people make images so huge because they have a 1280x1024 resolution and they think everyone else must have that too. (I even had one replying back to me saying "Not everyone has a resolution as low as yours, so please leave the image this big!")
If you don't want to be criticised for having a low resolution, then may I suggest that you refrain from criticising people for having older browsers:
It's an ancient version of Mozilla that I probably compiled incorrectly when I installed in years ago
In that case, I'm very afraid you have no grounds to complain.
Stephen Bain wrote:
If you on't want to be criticised for having a low resolution, then may I suggest that you refrain from criticising people for having older browsers
OK, so if I give you the $0 you need to download and install the newest Mozilla, Firefox, Konqueror, or Opera-with-Ads, whichever you prefer, then will you give me the money I need for a new computer?
Please think about these things before you make insinuations like that. Of all the people I know I have the slowest connection and yet it doesn't take me more than half an hour to download and install the newest Firefox. I'm afraid a computer with a better resolution takes a bit more than just patience.
Timwi
Timwi (timwi@gmx.net) [050523 11:37]:
John R. Owens wrote:
I don't know if this might have ever been suggested before already, but perhaps a change in software could allow us to set a maximum image size in our user preferences? Either in width/height, e.g. "always shrink images to less than 200 pixels wide or 150 pixels high, whichever is smaller", or in kB, e.g. "always shrink images to less than 10 kB".
This is absolutely brilliant. I love and support this idea.
Sounds good to me too.
Not because I have a slow connection (which I do, but I don't mind), but rather because I'm getting annoyed when people make images so huge because they have a 1280x1024 resolution and they think everyone else must have that too. (I even had one replying back to me saying "Not everyone has a resolution as low as yours, so please leave the image this big!")
We have thumbnailing code, so it does make some sense to have images be huge on the page and reduced as needed.
And of course, you probably wouldn't want the rule applied to Image: namespace pages.
Yes, you do want it applied to Image:-namespace pages. The actual image above it is not part of the page. :)
We already have such a size limitation for the Image: pages as an option in preferences. Perhaps it just needs some even smaller settings.
But then again, you might also end up with weird formatting, when the author/editor inserting into a page doesn't know how large the image ends up being displayed.
People should learn that formatting is more than just looking at what it looks like on *your* screen. Pages must be formatted in such a way that they are readable on any resolution (OK, I guess we can assume a minimum resolution of 800x600, but not more!). Resizing your window is all it takes to check. Once this is done properly, varying sizes of images shouldn't have any adverse effect on the formatting.
This is a matter of editorial judgement. "Image reshuffle" is an occasional edit summary of mine.
- d.
David Gerard wrote:
Timwi (timwi@gmx.net) [050523 11:37]:
Not because I have a slow connection (which I do, but I don't mind), but rather because I'm getting annoyed when people make images so huge because they have a 1280x1024 resolution and they think everyone else must have that too. (I even had one replying back to me saying "Not everyone has a resolution as low as yours, so please leave the image this big!")
We have thumbnailing code, so it does make some sense to have images be huge on the page and reduced as needed.
I *am* talking about thumbnails. The person I was referring to made the thumbnail so big that it was more than 70% the width of the article text area on my screen.
Timwi (timwi@gmx.net) [050524 05:06]:
David Gerard wrote:
Timwi (timwi@gmx.net) [050523 11:37]:
Not because I have a slow connection (which I do, but I don't mind), but rather because I'm getting annoyed when people make images so huge because they have a 1280x1024 resolution and they think everyone else must have that too. (I even had one replying back to me saying "Not everyone has a resolution as low as yours, so please leave the image this big!")
We have thumbnailing code, so it does make some sense to have images be huge on the page and reduced as needed.
I *am* talking about thumbnails. The person I was referring to made the thumbnail so big that it was more than 70% the width of the article text area on my screen.
Isn't there something in the style guide which strongly suggests 250px as a good image width? Mention it on talk - the other editor is simply incorrect, precisely for people with old machines.
I have occasionally used pictures going across the width of the article, but I've then laid it out centred without text wrapping around it - e.g. [[Telstra Dome]] (although I see the current version has the text wrapping around the rather wide image).
In any case, if you write a pref for max thumbnail size and Brion likes the idea I expect it'll go in :-)
- d.
John R. Owens wrote:
I don't know if this might have ever been suggested before already, but perhaps a change in software could allow us to set a maximum image size in our user preferences? Either in width/height, e.g. "always shrink images to less than 200 pixels wide or 150 pixels high, whichever is smaller", or in kB, e.g. "always shrink images to less than 10 kB".
Unfortunately I don't think this would be a workable solution, unless a thumbnail duplicate of every image was created at the time of uploading. That would be fairly straightforward to do, although it would be very resource intensive on the image server(s). (Is there more than one yet?) To shrink each image after every request before sending to the browser would literally kill the servers.
A better way would be to allow for a user preference to have images either on or off by default, but have a link on the page to view a version with images. So if a user browsing with images off wanted to see an image, they could then re-load the page with images included. (If it's good enough for Outlook, then it's good enough for Wikipedia.)
Stephen Bain wrote:
John R. Owens wrote:
I don't know if this might have ever been suggested before already, but perhaps a change in software could allow us to set a maximum image size in our user preferences? Either in width/height, e.g. "always shrink images to less than 200 pixels wide or 150 pixels high, whichever is smaller", or in kB, e.g. "always shrink images to less than 10 kB".
Unfortunately I don't think this would be a workable solution, unless a thumbnail duplicate of every image was created at the time of uploading. That would be fairly straightforward to do, although it would be very resource intensive on the image server(s). (Is there more than one yet?) To shrink each image after every request before sending to the browser would literally kill the servers.
This is wrong in a few ways. Image scaling works as follows. On render, the apache server checks if a thumbnail image already exists, via NFS. If it doesn't, or if the full-sized image is newer than the thumbnail, it retrieves the image over NFS, scales it using ImageMagick, and then saves it back to the the image server over NFS. Although it's not done on upload, it's only ever done once for a particular image/size combination. Image scaling is not a major strain on the cluster, and I don't think this feature would make it one. However, there is a minor performance problem with it, which is that different HTML would have to be generated for users with this feature enabled. That would reduce the parser cache hit ratio.
And you can't literally kill something that's not literally alive :)
A better way would be to allow for a user preference to have images either on or off by default, but have a link on the page to view a version with images. So if a user browsing with images off wanted to see an image, they could then re-load the page with images included. (If it's good enough for Outlook, then it's good enough for Wikipedia.)
I think that would be better done on the client side. Internet Explorer is better in this respect than Firefox or Mozilla, with a "show image" context menu item and, in some versions, a "show images" toolbar icon. I'm sure there are other browsers with similar features to IE. To do it on the server side would require reducing the parser cache hit ratio.
-- Tim Starling
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
Stephen Bain wrote: | John R. Owens wrote: | |>I don't know if this might have ever been suggested before already, but |>perhaps a change in software could allow us to set a maximum image size |>in our user preferences? Either in width/height, e.g. "always shrink |>images to less than 200 pixels wide or 150 pixels high, whichever is |>smaller", or in kB, e.g. "always shrink images to less than 10 kB". | | Unfortunately I don't think this would be a workable solution, unless | a thumbnail duplicate of every image was created at the time of | uploading. That would be fairly straightforward to do, although it | would be very resource intensive on the image server(s). (Is there | more than one yet?) To shrink each image after every request before | sending to the browser would literally kill the servers. | | A better way would be to allow for a user preference to have images | either on or off by default, but have a link on the page to view a | version with images. So if a user browsing with images off wanted to | see an image, they could then re-load the page with images included. | (If it's good enough for Outlook, then it's good enough for | Wikipedia.)
That's why I was suggesting it would probably be desirable to have a limited number of options for maximum width/height/filesize. If that were the case, then a limited number of different sized copies could be cached, the same way, if I understand correctly, that the thumbnail images are cached after the first time they're viewd in the page. So, for instance, you might have four width/height category choices, e.g. 100x75, 200x150, 400x300, and 600x400, and then you "just" need disk space for four or less copies of each image (obviously, any images that are, say, 250x200 would only need two extra copies, and those rather small ones). I suppose that since this would be applied to pretty much all images, instead of only those formatted to use thumbnails. So what you might end up with, for the hypothetical picture and choices given above (call the picture example.jpg), would be something like creating files corresponding to the following URLs (I haven't figured just which name hash directory it should actually belong in, especially since I notice it does take the "XXXpx-" part into account): http://upload.wikimedia.org/wikipedia/en/thumb/a/ab/Example.jpg http://upload.wikimedia.org/wikipedia/en/thumb/c/cd/200px-Example.jpg http://upload.wikimedia.org/wikipedia/en/thumb/e/ef/100px-Example.jpg and perhaps something like http://upload.wikimedia.org/wikipedia/en/thumb/1/12/5kb-Example.jpg http://upload.wikimedia.org/wikipedia/en/thumb/3/34/10kb-Example.jpg
And then, if you've set your preferences accordingly, you get the 200px or 100px version, and if not, you get Example.jpg itself in the viewed page (assuming it hasn't been thumbnailed, either). If this were to be done, I'd expect there should be a script to create these thumbnails, run at a low priority, before the option were enabled, so that the servers wouldn't bog down if many people using the option started browsing image-laden pages. But I'd definitely want to see what kind of demands it would make on disk space first.
And mind me, I'm not really pushing for the idea myself, just trying to present an optimal solution to a problem that was brought up. I'm on broadband, so it doesn't affect me much personally. I just thought I'd throw the idea out there.
- -- John R. Owens ProofReading Markup Language : http://prml.sourceforge.net/
Geoff Burling wrote:
It's an ancient version of Mozilla that I probably compiled incorrectly when I installed in years ago
In that case, I'm very afraid you have no grounds to complain.
Download times are a very off-putting experience whenever one deals with the Web, & very few web developers bother to optimize for speed -- or even consider it a problem.
Unfortunately, web surfers often do injustice to web designers who do. Websites I make may not *finish* loading instantly the way Google does, because I tend to use more images, but I make it a very important point that the loading of the images (and CSS and other data) must not impair the speed of appearance of the text. Some of the Wikipedia skins do that too and you can happily read away the text as soon as it has been transferred. Unfortunately, Monobook isn't one of them, because it uses several CSS files which themselves include yet other CSS files...
And turning images off is not the solution.
I never suggested that. That wouldn't make the text appear any sooner anyway.
I'm not interested in seeing every known image with the proper license that could be related to the subject.
Then don't look at them.
Timwi
On Mon, 23 May 2005, Timwi wrote:
Geoff Burling wrote:
It's an ancient version of Mozilla that I probably compiled incorrectly when I installed in years ago
In that case, I'm very afraid you have no grounds to complain.
Did I say that *was* the grounds for my original complaint? You asked, I answered.
Download times are a very off-putting experience whenever one deals with the Web, & very few web developers bother to optimize for speed -- or even consider it a problem.
[snip]
Some of the Wikipedia skins do that too and you can happily read away the text as soon as it has been transferred. Unfortunately, Monobook isn't one of them, because it uses several CSS files which themselves include yet other CSS files...
For the record, I don't use the Monobook skin.
And turning images off is not the solution.
I never suggested that. That wouldn't make the text appear any sooner anyway.
Did I say you had?
I'm not interested in seeing every known image with the proper license that could be related to the subject.
Then don't look at them.
So how I am to selectively view images? Last time this question was raised (for reasons other than viewing convenience), the conclusion was that this was an all or nothing situation: either every image linked in a page gets downloaded -- or none do.
Should I start removing images from pages where I feel there are too many? Or would this be an example of disrupting Wikipedia to prove a point?
Timwi, I'm not clear where this exchange between us is going. My original complaint is with contributors who want to insert more images than I believe are necessary; at the moment I feel this is an issue best solved by education, rather than policy or programming. You seem to hear my generalized complaints about Web design philosophy as attacks on how Wikipedia is designed; if I had specific complaints about Wikipedia, I'd be filing detailed bug reports in the proper manner.
If you think I'm picking a fight with you, I'm not; I'm just trying to express my belief that the average contributor to Wikipedia sometimes lacks a clear idea what constraints many end users endure when they view Wikipedia. And this erroneous impression is related to a number of other issues I have with Web design -- & with the assumptions software designers far too often make about their intended audience.
I could expound a little about those, too, but it's probably better to concede the floor to someone else at this point.
Geoff
Geoff
Geoff Burling wrote:
Timwi, I'm not clear where this exchange between us is going.
Nor am I. You are going on about things that I have never contested or even mentioned, and you have not addressed any of what I've actually been trying to tell you.
I'll try again.
I'm trying to tell you that if you are not interested in the images, the presence and absence of images should not make any difference to you because it does not make any difference to the time taken for the *text* to load and display -- unless you have a browser which artificially waits until all images are loaded, in which case I would just simply suggest to upgrade, rather than trying to re-educate everyone in the Wikipedia community on what *you* think is the right amount of images in articles.
Timwi
On Mon, 23 May 2005, Timwi wrote:
I'm trying to tell you that if you are not interested in the images, the presence and absence of images should not make any difference to you because it does not make any difference to the time taken for the *text* to load and display -- unless you have a browser which artificially waits until all images are loaded, in which case I would just simply suggest to upgrade, rather than trying to re-educate everyone in the Wikipedia community on what *you* think is the right amount of images in articles.
Which has never been my problem: I am not aware of any browser that requires all of the images to be downloaded before it shows the page. I have no idea how you concluded that was my point; I was talking about HOW MUCH TIME THE PAGE TOOK TO DOWNLOAD IN TOTAL.
And because images sometimes do not download in a predictable order, a user must then DOWNLOAD ALL OF THEM to determine which ones are useful, & which ones are useless. Even the ones that a contributor insists ought to be displayed in Wikipedia at 1280x1024 -- I assume that is an example of the kind of image you have said you object to? Or would complaining about it be an attempt to "re-educate everyone in the Wikipedia community on what *you* think is ... right"?
I'll make this easy on both of us: I'll stop reading or answering any more of your posts to this list. Obviously, that will make us both much happier.
Geoff
Fred Bauder wrote:
The issue is always download speed when we serve a diverse international audience and at least make noises about serving the poor and the third world. Serving up articles over 100kb long with several images each over 200kb will basically stop a slower computer with limited memory operating with a modem in its tracks, sometimes even requiring a reboot. Essentially the site becomes unusable.
The size of the article source text has little correlation with the size of the HTML output, so that argument is not relevant to this discussion.
That is kind of a Borg answer. It just doesn't comport with the reality on the receiving end. When I did use a modem and an older computer long articles were troublesome. It is difficult to tell now as almost everything loads.
Fred
From: Timwi timwi@gmx.net Reply-To: English Wikipedia wikien-l@Wikipedia.org Date: Sat, 21 May 2005 22:40:03 +0100 To: wikien-l@wikipedia.org Subject: [WikiEN-l] Re: Article size consistency 32k
Fred Bauder wrote:
The issue is always download speed when we serve a diverse international audience and at least make noises about serving the poor and the third world. Serving up articles over 100kb long with several images each over 200kb will basically stop a slower computer with limited memory operating with a modem in its tracks, sometimes even requiring a reboot. Essentially the site becomes unusable.
The size of the article source text has little correlation with the size of the HTML output, so that argument is not relevant to this discussion.
WikiEN-l mailing list WikiEN-l@Wikipedia.org http://mail.wikipedia.org/mailman/listinfo/wikien-l
Fred Bauder wrote:
That is kind of a Borg answer. It just doesn't comport with the reality on the receiving end. When I did use a modem and an older computer long articles were troublesome.
I have no idea what you are referring to. I didn't say anything about the "troublesome"-ness of articles on "an older computer", and I don't see how anything I said "doesn't comport with the reality on the receiving end".