--- Richard Rabinowitz <rickyrab(a)eden.rutgers.edu> wrote:
> College term papers generally are double-spaced. Internet printouts tend
> to be single spaced. Sure you're not taking that into account?
Hm. Good point. I went to a uni whose administration thought writing was very
important. Thus single space. Last time I was asked to double space was high
school.
-- mav
__________________________________________________
Do You Yahoo!?
Tired of spam? Yahoo! Mail has the best spam protection around
http://mail.yahoo.com
I see now that its been renamed, but most new or not-so-experienced
contributors wouldn't naturally look to IfD to bring up a .ogg for deletion.
It may be a good idea to note it on the 'Deletion Tools' template, or
wherever it is appropriate.
On 5/21/05, David 'DJ' Hedley <spyders(a)btinternet.com> wrote:
> With the increasing amount of audio and media other than text and images
> being uploaded I think it could be worth considering the creation of an MfD
> (Media for Deletion), to undergo the deletion of other media in a similiar
> way to RfD, CfD and IfD.
Are you talking about the English Wikipedia? If so, this should be on
wikien-l rather than wikipedia-l since non-English Wikipedias don't
generally have the same deletion procedures as English does.
"IfD" was actually renamed "Images and media for deletion"
<http://en.wikipedia.org/wiki/Wikipedia:Images_and_media_for_deletion>
last November so already covers audio and video. This is a fairly low
traffic page compared to the main VfD, so I don't see an imminent need
to split it at this stage.
Angela.
--- Fred Bauder <fredbaud(a)ctelco.net> wrote:
> The issue is always download speed when we serve a diverse international
> audience and at least make noises about serving the poor and the third
> world. Serving up articles over 100kb long with several images each over
> 200kb will basically stop a slower computer with limited memory operating
> with a modem in its tracks, sometimes even requiring a reboot. Essentially
> the site becomes unusable.
There is also the issue of print. So while Wikipedia is not paper, it would be
nice to include articles on every major topic without having to cut down more
trees than needed. So we should summarize much more often and move more
detailed text to daughter articles (we'd likely only print selected parent
articles in a print version).
Readers also very often have limited time and/or patience (especially on the
Internet). More condensed treatments should be available to serve those readers
while at the same time links to related articles can provide the detail for
those who need that. By 'more condensed' I mean articles that are in the size
range of a normal college term paper (10-15 printed 8.5x11 pages of prose with
standard font).
-- mav
Discover Yahoo!
Get on-the-go sports scores, stock quotes, news and more. Check it out!
http://discover.yahoo.com/mobile.html
--- David 'DJ' Hedley <spyders(a)btinternet.com> wrote:
> Unless we're living in a world of 56K internet still, I think 32K could
> become at least 50K. Even articles on albums by Eminem and so forth are
> getting above 32K - Giving that limit is slowly even limiting the growth of
> good articles. If an article grows above that it should be allowed to grow,
> unless it is obviously repeating itself.
Yeah, let it grow. But at some point some sections will need to be summarized
and the detail put into a daughter article. We need not burden readers with
more detail than is necessary - those that want more detail on the sub topic
covered by that section can skip ahead directly to the main article for that
section.
My own opinion is that almost all articles are probably too long once they
enter the 30 to 45KB range but some topics are so expansive that they are not
too long at even 50KB. At the same time some other topics are so narrowly
focused that they could be considered too detailed even if they don't trigger a
page size warning.
We had a long discussion about this a month ago on the featured article
criteria talk page. The general consensus was against specific numerical limits
while at the same time not having articles be longer than needed to pass the
other FAC criteria. The only outstanding matter is finding some wording that
can be agreed upon so that can be added as a FAC criteria.
Part of good writing is prioritizing what information should be presented up
front to readers vs what can be best dealt with in other related articles. Some
points:
*Readers should have the ability to zoom to the level of detail they need.
Having all the detail on one page does not allow for that.
*Do this by summarizing and providing links to more detailed treatments on
those sub topics.
*What matters is that Wikipedia has lots of detail on a topic, not that all
that detail is on the same page.
-- mav
Discover Yahoo!
Stay in touch with email, IM, photo sharing and more. Check it out!
http://discover.yahoo.com/stayintouch.html
> From: "Andries Krugers Dagneaux" <andrieskd(a)chello.nl>
> Subject: [WikiEN-l] Article size consistency 32k
> To: <wikien-l(a)Wikipedia.org>
> Message-ID: <000d01c55d72$fd9df690$3206a33e@oemcomputer>
> Content-Type: text/plain; charset="us-ascii"
>
> I notice that lately articles on the Featured article Candidates are
> often longer, sometimes almost twice, than the recommended size of 32k
> and nobody complains anymore about it.
See http://en.wikipedia.org/wiki/Wikipedia:Article_size for a full
discussion.
It begins:
"Limits on article size are set by a) technical issues, and b)
considerations of readability and organization.
In the past, technical considerations with some now-seldom-used
browsers prompted a firm recommendation that articles be limited to a
maximum size of precisely 32KB. With the advent of section editing, and
the availability of upgrades for the affected browsers, this no longer
applies. Thus, there is presently no firm policy dictating any precise
limit on article length."
There is no law, so there is nothing to enforce. Article size is a
matter of style, taste, and judgement and must be negotiated by
consensus on each individual article.
If you think an article is too long, try to convince your colleagues
(for example, by reminding them of how long it takes you to download
the article, if that is an issue for you).
--
Daniel P. B. Smith, dpbsmith(a)verizon.net
"Elinor Goulding Smith's Great Big Messy Book" is now back in print!
Sample chapter at http://world.std.com/~dpbsmith/messy.html
Buy it at http://www.amazon.com/exec/obidos/ASIN/1403314063/
Magnus Manske (magnus.manske(a)web.de) [050521 07:42]:
> and shall we start discussing what aspects need rating?
> I just started
> http://meta.wikimedia.org/wiki/En_validation_topics
Ah good! I just mentioned the validation feature on wikien-l. People will
want to go to that page.
- d.
Brion's switched it on, and Magnus' article validation feature is now
running on http://test.leuksman.com/ - the MediaWiki 1.5 test wiki.
So far it only works in Monobook, but that's enough to get the idea.
Basically, you can quality-rate a given revision of an article on various
criteria. There's a tab at the top marked "Validate".
Create an account and play with it. Then think hard about how to implement
this on en: and what criteria you think would be good.
The plan for now is just to gather data, not release it, then release it in
one go and see what sense people can make of it. Then we can decide what to
actually do with the numbers.
- d.
Stirling Newberry (stirling.newberry(a)xigenics.net) [050519 03:48]:
> The size problem will be solved by a distribution system, probably one
> that includes rating of articles. The local geography articles will get
> low ratings and thus not be in the "short" version of wikipedia.
Yes. VFD is supposed to be an immune system, to keep out the *shit*. It's
far too blunt a tool to try to use as a general quality-control system,
which is why people get so very upset when articles that are NPOV and
verifiable get nominated for VFD on grounds that are, per the deletion
policy, spurious.
(pointing to wikien-l - this is really en: related)
- d.