I recently bookmarked Wikipedia at my local library. They have brand new
terminals with TFT screens. The resolution is fairly low.
Sadly, the main page of Wikipedia suffers from several big problems
* it's overloaded. There is too much going on and no clear indication of
what can be ignored and what must be read
* the page title comes nearly halfway down the screen. The newcomer just
sees a lot of "scary" links and form elements before any sort of content.
Compared to, say, a Certain Non-Free Encyclopedia Beginning With B...
well, we suck. Show them both to the average library-visitor who just
wants to look something up -- he'll go for the Britt.
Someone posted a really nice new layout some time ago -- what's the
status on that?
Does it need more work before it's ready? I'm happy to help, I have some
free time at the moment. My only problem, still, is getting CVS access.
(anyone have the link?)
I think we need to consider moving most of the top bar stuff down to the
sidebar. (for example, like the streamlining I did here:
http://wiki.beyondunreal.com/ )
Of course, the fact that the language links are at the top is a key
argument in favour of keeping the status quo regarding the multilingual
portal. (ie, peopel who don't read english see their language
immediately). And so the Domino rally of committee debate rolls on...
-- tarquin
>Hi,
>
>I get
>
>Host 'larousse.wikipedia.org' is blocked because of many
>connection errors. Unblock with 'mysqladmin flush-hosts'
>
>error messages in about 2 from 3 tries to contact wikipedia.
>
> JeLuF
Brion goes on vacation and everything starts to fall apart. First order of
business of the Wikimedia Foundation is to set up a fund to clone Brion. :-)
--- mav
The last batch of fun upgrades included use of gzip compression on
cached pages where browsers accept it. I'm happy to report that this
seems to have decreased the bandwidth usage of the English-language
Wikipedia by up to roughly 25%.
Data from http://www.wikipedia.org/stats/usage_200306.html
kilobytes / hits = kb/hit
2003-06-01 6392951 / 619534 = 10.319 \
2003-06-02 7908928 / 793065 = 9.973 |
2003-06-03 8267879 / 822025 = 10.058 | mean 10.1
2003-06-04 7513917 / 755482 = 9.946 | range 0.37
2003-06-05 7347843 / 723717 = 10.153 |
2003-06-06 6300476 / 614552 = 10.252 |
2003-06-07 5159151 / 503097 = 10.255 |
2003-06-08 5732741 / 566484 = 10.120 /
-- gzip cache activated: --
2003-06-09 5376987 / 726971 = 7.396 \
2003-06-10 5442685 / 732897 = 7.426 | mean 7.6
2003-06-11 5735325 / 765204 = 7.495 | range 0.85
2003-06-12 6362049 / 772002 = 8.241 /
These counts include pages, images, css, everything. (But not the other
languages, mailing list, or database dump downloads.)
The bandwidth usage did go up a bit today, so it remains to be seen just
how stable the effect is. A number of things can affect it:
* Since caching, and thus sending of gzipped cached pages, is currently
only done for anonymous users, an increase in activity by registered
users would tend to reduce the overall percentage of savings
* Lots of editing and loading of dynamic pages, which are not
compressed, would do the same
* A large increase in brief visits by newbies drawn in by a link or news
mention would increase the relative bandwidth used by linked items
(logo, style sheet, etc) which are not additionally compressed
* Lots of work with new images might increase bandwidth usage.
Other thoughts:
- So far, no one's complained about being unable to read pages due to
pages being sent compressed when they shouldn't. (There was in fact a
bug to this effect which made the wiki unusable with Safari, but I don't
know if anyone but me noticed before I fixed it. :)
- Since gzipping is done only at cache time, this should use very little
CPU. IIRC the gzip was one of the faster steps when I was test profiling
this ;) and the number of times gzipping is done should not generally
exceed the number of edits * some factor regarding creation/deletion
rates and number of links per page. (More of course when the cache has
to be regenerated en masse.)
- The page cache presently stores both uncompressed and compressed
copies of pages, which is space inefficient, though we're not presently
hurting for space on larousse. Someone suggested storing just the
compressed pages, then in the relatively rare case a browser won't
accept gzipped pages, we can unzip it on the fly.
- We could offer either as default or as an option to compress
dynamically generated pages as well, which could shave some more
percentage points off the bandwidth usage. Might be a help for the modem
folks who do log in. :) However I'm not sure how much this would affect
CPU usage; in any case there's no urgency for this, it's just something
we might do if we have the cycles to burn (I don't think we do just now,
but we might one day).
-- brion vibber (brion @ pobox.com)
I rebooted both *.197 and *.199, mostly because I'm clueless these
days.
I don't know the password for root mysql account, for example. Well,
I used to know it.
This is twice today that the same thing happened. Any clues as
to what the problem is?
> >Hi,
> >
> >I get
> >
> >Host 'larousse.wikipedia.org' is blocked because of many
> >connection errors. Unblock with 'mysqladmin flush-hosts'
> >
> >error messages in about 2 from 3 tries to contact wikipedia.
> >
> > JeLuF
>
>Brion goes on vacation and everything starts to fall apart. First order of
>business of the Wikimedia Foundation is to set up a fund to clone Brion.
>:-)
>
>--- mav
>
I'm having the same problem. I got on three in ten attempts. When I got on I
tried a save but got that message above. When I finally got on again I found
my change had been saved. I tried again to work on an article, did a save
and again got the message. But this time when I finally got on a third time,
the second save had not worked. And when I tried to mail the wikilist I got
error messages there too.
JT
_________________________________________________________________
MSN 8 with e-mail virus protection service: 2 months FREE*
http://join.msn.com/?page=features/virus
When I mailed wikidown(a)wikipedia.org about the current problems on the
server, I got the following (note that the mail went to pliny, although
Larousse is the server reporting problems):
The original message was received at Fri, 20 Jun 2003 16:19:39 -0500
from 210-86-88-89.jetstream.xtra.co.nz [210.86.88.89]
----- The following addresses had permanent fatal errors -----
<wikidown(a)wikipedia.org>
(reason: 550 5.1.1 <wikidown(a)wikipedia.org>... User unknown)
----- Transcript of session follows -----
... while talking to pliny.wikipedia.org.:
>>> RCPT To:<wikidown(a)wikipedia.org>
<<< 550 5.1.1 <wikidown(a)wikipedia.org>... User unknown
550 5.1.1 <wikidown(a)wikipedia.org>... User unknown
--h5KLJkq15269.1056143986/pollux.host4u.net
Content-Type: message/delivery-status
Reporting-MTA: dns; pollux.host4u.net
Received-From-MTA: DNS; 210-86-88-89.jetstream.xtra.co.nz
Arrival-Date: Fri, 20 Jun 2003 16:19:39 -0500
Final-Recipient: RFC822; wikidown(a)wikipedia.org
Action: failed
Status: 5.1.1
Remote-MTA: DNS; pliny.wikipedia.org
Diagnostic-Code: SMTP; 550 5.1.1 <wikidown(a)wikipedia.org>... User unknown
Last-Attempt-Date: Fri, 20 Jun 2003 16:19:46 -0500
--
Richard Grevers
Eschew obfuscation.
I just got the following message:
Could not connect to DB on 130.94.122.197
Host 'larousse.wikipedia.org' is blocked because of many connection errors.
Unblock with 'mysqladmin flush-hosts'
If this error persists after reloading and clearing your browser cache,
please notify the Wikipedia developers.
--
Richard Grevers
Eschew obfuscation.
Hi,
I get
Host 'larousse.wikipedia.org' is blocked because of many connection errors. Unblock with 'mysqladmin flush-hosts'
error messages in about 2 from 3 tries to contact wikipedia.
JeLuF
The error message given is the syntax error message
Ha ocurrido un error de sintaxis en una consulta a la
base de datos. Esto puede ser debido a una b�squeda
ilegal (ver Buscando en Wikipedia), o puede indicar un
error en el software. La �ltima consulta que se
intent� fue:
SELECT cur_id,cur_namespace,cur_title,cur_text FROM
cur,searchindex WHERE cur_id=si_page AND ( (MATCH
(si_title) AGAINST ('astronomo')) ) AND cur_namespace
IN (0) LIMIT 0, 20
El error de retorno de MySQL fue"1016: Can't open
file: 'searchindex.MYI'. (errno: 145)".
AstroNomer
__________________________________
Do you Yahoo!?
SBC Yahoo! DSL - Now only $29.95 per month!
http://sbc.yahoo.com
in our current setup, several parts of the code are making changes to
CSS to create space in the layout for the Quickbar:
* skin.php line 242 -- puts an empty TD in both the top and bottom tables
* skinstandard.php line 40 -- sets the width of the div.article
it would be much simpler to only have skinstandard.php set the margin
for div.content, since that contains both tables.
that way we lose the daft empty spacer cells in both header and footer
tables
thoughts before I make this change?