Currently we are using a 16 port 10/100 switch on loan from (and
shared with) Bomis. It is full. I have here another 8 port 10/100
switch that I can use for when the new servers arrive, but sooner or
later I will want to buy a wikipedia-only switch.
We currently have 9 servers live, and we will soon have 5 more. At
some point, Jason will send me pliny and larousse, and we will
probably use them for something as well. (I think Wikipedia owns
three "old" machines? I would have to review email to see which ones
of those I said that we gave to the foundation versus loaned.)
So that's maybe 17 ports plus the port for the remote power
controller. Anyway, a 16 port will not be enough.
A 24 port would be enough for now, but at our current rate of growth,
it seems very likely to me that a 32 port would be needed as well.
I am interested in getting feedback on the issue of 10/100 versus
gigabit. The answer of which to get depends, of course, on the total
price -- if the price difference is large, we can be happy with
10/100. If the price difference is small, we might as well get the
gigabit.
--Jimbo
>-----Message d'origine-----
>De : wikitech-l-bounces(a)Wikipedia.org
>[mailto:wikitech-l-bounces@Wikipedia.org]De la part de Jimmy Wales
>Envoyé : jeudi 6 mai 2004 23:37
>À : Wikimedia developers
>Objet : Re: [Wikitech-l] Switch query
>
>
>My research reveals that 24 port gigabit switches are
>available for under
>$600. Managed switches a bit more, but still under $800.
>
>That seems worthwhile to me.
Go for it then :) I thought it was more expensive.
Shaihulud
On account of various troubles, most particularly rejection of all mail
to wikipl-l for the last couple weeks, I've upgraded mailman from 2.1.2
to 2.1.5rc2. It seems to work so far, but give a holler if it does
something unexpected.
-- brion vibber (brion @ pobox.com)
In the last 20 minutes or so, this error pops up everywhere on test:
A database query syntax error has occurred. This could be because of an
illegal search query (see Searching TestWikipedia), or it may indicate a
bug in the software. The last attempted database query was:
REPLACE INTO linkscc(lcc_pageid,lcc_cacheobj) VALUES(14312,
'x^ŽM‚@†ÿËÜGûœ=ePºÆ¦C.®.¸Kñ¿7¶`бëÃó~´#°¦kJ]Ö
´¥ÁO¨=Xíý{P†PyÂDàɹê\"²¥ \'×y¯ ‡{ç‚Æu úŠ\0—¦ ÆØ—ëo2¡AèFè¾
æÅ q[ýäy$ªŸñkÏGc-W³ž -lõç+Lcü÷ÙøFVi')
from within function "". MySQL returned error "1205: Lock wait timeout
exceeded; Try restarting transaction".
Any ideas?
--
Gabriel Wicke
In the case the real name field is really something that wikipedians
want, so that will be activated in mediawiki 1.3, could it be moved more
at the bottom of the page, instead of being the first thing asked in
the preferences ?
Also, could we indicate that this is optional ?
Anthere
To make sure the markup MediaWiki parser produces is valid xhtml/xml
i've added a tidy (http://tidy.sf.net) function to Parser.php that pipes
the generated body html through the tidy tool as a last step. The
function is disabled by default, set $wgUseTidy = true in LocalSettings
to enable it. You can test the output at http://test.wikipedia.org.
I've done some benchmarks using ab with tidy enabled vs. disabled- the
performance impact seems to be neglible.
PHP5 has tidy built in as a module (see
http://www.php.net/manual/en/ref.tidy.php). It might be possible to
improve the overall parser performance with the php5 version as it seems
to be possible to teach it which tags to allow (currently done in php).
--
Gabriel Wicke
How can I concatenate the data split in files?
For example, The old table of de.wikipedia.org was split in
files xaa (part one) and xab (part two).
I've concatenated the files with cat:
cat xaa xab > 20040501_old_table.sql.bz2
bunzip2 doesn't uncompress the generated file. I checked the md5 number
and:
md5sum 20040501_old_table.sql.bz2 => 7669369a13df1b48ff4cad5c02a5c0fd
wikipidia md5 number is e96a44216a3fa546d2299647f6f6eb45
The same happens to the English data (en.wikipedia.org)
Does somebody knows how to concatenate the files?
Thanks!
Luciana
I've commented out the following lines from LanguageWiktionaryPl.php:
# 10 => "Indeks",
# 11 => "Dyskusja_indeksu",
# 12 => "Aneks",
# 13 => "Dyskusja_aneksu"
since a) they caused the hundred+ pages with 'Indeks:' and 'Aneks:'
prefixes to be completely inaccessible and b) the numbers conflict with
the Template and Help namespaces in MediaWiki 1.3.
In the future, we really should have two things:
* An automated script for migrating pages safely from pseudo-namespaces
into newly created namespaces
* A safe 'user namespace' area where specific MediaWikis can set up
local namespaces that won't conflict with MediaWiki upgrades.
-- brion vibber (brion @ pobox.com)
I've installed MediaWiki on an ISP's servers and had no trouble. Recently
I built a RedHat box for an internal wiki at work, and URLs of the form
http://internal.domain/index.php/Main_Page
result in a 404 error. I've applied the suggested RewriteEngine rules
specified at http://meta.wikipedia.org/wiki/Apache_config#mod_rewrite to
a .htaccess file where index.php is, but that didn't help. Any suggestions?