Jason has made a hotel reservation Friday night in San Diego on the
assumption that he'll be headed down there Friday to decommission
geoffrin and gunther and pliny and whatever.
He's going to (most likely) send geoffrin back to Penguin for them to
either repair or replace. He's sending gunther (which is on loan from
bomis) to me so that we can start migrating some Bomis stuff here.
And Bomis is going to borrow whatever else belongs to Wikimedia for a
little bit before sending it along, also as part of that migration.
Should I ask him to wait? Can we get moved by then?
--Jimbo
Sheldon and Geoff suggest:
> > But anons and new users do have a tendency to save without first
> > previewing. I would therefore support the removal of the Save button
> > from the initial edit screens presented to anons. Only after they
> > previewed once would they be presented with the save button.
> That is such a simple & brilliant solution that I wonder why no
> one else has thought of it until now. Such a change would not be a
> serious barrier to someone who wanted to make the contribution
> without logging in: the forum on Linux Weekly News requires a preview
> step for its members to submit input, & I'd say that place has a high
> signal-to-noise ratio. It would also encourage the creation of more
> accounts, which would draw more people in.
Developers, please hide the "Save" button for newbies (like IP users and
those recently signed-in) -- thus requiring them to PREVIEW each page
submitted. (Un-hide the button, of course on the preview page :-)
We can try the experiment for a couple of weeks, and see if it has good
effects. What do you say?
Ed Poor
Wikien-l Admin
Hi,
I just tried to go to test.wikipedia.org, and I was forwarded to
http://larousse.wikimedia.org/wiki/Main_Page and received the following
error message:
ERROR
The requested URL could not be retrieved
While trying to retrieve the URL:
http://larousse.wikimedia.org/wiki/Main_Page
The following error was encountered:
* Access Denied.
Access control configuration prevents your request from
being allowed at this time. Please contact your service
provider if you feel this is incorrect.
Your cache administrator is webmaster.
Generated Mon, 09 Feb 2004 09:32:33 GMT by wikipedia.org
(squid/2.5.STABLE4)
Thanks,
Timwi
Hi,
I'm having the following problem. I know it's not a configuration
problem because the same machine can access Wikipedia using Lynx.
I'm trying to access Wikipedia using the Perl module LWP::UserAgent:
perl -e 'use LWP::UserAgent; use HTTP::Request;
my $ua = LWP::UserAgent->new;
my $request = HTTP::Request->new (GET =>
"http://en.wikipedia.org/");
my $response = $ua->request ($request);
print $response->content;
'
The response I get is an HTML document which I've plaintexted here for
your viewing pleasure:
ERROR
The requested URL could not be retrieved
While trying to retrieve the URL: http://en.wikipedia.org/
The following error was encountered:
* Access Denied.
Access control configuration prevents your request from
being allowed at this time. Please contact your service provider
if you feel this is incorrect.
Your cache administrator is webmaster.
Generated Mon, 09 Feb 2004 22:27:15 GMT by wikipedia.org
(squid/2.5.STABLE4)
Looks like the same error message like the one I get on
test.wikipedia.org, except that this one only occurs when I use
LWP::UserAgent.
Any ideas?
Thanks,
Timwi
I've been reading around a little bit more on some of the other
MediaWiki-powered websites that are out there, since I've been starting
a new wiki with a few other people for "Star Trek"-related topics,
Memory Alpha. So far, we've frequently been copying the documentation
pages from Wikipedia (for example, [[Wikipedia:Help]] and
[[Wikipedia:How_to_start_a_page]]) -- and we've kept the proper
attributions, of course!
However, after some discussion among ourselves and with a couple of
Wikipedia contributors on the side, we decided a month or so ago to
convert Memory Alpha to a Creative Commons License instead. It was my
understanding that this was compatible with the Wikipedia GFDL, but
then I read a WikiTravel page which suggests otherwise
(http://www.wikitravel.org/en/article/Wikitravel:
Why_Wikitravel_isn%27t_GFDL).
What options are currently available, or could be made available, given
the current Wikipedia copyright and the probability of other
MediaWiki-powered websites using different copyright licenses but
wishing to use the same documentation as far as the site's
functionality is concerned?
(I apologize if this isn't entirely the proper mailing list to post
this message on, but I feel it could/should be addressed here because
it's a good idea to provide some documentation that can be available
for everyone who uses the software -- and you guys make the software!)
Thanks,
Dan Carlson, Administrator
Memory Alpha: A Star Trek WikiWiki
http://memoryalpha.st-minutiae.com/
Suda's got a full database up and seems to be maintaining
synchronization with the master. Logins for the developers should be
set up, and most of the infrastructure is there or ready to be put in
place.
Unless anyone's got a better idea I'm planning to switch the master
database server and the web servers to Florida on Monday night /
Tuesday morning. I'll be off work Tuesday so I can sit and tweak things
all day if necessary.
We should be able to switch the squid cache on larousse to point at the
new web servers and make it transparent to end users except for a few
minutes of downtime as the database is reconfigured from slave to
master. Once we're sure it's working, we can finish setting up the DNS
to point at the new squids in Florida and they'll update when they
update.
-- brion vibber (brion @ pobox.com)
So, I've been thinking about the compressed storage of old versions of
MediaWiki articles, and I've been thinking about some other
ideas. I'll probably scribble them on meta soon; I wanted to float
them here first.
Some (most?) version control systems use a reverse diff storage
system to keep storage requirements down to a minimum, at the expense
of retrieval time for old versions.
The idea is pretty simple: you always keep a full copy of the current
version of a file -- let's call it version N. But instead of keeping a
full copy of version N - 1, you just keep the difference between that
copy and version N. Similarly, for version N - 2, you keep the
difference between N - 2 and N - 1.
If differences between versions are significantly smaller than the
versions themselves, there's a great reduction in storage requirements
for the entire system.
To generate a full copy of version N - 2, you have to first retrieve
version N, then the diff between version N - 1 and version N, and
apply that patch to make version N - 1, then retrieve the diff between
version N - 2 and version N - 1, and apply that patch to make version
N - 2. The idea, though, is that you retrieve old versions much, much
less frequently than the current version.
You could conceivably store all old versions as diffs against the
current version -- making retrieval time faster -- but it makes
storing slower, since you have to recompute the diffs each time you
save.
Another optimization is keeping an LRU cache of old versions that have
been retrieved.
Anyways, I think this would probably make storage much smaller for
MediaWiki databases. Even if compression reduces the size of old
versions to, say, 10% of their original size, storing a diff that's
just 1% of the old version is even more of an improvement. If storage
is at a premium, you could even compress the diffs... although this
probably wouldn't pay off for very small diffs.
~ESP
--
Evan Prodromou <evan(a)wikitravel.org>
Wikitravel - http://www.wikitravel.org/
The free, complete, up-to-date and reliable world-wide travel guide
1. I did get the RAID array up and running. RAID-5, 3x36G, for a 72G
logical volume and real usable space of a bit less than that. Should
be fast and safe, hoorah!
The bad news is, I either inevitably or stupidly (can't decide which
just yet) managed to wipe the existing hard drive in the process.
Almost surely someone a bit more clueful than me could have avoided
that, but I will say that I did my best and chose an option for raid
creation that promised me no data loss. I probably screwed something
up.
So, if a lot of work had been done prepping that machine, then I owe
apologies. I had to reinstall RH9 from scratch.
2. I ran out of time, so only some of the machines are on the power
port. The big issue with the power port is that there are now a *lot*
of wires in the cage, and I am trying to keep the whole thing neat and
professional, for my future sanity, and so it's a bit slow to feed the
wires just so. This is a minor issue, though, since (a) we do have
the ability to phone in for a manual reboot if necessary and (b) I'll
fix it at the earliest opportunity, i.e. early next week.
In my opinion, the db server probably needs an upgrade of ssh, but
from a hardware perspective, we're ready to roll. All the machines
are looking good.
--Jimbo
Hi,
since nobody answered my question, I went forward and committed my
change (that removes self-links and displays the link's text in bold
instead), but I have no real idea what branch I committed it to.
If I shouldn't have done this, then hopefully someone will tell me now
what I should have done instead.
Greetings,
Timwi