Pakaran made a comment on irc which really makes a lot of sense to me
Why could not be bureaucrat be allowed to give bots on their local
projects ? They are more likely to know the peoople as well as the rules.
And generally, it is not a sensitive issue such as removing sysop status
or giving check user. It is not easily abusable either.
Of course, it still make sense to ask stewards for help for global bots.
But frankly, what about giving bureaucrat the ability to give user a bot
flag ?
Anthere
Some sort of massive network outage at Level3 has thrown the net into a
bit of chaos and generally broken Wikipedia for the last couple hours.
We're at this point at least mostly accessible, but still have
connectivity problems between Florida and Amsterdam which is breaking
the European squid service.
Hopefully this will get resolved soon.
-- brion vibber (brion @ pobox.com)
I just wanted to announce the release of two new widgets which make our
wikis run much smoother.
http://sourceforge.net/project/mediawiki-agora
The categoryeditor widget allows users to view the existing categories
while editing the page, and automatically inserts/removes the Category
tags into the text area when the add/remove buttons are clicked.
http://sourceforge.net/project/screenshots.php?group_id=130663&ssid=20074
The categorysearch special page allows users to query the db for any/all
articles associated with those categories.
http://sourceforge.net/project/screenshots.php?group_id=130663&ssid=20075
Neither of these widgets are very useful for the wikipedia itself, since
it has far to many categories, and the in-out widget does not currently
scale that well, but for small-medium size wikis we have found these
widgets to be very helpful.
Is the best place to announce this tool?
BTW - the mediawiki-agora is a sourceforge project modeled on the Plone
Collective. It is intended as a repository for an assortment of useful
mw extensions that have no permanent home elsewhere. If anyone else has
any projects they would like to throw in the mix, please contact me and
i will add you as a developer.
Enjoy,
/Jonah
Boa tarde,
Gostaria de saber se é possível conseguir o código-fonte do projeto
Wikimedia Commons?
Caso seja possível favor indicar em qual site consigo.
Gostaria também de solicitar, se possível, o envio de mensagens em minha
língua que é o Português do Brasil.
--
Abraços
Lidiane Cristina
After a very informative exchange with Tim Starling, I've thought a bit
more about my proposal last night about making Wikipedia cacheable,
which in the light of day seems excessively complex. Here's a simpler
version:
At the moment, according to Tim, all Wikipedia pages are served as
uncacheable pages, thus preventing any intermediate proxy caches from
cahing them -- a quick packet dump shows that they are served with
Cache-Control: private, s-maxage=0, max-age=0, must-revalidate
whether or not I am logged in. Clearly, if Wikipedia content was
cacheable, there would be massive bandwidth gains, but the current
policy is designed to prevent caches serving out-of-date content, or the
same page to anons and logged-in users.
I'd be interested in the effects if we were to serve most ordinary pages
with
Cache-Control: public, must-revalidate
with (say) a max-age of a week, except for the three following
exceptions which need different or rapidly-changing data (I'll call it
"dynamic content") served to different users for the same URL:
(a) pages for logged-in users
(b) pages for anon users who have a pending message
(c) pages with auto-generated dynamic content (Special: pages, and any
others with similar behaviour)
which would be served with the anti-caching cache control header as before.
Since all pages would be must-revalidate, the Wikimedia cluster would
still get a conditional GET request per hit, so that it could check
freshness, then decide which header to generate, based on source IP and
any user cookies. The twist would be that the page would be reported as
outdated by the server if _either_ it had been changed since the cache
stored it, _or_ dynamic content was needed, thus serving the desired
dynamic content to those users who need it, whilst preventing that
content from being cached for other users.
Since 95%+ of all hits are presumably from anons without pending
messages, this should, in an ideal world, result in a very large number
of pages being successfully served by hits on ISPs' proxy caches,
without stopping dynamic content from being served to those users who
need it, or affecting the freshness of pages for anons.
The hit rate would not be quite as high as possible, since every hit
from a dynamic-content user would "wash out" any static version of the
page in question from the cache, but since these users would only
account for about 1 in 20 of page accesses, the remaining 19 out of 20
times there will still be a hit.
I'd be interested to hear what others think. Is there an obvious flaw in
my reasoning? Is this worth a try?
-- Neil
-------------------------------------------------------------
Pseudo-code:
if (logged_in_user) or (user_has_messages) or (special_page):
say has changed; serve with Cache-Control: private. must-revalidate
else:
if (modification_date > if_modified_since_date):
say has changed; serve with Cache-Control: public. must-revalidate
else:
say has not changed
When someone uploaded a 200 megapixel PNG of a fractal to de, it caused
the hard drives on 4 apache servers to fill up, and caused the site to
slow down. The action was not malicious, I hate to think what might have
happened if someone tried to actively exploit it.
I've now limited the thumbnailing code to only attempt to thumbnail
images less than 12.5 megapixels, or about 3500x3500. The problem is
that in ImageMagick's scaling code, the entire image needs to be
decompressed and stored in RAM. For a 200 megapixel image, that means
800 megabytes of working space.
The standard JPEG library has the ability to decompress directly to a
thumbnail. ImageMagick uses this feature. So I haven't restricted JPEG
sizes in any way. You'll still be able to upload large PNGs and link to
them with [[media:]] links.
There are probably still a few DoS avenues in the image handling code,
if anyone's really keen to crash the site. This change should at least
take care of the accidental problems.
In case anyone is looking for a fun project, it is theoretically
possible to make small thumbnails of large PNGs with very little working
memory, using libpng's low-level interface. Bicubic interpolation only
needs 4 rows, so non-progressive PNGs could be thumbnailed in a single
pass with 4 rows of working memory. Progressive PNGs could be
thumbnailed using the first few passes.
-- Tim Starling
Hi everybody.
My name's Felipe Ortega. I'm Telecommunications
Engineer and Associate Professor in the InfoTec
Department of the Alfonso X El Sabio University at
Madrid, Spain.
I'm currently working in a research project for my
Ph.D. program in Computer Science at the Rey Juan
Carlos University. I would like to analyze and work
towards better solutions for massive distribution of
media contents among a widespread volunteer users
comunity.
I think Wikimedia projects are a very good example of
initiatives that could get advantage from this
research. To start with, I would like to concentrate
in the specific problem of massive distribution of,
let's say, Wikipedia contents. Every volunteer could
contribute with a bit of disk space in order to build
some kind of grid computing infrastructure for
global access to it's contents.
To start with this work, I would be very grateful if
you could send me as much iformation as you can about
the present Wikipedia servers configuration. I've
retrieved a lot of data from meta.wikimedia.org, and
I've updated the overall system scheme, but I haven't
found anything about important info like, for example,
the SQUID caché configuration content load balancing.
Thanks a lot for your aid.
Regards,
Felipe Ortega.
______________________________________________
Renovamos el Correo Yahoo!
Nuevos servicios, más seguridad
http://correo.yahoo.es
I am unable to locate any code that would allow me to create an
internal link represented by an internal image. I am also unable to
disable the wiki code to allow proper html which would circumvent this
problem, as nowiki does not work on this code. Here is an example in
plain html which i would like to implement in my wiki:
<a href="[[Internal Link]]"> <img src="[[Image:Internal_Image]]"> </a>
I just want to be able to click on a jpeg image in my wiki and have it
take me to a specific wiki entry, like a menu. Internal image,
internal link, why is this an issue? Further, I cannot understand why
in any of the link or image syntax help areas it not only doesn't tell
how to do this, but it doesn't even mention if it is possible or not.
Help.
Best Regards, Steve