Sorry if that message has been already sent, but I can't see it in the
gmane interface.
Hello
I apologise if this question has already has been asked or is
elementary, but is there any possibility that the discussion, the
talks about a specific topic could be mirrored in a mailing list or
newsgroup? For example http://en.wikipedia.org/wiki/Talk:Sumer
It seems that I can contribute to that discussion, after having logged
in, but I would prefer a more familiar interface such as the emacs
editor. If such discussions were mirrored in a mailing list, I could
use that editor for this purpose.
Now I see for example the daily-article mailing list which is mirrored
in gmane as gmane.org:gmane.science.linguistics.wikipedia but there
seems no response posted.
Uwe Brauer
-----BEGIN PGP SIGNED MESSAGE-----
Moin,
didn't post for a long time, but some of you may remember my <graph>
extension, generating HTML and ASCII output. Over the time the software
has evolved quite a bit and I am now proud to announce a new release.
(Graph::Easy 0.30 and wikimedia-graph 0.12 are the latest now)
It can now generate HTML, ASCII, SVG and graphviz code. The latter
allows .png, .ps etc output via external programs like dot.
You can now also specify the output format in the graph code itself, so
forcing a particular graph to render as ASCII/HTML etc from the wiki page
itself.
The software is much faster, more stable, needs less prerequisites and
less memory. It also has now many of the planned features implemented,
and a few more. The syntax also has now settled down and the attribute
set is strictly enforced (catching these dang typos :-D
There is a comprehensive online manual (with examples etc) and a website
about the project online at my server. I als created a little demo page
where you can play live with it:
http://bloodgate.com/perl/graph/manual/ Manual
http://bloodgate.com/perl/graph/ Project Site
http://bloodgate.com/graph-demo/ Live demo
The live demo shows it without MediaWiki integration, since I do not have
a wiki on my server yet. (It also does not render PNG since dot/graphviz
is not installed - if somebody knows how to compile graphviz on FreeBSD
4.8 please contact me off-list).
Hope you enjoy it.
Please write me critics/praise/feedback/feature-wishes etc. I do have some
limited email writing/reading capabilities over the next two weeks, but
rest assured I will read all your responses and answer them :)
Best wishes,
Tels
- --
Signed on Fri Sep 23 17:38:38 2005 with key 0x93B84C15.
Visit my photo gallery at http://bloodgate.com/photos/
PGP key on http://bloodgate.com/tels.asc or per email.
"I can imagine what you're thinking though: this girl keeps her brains
in her backside! But actually only the more primitive parts of my brain
are in my butt. The more interesting parts are kept in a PC - my spinal
cord is actually an RS232 lead!" - Lucy, the OrangUtan Robot Girl
http://tinyurl.com/3fv6z
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.2.4 (GNU/Linux)
iQEVAwUBQzQkpHcLPEOTuEwVAQERAAf+LajzQuW8TnvnElH0dz0WU7ysmSsO+mHi
I9GquxE0sPsEOEX0HH3G85mN7afBQrduFmTWURw8FkuRM2gPYLWWc9dCNqt1qemE
wKxt6W0caagAX1MzEduOO+E1Y01B9AKiBQBiHHfdjDkWgwaPiPOk4QGsPYpk72QX
8T/J9k11lnnPe/z1C00CNLw33gJBXHDcswCcAfnkvYKX+OxNQrMMAiUuCt7U1qWt
iEEdnaahg7mAKRmhgAZC5wXsG/eA2gvNgExMisVDBo/cc+Kb21C5g5XxYy5PFUBG
XpDpn8is1FWjOIOgH/8S9zrTacPyHQq8pnT3mmlts7xeEErin0gS6w==
=yPno
-----END PGP SIGNATURE-----
> (file name generated through article and revision ID). That would save
> lots of space in the database, and not interfere with important ongoing
> operations like revert wars :-) and still keep the "really old" versions accessible.
Welcome, Magnus, to Wikimedia. We've been holding our
archives on ExternalStore servers, which are cheapo Apache
boxes. Migration is done by compressOld.
Domas
{{localurl:}} and {{localurle:}} were originally intended for use only
in the HTML sections of the language files. But since they were
introduced, they've become popular in wikitext, for example to generate
an edit URL for an arbitrary page, the title of which is specified in a
template parameter. Unfortunately, they're poorly suited for this use,
since wikitext requires absolute URLs. It doesn't understand local URLs,
like anchor tags in HTML do. The standard workaround is to use this:
{{SERVER}}{{localurl:page}}
but as Mormegil pointed out to me on IRC, that breaks if page is an
interwiki link. Interwiki URLs are always absolute. The standard
solution used in the MediaWiki API is to use the Title::getFullURL()
function. Accordingly, I've implemented an interface to this function,
{{fullurl:}}. So instead of the above, you can use:
{{fullurl:page}}
For completeness, I also implemented {{fullurle:}}, but it has no
practical applications that I can think of. Please use {{fullurl:}} and
{{localurl:}} in wikitext, not the escaped counterparts. Escaped URLs
only work in wikitext because of an ugly backwards-compatibility hack.
-- Tim Starling
Hello, All!
As you know,
not all languages presented in Wikipedia
by comparable amount of articles.
There are "rich" languages as English and German,
and there are not-so-rich as Russian, for example.
It is not good, but it is also not bad. It's just the fact.
Some language sections lacks terms already defined in many others.
Also, in Russian there is many notions,
mostly very modern Hi-Tech-related,
which lacks "canonical" translation to Russian,
but have a well-established form in English.
I am sure that in other languages the problem exists too.
Wiki propose no means, AFAIK, how to deal with notion
having no canonical term to represent it in some language.
Even existing ways known to me to circumvent the obstacle
are not convenient (see below).
Let us consider some not-so-rich language,
which will be denoted as "current" from here and below.
Suppose that its Wiki section is actively developping.
What an article editor should do if he has to use a notion
that should be defined, but no article on that notion exist
in _current_ language section?
An editor may have no time to make a new article,
he may doubt in desired article's title
because he do not know the most correct term,
or even if he is sure in the word used in the place,
he could be unsure in title of the _nest_
to which this word has to be belong
(there are situations where making separate articles per word is bad).
There are now only 3 ways:
use some words but skip definitions (which is poor),
make a new stub article (which is time consuming
and could cause future conflicts),
make some "external" link at least to another Wiki language section.
Now I propose a new feature that will aid
to exploit existance of other Wiki languages,
but not just with a [[:lang:term]] link.
Like [[:lang:term]], we should select a "default" language to fallback,
but this default will be used more sophisticately
than merely redirect the reader to the article denoted.
Suppose a construction similar to [[:lang:term]], like
[[:&en:some rare term|non-English term]]
( with English as "default" in this case ),
which, in case of browsing,
has to be interpreted in the following manner:
* Compose some special URL for a "lingual fallback" href;
* When such URL GET, retrieve specified article in "default" language;
* Extract from there the trailing list of interwiki links,
* When detecting a link to the current language, consider this fallback obsolete,
start some automated process replacing such fallbacks by "normal" links,
and end up redirecting browser to this "normal" article;
* Otherwise, give to the browser
the list of interwiki links plus the link to "default" language article,
means the list of languages where desired notion is defined by now;
* Also give a prompt to create the missing article
under some _suggested_ title (e.g. link text where the fallback originates from),
but not a _fixed_ title choosen earlier.
* Let user make his choice manually;
* Maybe, add a delayed browser redirect to one of these links,
possibly to default language's one,
but generally respecting user's preferences.
I'm quite new to the Wikipedia project as contributor.
I tried to read Wiki's FAQs about multi-language coordination,
but found there only trivial instructions.
May be, some techinque already exist I'm not aware of,
but the proposed scheme has the following advantages:
* Fallback creator will not be forced to choose the *future* name
of the missing article in current language
( comparing to leaving a link to unborn article );
* User has not to load texts in foreign languages,
which he may be unable to display or to understand,
or which may be excessive if article in current language exist
( comparing to making foreign links [[:lang:term]] );
* User will be prompted, but not obliged,
to start the missing article, probably using existing foreign analogs,
and already equipped with interwiki links.
* Fallback creator should select, among article versions in many languages,
a "default" fallback language for a link,
possible based on its proximity to current language,
spread in current-language-speaking countries,
content quality and relevance, or some weighted sum of these factors.
Regards,
--
qq~~~~\ [ ЗА IP БЕЗ ЦЕНЗУРЫ ]
/ /\ \ [ inCTV News ] news://news1.inCTV.ru/
\ /_/ /
\____/
Dear Rowan,
Thank you very much for you answer.
But I took a look and found several changes. We use mediawiki version 1.4.5
- The file Monobook.css doesn't exists. I think it has been replaced by skins/monobook/main.css
- I don't know how and where to place, instructions like those in this file:
.ns-4 * #content { background : #F4F4F4; }
.ns-4 * #p-cactions li { background : #F4F4F4; }
.ns-4 * #p-cactions li a { background : #F4F4F4; }
I found some subs like #contentSub{}, #p-cactions li{} and #p-cactions li a{}.
Is it in these subs that I need to place these instructions? How?
Many thanks in advance for your kind help and have a nice day,
Proth
>The ns-x are already there, as are the namespaces - Talk:, User:,
>Wikipedia: etc. Each has a number used inside the software (0 for the "main"/"article" namespace, 1 for "talk", etc), and
>these appear in the source of pages as CSS selectors. See http://meta.wikimedia.org/wiki/Help:Namespace for more.
>
>Perhaps the best explanation is by example:
>
>http://en.wikipedia.org/wiki/MediaWiki:Monobook.css contains code to make everything except articles have a blue background; >it starts by making everything blue, and then changes "ns-0" (articles) back to
>white:
>
>#content {
> background: #F8FCFF; /* a light blue */ }
>
>.ns-0 * #content {
> background: white;
>}
>
>[it goes on to do this for things like the background of the tabs at the top, so they fit better]
>
>http://fr.wikipedia.org/wiki/MediaWiki:Monobook.css takes a more long-winded approach, and defines a default "sky blue"
>background but then overrides it for each namespace one by one:
>
>/* Couleur par défaut */
>#content { background : #F8FCFF; } /* bleu ciel */ #p-cactions li { background : #F8FCFF; } #p-cactions li a { background : >#F8FCFF; }
>
>/* Couleur de fond des articles */
>.ns-0 * #content { background : white; } .ns-0 * #p-cactions li { background : white; } .ns-0 * #p-cactions li a {
>background : white; }
>
>[and so on for ns-1, ns-2, ns-3, etc]
>
>HTH
>--
>Rowan Collins BSc
>[IMSoP]
Are cookies enabled in your MediaWiki configuration, or more
precisely, the local PHP configuration? As the message occurs when
connecting to that, assume that the issue lies with your local
distribution.
Rob Church
On 23/09/05, bugzilla-daemon(a)mail.wikimedia.org
<bugzilla-daemon(a)mail.wikimedia.org> wrote:
> http://bugzilla.wikimedia.org/show_bug.cgi?id=3535
>
> Summary: Can not login, claims cookies are disabled
> Product: MediaWiki
> Version: 1.5rc4
> Platform: Macintosh
> OS/Version: Mac OS X 10.4
> Status: NEW
> Severity: normal
> Priority: Normal
> Component: User login/settings
> AssignedTo: wikibugs-l(a)wikipedia.org
> ReportedBy: cabi(a)mit.edu
>
>
> I am running Mediawiki 1.5rc4 with PHP 5.0.5. I can create accounts but cannot login. Error message is "XXX ses cookies to log in users.
> You have cookies disabled. Please enable them, then log in with your new username and password." But I am pretty sure that my cookies
> are enabled. I can login other mediawiki sites. Error persists in both Mac and Windows machines with several different browsers.
>
> --
> Configure bugmail: http://bugzilla.wikimedia.org/userprefs.cgi?tab=email
> ------- You are receiving this mail because: -------
> You are the assignee for the bug, or are watching the assignee.
> _______________________________________________
> Wikibugs-l mailing list
> Wikibugs-l(a)Wikipedia.org
> http://mail.wikipedia.org/mailman/listinfo/wikibugs-l
>
Hi All,
I was trying to download sql format data of june 23 2005 full_pages.sql from the following link, but was unable to do that
http://download.wikimedia.org/wikipedia/en/
I have installed wikimedia software on windows and trying to connect the database to that but in the firstplace i wasn't able to download more that 4GB of total data. Please guide me in doing this.. i have even tried on windows server 2003 and it didn't help.
Anyhelp from you is appreciable and help me in progress of my project.
Thanks,
Sravanthi
---------------------------------
Yahoo! for Good
Click here to donate to the Hurricane Katrina relief effort.
A few questions about database tables:
- It is safe to assume that, after a successful upgrade to 1.5, the
following tables are not needed and can be dropped: blobs,
brokenlinks, cur, links, linkscc and user_rights?
- maintenance/tables.sql specifically singles out searchindex as
having to be MyISAM. What about trackbacks, transcache and
user_newtalk, whose create instructions do not specify an storage
engine?
- And speaking of transcache, it is needed for day-to-day
operations, or it is an artifact of the updating procedure? (It is not
listed in maintenance/tables.sql.)
Thanks,
/L/e/k/t/u
Hello,
Thank you for your message, but I do not understand exactly...
I know where the CSS is but:
- Where could I find those namespaces?
- Where should I insert or modify those 'ns-x'?
Many thanks in advance and have a nice day,
PRoh
>Bonjour,
>Using CSS, you can change page color according to the namespace. Tthe
>content div have a class of 'ns-x' where 'x' is the namespace number.
>Using JS you can make many more things like change color according to
>the page name or so on (but don't forget not everybody using JS).
>
>Aoineko
>