Thomas, does your SVN access allow you to create namespaces on wikis?
If so, then he.wikisource will need a "Page" namespace in order to implement
ProofreadPage.
The local name for the namespace would be "עמוד" (four characters).
If you have the access to do it, please confirm. If not, please let me know
so that I can file a bug for it.
Dovi
---------------------------------
Shape Yahoo! in your own image. Join our Network Research Panel today!
This extension is outstanding.
A very big thank-you to ThomasV,
who both helped to develop it and implemented it on Wikisource.
In other news, he.wikisource just passed 2000 texts. A very high percentage
of them were typed in by hand. In the future, I hope that Proofread Text can
help contribute to this process.
Dovi
---------------------------------
Fussy? Opinionated? Impossible to please? Perfect. Join Yahoo!'s user panel and lay it on us.
I am pleased to announce that the ProofreadPage extension has been enabled
on all subdomains.
Local admins need to set the Proofreadpage_namespace variable before using
it.
Thomas
> ----- Ursprüngliche Nachricht -----
> Von: Yann Forget
> Gesendet: 07.06.07 20:58 Uhr
> An: discussion list for Wikisource, the free library
> Betreff: Re: [Wikisource-l] Following the conventions: seperating
> Wikisource
>
> Hello,
>
> Birgitte SB a écrit :
> > --- Mohamed Magdy <mohamed.m.k(a)gmail.com> wrote:
> >
> >> Other ideas say that Wikisource project is special
> >> somehow that some of the languages will never be big enough (text
> >> collections, contributors
> >> etc...) to deserve its own domain or there will
> >> never be enough community for it..and because of that they are
> >> better placed in one
> >> place (I don't really see the objection of making
> >> new sub-domains or wikis, does it cost?)
> >
> > Yes is costs. It costs a large amount of labor for
> > each separate subdomain to be montiored by admins.
> > Currently texts in language which have not gathered a
> > community around them (and some never will) are
> > montiored from a single list of recent changes. Any
> > problems readers encounter are able to be answered by
> > the community that has developed to look over these
> > texts. Subdomains are only good when a community to
> > inhabit them exists.
>
> Yes, that's the most important point. The community should be the main
> factor for a decision to separate a language into a subdomain. Technical
> issues should not.
>
> > BirgitteSB
>
> Best regards,
>
> Yann
> --
> http://www.non-violence.org/ | Site collaboratif sur la non-violence
> http://www.forget-me.net/ | Alternatives sur le Net
> http://fr.wikipedia.org/ | Encyclopédie libre
> http://fr.wikisource.org/ | Bibliothèque libre
> http://wikilivres.info | Documents libres
>
> _______________________________________________
> Wikisource-l mailing list
> Wikisource-l(a)lists.wikimedia.org
> http://lists.wikimedia.org/mailman/listinfo/wikisource-l
>
The issue:
http://wikisource.org is supposed to be a portal, a portal with links to
the different versions of the project. just like all the other projects,
the main domain a portal..the sub-domains contains the different
versions..except for the multilingual projects (commons + meta) which
don't have any sub-domains and have their main pages in English as
default and main pages in many other languages. so commons.wikimedia.org
or meta.wikimedia.org are considered to be the 'portals' or better..they
don't have portals (all content on one wiki) or there are no
metawiki.org or commonswiki.org so they don't have a portal and that is
of course made intentionally (they don't need that ball surrounded by
languages)..
When we look at Wikisource, we find it somehow disorganized..first..the
new languages are created inside wikisource.org wiki (which as I know
was once an oldwikisource) until they accumulate some unknown number of
pages then they move to their own sub-domain.. isn't that the exact
purpose of the incubator? develop the language to some extent to test
its potential?
Other ideas say that Wikisource project is special somehow that some of
the languages will never be big enough (text collections, contributors
etc...) to deserve its own domain or there will never be enough
community for it..and because of that they are better placed in one
place (I don't really see the objection of making new sub-domains or
wikis, does it cost?) ..so, following the conventions, as Wikisource
does contain sub-domains, the address http://wikisource.org should be
made only a portal..not a Wiki! and the Multilingual collections (that
didn't get a sub-domain and probably will never in the near future) be
placed on a sub-domain of wikisource.org, perhaps
multi.wikisource.org..not the current mix..having both the portal and
the multilingual wiki on the same place.. that is a solution..other
solution would be to move all the content on the wiki that shouldn't
exist on wikisource.org to the incubator and remove that wiki and just
add the portal like the other projects..but I seriously doubt if the
languages will get out of the incubator...in any solution, these
languages and cultures should be featured, the ignorance of myself to
certain language doesn't make it less or bad..
Now I feel that I wrote much to describe ;)
Hello,
I agree entirely with Dovi and I could not have said it better.
In addition, it would be quite proper to discuss this with the community
there first. There is also a mailing list for Wikisource.
Regards,
Yann
Dovi Jacobs a écrit :
> Hi. The model that Wikisource follows here is similar to Wikiversity:
>
> Just as at Wikiversity, the Wikisource "incubator" is within Wikisource itself. We consider this to be a much more supportive (and better monitored) environment for new languages (rather than the generic incubator.wikimedia.org) for a number of reasons. When such languages are ready, they can then recieve their own subdomains. Until then, they always have a proper place to build their content.
>
> In fact, my personal suggestion is that new test languages for all existing Wikimedia projects (Wikipedia, Wikibooks, etc.) should be hosted by those projects themselves, rather than at a generic incubator. The Wikiversity/Wikisource model works very nicely indeed, providing a closer sense of a project-wide environment for new test-languages, with a common logo and framework for parallel new languages in the project, while the generic incubator is rather cold and unfriendly (take a look at its main page). There is no way that a single separate wiki for all new languages in all projects at once can provide proper guidance, supervision, and monitoring. Perhaps the incubator would be better left for testing entirely new Wikimedia projects.
>
> As for the Wikisource portal, because it is at wikisource.org rather than on Meta, you will find that it is much better supported than the portals for other Wikimedia projects, which are often out-of-date ("out-of-site" >> "out-of-mind"), and often have aesthetic or other problems that take longer to fix. People go to Wikisource and make direct suggestions for Portal updates right there at the talk page, and Wikisource admins take care of things immediately because they are always around at the wiki. Here too, this may be a better model than the convention for other projects.
>
> Dovi Jacobs
--
http://www.non-violence.org/ | Site collaboratif sur la non-violence
http://www.forget-me.net/ | Alternatives sur le Net
http://fr.wikipedia.org/ | Encyclopédie libre
http://fr.wikisource.org/ | Bibliothèque libre
http://wikilivres.info | Documents libres
I'd like to invite you to participate in a survey about Wikimedia's
brands, their uses, and possible changes to our brand strategy:
http://meta.wikimedia.org/wiki/Wikimedia_brand_survey
Thank you.
--
Peace & Love,
Erik
DISCLAIMER: This message does not represent an official position of
the Wikimedia Foundation or its Board of Trustees.
"An old, rigid civilization is reluctantly dying. Something new, open,
free and exciting is waking up." -- Ming the Mechanic
Sorry for the French. In short, it is about the release of archives from
Bad Arolsen on the Shoah. It could be interesting for Wikisource.
Yann
=======
Bonjour,
Il peut y avoir des documents intéressants pour Wikisource.
Yann
*mardi 15 mai 2007, 18h44*
*Une commission de 11 pays décide de commencer le transfert des archives
nazies*
Les diplomates de 11 pays se sont mis d'accord mardi pour commencer à
distribuer des copies électroniques de documents provenant des archives
nazies gardées à Bad Arolsen, en Allemagne
<http://fr.fc.yahoo.com/a/allemagne.html>, afin de les mettre pour la
première fois depuis plus d'un demi-siècle à la disposition des
chercheurs travaillant sur la Shoah.
Les onze pays supervisant le Service international de recherches (SIR),
branche du Comité international de la Croix-Rouge qui gère les archives
de Bad Arolsen, ont décidé de commencer dès que possible le transfert
des documents scannés.
Une mesure qui contourne l'obligation légale de garder les documents
confidentiels tant que l'ensemble des 11 pays n'auront pas ratifié
l'accord de 2006 sur l'ouverture des archives. Quatre pays ne l'ont pas
encore fait. La décision de mardi devrait avoir pour effet d'accélérer
de quelques mois la distribution des documents.
Le mémorial de Yad Vashem à Jérusalem a salué la mesure. "Je suis ravi
de voir ce projet avancer", a déclaré son directeur Avner Shalev.
Jusqu'à présent, les volumineuses archives sur la mort, l'esclavage
<http://fr.news.yahoo.com/colonisation.html> ou l'oppression de 17
millions de juifs, Roms et autres victimes des camps de la mort,
n'étaient accessibles qu'aux victimes et à certains membres de leur
famille, et étaient utilisées pour chercher des disparus ou étayer des
demandes d'indemnisation.
Les archives contiennent 30 millions de documents, rangés sur 25
kilomètres d'étagères. La décision de hâter leur transfert vers les
archives nationales des 11 pays vise à éviter de nouveaux retards dans
leur ouverture à des proches de victimes et aux historiens.
Selon le protocole conclu l'an dernier, une seule copie des documents
sera disponible dans chacun des pays et pourra être consultée "sur les
lieux d'un dépôt d'archives approprié". Chaque gouvernement devrait
également prendre en compte "le caractère sensible de certaines
informations" contenues dans les documents.
Les sept pays ayant ratifié le protocole de 2006, qui modifie un traité
de 1955, sont les Etats-Unis <http://fr.fc.yahoo.com/u/usa.html>, Israël
<http://fr.fc.yahoo.com/p/proche-orient.html>, la Pologne
<http://fr.news.yahoo.com/p/pologne.html>, l'Allemagne, la Belgique
<http://fr.fc.yahoo.com/b/belgique.html>, les Pays-Bas et la
Grande-Bretagne <http://fr.fc.yahoo.com/r/ru.html>. En revanche, la
France, la Grèce, l'Italie <http://fr.fc.yahoo.com/i/italie.html> et le
Luxembourg ne l'ont pas encore ratifié. AP
lma/v434/tl
--
http://www.non-violence.org/ | Site collaboratif sur la non-violence
http://www.forget-me.net/ | Alternatives sur le Net
http://fr.wikipedia.org/ | Encyclopédie libre
http://fr.wikisource.org/ | Bibliothèque libre
http://wikilivres.info | Documents libres
Indeed, interesting for Wikisource.
Yann
============
Hi friends,
A few of us know that I'm very insteresting in having a technical way
to print one article or a set of article as a PDF document.
They are two technical approaches :
* A wiki2pdf tool
* A html2pdf tool
Both do not exist, better : they exists but are not really satisfying.
The arguments behind this opinion are pretty the same than thus I have
used again people who wanted to build an offline-reader without an
HTML renderer. So, I believe the first will never be satisfying and
that's why I really take care about the last developments of the
second solution.
The most interesting project is the Cairo one :
http://cairographics.org/news/cairo-1.4.6/
This is the new graphics rendering library behind gecko 1.9, itself
used in the next version of Firefox 3. If we are lucky it will used
too with the next releases of Kiwix.
http://www.mozilla.org/roadmap/gecko-1.9-roadmap.html
Bot projects (Cairo & Gecko 1.9) have made a lot of progress the last
monts, and it's pretty sure that the next firefox will have a "save as
PDF" feauture.
A private company have build a web page to test the rendering of web
pages as PDF using the last unstable version of gecko (still in
development) :
http://gecko.dynalivery.com/
I have done a test with one of our articles :
http://195.221.21.162/enwiki/dvd/art/1.html
as Attachment the result.
IMHO, it will certainly possible for us to integrate that feature in a
middle-term.
Best regards
Emmanuel
--
http://www.non-violence.org/ | Site collaboratif sur la non-violence
http://www.forget-me.net/ | Alternatives sur le Net
http://fr.wikipedia.org/ | Encyclopédie libre
http://fr.wikisource.org/ | Bibliothèque libre
http://wikilivres.info | Documents libres