Hello,
I am interested in using the Wiktionary API (located at
en.wiktionary.org/w/api.php) and was having trouble finding any information
on what is acceptable commercial use. If there are any controlling
documents on the subject, can you please direct me to them? In particular,
I would like to know if there are any restrictions on the number of requests
allowed in a given time period, and if there are any other restrictions on
volume or frequency of use that I should keep in mind.
If determining acceptable use of the API remains a subjective exercise, let
me explain how I would like to use it and perhaps you can tell me if my
intended use is acceptable.
I am starting a new language translation service bureau that will use online
tools to make the translation process more accurate and less expensive for
the end customer. We also intend to offer free access to our tools to any
open source project or non-profit organization (in such a case, they would
be free to use our project management, version control, and translator tools
free of charge, but they would have to find their own volunteer translators
to do the actual translation work).
As part of our translation tool set, we would like to provide access to
monolingual and bilingual dictionaries. Wiktionary appears to be the
perfect choice for this. We would like to use the Wiktionary API to fetch
words that are requested by our users (translators) and then render them on
our own servers for viewing in our translator tool. We would keep a local
cache of fetched documents to minimize the number of API calls that we need
to make. We will, of course, give proper attribution, etc., but it is
possible that we will eventually be making quite a large number of requests,
so we thought we should check with you first.
Is that acceptable use?
Cheers,
James
Hello!
You are receiving this email because your project has been selected to
take part in a new effort by the PHP QA Team to make sure that your
project still works with PHP versions to-be-released. With this we
hope to make sure that you are either aware of things that might
break, or to make sure we don't introduce any strange regressions.
With this effort we hope to build a better relationship between the
PHP Team and the major projects.
If you do not want to receive these heads-up emails, please reply to
me personally and I will remove you from the list; but, we hope that
you want to actively help us making PHP a better and more stable tool.
The second release candidate of PHP 5.2.11 was just released and can
be downloaded from http://downloads.php.net/ilia/, the win32 binaries
are available athttp://windows.php.net/qa/. Please try this release
candidate against your code and let us know if any regressions should
you find any. The goal is to have 5.2.11 out within two to three weeks
time, so timely testing would be extremely helpful.
In case you think that other projects should also receive this kinds
of emails, please let me know privately, and I will add them to the
list of projects to contact.
Best Regards,
Ilia Alshanetsky
5.2 Release Master
I want to add that Okawix uses code from the pre-ZIM GPL'ed ZenoReader and
ZenoWriter which has been developed by the openZIM team before we started
ZIM, but they changed it to be incompatible with Zeno and ZIM.
So Okawix can be regarded to be as proprietary as well as also a GPL
violation.
We have been contacted by Linterweb (the company behind Okawix) several times
and we also invited them to the developers meeting, but actually they do not
seem to be able for a collaboration with an open source community.
The Wikimedia Foundation had a similar experience when trying to work with
them.
/Manuel
Am Mittwoch, 2. September 2009 schrieb Gerard Meijssen:
> Hoi,
> For you information Okawix is localised at translatewiki.net.
> Thanks,
> GerardM
>
> http://translatewiki.net/wiki/Translating:Okawix
>
> 2009/9/2 Manuel Schneider <manuel.schneider(a)wikimedia.ch>
>
> > Hi Chengbin, hi list,
> >
> > static.wikimedia.org is currently not being updated and while the dumps
> > processing has been assigned to and completely rewritten by Tomasz Finc
> > (developer at WMF), there has not been made any assignment concerning
> > HTML dumps.
> >
> > We had a Wikipedia Offline meeting at Wikimania last week and discussed
> > several issues. One issue is the fact, that WMF wants to see the ZIM file
> > format being used for offline dumps and has suggested to include it into
> > the
> > regular dumping process.
> > So one question was: When will that happen, what is the status of WMF ZIM
> > dumping?
> > As ZIM uses HTML extracts Tomasz clarified that once
> > static.wikimedia.orghas been rebuild to be stable and sutainable,
> > integrating ZIM would be trivial. But he also informed us that this task
> > has not yet been assigned.
> >
> > As Brion Vibber and Erik Möller have been at the meeting as well we hope
> > that
> > this assignment will be made soon and this task has got higher priority.
> >
> > This said I may also advise you not to you use the pure HTML dumps but
> > the ZIM
> > files for your Archos, because that's what they are meant for.
> > A ZIM file containing all german Wikipedia articles (>900,000) is 1,4 GB,
> > an
> > additional full text search index takes another 1 GB.
> >
> > Greets,
> >
> >
> > Manuel
> >
> > Am Mittwoch, 2. September 2009 schrieb Chengbin Zheng:
> > > I bring this old issue up because I want to know if (or if not)
> > > progress (or plans) are made to update the static HTML version of
> > > Wikipedia. B&H photos just leaked the next generation of Archos
> > > portable media players. Unbelievably, the rumors of a 500GB version is
> > > true! This is already tempting (especially the price at $420). Just
> > > waiting for specs
> >
> > on
> >
> > > September 15, the Archos event. I really hope it will support NTFS so I
> >
> > can
> >
> > > use the compression feature.
> > >
> > > It would be really cool and convenient to have an offline copy of
> >
> > Wikipedia
> >
> > > anywhere I go without the need of Wi-Fi. What am I gonna do with 500GB?
> > >
> > > BTW, does anyone know what is the size of the current static HTML
> > > English Wikipedia version uncompressed? Thanks.
> > > _______________________________________________
> > > Wikitech-l mailing list
> > > Wikitech-l(a)lists.wikimedia.org
> > > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> >
> > --
> > Regards
> > Manuel Schneider
> >
> > Wikimedia CH - Verein zur Förderung Freien Wissens
> > Wikimedia CH - Association for the advancement of free knowledge
> > www.wikimedia.ch
> >
> > _______________________________________________
> > Wikitech-l mailing list
> > Wikitech-l(a)lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
> _______________________________________________
> Wikitech-l mailing list
> Wikitech-l(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
--
Regards
Manuel Schneider
Wikimedia CH - Verein zur Förderung Freien Wissens
Wikimedia CH - Association for the advancement of free knowledge
www.wikimedia.ch
> [...] at this time Okawix was NOT GPL and NOT available in
source code
> (but yet published and being sold on DVD).
If is it about enwiki's Version 0.5, it was using KiwiX :
http://suggestusability.blogspot.com/2007/04/wikipedia-05.html
I'm not convinced the software was already forked back then,
actually.
Wolfgang "Darkoneko" ten Weges
---------------------- ALICE N°1 de la RELATION CLIENT 2008*--------------------
Découvrez vite l'offre exclusive ALICE BOX! En cliquant ici http://abonnement.aliceadsl.fr Offre soumise à conditions.*Source : TNS SOFRES / BEARING POINT. Secteur Fournisseur d.Accès Internet
---------- Forwarded message ----------
From: Brion Vibber <brion(a)wikimedia.org>
Date: Tue, Sep 1, 2009 at 11:48 AM
Subject: Re: [Wikitech-l] how to chang {{SITENAME}}
To: Wikimedia developers <wikitech-l(a)lists.wikimedia.org>
[snip]
I'd like to ask that folks leave this thread aside for the moment other
than useful replies to the original poster's request about how and where
to propose changing $wgSitename for ja.wikipedia.org.
If the code paths for setting up and running the parser to do brace
substitution in messages were significantly faster, we wouldn't bother
optimizing a few '$1 - {{SITENAME}}'s to '$1 - Wikipedia' on a few of
our high-traffic sites with stable names.
Brace substitution in messages is done using a limited mode of the
parser, much as it is in templates and wiki pages; avoiding braces in
messages that appear on parser-cached pages means that we avoid having
to initialize the parser, which can be a very noticeable win on a
high-traffic site.
If anyone is interested in actually looking into the costs of Parser
setup and invocation for brace replacement in messages and optimizing
this code path, that would be great, but please follow up in a new
thread and only post _new_ information or questions, not repeats of
what's already been said.
Thanks.
-- brion vibber (brion @ wikimedia.org)
_______________________________________________
Wikitech-l mailing list
Wikitech-l(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Let me be the one to fork this into a new thread. We obviously need
to speed up these kinds of things and a nice roundtable discussion
is a great place to start.
-Chad
Greetings.
Can anyone provide a status update regarding flaggedrevs.labs.wikimedia.org ?
In the future perhaps it would be better to import simple english
Wikipedia for enwp testing: The lack of templates makes the site look
extensively vandalized already. I'm guessing that an alternative
english language project would be more useful than a subset of enwp.
:)
Hi Chengbin,
ZIM is an upcoming standard for using HTML contents offline. It is derived
from the Zeno file format used on the german Wikipedia DVDs since 2006 (ZIM =
Zeno IMproved).
There are currently several reader applications for it, for instance the
zimreader made by the openZIM project or Kiwix.
There are some ports around like Kiwix on Windows and zimreader on openmoko /
ARM.
The zimreader by openZIM works like a small webserver, it serves the contents
of the ZIM file locally.
Once the HTML dump on static.wikimedia.org is fixed and ZIM file creation has
been integrated you will be able to download fresh ZIM files of all Wikimedia
projects directly from download.wikimedia.org.
Currently the Kiwix team has created some ZIM files and we try to build a ZIM
file directory:
http://openzim.org/ZIM_File_Archive
ZIM actually stores the article text portion of the HTML output of the Wiki in
a compressed cluster. It can hold also all type of other MIME types such as
images, CSS files etc.
http://openzim.org/ZIM_File_Format
It is an open standard and has currently been developed and implemented by the
openZIM team (sponsored by Wikimedia CH) in C++. There is a library (zimlib)
which can be integrated in other reader or dumping applications to make them
ZIM-aware.
Using the open documentation ZIM can be implemented in any other language as
well.
The idea of ZIM is to make the data files freely interchangeable with any
reader application. It is also flexible enough to store other works than only
data from Wikipedia/MediaWiki. Then it tries to keep the reader application
as simple and stupid as possible. There is only uncompression and HTML
rendering to be done while a HTML renderer should be available on nearly all
devices.
Greets,
Manuel
Am Mittwoch, 2. September 2009 schrieb Chengbin Zheng:
> On Wed, Sep 2, 2009 at 8:13 AM, Manuel Schneider <
>
> manuel.schneider(a)wikimedia.ch> wrote:
> > Hi Chengbin, hi list,
> >
> > static.wikimedia.org is currently not being updated and while the dumps
> > processing has been assigned to and completely rewritten by Tomasz Finc
> > (developer at WMF), there has not been made any assignment concerning
> > HTML dumps.
> >
> > We had a Wikipedia Offline meeting at Wikimania last week and discussed
> > several issues. One issue is the fact, that WMF wants to see the ZIM file
> > format being used for offline dumps and has suggested to include it into
> > the
> > regular dumping process.
> > So one question was: When will that happen, what is the status of WMF ZIM
> > dumping?
> > As ZIM uses HTML extracts Tomasz clarified that once
> > static.wikimedia.orghas been rebuild to be stable and sutainable,
> > integrating ZIM would be trivial. But he also informed us that this task
> > has not yet been assigned.
> >
> > As Brion Vibber and Erik Möller have been at the meeting as well we hope
> > that
> > this assignment will be made soon and this task has got higher priority.
> >
> > This said I may also advise you not to you use the pure HTML dumps but
> > the ZIM
> > files for your Archos, because that's what they are meant for.
> > A ZIM file containing all german Wikipedia articles (>900,000) is 1,4 GB,
> > an
> > additional full text search index takes another 1 GB.
> >
> > Greets,
> >
> >
> > Manuel
> >
> > Am Mittwoch, 2. September 2009 schrieb Chengbin Zheng:
> > > I bring this old issue up because I want to know if (or if not)
> > > progress (or plans) are made to update the static HTML version of
> > > Wikipedia. B&H photos just leaked the next generation of Archos
> > > portable media players. Unbelievably, the rumors of a 500GB version is
> > > true! This is already tempting (especially the price at $420). Just
> > > waiting for specs
> >
> > on
> >
> > > September 15, the Archos event. I really hope it will support NTFS so I
> >
> > can
> >
> > > use the compression feature.
> > >
> > > It would be really cool and convenient to have an offline copy of
> >
> > Wikipedia
> >
> > > anywhere I go without the need of Wi-Fi. What am I gonna do with 500GB?
> > >
> > > BTW, does anyone know what is the size of the current static HTML
> > > English Wikipedia version uncompressed? Thanks.
> > > _______________________________________________
> > > Wikitech-l mailing list
> > > Wikitech-l(a)lists.wikimedia.org
> > > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> >
> > --
> > Regards
> > Manuel Schneider
> >
> > Wikimedia CH - Verein zur Förderung Freien Wissens
> > Wikimedia CH - Association for the advancement of free knowledge
> > www.wikimedia.ch
> >
> > _______________________________________________
> > Wikitech-l mailing list
> > Wikitech-l(a)lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
> I'm not familiar with the file extension .zim. What is that? Some sort of
> compressed html format like .chm? Where can I get a .zim file? I need to
> get check if this format is compatible with my Archos's Opera browser.
> _______________________________________________
> Wikitech-l mailing list
> Wikitech-l(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
--
Regards
Manuel Schneider
Wikimedia CH - Verein zur Förderung Freien Wissens
Wikimedia CH - Association for the advancement of free knowledge
www.wikimedia.ch
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
Hello,
we try to exclude from
- - RSS/Atom feeds of RC and New Pages list generally and
- - Special Pages of RC and New Pages for users not in a custom user group
all entries or __at least__ the diffs and new page content for all pages
of a custom namespace.
Where is the right place to hook in and check the entries (or part of
content for them) to be potentially excluded?
Where can we configure Feed content/structure?
Uwe (Baumbach)
U.Baumbach(a)web.de
Besuchen Sie den 61. Deutschen Genealogentag!
11.-14. September 2009
http://www.genealogentag.de
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.7 (MingW32)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/
iD8DBQFKnoHNFEbayCH8zXkRAvbiAJ9O+rhrpVolxSscs0Xvl7QkndM7QwCg8oY8
vpHzObrmzt5mZlPGMwn0KJY=
=AnhX
-----END PGP SIGNATURE-----
I bring this old issue up because I want to know if (or if not) progress (or
plans) are made to update the static HTML version of Wikipedia.
B&H photos just leaked the next generation of Archos portable media players.
Unbelievably, the rumors of a 500GB version is true! This is already
tempting (especially the price at $420). Just waiting for specs on September
15, the Archos event. I really hope it will support NTFS so I can use the
compression feature.
It would be really cool and convenient to have an offline copy of Wikipedia
anywhere I go without the need of Wi-Fi. What am I gonna do with 500GB?
BTW, does anyone know what is the size of the current static HTML English
Wikipedia version uncompressed? Thanks.
Where can I see the lucene-search configurations (lsearch.conf,
lsearch-global.conf) of wikimedia?
Could you published them in http://noc.wikimedia.org/conf/ ?