Hi,
Is it possible to transfer large djvu files (more than 100 mb) from
Internet Archive to Commons? I have tried IA-upload and url2commons tool
but not succeeded.
Regards,
--
Bodhisattwa
Hi Wikisourcers,
We in Cascadia Wikimedians have been contacted regarding the topic of
Wikimedia resources that could be used to host materials related to Alaska
Native Elder stories. I'm familiar with Wikisource largely from what I've
heard about it. I've only made one edit. Is there an experienced
Wikisourcer who might be available for me to meet with via IRC or Hangouts
sometime, so that I can get familiar with the basics and provide
appropriate guidance regarding the possibility of uploading Alaska Native
Elder stories to Wikisource?
Thanks!
Pine
Hi,
For a long time Indic languages Wikisource projects depended totally
on manual proofreading, which not only wasted a lot of time, but also
a lot of energy. Recently Google has released OCR software for more
than 20 Indic languages, along with other Asian languages. This
software is far far better and accurate than the previous OCRs. But it
has many limitations. Uploading the same large file two times (one
time for Google OCR and another at Commons) is not an easy solution
for most of the contributors, as Internet connection is way slow in
India. Now if we develop a tool which can feed the uploaded pdf or
djvu files of Commons directly to Google OCRs, so that uploading them
2 times can be avoided.
This was proposed in 2015 community wishlist. Now, as the voting
procedure for the wishlist has been started, the proposal needs your
support. Please follow the link-
https://meta.wikimedia.org/wiki/2015_Community_Wishlist_Survey/Wikisource#T…
FYI, this proposal was also accepted as a highest priority need at the
2015 Wikisource Conference in Vienna.
(https://etherpad.wikimedia.org/p/wscon2015needs)
Regards
--
Bodhisattwa Mandal
Administrator, Bengali Wikipedia
''Imagine a world in which every single person on the planet is given
free access to the sum of all human knowledge.''
I stumbled upon a PDF from a Nationa Library of Naples scan:
https://github.com/rohit-dua/BUB/files/132335/Comedy_of_the_Mistake_By_Sir_…
The usual ugly menace in the first page (arguably copyfraud)
disappeared, instead I only see once sentence: «This is a reproduction
of a library book that was digitized by Google as part of an ongoing
effort to preserve the information in books and make it universally
accessible», followed by logo, domain and QR code.
By the way, I'm still not sure of the answer to
http://webapps.stackexchange.com/q/76234/55370
Nemo
Call for Papers and Participation
Paper Submission Deadline: February 16, 2016
=================================
The International Conference on Digital Information Processing,
Electronics, and Wireless Communications (DIPEWC2016)
URL: http://www.sdiwc.net/conferences/dipewc2016/
The International Conference on Computing Technology, Information
Security and Risk Management (CTISRM2016)
URL: http://sdiwc.net/conferences/ctisrm2016/
=================================
Venue: Islamic Azad University (IAU), UAE Branch, Academic City, Dubai,
UAE
Conference Dates: March 2-4, 2016
Saluton,
Please give feed back on this topic on the phabricator ticket about
Integrating translation tools within Wikisource
<https://phabricator.wikimedia.org/T119326>.
ĝis
Hey Wikisourcians,
the WMAT team has been working on the PEG report on our conference for a
while - we are almost finished with our input and now we wanted to ask
for your help: While we could cover all the organizational and budget
related tasks and add some learnings as well as a rough description of
what happend during our time together in Vienna, we cannot sum up the
content and topics that were covered in the various sessions, as we
mostly were not present.
So please help us to make this report a valuable resource for yourselves
as well as for future hosts and organizers and show interested other
Wikimedians the awesome things you achieved during and after the event
and what the next steps will be. I think especially the section
"Activities" in the beginning could use a more detailed record of the
work that was done during the three days. *The deadline for the report
is February 11.*
For your input: link to the report
https://meta.wikimedia.org/wiki/Grants:PEG/WCUG_Wikisource/Wikisource_Confe…
For your information: link to the detailed results from the participants
survey
https://meta.wikimedia.org/wiki/Wikisource_Community_User_Group/Wikisource_…
Thank you!
Claudia
--
Claudia Garád
Geschäftsführerin
Wikimedia Österreich - Verein zur Förderung Freien Wissens www.wikimedia.at
Hi,
Aubrey and Sam would like to try Discourse as an alternative for the
wikisource-l mailing list. A pilot installation is up and running at
https://discourse.wmflabs.org/. There is a separate category for
wikisource-l related discussions on the Discourse site. Members can create
new posts by mailing to wmflabsdiscourse+wikisource-l(a)gmail.com.
The pilot installation is to test things out. Users of Discourse can
download their posts in a .csv file. Admins can download backups - which
are Postgress database tables. I'm trying to figure out to export/download
all posts of all users. Discourse uses javascript, as does Facebook,
Twitter and many other sites.
For the Wikimedia Discourse installation to be a succesful pilot it has to
be a real alternative for a mailing list like wikisource-l. Users want to
read posts in their mail, and want to be able to reply to posts by mail,
without having to visit the site. Users who have signed up at https://
discourse.wmflabs.org/ will receive new posts by default in their mailbox.
Mail preferences can be individually set. Wikimedia Discourse users can
reply by mail to posts received by mail. Members of Wikimedia Discourse can
mail new posts. Please let me know if anything is missing at Wikimedia
Discourse that the current wikisource-l mailing list does have. Sign up for
an account at https://discourse.wmflabs.org/ to discover the extras
Discourse offers. Wikimedia Discourse users can earn barnstars for example,
isn't that nice?
Please note that by subscribing to the wikisource-l mailing list your
mailing list subscription isn't linked to any account at any Wikimedia
wiki, neither is your mail address. For the wikisource-l mailing list you
will have to provide a mail address and after confirmation mailman will
mail you a password. If you want to sign up at https://discourse
.wmflabs.org/ you will have to provide a mail address and pick a password
yourself. Currently their is no link between
https://discourse.wmflabs.org/ accounts
and any account at any Wikimedia wiki, nor is their link with mail
addresses, just like it as with the wikisource-l mailing list. A
phabricator ticket requesting SSO or Oauth 1.0 or Oauth 2.0 can be found at
https://phabricator.wikimedia.org/T124691. Currently there are no resources
available for writing an Oauth 1.0 plugin for Wikimedia Discourse, neither
are there resources available for moving MediaWiki authentification from
Oauth 1.0 to Oauth 2.0. As said, SSO or Oauth is a 'nice to have' and
doesn't have to block a succesful pilot.
Secure communication is a must. The Discourse installation is now
configured to default to use SSL, so you will always see https://.
Here is a short list of places where the Discourse installation can be
discussed:
- https://meta.wikimedia.org/wiki/Discourse
- https://www.mediawiki.org/wiki/Discourse
- https://phabricator.wikimedia.org/T124690
and of course at the wikisource-l mailing list and at https://discourse
.wmflabs.org/.
Best regards,
Ad Huikeshoven