Hello,
Mockups for new Commons file pages featuring structured copyright and
licensing statements have been posted for community review. [1] These
potential additions to the file page will affect Commons only, file pages
on other wikis will remain as they are. Please have a look; questions,
comments, concerns can be left on the talk page. [2]
Thank you, see you on the wiki.
1.
https://commons.wikimedia.org/wiki/Commons:Structured_data/Get_involved/Fee…
2.
https://commons.wikimedia.org/wiki/Commons_talk:Structured_data/Get_involve…
--
Keegan Peterzell
Community Relations Specialist
Wikimedia Foundation
Hello all,
In order to allow better gadget integration, JavaScript hooks documented in
the hooks-js.txt
<https://phabricator.wikimedia.org/diffusion/EWBA/browse/master/docs/hooks-j…>
file delivered together with Wikibase source code are considered stable. We
just added this information in the stable interface policy
<https://www.wikidata.org/w/index.php?title=Wikidata%3AStable_Interface_Poli…>.
If you have any question about this, feel free to ping me.
Cheers,
--
Léa Lacroix
Project Manager Community Communication for Wikidata
Wikimedia Deutschland e.V.
Tempelhofer Ufer 23-24
10963 Berlin
www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/029/42207.
<tl;dr>: Read https://www.mediawiki.org/wiki/Google_Code-in/Mentors and
add your name to the mentors table and start tagging #GCI-2018 tasks.
We'll need MANY mentors and MANY tasks, otherwise we cannot make it.
Google Code-in is an annual contest for 13-17 year old students. It
will take place from Oct23 to Dec13. It's not only about coding:
we also need tasks about design, docs, outreach/research, QA.
Last year, 300 students worked on 760 tasks supported by 51 mentors.
For some achievements from last round, see
https://blog.wikimedia.org/2018/03/20/wikimedia-google-code-in-2017/
While we wait whether Wikimedia will get accepted:
* You have small, self-contained bugs you'd like to see fixed?
* Your documentation needs specific improvements?
* Your user interface has some smaller design issues?
* Your Outreachy/Summer of Code project welcomes small tweaks?
* You'd enjoy helping someone port your template to Lua?
* Your gadget code uses some deprecated API calls?
* You have tasks in mind that welcome some research?
Note that "beginner tasks" (e.g. "Set up Vagrant") and generic
tasks are very welcome (like "Choose and fix 2 PHP7 issues from
the list in https://phabricator.wikimedia.org/T120336" style).
We also have more than 400 unassigned open #easy tasks listed:
https://phabricator.wikimedia.org/maniphest/query/HCyOonSbFn.z/#R
Can you mentor some of those tasks in your area?
Please take a moment to find / update [Phabricator etc.] tasks in your
project(s) which would take an experienced contributor 2-3 hours. Read
https://www.mediawiki.org/wiki/Google_Code-in/Mentors
, ask if you have any questions, and add your name to
https://www.mediawiki.org/wiki/Google_Code-in/2018#List_of_Wikimedia_mentors
(If you have mentored before and have a good overview of our
infrastructure: We also need more organization admins! See
https://www.mediawiki.org/wiki/Google_Code-in/Admins )
Thanks (as we cannot run this without your help),
andre
--
Andre Klapper | Bugwrangler / Developer Advocate
https://blogs.gnome.org/aklapper/
Wikimedia Sverige is proud to be the recipient of $324,500 in support from
the Swedish Postcode Foundation for our project FindingGLAMs
<https://meta.wikimedia.org/wiki/FindingGLAMs>.
We will work in collaboration with UNESCO and the Wikimedia Foundation to
achieve a number of ambitious goals over the 15 month long project. We will
work to:
- Collect and include data about GLAM[1] institutions around the world
on Wikidata (e.g. where they are located). This will be done both through a
campaign and from batch uploads of datasets;
- Batch upload collections of media files, in collaboration with the
Structured Data on Commons program;
- Organize networks of experts to discuss issues with disseminating
different types of material through the Wikimedia projects;
- Create a white paper consisting of a number of case studies based on
the work and discussions outlined above;
- Communicate about the Wikimedia movement’s work with GLAMs at
conferences;
- Organize a number of activities both on- and offline to use the
material. This include some work around Wikimania 2019 that will take place
in Stockholm, Sweden.
We hope many of you would like to get involved in the project and work with
us to connect to GLAM partners in your countries. At this point we are
looking for your help to identify institutions that maintain lists of GLAM
institutions (similar to the collection of monuments lists for WLM) and
later to identify GLAM institutions with specific types of collections that
would like to work with the Wikimedia movement.
Please contact the project manager John Andersson (
john.andersson(a)wikimedia.se) if you have any questions. If you are
interested to take part, please sign up on the project portal at Meta:
https://meta.wikimedia.org/wiki/FindingGLAMs
As always, you can find the full application on our wiki (in Swedish):
https://se.wikimedia.org/wiki/Projekt:FindingGLAMs_2018/Ansökan
[1] GLAM is an acronym for Galleries, Libraries, Archives and Museums.
We look forward working with you!
Best regards,
John
- - - -
John Andersson
Executive Director
Wikimedia Sverige
Phone: +46(0)73-3965189
Email: john.andersson(a)wikimedia.se <JohnAndersson86(a)hotmail.com>
Visiting address: Goto10, Hammarby Kaj 10D, 120 32 Stockholm
Dear Wikidata community,
We're working on a project called Wikibabel to machine-translate parts of
Wikipedia into underserved languages, starting with Swahili.
In hopes that some of our ideas can be helpful to machine translation
projects, we wrote a blogpost about how we prioritized which pages to
translate, and what categories need a human in the loop:
https://medium.com/@oirzak/wikibabel-equalizing-information-access-on-a-bud…
Rumor has it that the Wikidata community has thought deeply about
information access. We'd love your feedback on our work. Please let us know
about past / ongoing machine translation related projects so we can learn
from & collaborate with them.
Best regards,
Olya & the Wikibabel crew
Hi Wikitool users, developers,
I am testing the sample example FetchOnlineDataExample, but get this a Java
Network related error:
*Could not retrive data: java.io.IOException: Unable to tunnel through
proxy. Proxy returns "HTTP/1.1 407 Proxy Authentication Required"*
Thinking that this issue seems to be related to Java's proxy settings I
have tried my bit to no avail. I now wanted to seek a confirmation whether
the Wikitool is capable of establishing a HTTPS connection and Managing
authentication, given I pass the following JVM Arguments during runtime:
-Dhttps.proxyHost=xxx.xxx.xx.xx -Dhttps.proxyPort=xxxx
-Dhttps.proxyUser=UserId -Dhttps.proxyPassword=Password
-Djdk.http.auth.tunneling.disabledSchemes=""
*The last parameter is something I picked from various forums where it is
argued that Java 8 onwards the Basic authentication scheme has been
deactivated. (
http://www.oracle.com/technetwork/java/javase/8u111-relnotes-3124969.html )
Also, passing the jvm arguments did not resolve the error. Hence I wanted
to clarify if there are any limitation with the WikiData Toolkit:
****************Error logs
****************************************************
*** Wikidata Toolkit: FetchOnlineDataExample
------------------------------
*** This program fetches individual data using the wikidata.org API.
*** It does not download any dump files.
------------------------------
*** Fetching data for one entity:
2018-09-04 18:23:04 ERROR - Could not retrive data: java.io.IOException:
Unable to tunnel through proxy. Proxy returns "HTTP/1.1 407 Proxy
Authentication Required"
Exception in thread "main" java.lang.NullPointerException
at examples.FetchOnlineDataExample.main(FetchOnlineDataExample.java:100)
Hi,
I am testing the sample example FetchOnlineDataExample but getting a Java
related error:
*Could not retrive data: java.io.IOException: Unable to tunnel through
proxy. Proxy returns "HTTP/1.1 407 Proxy Authentication Required"*
Thinking that this issue seems to be related to Java's proxy settings I
have tried my bit and now wanted to seek a confirmation whether the logic
of establishing a HTTPS connection and authentication will work, given I
pass the following JVM Arguments during runtime:
-Dhttps.proxyHost=xxx.xxx.xx.xx -Dhttps.proxyPort=xxxx
-Dhttps.proxyUser=UserId -Dhttps.proxyPassword=Password
-Djdk.http.auth.tunneling.disabledSchemes=""
The last parameter is something I picked from various forums where it is
argued that Java 8 onwards Basic authentication scheme has been
deactivated. (
http://www.oracle.com/technetwork/java/javase/8u111-relnotes-3124969.html )
Actually, passing the Arguments did not resolve the error. Hence I wanted
to clarify if there are any Limitation on the Toolkit end.
****************Error logs
****************************************************
*** Wikidata Toolkit: FetchOnlineDataExample
------------------------------
*** This program fetches individual data using the wikidata.org API.
*** It does not download any dump files.
------------------------------
*** Fetching data for one entity:
2018-09-04 18:23:04 ERROR - Could not retrive data: java.io.IOException:
Unable to tunnel through proxy. Proxy returns "HTTP/1.1 407 Proxy
Authentication Required"
Exception in thread "main" java.lang.NullPointerException
at examples.FetchOnlineDataExample.main(FetchOnlineDataExample.java:100)