Hello developers,
I would like to gauge the community's interest in taking part in a 1 day
hackathon in London on February 3. It's a bit short notice, but we have
been offered 1 day during a more general hackathon on the future of Wikis.
It's going under the general title of Darvoz (Portugese for 'to give
voice') and Katherine Maher will be in London during that evening and will
be giving a talk at the same venue from 7-9.
So I would like to see how many developers would be interested to come down
and join in a hackathon and attend the talk in the evening. At the moment,
it's undecided what the focus of the hackathon would be, but that's why I
would like to see if any community members could come and take part, and
potentially help me to organise the day. Please let me know if you are
interested and if you could come to London (if you might need somewhere to
stay the night, or if you might need travel expenses to get there) for that
day.
Here's the darvoz.org site.
John Lubbock
Communications Coordinator
Wikimedia UK
+44 (0) 203 372 0767
Wikimedia UK is a Company Limited by Guarantee registered in England and
Wales, Registered No. 6741827. Registered Charity No.1144513. Office 1,
Ground Floor, Europoint, 5 - 11 Lavington Street, London SE1 0NZ.
Wikimedia UK is the UK chapter of a global Wikimedia movement. The
Wikimedia projects are run by the Wikimedia Foundation (who operate
Wikipedia, amongst other projects). *Wikimedia UK is an independent
non-profit charity with no legal control over Wikipedia nor responsibility
for its contents.*
<http://www.avg.com/email-signature?utm_medium=email&utm_source=link&utm_cam…>
Virus-free.
www.avg.com
<http://www.avg.com/email-signature?utm_medium=email&utm_source=link&utm_cam…>
<#DAB4FAD8-2DD7-40BB-A1B8-4E2AA1F9FDF2>
Hi all!
TechCom decided to use this week's IRC discussion slot for a brain storming
session about Evolving the MediaWiki Architecture. "MediaWiki Architecture" is
supposed to be interpreted broadly here - anything that helps us to manage and
serve content is in scope.
The idea is to provide input for the corresponding workshop session at the
summit, in which we will identify focus areas for technical development for the
years to come. You can find more information about the session on phabricator,
please provide input there as well: <https://phabricator.wikimedia.org/T183313>.
The IRC session will take place in #wikimedia-office on Wednesday January 17,
21:00 UTC (2pm PDT, 23:00 CEST).
--
Daniel Kinzler
Principal Platform Engineer
Wikimedia Deutschland
Gesellschaft zur Förderung Freien Wissens e.V.
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA512
Hi,
MediaWiki code search is a fully free software tool that lets you
easily search through all of MediaWiki core, extensions, and skins
that are hosted on Gerrit. You can limit your search to specific
repositories, or types of repositories too. Regular expressions are
supported in both the search string, and when filtering by path.
Try it out: https://codesearch.wmflabs.org/search/
I started working on this because the only other options to searching
the entire MediaWiki codebase was either cloning everything locally
(takes up space, and need to manually keep it up to date) or using
Github (not free software, has extraneous repositories). The backend
is powered by hound, a code search tool written by etsy, based on
Google's Code Search.
Please let me know what you think! More documentation and links are
at: <https://www.mediawiki.org/wiki/Codesearch>.
- -- Legoktm
-----BEGIN PGP SIGNATURE-----
iQJLBAEBCgA1FiEE+h6fmkHn9DUCyl1jUvyOe+23/KIFAlo7NUoXHGxlZ29rdG1A
bWVtYmVyLmZzZi5vcmcACgkQUvyOe+23/KJn/w//YYSD6Fer5EfQAXj+frd02rB5
yx8cowO4ttPFG+52ZTt4RE24SdjSFcz42jnq6wuSQ47pQsZHgDc5qrr6JRsFGq9l
Bvnh7NIYsHHOdQDTkxwHHwaHBTb31u35Bt8+qSHPqbB3cCAHMirJJjvs5+yoilIi
wCmbjpxYoL4eUiMNeZRH/eYyUxpZJwHadc2FuuN3meUIgKoFAblHnKdxTmYoExqr
86PkjE36trbvOQkfrxaSyGJjG5Nm7l+83rm3pCo5pX9Fj/GZOdxcp0siRBKGaQ7W
OciRofZAPjtqmiUunf2pe/wVEAK51VS7EkobgWraSSOwBf62PN7hHVLXQanRn8bh
tQEcKHOxoVSXDlM/fl45cIBN/YGm9LEmRk0iB1HlZZ+QSC3XYj3kL/eMLlGorOuX
MtKZ+J1KOjNJ2fmCMBZhGDzdHPSN70VSAN2Th3kqpDTGzXLTcn3D0VqIT0gQ6eiz
lVyW0haiDuBS7JixZDdLFNr8RkMRLRWmJEdQQi/5VEp1I7K/UQmmt50HqzDBN4d6
/0iKw8p5lANdmjP1rsVzmRrc5C94IS6GN68VznfXMPD+iXI4j1PEeJ6cgEn4aD3y
oh2bD4nmX/T4YfBeigWxPVq3OyPHC5tPzTxdy8OHPNfko/xpwhlBMaf70fBIaBPy
Ciq+thh5hlKuCT1HdXI=
=Te+C
-----END PGP SIGNATURE-----
NPM v5 supports shrinkwrapping dependencies
<http://blog.npmjs.org/post/161081169345/v500> via a "package-lock.json"
lockfile (example
<https://gerrit.wikimedia.org/r/#/c/403724/1/package-lock.json>). Should we
generally be committing these lockfiles or ignoring them in Wikimedia repos?
The downsides of lockfile usage that I'm aware of are:
1. No CI support yet. (CI simply ignores this file when installing
dependencies because it uses an older version of NPM.)
2. Platform differences can create different lockfiles for optional
dependencies but it's a known issue
<https://github.com/npm/npm/issues/17722>.
3. Developers must remember to use NPM v5 or greater when adding or
updating dependencies.
4. The format is a bit verbose.
The pros of lockfiles are:
1. Considered best practice and the default behavior of NPM.
2. The officially supported use case for reproducible builds.
The topic is tracked in T179229 <https://phabricator.wikimedia.org/T179229> but
there's lots of activity around lockfiles
<https://phabricator.wikimedia.org/search/query/NFhYM5EmMLlB/#R> outside of
it.
Stephen
Sorry for cross-posting!
Reminder: Technical Advice IRC meeting again **tomorrow, Wednesday 4-5 pm
UTC** on #wikimedia-tech.
The Technical Advice IRC meeting is open for all volunteer developers,
topics and questions. This can be anything from "how to get started" over
"who would be the best contact for X" to specific questions on your project.
If you know already what you would like to discuss or ask, please add your
topic to the next meeting:
https://www.mediawiki.org/wiki/Technical_Advice_IRC_Meeting
Hope to see you there!
Michi (for WMDE’s tech team)
--
Michael F. Schönitzer
Wikimedia Deutschland e.V. | Tempelhofer Ufer 23-24 | 10963 Berlin
Tel. (030) 219 158 26-0
http://wikimedia.de
Stellen Sie sich eine Welt vor, in der jeder Mensch an der Menge allen
Wissens frei teilhaben kann. Helfen Sie uns dabei!
http://spenden.wikimedia.de/
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/681/51985.
I have just merged the "stretch-migration" feature branch to master
for MediaWiki-Vagrant.
Major changes:
* Debian Stretch (Debian 9) base image
* Default PHP runtime is Zend PHP 7.0 (HHVM available via role)
* Database is MariaDB 10.1
* Puppet 4
Once you update your local MediaWiki-Vagrant clone to 59e3b49c or
later you will need to create a new VM based on the Debian Stretch
base image in order to use `vagrant provision`. Upgrading your local
VM may be as easy as using `vagrant destroy` to delete the current VM
followed by `vagrant up` to make a new one. Note that this will *not*
save the contents of any local wikis in the VM. You will need to
manually backup and restore the databases or export and import pages
you have created.
See <https://phabricator.wikimedia.org/T181353> for more information
and a few known open bugs.
I have also created a jessie-compat branch that can by used by users
who are not ready to destroy their current Jessie based virtual
machines and start over with Stretch. A simple `git checkout -b
jessie-compat` should be all that is needed to switch your local
MediaWiki-Vagrant clone to the new compatibility branch. This branch
will probably receive few updates, so you are encouraged to create new
Stretch based VMs soon.
Bryan
--
Bryan Davis Wikimedia Foundation <bd808(a)wikimedia.org>
[[m:User:BDavis_(WMF)]] Manager, Cloud Services Boise, ID USA
irc: bd808 v:415.839.6885 x6855
Hi all,
I am forwarding you this email, because we have a specific technical
question. With DBpedia as middleware, we can create a global view on all
the data that is in Wikipedias Infoboxes and Wikidata and compare them
(for details see the email below and also the proposal).
We were wondering what is the latest and most appropriate tech to
interface with the editors of infoboxes.
VisualEditor seems appropriate, but I checked here for example:
https://en.wikipedia.org/wiki/Fulda
It seems to be possible to edit some values, but there is no Wikidata
support and also population does have a reference which does not show up
in the VisualEditor.
Do you think it would be a good way to provide comparative facts from
other language versions in the VisualEditor? Or would you choose
something else?
All the best,
Sebastian
-------- Forwarded Message --------
Subject: [Wikidata] GlobalFactSync
Date: Mon, 15 Jan 2018 19:57:04 +0100
From: Magnus Knuth <knuth(a)informatik.uni-leipzig.de>
Reply-To: Discussion list for the Wikidata project.
<wikidata(a)lists.wikimedia.org>
To: wikidata(a)lists.wikimedia.org
Dear all,
last year, we applied for a Wikimedia grant to feed qualified data from Wikipedia infoboxes (i.e. missing statements with references) via the DBpedia software into Wikidata. The evaluation was already quite good, but some parts were still missing and we would like to ask for your help and feedback for the next round. The new application is here: https://meta.wikimedia.org/wiki/Grants:Project/DBpedia/GlobalFactSync
The main purpose of the grant is:
- Wikipedia infoboxes are quite rich, are manually curated and have references. DBpedia is already extracting that data quite well (i.e. there is no other software that does it better). However, extracting references is not a priority on our agenda. They would be very useful to Wikidata, but there are no user requests for this from DBpedia users.
- DBpedia also has all the infos of all infoboxes of all Wikipedia editions (>10k pages), so we also know quite well, where Wikidata is used already and where information is available in Wikidata or one language version and missing in another.
- side-goal: bring the Wikidata, Wikipedia and DBpedia communities closer together
Here is a diff between the old an new proposal:
- extraction of infobox references will still be a goal of the reworked proposal
- we have been working on the fusion and data comparison engine (the part of the budget that came from us) for a while now and there are first results:
6823 birthDate_gain_wiki.nt
3549 deathDate_gain_wiki.nt
362541 populationTotal_gain_wiki.nt
372913 total
We only took three properties for now and showed the gain where no Wikidata statement was available. birthDate/deathDate is already quite good. Details here: https://drive.google.com/file/d/1j5GojhzFJxLYTXerLJYz3Ih-K6UtpnG_/view?usp=…
Our plan here is to map all Wikidata properties to the DBpedia Ontology and then have the info to compare coverage of Wikidata with all infoboxes across languages.
- we will remove the text extraction part from the old proposal (which is here for you reference: https://meta.wikimedia.org/wiki/Grants:Project/DBpedia/CrossWikiFact). This will still be a focus during our work in 2018, together with Diffbot and the new DBpedia NLP department, but we think that it distracted from the core of the proposal. Results from the Wikipedia article text extraction can be added later once they are available and discussed separately.
- We proposed to make an extra website that helps to synchronize all Wikipedias and Wikidata with DBpedia as its backend. While the external website is not an ideal solution, we are lacking alternatives. The Primary Sources Tool is mainly for importing data into Wikidata, not so much synchronization. The MediaWiki instances of the Wikipedias do not seem to have any good interfaces to provide suggestions and pinpoint missing info. Especially to this part, we would like to ask for your help and suggestions, either per mail to the list or on the talk page: https://meta.wikimedia.org/wiki/Grants_talk:Project/DBpedia/GlobalFactSync
We are looking forward to a fruitful collaboration with you and we thank you for your feedback!
All the best
Magnus
--
Magnus Knuth
Universität Leipzig
Institut für Informatik
Abt. Betriebliche Informationssysteme, AKSW/KILT
Augustusplatz 10
04109 Leipzig DE
mail: knuth(a)informatik.uni-leipzig.de
tel: +49 177 3277537
webID: http://magnus.13mm.de/
_______________________________________________
Wikidata mailing list
Wikidata(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
Hello, and thank you for your answer. Yes, he likes local aliases indeed.
They can't be done for all wikis, because it's on wiki own language. About
templates - we can create them, of course, but it's about wikilinks. Isn't
there any where to create them, even in phabricator, as namespaces aliases
were?
Igal
On Jan 15, 2018 09:32, "MZMcBride" <z(a)mzmcbride.com> wrote:
יגאל חיטרון wrote:
>We just added a lot of aliases for namespaces, for example WP: for
>Wikipedia: and U: for User:. Is there a way to do the same thing for the
>sister projects? For example, adding a local name for n: or wict:.
Are you familiar with <https://meta.wikimedia.org/wiki/Interwiki_map>? It
sounds similar to what you want, except interwiki prefixes defined on that
page apply to all public Wikimedia wikis. Do you want local-only prefixes?
Would templates (i.e., {{wict|hello}} instead of [[wict:hello]]) work?
MZMcBride
_______________________________________________
Wikitech-l mailing list
Wikitech-l(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Hello. One of tech masters on our wiki asked me to check this. He is
thousands times clever than me, but has thouthands times less time, so I do
this for him.
We just added a lot of aliases for namespaces, for example WP: for
Wikipedia: and U: for User:. Is there a way to do the same thing for the
sister projects? For example, adding a local name for n: or wict:. Thank
you.
Igal (User:IKhitron)