Dear all,
I thank you for your efforts. I am honoured to inform you that our latest research paper about Wikidata and Health has been published in Journal of Biomedical Informatics (IF=2.9). The paper is available at https://doi.org/10.1016/j.jbi.2019.103292. This paper is the first one of our series of research publications about Medical Wikidata project. If you like to have the paper, please contact me and I will send the PDF to you. Our next papers will work on finding methods to ameliorate the coverage and quality of medical information in Wikidata. We are quite sure that our work is important as it will be provide trustworthy reference medical information that can be used by physicians and computer programs to process medical data and enhance the efficiency of health care.
Yours Sincerely,
Houcemeddine Turki (he/him)
Medical Student, Faculty of Medicine of Sfax, University of Sfax, Tunisia
GLAM, Research and Education Coordinator, Wikimedia TN User Group
Member, Wiki Project Med
Member, WikiIndaba Steering Committee
Member, Wikimedia and Library User Group Steering Committee
____________________
+21629499418
The deployment of 1.34.0-wmf.24 to group0 was delayed until late Tuesday,
night/evening. More specifically, I promoted the branch to group0 just
about 20 minutes ago at approximately 03:30 UTC / 8:30 PM San Francisco.
Group 1 is scheduled for tomorrow between 19:00 - 21:00 UTC.
There are currently several open tasks with status Unbreak Now!, however,
none are marked as train blockers. From a brief review of comments, it
appears that all of them had a fix merged prior to the branch cut and
remain open pending QA review. If any issues remain please let me know,
ideally by adding issues as blockers under T220749[1] sometime before 19:00
UTC. If you need more time to test prior to the rollout of group1 then you
can reach me via IRC or email to coordinate.
[1] Tasks marked Unbreak now!
https://phabricator.wikimedia.org/maniphest/query/AtbLhf4ZUBGj/
[2] Train blockers for 1.34.0-wmf.24
https://phabricator.wikimedia.org/T220749
Happy testing!
Your humble train conductor, Mukunda
Phabricator: @mmodell
IRC: twentyafterfour
Hello everyone,
I would like to share a quick summary of Small Wiki Toolkits, which was one
of the focus areas at this year’s Wikimania Hackathon.
As part of this focus area, five workshops and two sessions were conducted
that covered a wide range of topics – developing user scripts and gadgets,
working with Wikimedia APIs, writing templates in Lua, generating Wikidata
infoboxes, leveraging Wikimedia cloud services, etc. Participants also
engaged in other activities such as the Wikidata documentation translation
sprint and cleaning MediaWiki:Common.css with TemplateStyles.
All toolkits are now available on the Meta page:
https://meta.wikimedia.org/wiki/Small_wiki_toolkits. You can also add this
page to your Watchlist to keep up to date with the next steps on this
project!
To learn more, read the complete summary on Wikimedia Space:
https://space.wmflabs.org/2019/09/24/round-up-of-small-wiki-toolkits-at-wik…
Cheers,
Srishti
*Srishti Sethi*
Developer Advocate
Wikimedia Foundation <https://wikimediafoundation.org/>
Hello all,
On my own mediawiki install, I am trying to add another checkbox field to the Special:CreateAccount page. I have found the code responsible for the form, but for some reason the checkbox does not show up. As a test, I then went and tried copying and pasting one of the existing text boxes (with its IDs etc changed of course) to see if that would work. Nothing shows up other than the fields already present.
Does anyone have any ideas what could be blocking it and/or what I am missing? Below is the diff of the change that doesn’t show.
https://github.com/TheSandDoctor/misc-code-bits/commit/4f2f6221c64095777622…
Thanks!
TheSandDoctor
The code review working group has been discussing ideas for how to
encourage more / better code reviews for Wikimedia code.
One idea that we are exploring[1] is something we tried previously which
was called "Code review office hours." This was a weekly scheduled IRC
meeting attended by code reviewers. The previous incarnation was a minor
success but eventually interest petered out and we canceled the scheduled
meetings.
So in bringing back the code review meetings, we want to try something a
bit different. A couple of ideas proposed so far:
* Have more focused meetings around specific parts of the code base or
specific features.
* Rebrand as "Patch triage" and focus more on an initial code
review/feedback on new patches rather than focusing on merging as we did
previously.
I'm writing this to raise awareness and encourage participation, as well as
feedback about the current proposals. We won't be successful without
participation from code reviewers as well as code contributors so your
feedback and participation are important and appreciated. If this interests
you, please join the discussion on the Phabricator taskl[1].
1. T229512 "Review and refine the Code Review Office Hours model of
engagment" https://phabricator.wikimedia.org/T229512
Hello,
Startup module, is the Javascript code [1] that is being served at almost
every request to Wikimedia as part of <head> (non-blocking though) to load
other RL modules. It does important things for example it checks if the
requested modules already exist with the given hash in the local storage
cache.
Because of the given reasons, the startup module needs and includes list of
all registered modules with their version hash in the code. As the result
if you register a module, even if you don't load it, it adds a rather small
overhead (around 19 bytes after minification and compression) to every
request to any wiki the code is enabled. So if you register a new module in
core or one of the extensions that is deployed in all wikis (CX,
WikibaseClient, etc.), it's going to be added to every page request,
meaning 30 GBs more traffic every day to our users. Even if you don't use
it anywhere.
If you're adding a feature, 30GB/day is acceptable but sometimes developers
use Resource loader for dependency management or class loading (yours truly
used to do that) and introduce 20 modules instead of one and each one
causing an extra 600 GB/day. The big overhead for our users is bad for
three reasons: 1- RL has to handle the dependency management, making every
page view slightly slower and worse UX for our users 2- The extra network
is bad with places without access to broadband connection, where Wikipedia
is the most likely place that people learn and grow [2] 3- The scale we are
talking is so massive (petabytes a year) that It has environmental impact.
Let me give you an example, a couple of weeks ago we dropped 137 modules
from WikibaseClient. After deployment, it dropped 4TB/day from our network
(= 1.5 PB/year) and synthetic data shows, in an average case we drop 40 ms
from every response time [3].
We now have a dashboard to track size of size of RL registry [4] plus a
weekly metrics for changes [5][6]
If you're looking for ways to help. I wrote a tool [7] to do some graph
analysis and it gives list of extensions that has modules that can be
merged. The extensions that according to the analysis (that can have false
positives) can get better are TimedMediaHandler, PageTriage, Graph,
RevisionSlider, CodeMirror, Citoid, TemplateData, TwoColConflict,
Collection, CentralNotice, AdvancedSearch, 3D, MobileFrontend and many more
including some bits and pieces in core. I put the graphs of modules that
can be merged at [8] and I invite you to have fun with those modules.
Modules can be merged using package modules [9]
Most of the is work done by the performance team [10] and volunteers and
developers in lots of teams. I joined the party later as volunteer/WMDE
staff and I'm sharing mostly the results and writing long emails. Big kudos
to Krinkle, James Forrester, Santhosh Thottingal, Jon Robson, Alaa Sarhaan,
Alexandros Kosiaris, and so many others that helped and I forgot.
[1] An example from English Wikipedia:
https://en.wikipedia.org/w/load.php?lang=en&modules=startup&only=scripts&ra…
[2] https://arxiv.org/abs/1812.00474
[3] https://phabricator.wikimedia.org/T203696#5387672
[4] https://grafana.wikimedia.org/d/BvWJlaDWk/startup-module-size?orgId=1
[5] https://gist.github.com/Krinkle/f76229f512fead79fb4868824b5bee07
[6]
https://docs.google.com/document/d/1SESOADAH9phJTeLo4lqipAjYUMaLpGsQTAUqdgy…
[7] https://phabricator.wikimedia.org/T232728
[8] https://phabricator.wikimedia.org/T233048
[9] https://www.mediawiki.org/wiki/ResourceLoader/Package_modules
[10] https://phabricator.wikimedia.org/T202154
Best
--
Amir (he/him)
Well, I'm thrilled about this, especially after having had a look
through https://www.slideshare.net/lucidworks/searching-for-better-code-presented-b…
Honestly, though, it's only the third best thing that happened this
week after Valerie Plame entering politics and the UC system divesting
from fossil fuels.
Grant, welcome! My advice is to set make a long list of concrete KPIs
for contributor (e.g. editor) support, reach, and cloud support, in a
way that can be used for fundraising. The fundraising messaging has
been stuck for years on this thing about, "if everyone reading this
contributed the cost of a cup of coffee, then _some goal here_," which
is okay, but could be so much better flipped with the KPIs as the ask,
e.g., "Each $CURRENCY you donate will pay to support N additional
$CONTENTS," where the wikipedias can use ops measurements of the
resources typical to, e.g., take an article from Start to B class, for
example, or how much time, server electricity including idle time, and
other resource it takes to get a new word added to Wiktionary to some
level of proficiency. If these units relate to the potential donor's
language or geography, all the better. People geolocated in the
developed world using languages with highly developed wikipedias and
wiktionaries can be told how much it would cost to, for example,
eliminate units of the various WP:BACKLOG items you find suitable in
multivariate e.g. Latin squares donation message testing. (Or add new
technology projects like an intelligibility- and natural spoken
feedback version of https://www.speechace.co/api_sample/ hint
https://phabricator.wikimedia.org/T166929#5473028 hint.)
Also please take Curecoin instead of Bitcoin, even if that means
paying the extra transaction fee before converting the Curecoin to
cash. It is the height of folly to be as close to endorsing wasted
electricity-based cryptocurrency as we already do, when alternatives
with a benefit are less commonly known. The only other blockchain
thing I like is that long-term state-sponsored censorship mitigation
program can be based on copying the dumps to IPFS, but please also
support the CDN efforts like Encrypted-SNI:
https://twitter.com/jsalsman/status/1142172682751864832https://twitter.com/jsalsman/status/1142940652851695616https://twitter.com/jsalsman/status/1053786384463355905
Please let me know your thoughts.
Best regards,
Jim
Dear all,
for our deployment procedure we're setting up a clean database with some
prefilled content. I use MW 1.32.3 for that. When I did the fresh
install (minimal installation method) I realized that the
'content_models' database table only contains one row with model_id = 1
and model_name = wikitext.
I then install extensions, run update.php, and import a couple pages,
templates, etc. When setting up Cargo's _pageData table using the
setCargoPageData.php script at a later stage I get the following error
(see backtrace at the very bottom).
Failed to access name from content_models using id = 2
I get this error fixed by inserting the core content models
<https://www.mediawiki.org/wiki/Content_handlers> into the
'content_models' table manually. Accessing the pages afterwards works
fine (Widgets in this case). The widgets themselves work fine, too.
Am I missing something that the core content models are not in the
initial database table 'content_models'? Is this table not filled during
the installation procedure?
Is it actually bad advice to insert the rows manually?
Kind regards and thanks for your support,
Tom
[12081d263c4cc289d8a144d8] [no req]
MediaWiki\Storage\NameTableAccessException from line 42 of
/var/www/mediawiki/lib/includes/Storage/NameTableAccessException.php:
Failed to access name from content_models using id = 2
Backtrace:
#0 /var/www/mediawiki/lib/includes/Storage/NameTableStore.php(308):
MediaWiki\Storage\NameTableAccessException::newFromDetails(string,
string, integer)
#1 /var/www/mediawiki/lib/includes/Revision/RevisionStore.php(1625):
MediaWiki\Storage\NameTableStore->getName(integer)
#2 /var/www/mediawiki/lib/includes/Revision/RevisionStore.php(1671):
MediaWiki\Revision\RevisionStore->loadSlotRecords(string, integer)
#3 [internal function]:
MediaWiki\Revision\RevisionStore->MediaWiki\Revision\{closure}()
#4 /var/www/mediawiki/lib/includes/Revision/RevisionSlots.php(165):
call_user_func(Closure)
#5 /var/www/mediawiki/lib/includes/Revision/RevisionSlots.php(107):
MediaWiki\Revision\RevisionSlots->getSlots()
#6 /var/www/mediawiki/lib/includes/Revision/RevisionRecord.php(192):
MediaWiki\Revision\RevisionSlots->getSlot(string)
#7 /var/www/mediawiki/lib/includes/Revision/RevisionRecord.php(175):
MediaWiki\Revision\RevisionRecord->getSlot(string, integer, NULL)
#8 /var/www/mediawiki/lib/includes/Revision.php(932):
MediaWiki\Revision\RevisionRecord->getContent(string, integer, NULL)
#9 /var/www/mediawiki/lib/includes/page/WikiPage.php(801):
Revision->getContent(integer, NULL)
#10
/var/www/mediawiki/lib/extensions/Cargo/includes/CargoPageData.php(107):
WikiPage->getContent()
#11
/var/www/mediawiki/lib/extensions/Cargo/maintenance/setCargoPageData.php(88):
CargoPageData::storeValuesForPage(Title)
#12 /var/www/mediawiki/lib/maintenance/doMaintenance.php(94):
SetCargoPageData->execute()
#13
/var/www/mediawiki/lib/extensions/Cargo/maintenance/setCargoPageData.php(103):
require_once(string)
#14 {main}
--
Tom Schulze
energypedia consult GmbH
König-Adolf-Str. 12
65191 Wiesbaden
Phone: +49 0611 18195031
Email: t.schulze(a)energypedia-consult.com
Web: www.energypedia-consult.com | www.webmo.info
Registergericht: Frankfurt, Eintragungs-Nr. HRB 93412 | Sitz: Eschborn | Geschäftsführung: Robert Heine
Forwarding to Wikitech-l and MediaWiki-l.
Pine
( https://meta.wikimedia.org/wiki/User:Pine )
---------- Forwarded message ---------
From: Katherine Maher <kmaher(a)wikimedia.org>
Date: Wed, Sep 18, 2019, 10:44
Subject: [Wikimedia Announcements] Welcoming Wikimedia Foundation’s new
CTO, Grant Ingersoll
To: wikimediaannounce-l <WikimediaAnnounce-l(a)lists.wikimedia.org>
Hi all,
I’m excited to officially welcome Grant Ingersoll as our Chief Technology
Officer! Grant will be starting September 23. He’ll be based in Charlotte,
North Carolina.
Grant comes from a long history of working on open source projects. Most
recently, he served as the Chief Technical Officer of Lucidworks, an
AI-powered open-source search services company which he co-founded. He is a
Lucene and Solr committer, co-founder of the Apache Mahout machine learning
project, and a long-standing member of the Apache Software Foundation. He’s
an author, having written a book for Java developers on how to wrangle
unstructured text for search, text-mining, and the like. He’s also a
long-time, committed remotee, having worked with the distributed Lucidworks
team from his home in North Carolina for nearly a decade.
In Grant’s own words, “The power of learning and knowledge have always
stood as key pillars in my career. I'm a big believer that access to free,
trusted knowledge is of vital importance as society looks to tackle large
scale challenges. My wife, Robin, and I recently moved from Chapel Hill,
North Carolina, our home for 10 years, to Charlotte, North Carolina, where
our daughter lives. We also just dropped our son off at college for his
freshman year. We're adjusting to the empty nest life with our dog Allie (a
black lab mix). When not traveling or exploring Charlotte, I can usually be
found on a bike, out kayaking, or writing code.”
Grant will be working with myself, Erika Bjune (who is transitioning to VP,
Technology), and others in the leadership of the technology department to
determine a set of priorities for his first year. I expect these will
likely focus around evaluating our current capacities and co-creating a
vision for the continued evolution of our technical platforms, supporting
the staff of the department, working with the finance and operations folks
on planning and budgeting for the future, and of course, getting to know
our technical community and the broader movement. Under Grant’s leadership,
we will continue the work of improving and modernizing our technical
ecosystem to respond to our future needs, as laid out in the movement
strategy.
I’m thrilled to have the CTO role filled, and to bring Grant in at a time
when the movement is digging into the question of what it means to “become
the essential infrastructure of free knowledge.” From our first meeting, I
was struck by his curiosity. He was genuinely interested in how Wikimedia
works, and was willing to get into what I often think of “Wiki PhD” level
conversations about the nuances of our community and values. I have
generally found that the folks who bring that sort of openness and humility
to their work are the folks who thrive in the challenges of our mission and
movement. As our movement expands, I’m glad to have Grant’s experience,
curiosity, and passion for building things on board here at the Foundation.
I want to also thank Erika Bjune for her work as interim CTO. She
passionately advocated for the importance of our platforms, embodied
partnership and cooperation, and her expertise on the interview panel was
invaluable. I’m thrilled she’ll be working so closely with Grant as we move
forward.
Please join me in welcoming Grant to the Wikimedia Foundation!
Katherine
--
Katherine Maher (she/her)
Executive Director
Wikimedia Foundation <https://wikimediafoundation.org/>
_______________________________________________
Please note: all replies sent to this mailing list will be immediately
directed to Wikimedia-l, the public mailing list of the Wikimedia
community. For more information about Wikimedia-l:
https://lists.wikimedia.org/mailman/listinfo/wikimedia-l
_______________________________________________
WikimediaAnnounce-l mailing list
WikimediaAnnounce-l(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikimediaannounce-l