Hi,
I'd like to discuss potential improvements to www.mediawiki.org (the
front page itself), both regarding content and presentation, as I see
several issues with the current page.
I propose to discuss this in several stages:
* Defining audiences,
* defining content per group,
* defining presentation and layout.
My main intention are improvements to allow our audiences to better
find the relevant information that they are looking for.
We might not be able to make everything perfect but we can certainly
try to make it better?
If this sounds interesting, please take a look at
https://www.mediawiki.org/wiki/MediaWiki/Homepage_improvements_2018
and let's discuss on the corresponding discussion page and work on
this?
Thank you!
andre
--
Andre Klapper | Bugwrangler / Developer Advocate
https://blogs.gnome.org/aklapper/
Forwarding because this (ambitious!) proposal may be of interest to people
on other lists. I'm not endorsing the proposal at this time, but I'm
curious about it.
Pine
( https://meta.wikimedia.org/wiki/User:Pine )
---------- Forwarded message ---------
From: Denny Vrandečić <vrandecic(a)gmail.com>
Date: Sat, Sep 29, 2018 at 6:32 PM
Subject: [Wikimedia-l] Wikipedia in an abstract language
To: Wikimedia Mailing List <wikimedia-l(a)lists.wikimedia.org>
Semantic Web languages allow to express ontologies and knowledge bases in a
way meant to be particularly amenable to the Web. Ontologies formalize the
shared understanding of a domain. But the most expressive and widespread
languages that we know of are human natural languages, and the largest
knowledge base we have is the wealth of text written in human languages.
We looks for a path to bridge the gap between knowledge representation
languages such as OWL and human natural languages such as English. We
propose a project to simultaneously expose that gap, allow to collaborate
on closing it, make progress widely visible, and is highly attractive and
valuable in its own right: a Wikipedia written in an abstract language to
be rendered into any natural language on request. This would make current
Wikipedia editors about 100x more productive, and increase the content of
Wikipedia by 10x. For billions of users this will unlock knowledge they
currently do not have access to.
My first talk on this topic will be on October 10, 2018, 16:45-17:00, at
the Asilomar in Monterey, CA during the Blue Sky track of ISWC. My second,
longer talk on the topic will be at the DL workshop in Tempe, AZ, October
27-29. Comments are very welcome as I prepare the slides and the talk.
Link to the paper: http://simia.net/download/abstractwikipedia.pdf
Cheers,
Denny
_______________________________________________
Wikimedia-l mailing list, guidelines at:
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l(a)lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l,
<mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe>
Dear users, developers and all people interested in semantic wikis,
We are happy to announce SMWCon Fall 2018 - the 15th Semantic MediaWiki Conference:
Dates: December 12th to December 14th 2018 (Wednesday to Friday).
Location: TechBase Regensburg, Franz-Mayer-Straße 1, 93053 Regensburg, Germany
Conference page: https://www.semantic-mediawiki.org/wiki/SMWCon_Fall_2018
Ticket Shop: https://en.xing-events.com/SMWCon_Fall_2018
SMWCon Fall 2018 will be supported by gesinn.it GmbH & Co. KG [0].
Due to a challenging conference room situation, SMWCon Fall will take place very late this year.
What's good about it: participants will have the opportunity to enjoy winter atmosphere (including "Christkindlmarkt") in Regensburg :-)
This time, we start Wednesday with two conference days (including a short introduction to SMW) followed by a Tutorial / Hackathon Day on Friday.
TICKET SHOP IS ALREADY OPEN
Contributing to the conference: If you want to present your work in the conference, please go to the conference page and add your talk there.
To create an attractive program for the conference, we will later ask you to give further information about your proposals.
Looking forward to meet you in Regensburg!
/Alexander Gesinn
on behalf of the gesinn.it organization team
[0] http://gesinn.it, http://semantic.wiki
Hello,
This update is to add some additional information we are now able to share
in regard to why translation updates are on hold.
The Security team and others at the Wikimedia Foundation are engaged in a
security event involving our translation services. No Wikimedia users or
their data are currently affected. We made the decision to temporarily
disable translation updates until suitable countermeasures can be applied
and at this point reinstatement of translation updates is to be determined.
At the resolution of this event the Security team will publish a summary
blog post (https://phabricator.wikimedia.org/phame/blog/view/13/) including
additional details as appropriate.
Thank you for your patience and understanding while we work to better
protect the community.
John Bennett
Director of Security, Wikimedia Foundation
Hi All,
A friendly reminder that TechCom is hosting an IRC meeting tomorrow
(Wednesday 26 September) regarding RFC: Modern Event Platform: Scalable
Event Intake Service. <https://phabricator.wikimedia.org/T201963>
This RFC outlines the current state of our event intake systems and some
possible ways forward. The outcome of this RFC should be to choose an
implementation option. This RFC is not intended as a design document.
The meeting is scheduled for Wednesday 26 September at 2pm PST(21:00
UTC, 23:00 CET) in #wikimedia-office.
If you haven't joined a #wikimedia-office meeting before more
information can be found here:
<https://meta.wikimedia.org/wiki/IRC_office_hours#How_to_participate>
More information regarding the TechCom RFC process is available here:
<https://www.mediawiki.org/wiki/Wikimedia_Technical_Committee/Processes#RFC_…>
Thanks,
Kate
--
Kate Chapman TechCom Facilitator (Contractor)
📘 Read this post on Phabricator at
https://phabricator.wikimedia.org/phame/live/1/post/119/
________________________________
How’d we do in our strive for operational excellence last month? Read on to
find out!
- Month in numbers.
- Current problems.
- Highlighted stories.
## Month in numbers
* 1 documented incident since August 9. [1]
* 113 Wikimedia-prod-error tasks closed since August 9. [2]
* 99 Wikimedia-prod-error tasks created since August 9. [3]
## Current problems
Frequent:
* [MediaWiki-Logging] Exception from Special:Log (public GET). –
https://phabricator.wikimedia.org/T201411
* [Graph] Warning "data error" from ApiGraph in gzdecode. –
https://phabricator.wikimedia.org/T184128
* [RemexHtml] Exception "backtrack_limit exhausted" from search index jobs.
– https://phabricator.wikimedia.org/T201184
Other:
* [MediaWiki-Redirects] Exception from NS_MEDIA redirect (public GET). –
https://phabricator.wikimedia.org/T203942
This is an oldie: (Well..., it's an oldie where I come from... 🎸)
* [FlaggedRevs] Exception from Special:ProblemChanges (since 2011). –
https://phabricator.wikimedia.org/T176232
Terminology:
* An Exception (or fatal) causes user actions to be aborted. For example, a
page would display "Exception: Unable to render page", instead the article
content.
* A Warning (or non-fatal, or error) can produce page views that are
technically unaware of a problem, but may show corrupt or incomplete
information. For example, an article would display the word "null" instead
of the actual content. Or, a user may be told "You have (null) new
messages."
The combined volume of infrequent non-fatal errors is high. This limits our
ability to automatically detect whether a deployment caused problems. The
“public GET” risks in particular can (and have) caused alerts to fire that
notify Operations of wikis potentially being down. Such exceptions must not
be publicly exposed.
With that behind us... Let’s celebrate this month’s highlights!
## *️⃣ Quiz defect – "0" is not nothing!
Tyler Cipriani (Release Engineering) reported an error in Quiz. Wikiversity
uses Quiz for interactive learning. Editors define quizzes in the source
text (wikitext). The Quiz program processes this text, creates checkboxes
with labels, and sends it to a user. When the sending part failed, "Error:
Undefined index" appeared in the logs. Volunteer Umherirrender investigated.
A line in the source text can: define a question, or an answer, or nothing
at all. The code that creates checkboxes needs to decide between
"something" and "nothing". The code utilised the PHP "if" statement for
this, which compares a value to True and False. The answers to a quiz can
be any text, which means PHP first transforms the text to one of True or
False. In doing so, values like "0" became False. This meant the code
thought "0" was not an answer. The code responsible for sending checkboxes
did not have this problem. When the code tried to access the checkbox to
send, it did not exist. Hence, "Error: Undefined index".
Umherirrender fixed the problem by using a strict comparison. A strict
comparison doesn't transform a value first, it only compares.
– https://phabricator.wikimedia.org/T196684
## *️⃣ PageTriage enters JobQueue for better performance
Kosta Harlan (from Audiences's Growth team) investigated a warning for
PageTriage. This extension provides the New Pages Feed tool on the English
Wikipedia. Each page in the feed has metadata, usually calculated when an
editor creates a page. Sometimes, this is not available. Then, it must be
calculated on-demand, when a user triages pages. So far, so good. The
information was then saved to the database for re-use by other triagers.
This last part caused the serious performance warning: "Unexpected database
writes".
Database changes must not happen on page views. The database has many
replicas for reading, but only one "master" for all writing. We avoid using
the master during page views to make our systems independent. This is a key
design principle for MediaWiki performance. [5] It lets a secondary data
centre build pages without connecting to the primary (which can be far
away).
Kosta addressed the warning by improving the code that saves the calculated
information. Instead of saving it immediately, an instruction is now sent
via a job queue, after the page view is ready. This job queue then
calculates and saves the information to the master database. The master
synchronises it to replicas, and then page views can use it.
– https://phabricator.wikimedia.org/T199699 /
https://gerrit.wikimedia.org/r/455870
## *️⃣ Tomorrow, may be sooner than you think
After developers submit code to Gerrit, they eagerly await the result from
Jenkins, an automated test runner. It sometimes incorrectly reported a
problem with the MergeHistory feature. The code assumed that the tests
would finish by "tomorrow".
It might be safe to assume our tests will not take one day to finish.
Unfortunately, the programming utility "strtotime", does not interpret
"tomorrow" as "this time tomorrow". Instead, it means "the start of
tomorrow". In other words, the next strike of midnight! The tests use UTC
as the neutral timezone.
Every day in the 15 minutes before 5 PM in San Francisco (which is midnight
UTC), code submitted to Code Review, could have mysteriously failing tests.
– Continue at https://gerrit.wikimedia.org/r/452873
## *️⃣ Continuous Whac-A-Mole
In August, developers started to notice rare and mysterious failures from
Jenkins. No obvious cause or solution was known at that time.
Later that month, Dan Duvall (Release Engineering team) started exploring
ways to run our tests faster. Before, we had many small virtual servers,
where each server runs only one test at a time. The idea: Have a smaller
group of much larger virtual servers where each server could run many tests
at the same time. We hope that during busier times this will better share
the resources between tests. And, during less busy times, allow a single
test to use more resources.
As implementation of this idea began, the mysterious test failures became
commonplace. "No space left on device", was a common error. The test
servers had their hard disk full. This was surprising. The new (larger)
servers seemed to have enough space to accommodate the number of tests it
ran at the same time. Together with Antoine Musso and Tyler Cipriani, they
identified and resolved two problems:
1) Some automated tests did not clean up after themselves.
2) The test-templates were stored on the "root disk" (the hard drive for
the operating system), instead of the hard drive with space reserved for
tests. This root disk is quite small, and is the same size on small servers
and large servers.
– https://phabricator.wikimedia.org/T202160 /
https://phabricator.wikimedia.org/T202457
## 🎉 Thanks!
Thank you to everyone who has helped report, investigate, or resolve
production errors. Including:
Tpt
Ankry
Daimona
Legoktm
Volker_E
Pchelolo
Dan Duvall
Gilles Dubuc
Daniel Kinzler
Umherirrender
Greg Grossmeier
Gergő Tisza (Tgr)
Sam Reed (Reedy)
Giuseppe Lavagetto
Brad Jorsch (Anomie)
Tim Starling (tstarling)
Kosta Harlan (kostajh)
Jaime Crespo (jcrespo)
Antoine Musso (hashar)
Roan Kattouw (Catrope)
Adam WMDE (Addshore)
Stephane Bisson (SBisson)
Niklas Laxström (Nikerabbit)
Thiemo Kreuz (thiemowmde)
Subramanya Sastry (ssastry)
This, that and the other (TTO)
Manuel Aróstegui (Marostegui)
Bartosz Dziewoński (matmarex)
James D. Forrester (Jdforrester-WMF)
Thanks!
Until next time,
– Timo Tijhof
________________________________
Further reading:
* August 2018 edition. –
https://lists.wikimedia.org/pipermail/wikitech-l/2018-August/090594.html
* July 2018 edition. –
https://lists.wikimedia.org/pipermail/wikitech-l/2018-July/090363.html
Footnotes:
[1] Incidents. –
https://wikitech.wikimedia.org/wiki/Special:AllPages?from=Incident+document…
[2] Tasks closed. –
https://phabricator.wikimedia.org/maniphest/query/wOuWkMNsZheu/#R
[3] Tasks opened. –
https://phabricator.wikimedia.org/maniphest/query/6HpdI76rfuDg/#R
[4] Quiz on Wikiversity. –
https://en.wikiversity.org/wiki/How_things_work_college_course/Conceptual_p…
[5] Operate multiple datacenters. –
https://www.mediawiki.org/wiki/Requests_for_comment/Master-slave_datacenter…
All that red makes the page look bad, and i would like to point out the abuse factor here, all those red links start edit wars,
and should be put there if any by people,
The creation of the wikidata page also creats a problem, because it does not establis a lable which should be mandatory
and in english,
in the save proses.
and this problem * https://www.wikidata.org/wiki/Wikidata:WikiProject_Labels_and_descriptions#…
>Tuesday, September 25, 2018 2:58 AM -05:00 from Sergey Leschina <mail(a)putnik.ws>:
>
>I want to draw your attention to the problem from the other side. On the newly created page, which can be opened by the red link, there is no binding to the Wikidata. This means that after the creation, the page will not automatically be linked to the Wikidata. And if the project has templates that can use information from the Wikidata, they will not fully work until the page will be saved at least once and linked to an item. I already suggested to add the parameter for this: https://phabricator.wikimedia.org/T178249
>
>If something like this will be implemented, then it will be possible to make a template for the red links (with Lua and TemplateStyles) that will be connected to the Wikidata. Although I agree that it is better to have a syntax that will allow to make links without such difficulties.
>пн, 24 сент. 2018 г. в 20:50, Maarten Dammers < maarten(a)mdammers.nl >:
>>Hi everyone,
>>
>>According to https://www.youtube.com/watch?v=TLuM4E6IE5U : "Semantic
>>annotation is the process of attaching additional information to various
>>concepts (e.g. people, things, places, organizations etc) in a given
>>text or any other content. Unlike classic text annotations for reader's
>>reference, semantic annotations are used by machines to refer to."
>>(more at
>>https://ontotext.com/knowledgehub/fundamentals/semantic-annotation/ )
>>
>>On Wikipedia a red link is a link to an article that hasn't been created
>>(yet) in that language. Often another language does have an article
>>about the subject or at least we have a Wikidata item about the subject.
>>Take for example
>>https://nl.wikipedia.org/w/index.php?title=Friedrich_Ris . It has over
>>250 incoming links, but the person doesn't have an article in Dutch. We
>>have a Wikidata item with links to 7 Wikipedia's at
>>https://www.wikidata.org/wiki/Q116510 , but no way to relate
>>https://nl.wikipedia.org/w/index.php?title=Friedrich_Ris with
>>https://www.wikidata.org/wiki/Q116510 .
>>
>>Wouldn't it be nice to be able to make a connection between the red link
>>on Wikipedia and the Wikidata item?
>>
>>Let's assume we have this list somewhere. We would be able to offer all
>>sorts of nice features to our users like:
>>* Hover of the link to get a hovercard in your favorite backup language
>>* Generate an article placeholder for the user with basic information in
>>the local language
>>* Pre-populate the translate extension so you can translate the article
>>from another language
>>(probably plenty of other good uses)
>>
>>Where to store this link? I'm not sure about that. On some Wikipedia's
>>people have tested with local templates around the red links. That's not
>>structured data, clutters up the Wikitext, it doesn't scale and the
>>local communities generally don't seem to like the approach. That's not
>>the way to go. Maybe a better option would be to create a new property
>>on Wikidata to store the name of the future article. Something like
>>Q116510: Pxxx -> (nl)"Friedrich Ris". Would be easiest because the
>>infrastructure is there and you can just build tools on top of it, but
>>I'm afraid this will cause a lot of noise on items. A couple of
>>suggestions wouldn't be a problem, but what is keeping people from
>>adding the suggestion in 100 languages? Or maybe restrict the usage that
>>a Wikipedia must have at least 1 (or n) incoming links before people are
>>allowed to add it?
>>We could create a new projects on the Wikimedia Cloud to store the
>>links, but that would be quite the extra time investment setting up
>>everything.
>>
>>What do you think?
>>
>>Maarten
>>
>>
>>
>>
>>_______________________________________________
>>Wikidata mailing list
>>Wikidata(a)lists.wikimedia.org
>>https://lists.wikimedia.org/mailman/listinfo/wikidata
>
>
>--
>Sergey Leschina
>_______________________________________________
>Wikidata mailing list
>Wikidata(a)lists.wikimedia.org
>https://lists.wikimedia.org/mailman/listinfo/wikidata
Reminder: Technical Advice IRC meeting again **Wednesday 3-4 pm UTC** on
#wikimedia-tech.
Question can be asked in English, Persian & German.
The Technical Advice IRC Meeting is a weekly support event for volunteer
developers. Every Wednesday, two full-time developers are available to help
you with all your questions about Mediawiki, gadgets, tools and more! This
can be anything from "how to get started" over "who would be the best
contact for X" to specific questions on your project.
If you know already what you would like to discuss or ask, please add your
topic to the next meeting:
https://www.mediawiki.org/wiki/Technical_Advice_IRC_Meeting
Hope to see you there!
Michi (for the Technical Advice IRC Meeting crew)
--
Michael F. Schönitzer
Wikimedia Deutschland e.V. | Tempelhofer Ufer 23-24 | 10963 Berlin
Tel. (030) 219 158 26-0
http://wikimedia.de
Stellen Sie sich eine Welt vor, in der jeder Mensch an der Menge allen
Wissens frei teilhaben kann. Helfen Sie uns dabei!
http://spenden.wikimedia.de/
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/681/51985.
Hello everyone,
In 2016, I did some work on a project [
https://phabricator.wikimedia.org/T118463] (under the mentorship of Stephen
LaPorte, Lydia Pintscher, Marius Hoch, Tim Starling and Bene*) to write
automated unit tests for the IFTTT app for Wikimedia (Wikipedia Channel on
IFTTT), add a few Wikidata related trigger (queries done using the WDQS)
and also to build an interface[4] to allow users not wanted to use the
IFTTT website to be able to get updates on various Wikipedia feeds such as
(AotD etc).
I documented the work (progress report) [1] and also developed a v1.0 of
the User Manual [2], Hoo (Marius) gave an idea to transfer the manual
content to a wiki page so it can easily be improved by others (such as
translation etc) when using the tool and we now have it mediawiki [dot] org
[3].
Would like to share the work that has been done over the years to community
and fans of RSS should have a look at it and give some feedback for
improvements. Thanks!
Note: Some triggers (like the hash-tag) trigger RSS will not work due to
database changes on the Toolforge replica but it's on the radar and some
part of the work can't be tested by users or seen directly since it's
mostly backend stuff, just the RSS UI can be used by community as it's
user-facing.
[1] https://www.mediawiki.org/wiki/User:Alangi_Derick/IFTTT_GSoC_Report_2016
[2] https://commons.wikimedia.org/wiki/File:Wikipedia_RSS_User_Manual.pdf
[3] https://www.mediawiki.org/wiki/Help:Wikipedia_RSS_Feed_User_Manual
[4] https://tools.wmflabs.org/ifttt-testing/ifttt/v1/rss-feeds
*--*
*Regards, Derick .N.,*
*V. Developer {at} Wikimedia*
*See also: https://bit.ly/2xUjuLT <https://bit.ly/2xUjuLT>*