Hi,
On Tue, Mar 1, 2016 at 3:36 PM, David Strine <dstrine(a)wikimedia.org> wrote:
> We will be holding this brownbag in 25 minutes. The Bluejeans link has
> changed:
>
> https://bluejeans.com/396234560
I'm not familiar with bluejeans and maybe have missed a transition
because I wasn't paying enough attention. is this some kind of
experiment? have all meetings transitioned to this service?
anyway, my immediate question at the moment is how do you join without
sharing your microphone and camera?
am I correct thinking that this is an entirely proprietary stack
that's neither gratis nor libre and has no on-premise (not cloud)
hosting option? are we paying for this?
-Jeremy
Hello,
can someone to update list https://phabricator.wikimedia.org/P10500 which
contains repositories which haven't mediawiki/mediawiki-codesniffer.
I found in list that much repositories are empty, and repositories which
aren't available on Gerrit.
So, can someone please update this list of repositories (in
mediawiki/extensions) which haven't mediawiki/mediawiki-codesniffer, but at
least, contains one PHP file. or to provide me command with which I can
update list when I want, so I don't need to request it every time.
Best regards,
Zoran.
P. S.: Happy weekend! :)
Hi there,
I am investigating a breakage in my extension that has occurred in MW 1.34
but which didn't seem to be a problem on MW 1.29. (I have not tested
interim versions to see where the issue first arises.)
One of the parser hooks in the extension needs to perform variable
expansion. What is happening is a lot more complicated than this example,
but effectively
<my_hook Foo="What the foo!">{{{Foo}}}</my_hook>
should end up generating the following output, using variable expansion:
What the foo!
The semantics of variable handling need to follow the MW semantics,
including default values (possibly nested), parser functions, etc. therefore
it needs to use the MW parser to perform the expansion.
Assuming the arguments that MW passes into the parser hook are named $Text,
$Vars, $Parser and $Frame, the relevant code looks something like this
(again, a bit more complicated in practice):
$NewFrame = new PPTemplateFrame_DOM($Frame->preprocessor, $Frame,
array(), $Vars, $Frame->title);
return $Parser->replaceVariables($Text, $NewFrame);
(I have included a more detailed listing of the code that I am using for
doing the parse at the end of this message.)
My code was working fine on MW 1.29 and earlier, but when I upgrade to 1.34
I am finding that I get a fatal exception thrown when my tag is encountered:
/index.php?title=Main_Page MWException
from line 373 of ~\includes\parser\PPFrame_DOM.php:
PPFrame_DOM::expand: Invalid parameter type
I have generated a backtrace and the top of the stack is as follows:
#0 ~\includes\parser\Parser.php(3330): PPFrame_DOM->expand(PPNode_Hash_Tree,
integer)
#1 MyExtension.php (434): Parser->replaceVariables(string,
PPTemplateFrame_DOM)
#2 ~\includes\parser\Parser.php(4293): MyExtensionParserHook(string, array,
Parser, PPTemplateFrame_Hash)
(The subsequent call stack entries are the parent functions you would expect
to see in that situation.)
Can anyone see why the above code would no longer work as it did on previous
versions? What is the current recommended method for manually expanding
template variables from within a parser hook?
Kind regards,
- Mark Clements (HappyDog)
----------------------------------
Full example (with extension-specific code omitted):
----------------------------------
function MyExtensionParserHook($Text, $Vars, $Parser, $Frame) {
// 1) Manipulate $Text and $Vars
// (omitted)
// 2) Expand variables in the resulting text.
// Set up a new frame which mirrors the existing one but which also has
the
// field values as arguments.
// If we are already in a template frame, merge the field arguments with
the
// existing template arguments first.
if ($Frame instanceof PPTemplateFrame_DOM) {
$NumberedArgs = $Frame->numberedArgs;
$NamedArgs = array_merge($Frame->namedArgs, $Vars);
}
else {
$NumberedArgs = array();
$NamedArgs = $Vars;
}
$NewFrame = new PPTemplateFrame_DOM($Frame->preprocessor, $Frame,
$NumberedArgs, $NamedArgs,
$Frame->title);
// Perform a recursive parse on the input, using our newly created
frame.
return $Parser->replaceVariables($Text, $NewFrame);
}
Last week, we had a Indic Wikisource Proofreadthon 2020 event. see
here for full details
https://meta.wikimedia.org/wiki/Indic_Wikisource_Proofreadthon_2020
Though I did not participate in this event, (feels sad for this. Life
is too messy nowadays), I thought to build a small tool to give report
on any wikipedia user’s contribution on a given wikisite for a given
date range.
It may help to calculate, measure, decide on the contributions for
such competitions.
Mediawiki has a good API to fetch user contributions.
https://www.mediawiki.org/wiki/API:Usercontribs
Get all edits by a user.
https://www.mediawiki.org/wiki/Special:MyLanguage/API:Usercontribs
For my wonder, there was a sample python code on the same page.
The code gave only 500 results. I wrote a loop to get the data batch
by batch till all the data is received.
Published the tool here –
https://github.com/tshrinivasan/wiki_user_contributions_report
How to run?
python3 get_user_contributions.py <language> <wikisite> <username>
<start_date> <end_date>
This will give the data as a CSV file. Used a csv-to-html converter
utility to convert this to a web page with all the data in a sortable
table.
For my wonder, my friend Dinesh Karthik, converted this as a nice web
application with flask, dash and hosted in heroku.
https://wiki-user-contributions.herokuapp.com/
Source : https://github.com/Dineshkarthik/wiki-user-contributions
Thanks to Info-farmer for providing the idea, Bartosz Dziewoński on
wikipedia mailing list for answering all my questions, Dinesh for
making a web application quickly.
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
is a good place to ask any tech questions regarding wikipedia.
Thanks to all wikisource contributors for the event and in general.
--
Regards,
T.Shrinivasan
My Life with GNU/Linux : http://goinggnu.wordpress.com
Free E-Magazine on Free Open Source Software in Tamil : http://kaniyam.com
Get Free Tamil Ebooks for Android, iOS, Kindle, Computer :
http://FreeTamilEbooks.com
Apologies for cross-posting
====
SEMANTiCS - 17th International Conference on Semantic Systems, September
6 - 9, 2021
Amsterdam, The Netherlands
https://2021-eu.semantics.cc/
====
= Important Dates (specific track dates are given below)
* Abstract Submission Deadline: March 22, 2021 (11:59 pm, Hawaii time)
* Paper Submission Deadline: March 29, 2021 (11:59 pm,Hawaii time)
* Notification of Acceptance: May 17, 2021 (11:59 pm, Hawaii time)
* Camera-Ready Paper: June 06, 2021 (11:59 pm, Hawaii time)
= Read a detailed description of all available calls online:
https://2021-eu.semantics.cc/calls
= Submission via Easychair on
https://easychair.org/my/conference?conf=sem21eu#
Proceedings of SEMANTiCS 2021 EU are planned to be published by Springer
LNCS & CEUR. All proceedings will be made available open access.
SEMANTiCS 2021 EU particularly welcomes submissions on the following key
topics:
Topics of interest include, but are not limited to:
* Web Semantics & Linked (Open) Data
* Enterprise Knowledge Graphs, Graph Data Management, and Deep Semantics
* Machine Learning & Deep Learning Techniques
* Semantic Information Management & Knowledge Integration
* Terminology, Thesaurus & Ontology Management
* Data Mining and Knowledge Discovery
* Reasoning, Rules, and Policies
* Natural Language Processing
* Data Quality Management and Assurance
* Explainable Artificial Intelligence
* Semantics in Data Science
* Semantics in Blockchain environments
* Trust, Data Privacy, and Security with Semantic Technologies
* Economics of Data, Data Services, and Data Ecosystems
-------
* Special Sub-Topic: Digital Humanities and Cultural Heritage
* Special Sub-Topic: LegalTech
* Special Sub-Topic: Distributed and Decentralized Knowledge Graphs
We especially encourage contributions that illustrate the applicability
of the topics mentioned above for industrial purposes and/or illustrate
the business relevance of their contribution for specific industries.
We invite contributions to the following tracks:
= Read a detailed description of all available calls online:
https://20201eu.semantics.cc/calls
== Research and Innovation Track ==
The Research and Innovation track at SEMANTiCS welcomes papers on novel
scientific research and/or innovations relevant to the topics of the
conference. Submissions must be original and must not have been
submitted for publication elsewhere. Papers must follow the guidelines
given in the author instructions, including references and optional
appendices. Each submission will be reviewed by several PC members who
will judge it based on its innovativeness, appropriateness, and impact
of results in terms of effectiveness at solving real problems.
= Important Dates:
* Abstract Submission Deadline: March 22, 2021 (11:59 pm, Hawaii time)
* Paper Submission Deadline: March 29, 2021 (11:59 pm,Hawaii time)
* Notification of Acceptance: May 17, 2021 (11:59 pm, Hawaii time)
* Camera-Ready Paper: June 06, 2021 (11:59 pm, Hawaii time)
Author instructions: Reviews will be carried out in a single-blind mode.
Long papers should have a maximum length of 15 pages and short papers of
6 pages. Submissions should follow the guidelines of the Springer LNCS
format. The detailed Call for Research and Innovation papers is
available here: https://2021-eu.semantics.cc/calls
== Posters and Demos Track ==
The Posters and Demonstrations Track invites innovative work in
progress, late-breaking research and innovation results, and smaller
contributions in all fields related to the Semantic Web and Linked Data
in a broader sense. These include submissions on innovative applications
with impact on end users, such as demos of solutions that users may test
or that are yet in the conceptual phase but are worth discussing, and
also applications or pieces of code that may attract developers and
potential research or business partners.
= Important Dates:
* Paper Submission Deadline: May 24, 2021 (11:59 pm, Hawaii time)
* Notification of Acceptance: June 21, 2021 (11:59 pm, Hawaii time)
* Camera-Ready Paper: July 05, 2021 (11:59 pm, Hawaii time)
Author instructions: Proceedings are planned to be published via CEUR
Workshop Proceedings and should follow the guidelines of the Springer
LNCS format. The detailed Call for Poster and Demos papers is available
online.
== Industry and Use Case Track ==
Focusing strongly on industry needs and ground breaking technology
trends SEMANTICS invites presentations on enterprise solutions that deal
with semantic processing of data and/or information. A special focus of
Semantics 2019 will be on the convergence of machine learning techniques
and knowledge graphs. Additional topics of interest are Enterprise
Knowledge Graphs, Semantic AI & Machine Learning, Enterprise Data
Integration, Linked Data & Data Publishing, Semantic Search,
Recommendation Services, Thesaurus and/or Ontology Management, Text
Mining, Data Mining and any related fields. All submissions should have
a strong focus on real-world applications beyond the prototypical stage
and demonstrate the power of semantic systems!
= Important Dates:
* Presentation Submission Deadline: April 26, 2021 (11:59 p.m.,Hawaii
time)
* Notification of Acceptance: May 17, 2021 (11:59 p.m., Hawaii time)
* Camera-Ready Presentation: July 26, 2021 (11:59 p.m.,
Hawaii time)
Submit your presentations here:
http://2021-eu.semantics.cc/submission-industry-presentations
== Workshops and Tutorials ==
Workshops and Tutorials at SEMANTiCS 2018 allow your organisation or
project to advance and promote your topics and gain increased
visibility. The workshops and tutorials will provide a forum for
presenting widely recognised contributions and findings to a diverse and
knowledgeable community. Furthermore, the event can be used as a
dissemination activity in the scope of large research projects or as a
closed format for research and commercial project consortia meetings.
= Important Dates for Workshops:
* Proposals WS Deadline: March 01, 2021 (11:59 pm, Hawaii time)
* Notification of Acceptance: March 15, 2021 (11:59 pm, Hawaii time)
= Important Dates for Tutorials (and other meetings, e.g. seminars,
show-cases, etc., without call for papers):
* Proposals Tutorial Deadline: June 07, 2021 (11:59 pm, Hawaii time)
* Notification of Acceptance: June 21, 2021 (11:59 pm,
Hawaii time)
== Special Calls ==
Special calls or sub-topics are dedicated towards specific topics that
are of special interest to the SEMANTiCS community. IN case we receive a
sufficient amount of high quality submissions these topics will become
special tracks within the conference program. For 2021 SEMANTiCS
Amsterdam encourages submissions to the following sub-topics:
* Special Sub-Topic: Digital Humanities and Cultural Heritage
* Special Sub-Topic: LegalTech
* Special Sub-Topic: Distributed and Decentralized Knowledge Graphs
Each sub-topic is managed by a distinct committee and encourages
submissions from the scientific or industrial domain. Scientific
submissions will undergo a thorough review process and will be published
in the conference proceedings in case of acceptance. Industrial
submissions will be evaluated and selected according to the quality
criteria of the industry track. WE are looking forward to your submissions!
== SEMANTiCS 2021 EU Organizing Committee ==
The program committee is announced on the conference website
https://2021-eu.semantics.cc/committee
= Read a detailed description of all available calls online:
https://2021-eu.semantics.cc/calls
== About SEMANTiCS ==
The annual SEMANTiCS EU conference is the meeting place for
professionals who make semantic computing work, understand its benefits,
and encounter its limitations. Every year, SEMANTiCS attracts
researchers and practitioners alike from a wide spectrum of
organisations ranging from universities, non-profit organisations,
public administration bodies, to SMEs and the largest companies in the
world.
SEMANTiCS is bound to continue a long tradition of building bridges
between like minded but often separated communities of interest. To do
so, the conference aims to explore the intersections, benefits and
hurdles of various traditions in artificial intelligence, machine
learning and semantic processing of graph data and information.
SEMANTiCS invites latest scientific research as well as presentations on
industry implementations, use case prototypes and best practices.
The SEMANTiCS program will provide a rich mix of technical talks, panel
discussions on emerging topics and presentations of practical systems by
people who make things work - just like you. In addition, attendees will
have a unique opportunity to network with experts in a variety of
fields. These relationships provide great value to organisations as they
encounter technical challenges in any stage of implementation. The
expertise gained by SEMANTiCS attendees has a long-term impact on their
careers and organisations. These factors make SEMANTiCS the key event
across Europe for a diverse community of industry leaders and academic
experts alike.
Hi. There is a serious problem in watchlist queue at the last hours. The
edits that one already read do not disappear from the Watchlist. If I use
the rcfilter "unseen changes" I can see also all the edits I already read,
but not in bold - what is definitely impossible for this filter (see
safemode screenshot below). When I use the Global Watchlist, the edits I
read do not disappear from the list for hours. This also means that the
problem exists on multiple wikis. I believe there is a need for "Unbreak
Now!" phabricator ticket, and do not file it by myself because I have no
way to get reconstruction steps. Hopely the developers could check the
queue and fix the problem. Thank you,
Igal (User:IKhitron)
https://imgur.com/a/zoEraFl
Hi everyone,
On the French Wikipedia we're currently reworking our help page about
redirects and some of us would like to include a section about common
misconceptions, especially those described at
<https://en.wikipedia.org/w/index.php?title=Wikipedia:Tools/Navigation_popup…>.
However one user who's used to "fix" redirects is strongly opposed
because "this page is from 2006" and it's "unsourced".[1]
So I would like to ask the sysadmins from Wikimedia and the MediaWiki
developers who are following this mailing list: is this page created in
2006 still true and relevant in 2020?
Thank you.
--
Kind regards,
Thibaut Payet
[[User:Thibaut120094]]
[1]
<https://sigma.toolforge.org/usersearch.py?name=Alaspada&page=Discussion_aid…>
In 2005, at the first Wikimania in Frankfurt, Germany,
Magnus Manske asked me if I could open up my Scandinavian
book scanning website Project Runeberg to German and
other languages, or release the software as open source.
I refused, as my software is just a rapid prototype that
would need to be rewritten from scratch anyway. But I
said that Wikisource could be used for this purpose. At
the time, Wikisource was only a wiki for e-text. As a
proof of concept, I put up "Meyers Blitz-Lexikon" as
the first book with scanned page images in Wikisource,
https://de.wikisource.org/wiki/Seite:LA2-Blitz-0005.jpg
and soon after the "New Student's Reference Work",
https://en.wikisource.org/wiki/Page:LA2-NSRW-1-0013.jpg
This was the basic inspiration for the "Proofread Page"
extension, now used in Wikisource.
In 2010-2011 I tried to use Wikisource, but I thought
this extension was too hard to work with. From scanner
to finished presentation, Wikisource was so much slower
to work with than my own system. By primary gripes are:
It is too hard to upload PDF files to Commons, it's too
hard to create the Index page, each page is not created
immediately (making the raw OCR text searchable), and
pages hidden in the Page: namespace are not always
indexed by search engines. Unfortunately, the system
hasn't improved much in the last decade.
(My criticism of my own website's system is a lot
harsher, but hits different targets.)
There is also a difference in how we view copyright,
as my own website can cut corners and scan some books
that are "most likely" out of copyright, which is
something Wikimedia's user communities never accept.
In 2012, I thought the time had finally come to rewrite
my software, but I failed to organize a project around
this, and instead I continued to use the existing system,
just adding volume. Indeed, Project Runeberg has grown
from 0.75 million book pages in 2012 to 3.1 million
pages today.
Now in 2020, I'm finally tired of my existing system's
limitations. What should I do? It's not 2005 or 2012
anymore. What has changed in that time?
I can't move everything over to Wikisource, because of
the copyright differences.
Should I start to use Mediawiki + ProofreadPage and
convert my collection to that format?
Should I develop my own modification of Mediawiki?
Is that a stable ground to work from?
It seems to me that PHP, MariaDB and the architecture
of Mediawiki with extensions has now been the same for
a long time. Will this last for the next 20 years?
Or is there today some other existing systems that
solve the same problem, that weren't available in 2005?
(And that Wikisource would have picked up, if it were
started today, instead of developing its own extension.)
--
Lars Aronsson (lars(a)aronsson.se)
Project Runeberg - free Nordic literature - http://runeberg.org/
tldr: If you have built skins or use skins that are not listed on
MediaWiki.org, please list them [4] and check that they work with current
MediaWiki.org. If you have always wanted to build a skin try the new tool
and give me feedback on how you get on! [3].
Longer version:
As part of my involvement in the desktop improvements project [1] I with
the help of many others have been trying to simplify the development of
skins.
As part of this, much-needed maintenance has occurred in MediaWiki core
with the intention of making skin development easier.
As a personal goal, I wanted to prototype a tool to showcase the skins
available in the ecosystem, and finally with the downtime of the holiday
period (no deploys!) I've finally done that. It allows showcasing [2] and
building skins [3].
While building this tool I was surprised to find that excluding forks of
skins, there are only __55 skins__ listed on MediaWiki.org. Out of those,
only 38 have been kept up to date.
I can't believe that given the age of this project there are only 38 usable
skins and I am writing to you in the hope that:
1) You know of others that can be added to MediaWiki.org in the "Skin"
namespace [4] Note, any edits to MediaWiki.org will automatically get
picked up by the tool and listed.
2) If you build skins for closed source wikis, please consider publishing
them over the holiday period if you can!
3) If you have fix skins that do not work with latest MediaWiki so I can
showcase them on the new tool.
4) If you are inspired to make a new skin, possibly trying out the starter
kit tool I have created [3] which will construct a working zip file that
can be added to your local mediawiki and eventually github/gerrit and give
me feedback via Phabricator/email/github what can be improved.
******
[1] https://www.mediawiki.org/wiki/Desktop_improvements
[2] https://skins.wmflabs.org/?
[3] https://skins.wmflabs.org/?#/add
[4]
https://www.mediawiki.org/w/index.php?action=edit&preload=Template%3ASkin%2…