I apologize for my continued confusion about what was written on which
mailing list. I can mostly blame Gmail for not handling the bcc's like I
think that it should. I referred to an email from 80hnhtv4agou in my
previous post to Research-l, but that person sent their email to
Wikitech-l. I am forwarding 80hnhtv4agou's email (below) to Research-l, and
I invited 80hnhtv4agou to participate here.
Pine
( https://meta.wikimedia.org/wiki/User:Pine )
---------- Forwarded message ---------
From: 80hnhtv4agou--- via Wikitech-l <wikitech-l(a)lists.wikimedia.org>
Date: Sat, Sep 15, 2018 at 1:34 PM
Subject: Re: [Wikitech-l] Results from 2018 global Wikimedia survey
arepublished!
To: Wikimedia developers <wikitech-l(a)lists.wikimedia.org>, <
egalvez(a)wikimedia.org>
Experience of harassment has
not declined since 2017 and appears to remain steady
At
what point and time does the foundation step in, as some language
editions do and stop the warring and abuse by users and administrators
?
does
the board know what is going on here ?
and
it is not just EN wikipedia, but in all l anguage editions as well.
Hi everybody,
on Thursday Sept 13th (EU morning) I am planning to reboot the stat hosts
(stat1004, stat1005 and stat1006) and the notebook hosts (notebook1003,
notebook1004) for Linux kernel upgrades. Please let me know if this impacts
your work in https://phabricator.wikimedia.org/T203165 or on IRC (elukey -
#wikimedia-analytics).
Thanks!
Luca
Cross-posting to Research-l. Thank you Leila.
Pine
( https://meta.wikimedia.org/wiki/User:Pine )
---------- Forwarded message ---------
From: Leila Zia <leila(a)wikimedia.org>
Date: Tue, Sep 11, 2018 at 12:43 AM
Subject: Re: [Wikimedia-l] Why We Read Wikipedia in your language
To: Wikimedia Mailing List <wikimedia-l(a)lists.wikimedia.org>
Hi all,
Update time.
Thank you all for your patience and support as we went through the
different stages of the analysis for this study. We have now concluded
the study based on the survey of the 14 Wikipedia languages [1]. Here
is what will happen next:
* We are doing some relatively major documentation at
https://meta.wikimedia.org/wiki/Research:Characterizing_Wikipedia_Reader_Be…
. The goal is to have that page and the sub-pages in a way that can be
consumed more easily by audiences beyond researchers. I expect the
pages to come to life almost completely on or before 2018-09-14. We
will need the first couple of weeks of October for data and code
documentation to make sure you have all the data you need for your
languages to dig deeper if you choose to. By the end of October,
please expect all documentation to be completed.
* We are happy to try to give presentations about this research to
your language community if there is interest on your end and we can
make it work on our end. The priority will be given to languages that
already participated in the study. If you want to sign up for one,
please go to
https://meta.wikimedia.org/wiki/Research:Characterizing_Wikipedia_Reader_Be…
* Our November Research Showcase [2] will most likely be on this
topic, so if you want to have a general overview of the results, keep
an eye on that.
* We have submitted a research paper to a peer-reviewed conference
based on this work. There is an anonymization process for the reviews
and in order to not break that we will wait until the results are out
(towards the end of October) and only then put the full paper on
arxiv, under CC BY-SA 4.0 or a more permissive license.
* We are discussing with our collaborators to potentially set up a
challenge for researchers to work with a subset of the data
(anonymized/aggregated/...) to answer an interesting research
questions. If you want to brainstorm with us about this, please drop a
line at
https://meta.wikimedia.org/wiki/Research_talk:Characterizing_Wikipedia_Read…
* Do you have an idea about how to more effectively disseminate this
knowledge? please call it out. There is quite a bit of knowledge to
share and we're honestly not 100% sure what the best way to do it is
across a global movement. As a result, we're offering a mix of
documentation, pinging points of contacts in each language so they're
aware of them, general presentations, language specific presentations,
as well as data documentation for you to be able to dig on your own
deeper.
Best,
Leila, on behalf of the researchers (Florian Lemmerich, Diego Saez,
Bob West, and myself)
[1] ar, bn, de, en, es, he, hi, hu, ja, nl, ro, ru, uk, zh
[2] https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase
_______________________________________________
Wikimedia-l mailing list, guidelines at:
https://meta.wikimedia.org/wiki/Mailing_lists/Guidelines and
https://meta.wikimedia.org/wiki/Wikimedia-l
New messages to: Wikimedia-l(a)lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l,
<mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe>
Hi. I'm writing to this mailing list because it is listed in the side bar
of the wikipapers site.
Yesterday I created an account in wikipapers
<http://wikipapers.referata.com/> and I am trying to upload an image to
wikipapers, but when I try to upload a 10K
B file I get this error:
<You cannot upload this file, because you are currently using 110.26953125
MB of file space, and uploading it would exceed the allowed quota for your
site (100 MB).>
in fact it happens with any file, because even a 152 bytes files gives me
this error.
I haven't uploaded any file yet so I don't understand that error :S
--
Saludos,
Abel.
Forwarding to Analytics and Research in case this is of interest.
Pine
( https://meta.wikimedia.org/wiki/User:Pine )
---------- Forwarded message ---------
From: Amir Sarabadani <amir.sarabadani(a)wikimedia.de>
Date: Thu, Sep 6, 2018 at 9:19 PM
Subject: [Wikitech-l] Change tag reading from the new column in the beta
cluster
To: Wikimedia developers <Wikitech-l(a)lists.wikimedia.org>
Hello,
As part of normalizing change tag schema [1] I just switched on reading
from the new column (ct_tag_id in change_tag table, a foreign key to ctd_id
from change_tag_def table) in beta cluster [2] which means new rows will
have empty string as their value of ct_tag. [3]
We are not rushing to flip the switch in production but I just wanted to
send this email asking people who test in beta cluster to file a
phabricator ticket if they see anything unexpected in there that might be
related change tags. This table is being read if someone checks history,
recent changes, watchlist, user contributions, or whole lot of other
special pages plus lots of API queries. I checked anything I could think of
but I might have missed something. Any extra pair of eyes would be
extremely appreciated.
[1]: https://phabricator.wikimedia.org/T185355
[2]: https://phabricator.wikimedia.org/T196671
[3]: For example:
MariaDB [enwiki]> select ct_id, ct_rc_id, ct_rev_id, ct_tag, ct_tag_id from
change_tag order by ct_id desc limit 10;
+--------+----------+-----------+-----------------------+-----------+
| ct_id | ct_rc_id | ct_rev_id | ct_tag | ct_tag_id |
+--------+----------+-----------+-----------------------+-----------+
| 217824 | 633991 | 384018 | | 3 |
| 217823 | 633990 | 384017 | | 3 |
| 217822 | 633989 | 384016 | | 3 |
| 217821 | 633988 | 384015 | | 3 |
| 217820 | 633987 | 384014 | | 3 |
| 217819 | 633986 | 384013 | mw-undo | 2 |
| 217818 | 633985 | 384012 | mw-undo | 2 |
| 217817 | 633984 | 384011 | visualeditor-wikitext | 29 |
| 217816 | 633983 | 384010 | mobile web edit | 16 |
| 217815 | 633983 | 384010 | mobile edit | 15 |
+--------+----------+-----------+-----------------------+-----------+
10 rows in set (0.00 sec)
Thank you!
Best
--
Amir Sarabadani
Software Engineer
Wikimedia Deutschland e. V. | Tempelhofer Ufer 23-24 | 10963 Berlin
Tel. (030) 219 158 26-0
http://wikimedia.de
Stellen Sie sich eine Welt vor, in der jeder Mensch an der Menge allen
Wissens frei teilhaben kann. Helfen Sie uns dabei!
http://spenden.wikimedia.de/
Wikimedia Deutschland – Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/029/42207.
_______________________________________________
Wikitech-l mailing list
Wikitech-l(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Dear Mr. or Ms.,
I thank you for your efforts. We are managing to host next november a Wikidata query service advanced training for computer scientists from University of Sfax, Tunisia. The purpose of this training is to allow these scientists to have advanced skills to develop SPARQL queries that can be later used in web applications of Wikidata for various purposes particularly medical ones. To do this training, we need funding and that is why we did a rapid grant application. https://meta.wikimedia.org/wiki/Grants:Project/Rapid/Csisc/SPARQL:_Be_conne…. We invite you to endorse it so that the advanced training can be done.
Yours Sincerely,
Houcemeddine Turki
Hello Research, Mobile, and Design colleagues,
In case other people are interested who didn't attend the August Wikimedia
Activities Meeting, there was a design research presentation in the meeting
regarding personas of mobile Wikimedia users: https://youtu.be/yZPZmRQnkXU
On a related note, I would like to learn more about design research,
including about how design research interfaces with analytics and UX
design, and I would like to request that WMF have an office hour on this
topic.
Regards,
Pine
( https://meta.wikimedia.org/wiki/User:Pine )
More changes are coming for dumps, this time for Hungarian Wikipedia
(approximately 436,000 articles) and Arabic Wikipedia.(approximately
595,000 articles).
Pine
( https://meta.wikimedia.org/wiki/User:Pine )
---------- Forwarded message ---------
From: Ariel Glenn WMF <ariel(a)wikimedia.org>
Date: Mon, Aug 20, 2018 at 10:27 AM
Subject: [Wikitech-l] huwiki, arwiki to be treated as 'big wikis' and run
parallel jobs
To: Wikipedia Xmldatadumps-l <Xmldatadumps-l(a)lists.wikimedia.org>,
Wikimedia developers <wikitech-l(a)lists.wikimedia.org>
Starting September 1, huwiki and arwiki, which both take several days to
complete the revsion history content dumps, will be moved to the 'big
wikis' list, meaning that they will run jobs in parallel as do frwiki,
ptwiki and others now, for a speedup.
Please update your scripts accordingly. Thanks!
Task for this: https://phabricator.wikimedia.org/T202268
Ariel
_______________________________________________
Wikitech-l mailing list
Wikitech-l(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l