Sorry for cross-posting.
The Technical Decision-Making Forum Retrospective
<https://www.mediawiki.org/wiki/Technical_decision_making> team invites you
to join one of our “listening sessions” about the Wikimedia's technical
We are running the listening sessions to provide a venue for people to tell
us about their experience, thoughts, and needs regarding the process of
making technical decisions across the Wikimedia technical spaces. This
complements the survey
<https://wikimediafoundation.limesurvey.net/885471?lang=en>, which closed
on August 7.
Who should participate in the listening sessions?
People who do technical work that relies on software maintained by the
Wikimedia Foundation (WMF) or affiliates. If you contribute code to
MediaWiki or extensions used by Wikimedia, or you maintain gadgets or tools
that rely on WMF infrastructure, and you want to tell us more than could be
expressed through the survey, the listening sessions are for you.
How can I take part in a listening session?
There will be four sessions on two days, to accommodate all time zones. The
two first sessions are scheduled:
- Wednesday, September 13, 14:00 – 14:50 UTC
- Wednesday, September 13, 20:00 – 20:50 UTC
The sessions will be held on the Zoom platform.
If you want to participate, please sign up for the one you want to attend: <
If none of the times work for you, please leave a message on the talk page.
It will help us schedule the two last sessions.
The sessions will be held in English. If you want to participate but you
are not comfortable speaking English, please say so when signing up so that
we can provide interpretation services.
The sessions will be recorded and transcribed so we can later go back and
extract all relevant information. The recordings and transcripts will not
be made public, except for anonymized summaries of the outcomes.
What will the Retrospective Team do with the information?
The retrospective team will collect the input provided through the survey,
the listening sessions and other means, and will publish an anonymized
summary that will help leadership make decisions about the future of the
In the listening sessions, we particularly hope to gather information on
the general needs and perceptions about decision-making in our technical
spaces. This will help us understand what kind of decisions happen in the
spaces, who is involved, who is impacted, and how to adjust our processes
Are the listening sessions the best way to participate?
The primary way for us to gather information about people’s needs and wants
with respect to technical decision making was the survey
<https://wikimediafoundation.limesurvey.net/885471?lang=en>. The listening
sessions are an important addition that provides a venue for free form
conversations, so we can learn about aspects that do not fit well with the
structure of the survey.
In addition to the listening sessions and the survey, there are two more
ways to share your thoughts about technical decision making: You can post
on the talk page
or you can send an email to <tdf-retro-2023(a)lists.wikimedia.org>.
Where can I find more information?
There are several places where you can find more information about the
Technical Decision-Making Process Retrospective:
The original announcement about the retrospective from Tajh Taylor:
The Technical Decision-Making Process general information page:
The Technical Decision-Making Process Retrospective page:
The Phabricator ticket: https://phabricator.wikimedia.org/T333235
Who is running the technical decision making retrospective?
The retrospective was initiated by Tajh Taylor. The core group running the
process consists of Moriel Schottlender (chair), Daniel Kinzler, Chris
Danis, Kosta Harlan, and Temilola Adeleye. You can contact us at <
Thank you for participating!
Benoît Evellin - Trizek (he/him)
Community Relations Specialist
Wikimedia Foundation <https://wikimediafoundation.org/>
My name is Anna Yuan and I am an undergraduate student working under the
supervision of Dr.Haiyi Zhu <https://haiyizhu.com/> in the HCI Department
at Carnegie Mellon University. Our team is currently conducting a research
study on Wikipedia's ORES system. Our research focuses on exploring
opportunities to better communicate the affordance of the ORES system and
thus help people effectively design and use ORES-based applications.
If you have developed or used any ORES-based application, we would love to
invite you to participate in this research. The research will be an
interview that takes approximately 45 minutes. During the research, I will
ask you about your background, your current experience of using ORES and
ORES-based applications, and your suggestion on how to improve the
All participants will be offered $20 amazon gift cards. If you are
interested in taking part in this research or would like more information,
please reply to this email and let me know.
I am looking forward to your response.
Wiki Account: https://en.wikipedia.org/wiki/User:Bobo.03
-------- Messaggio inoltrato --------
Oggetto: [Xmldatadumps-l] Your comments needed (long term dumps rewrite?)
Data: Thu, 19 Feb 2015 12:30:01 +0200
Mittente: Ariel Glenn WMF <ariel(a)wikimedia.org>
The MediaWiki Core team has opened a discussion about getting more
involved in and maybe redoing the dumps infrastructure. A good starting
point is to understand how folks use the dumps already or want to use
them but can't, and some questions about that are listed here:
I've added some notes but please go weigh in. Don't be shy about what
you do/what you need, this is the time to get it all on the table.
As brought to my attention by Max, through a post on his talk page:
== The problem ==
Basically, some user defined JS out there has "http://" hard coded, and
when you request insecure (non-https) resources during a secure session
you get errors in many modern browsers.
This is a heads up that this user created code might need some special
attention before and after the switch over.
== How to fix ==
The best option is to use "protocol relative" urls, these are urls that
start with "//someurl" instead of "http://" or "https://" or "gopher://"
Here's a blog post about when we started using protocol relative urls in
MediaWiki and at WMF:
I've put up basic instructions on the HTTPS page on metawiki:
Thanks for getting this out to the appropriate channels!
cc'ing Wikibots-l(a)lists.wikimedia.org because there might be some
overlap of affected people on that list.
| Greg Grossmeier GPG: B2FA 27B1 F7EB D327 6B8E |
| identi.ca: @greg A18D 1138 8E47 FAC8 1C7D |
This has implications for bot owners (all users, including bot users,
will be forced to HTTPS instead of HTTP when logged in).
Please review your code to make sure it won't break on Wednesday :)
----- Forwarded message from Greg Grossmeier <greg(a)wikimedia.org> -----
> Date: Mon, 19 Aug 2013 17:00:09 -0700
> From: Greg Grossmeier <greg(a)wikimedia.org>
> To: Wikimedia developers <wikitech-l(a)lists.wikimedia.org>, Wikitech Ambassadors
> Subject: HTTPS for logged in users on Wednesday August 21st
> Hello all,
> As we outlined in our blog post on the future of HTTPS at the Wikimedia
> Foundation, the plan is to enable HTTPS by default for logged in
> users on August 21st, this Wednesday.
> We are still on target for that rollout date.
> As this can have severe consequences for users where HTTPS is blocked by
> governments/network operators *in addition to* users who connect to
> Wikimedia sites via high latency connections, we've set up a page on
> MetaWiki describing what is going on and what it means for users and
> what they can do to report problems.
> Please help watch out for any unintended consequences on August 21st and
> report any negative issues to us as soon as you can. Bugzilla, IRC
> (#wikimedia-operations), or the (forthcoming) OTRS email are all fine.
> Also, feel free to email myself or ping me directly on IRC.
>  https://blog.wikimedia.org/2013/08/01/future-https-wikimedia-projects/
>  https://meta.wikimedia.org/wiki/HTTPS
>  https://bugzilla.wikimedia.org
> | Greg Grossmeier GPG: B2FA 27B1 F7EB D327 6B8E |
> | identi.ca: @greg A18D 1138 8E47 FAC8 1C7D |
----- End forwarded message -----
| Greg Grossmeier GPG: B2FA 27B1 F7EB D327 6B8E |
| identi.ca: @greg A18D 1138 8E47 FAC8 1C7D |
User:Sadads and I are currently working on an academic article related
to the coverage of historical information on Wikipedia (see the
outline of our methods below). We were thinking that some of the
existing bots or perhaps people in the bot-writing community might be
able to help us write some scripts that will pull the information we
need off wiki. We are not script writers, but we think what we want to
do is pretty easy. Please let us know if you can help us out! Thanks!
Adrianne (User:Wadewitz) and Alex (User:Sadads)
In this study, we use the following ways to analyze the ways in which
Wikipedia articles approach historiography. We approached our analysis
in two ways: quantitatively and qualitatively. In our quantitative
approach, we followed the following procedures:
First, we look at the number of different sources the article cites.
We determined this by running a script over the article that counted
the number of discrete citations in the footnotes and works cited.
Because many articles have a large number of sources but rely on a
small number of them for much of their information, we also look at
how often each source is used and whether any one source is used
disproportionately. While there are reliable sources that could be
used in this way, we have found that this is a marker of an article
that presents only one historiographical viewpoint.
Second, we are also interested in the types of sources used. So, using
a script to check the publication information and template information
of the source, we analyzed the ratio of journal to book to newspaper
to web sources. Moreover, because articles that have a wide span of
publication dates tend to have a good representation of
historiography, we analyzed the dates published of the sources.
Third, we searched the articles for the following words, based on a
preliminary survey of 25 articles we used as initial. These words
indicated that the articles approached history and historiography from
an ambiguous or debatable position: “probably”, “possibly”, “on the
other hand”, “one view”, “bias”, “perspectives”. We also searched for
sections such as “Historiography”, “Modern view”, “Legacy”, and
We chose to analyze 19th-century FA, GA, and B articles. The GA and FA
articles have undergone a review process on Wikipedia and thus should
be better. We excluded any B article that had been through a peer
review on the site, as we wanted to contrast articles that had been
through Wikipedia content revision process. We wanted to know what the
“best” articles Wikipedia had to offer before and after comment by the
community. We also chose this field as both of us have some
familiarity with the time period but neither of us had worked
extensively on the articles, so there was no conflict of interest. We
also excluded any military history articles because of the significant
difference in historiographic focus of the military history community.
Additionally, the Wikipedia community has a significant more coverage
on the topic of military history, both in number of articles and level
of commitment to that subtopic within the community, WikiProject
Military History being one of the most active and having a different
standard of topic coverage.
Dr. Adrianne Wadewitz
Mellon Digital Scholarship Fellow
Center for Digital Learning + Research
Congratulations to everyone involved. I'd love to have a look at it and propose it for Tamil Wikipedia. Were the Swedish common names obtained by manual translation?
"That language is an instrument of human reason, and not merely a medium for the expression of thought, is a truth generally admitted."
- George Boole, quoted in Iverson's Turing Award Lecture
> From: "wikibots-l-request(a)lists.wikimedia.org" <wikibots-l-request(a)lists.wikimedia.org>
>Sent: Thursday, October 18, 2012 5:30 PM
>Subject: Wikibots-l Digest, Vol 40, Issue 2
>Send Wikibots-l mailing list submissions to
>To subscribe or unsubscribe via the World Wide Web, visit
>or, via email, send a message with subject or body 'help' to
>You can reach the person managing the list at
>When replying, please edit your Subject line so it is more specific
>than "Re: Contents of Wikibots-l digest..."
> 1. A bot to create articles about species (Lars Aronsson)
> 2. Re: [Wikitech-l] A bot to create articles about species
> (Nikola Smolenski)
>Date: Thu, 18 Oct 2012 03:26:20 +0200
>From: Lars Aronsson <lars(a)aronsson.se>
>To: Wikimedia developers <wikitech-l(a)lists.wikimedia.org>, Wikimedia
> bot editors discussion <wikibots-l(a)lists.wikimedia.org>
>Subject: [Wikibots-l] A bot to create articles about species
>Content-Type: text/plain; charset=ISO-8859-1; format=flowed
>User:Lsj has written 4000 lines of C# source code on top
>of the DotNetWikiBot framework, to create 10,000 articles
>in Swedish about bird species in the spring of 2012 and
>recently even more articles in Swedish about fungi species.
>Some information about his Lsjbot is found here,
>The otherwise very reluctant/skeptic/picky Swedish Wikipedia
>community has gladly accepted these well-written articles.
>I think it would be interesting if a community of wikipedians
>in some other language would try to translate this bot.
>Some languages might have notability or relevance requirements
>that these species don't fulfill, others might think 1700
>bytes is a too short article. But I think the citation of
>sources and correctness of fact would be generally accepted.
>Here is a blog post in Swedish about the bird articles,
>Some 3,600 birds are found in this category for articles
>that were bot-created and have not yet been inspected,
>Some 54,000 fungi species are found here,
>The birds more often have common names, which are preferred
>as article names instead of the Latin/scientific names,
>e.g. the blue-and-white swallow,
>where the Latin name is a bot-created redirect to the
>At the Swedish Wikipedia village pump there is now a
>discussion of whether to continue with species of animals,
>plants, bacteria, etc.
> Lars Aronsson (lars(a)aronsson.se)
> Aronsson Datateknik - http://aronsson.se
>Date: Thu, 18 Oct 2012 08:46:22 +0200
>From: Nikola Smolenski <smolensk(a)eunet.rs>
>To: Wikimedia developers <wikitech-l(a)lists.wikimedia.org>
>Cc: Wikimedia bot editors discussion <wikibots-l(a)lists.wikimedia.org>
>Subject: Re: [Wikibots-l] [Wikitech-l] A bot to create articles about
>Content-Type: text/plain; charset="utf-8"; Format="flowed"
>On 18/10/12 03:26, Lars Aronsson wrote:
>> User:Lsj has written 4000 lines of C# source code on top
>> of the DotNetWikiBot framework, to create 10,000 articles
>> in Swedish about bird species in the spring of 2012 and
>> recently even more articles in Swedish about fungi species.
>> Some information about his Lsjbot is found here,
>> The otherwise very reluctant/skeptic/picky Swedish Wikipedia
>> community has gladly accepted these well-written articles.
>> I think it would be interesting if a community of wikipedians
>> in some other language would try to translate this bot.
>> Some languages might have notability or relevance requirements
>> that these species don't fulfill, others might think 1700
>> bytes is a too short article. But I think the citation of
>> sources and correctness of fact would be generally accepted.
>The need for such bots should cease after Wikidata is fully deployed. I
>suggest to interested programmers that they should direct their effort
>Wikibots-l mailing list
>End of Wikibots-l Digest, Vol 40, Issue 2
Hi folks :)
As you might have heard we've been working on Wikidata
(http://meta.wikimedia.org/wiki/Wikidata) for a few months now. The
first phase of Wikidata is about centralizing language links in
Wikidata. We are getting close to a first deployment of this on the
Hungarian Wikipedia and then probably the Hebrew or Italian Wikipedia.
I wanted to give you a heads-up about this.
A few people have already started working on bots to migrate language
links to Wikidata where desired. (The current system will continue to
work.) This includes the PyWikidata library. You can find the current
state here: http://meta.wikimedia.org/wiki/Wikidata/Bots.
I also wanted to let you know that we'll unfortunately have to break
the API after deployment. I'll keep you updated on this. It might be
wise to wait for that to happen before transitioning existing bots.
Please let me know about any questions you have.
Lydia Pintscher - http://about.me/lydia.pintscher
Community Communications for Wikidata
Wikimedia Deutschland e.V.
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
Hi there, sorry if this may be off-topic. I'm an admin @ it.wp and I'm
also part of the WikiAfrica Project
We're in possession of the updated datas about the administrative
divisions of Botswana (up to villages) and we'd like to upload them on
as many WP version as possible. We're in contact with some bot-owners
on it.wp, sv.wp, nl.wp, pl.wp, and tn.wp.
What we need is a bot owner that can help us in programming a script
and create these articles, and of course some help for the
translations in your native language.
If you are interested in it or you need any clarification, please feel
free to contact me.