Hi,
On Tue, Mar 1, 2016 at 3:36 PM, David Strine <dstrine(a)wikimedia.org> wrote:
> We will be holding this brownbag in 25 minutes. The Bluejeans link has
> changed:
>
> https://bluejeans.com/396234560
I'm not familiar with bluejeans and maybe have missed a transition
because I wasn't paying enough attention. is this some kind of
experiment? have all meetings transitioned to this service?
anyway, my immediate question at the moment is how do you join without
sharing your microphone and camera?
am I correct thinking that this is an entirely proprietary stack
that's neither gratis nor libre and has no on-premise (not cloud)
hosting option? are we paying for this?
-Jeremy
Phabricator users,
this is to let you know that the "aphlict" service has been disabled on
Phabricator (for now) because it caused stability issues.
This means you will not get realtime (pop-up) notifications on Phabricator.
(If you had those enabled in the first place).
Regular notifications (that do not pop-up) and emails are not affected by
this.
https://phabricator.wikimedia.org/T238593
--
Daniel Zahn <dzahn(a)wikimedia.org>
Operations Engineer
As of 950cf6016c, the mediawiki/core repo was updated to use DB_REPLICA
instead of DB_SLAVE, with the old constant left as an alias. This is part
of a string of commits that cleaned up the mixed use of "replica" and
"slave" by sticking to the former. Extensions have not been mass
converted. Please use the new constant in any new code.
The word "replica" is a bit more indicative of a broader range of DB
setups*, is used by a range of large companies**, and is more neutral in
connotations.
Drupal and Django made similar updates (even replacing the word "master"):
* https://www.drupal.org/node/2275877
* https://github.com/django/django/pull/2692/files &
https://github.com/django/django/commit/beec05686ccc3bee8461f9a5a02c607a023…
I don't plan on doing anything to DB_MASTER, since it seems fine by itself,
like "master copy", "master tape" or "master key". This is analogous to a
master RDBMs database. Even multi-master RDBMs systems tend to have a
stronger consistency than classic RDBMs slave servers, and present
themselves as one logical "master" or "authoritative" copy. Even in it's
personified form, a "master" database can readily be thought of as
analogous to "controller", "governer", "ruler", lead "officer", or such.**
* clusters using two-phase commit, galera using certification-based
replication, multi-master circular replication, ect...
**
https://en.wikipedia.org/wiki/Master/slave_(technology)#Appropriateness_of_…
***
http://www.merriam-webster.com/dictionary/master?utm_campaign=sd&utm_medium…
--
-Aaron
A few days ago, Google Code-in 2019 ended.
194 students completed 715 Wikimedia tasks.[1]
Big thanks and congratulations to everybody!
Read about their GCI experience with Wikimedia in their blog posts:
https://www.mediawiki.org/wiki/Google_Code-in/2019#Wrap-up_blog_posts
Thanks to our 43 mentors for being available, also on weekends &
holidays. Thanks to everyone on IRC, Gerrit, Phabricator, mailing
lists, Github, etc. for your friendliness, patience, support and help.
Thanks to Wikimedia org admins for making this run really smoothly.
And thanks to Google for organizing this opportunity for young people
to learn about and contribute to free software and free knowledge.
Google will announce Grand Prize winners and finalists on February 10.
We also welcome everybody's feedback what Wikimedia could improve:
https://www.mediawiki.org/wiki/Google_Code-in/Lessons_learned#2019
Sharing some of the students' achievements, to give you an impression:
* WMCZ's Tracker software received 40 fixes and improvements
* 37 fixes and improvements to the WatchTranslations tool
* 24 fixes and improvements to the VideoCutTool tool
* 12 fixes and improvements to the WikiContrib tool
* Tasks to learn programming in Lua were completed 218 times (such as:
Introduction to Lua in Wikipedia, Working with modules, Calculations
and tests, Loops and tables, Lua libraries, MediaWiki libraries,
Wikibase client, Name formats, Date formats, Using Wikidata)
* 13 extensions got extension.json converted to manifest_version 2
* 8 extensions got jshint and jscs replaced with eslint
* 5 extensions got jsonlint replaced with eslint
* Blockly, Huggle, Phabricator, video2commons, and Wiki Ed Dashboard
got migrated to the new Translate validator framework
* Flask-JSONLocale received three bug fixes
* Watchlist notifications can be delivered as web notifications
through Echo (but not enabled yet on Wikimedia sites)
* Kiwix Android App has a button to search for a new article
* Quarry has a button to toggle syntax highlighting
* Wikilink tool has a namespace filter and can filter bot edits
* Commons Android App has screenshots in 14 more languages, the
username can be clicked in the navigation drawer, an improved share
message, and videos for best media upload practices were created
* Commons Mass Description tool: /users does not throw a fatal error
* EasyTimeline has a system message to track category description
* MediaWiki Core's UploadFromUrl::isAllowedHost has tests
* Google Drive to Commons tool allows removing files after selection
before uploading, and has a redesigned 'upload complete' screen
* Hashtags tool allows charts to be downloaded as images
* MobileFrontend extension allows turning on the lead paragraph in
other namespaces and the nav menu drawer received a CSS shadow
* DynamicPageList extension uses page images as the image in gallery
mode for pages which are not in the File namespace
* MassMailer tool has improved internationalization, its alerts are
dismissable, and is integrated with Translatewiki.net
* Map of Monuments tool can filter by country and language
* Wiki Education Dashboard Android App has better screenshots and a
placeholder for empty screens
* WikiEduDashboard has a git pull hook to update gems and packages,
and its ArticleViewer adds appropriate styling for Wikidata
* Music on Wikimedia pages was transcribed to LilyPond 9 times
* 4 templates on English Wikipedia received Documentation
* A web app to showcase a variety of tools hosted on Wikimedia
Toolforge was created
* meta:Research_on_open_source_team_communication_tools was expanded
* Hackathons of other FOSS orgs were researched, to improve ours
* 14 text files in MediaWiki core's /doc got converted to Markdown
* Logos designed for VideoCutTool tool, the MediaWiki FormWizard
Extension, the Friends of the Docs working group, and the Google
Drive to Wikimedia Commons Uploader tool
* Designs for "Personal Space Needed" and "Wikimedia Friends of the
Docs" stickers created
* HD logos were added for several Wikimedia project websites
Thanks for helping making free knowledge available to everybody. <3
On behalf of the org admins,
andre
[1] https://www.mediawiki.org/wiki/Google_Code-in/Statistics
--
Andre Klapper (he/him) | Bugwrangler / Developer Advocate
https://blogs.gnome.org/aklapper/
On behalf of Open Labs Albania <https://openlabs.cc/en/> and the Wikimedia
Foundation Community Events Team we are pleased to announce that registration
for the Wikimedia Hackathon 2020
<https://www.mediawiki.org/wiki/Wikimedia_Hackathon_2020/Register> is now
open for both scholarships and regular attendees!
The hackathon will be held at OFIÇINA <http://www.oficina.al/> in Tirana,
Albania between 9-11 May 2020 (please note that this is a Saturday through
Monday instead of the usual Friday - Sunday). Additionally, we encourage
attendees to stay in Tirana and attend the 7th annual Open Source
Conference Albania (OSCAL) <https://oscal.openlabs.cc/> which will be in
the same venue on May 16th and 17th, and will have a Wikimedia Track and
mission aligned organizations participating.
We have identified two focus areas
<https://www.mediawiki.org/wiki/Wikimedia_Hackathon_2020#Hackathon_Focus_Are…>
for the event which we will continue to refine and clarify over the next
months. As usual, we welcome participants who plan to work on or learn
about any project that they like related to any area of Wikimedia
Technology.
If you have any questions or comments please contact:
wiki-hackathon(a)openlabs.cc Otherwise, please continue to follow our program
and organizational developments on Wikimedia Hackathon 2020 on Mediawiki
<https://www.mediawiki.org/wiki/Wikimedia_Hackathon_2020>.
Please help us by forwarding this email to relevant lists and looking
forward to see you in Tirana!
--
Rachel Farrand
Senior Program Manager
Events Team
Wikimedia Foundation
As research for a potential presentation topic at EMWCon 2020 [0], I'm
polling the community.
What algorithms do you program into your wikis? I'm curious to learn about
technical methods, policies, and criteria for measuring "success". I'm
interested in this concept because of how algorithms are used in social
media tools and many have been called out for being "bad". So I wonder if
wikis are any better. And don't just limit the analogy to social media.
There's things like Amazon's suggested purchases or the stack exchange
voting system selecting a "correct" answer by election. For example, the
extension Watch Analytics recommends which pages need review based on how
many people have reviewed each changed page. I think there are
voting/rating systems for wikis. I wonder how those influence decisions. At
one time I was working on an extension that built on the idea of
contribution score, but I feared it would incentivize the "wrong"
behavior.
I'm looking for how algorithms influence content presented to users, but
also if algorithms are used to make decisions.
Here are some other examples of what I would consider algorithms in wikis:
- Human driven voting systems like stack overflow
- Category and Property driven - Using Wikibase, Cargo, or SMW to
determine a priority or to identify a missing value
- Machine-learning - Antivandalism/spam tools
- Suggestions for which wiki pages to watch based on links between wiki
pages
- Suggestions for which wiki pages to review based on watchlists and how
many people have seen the page since it was last edited
- Suggestions for which users to collaborate with based on pages each
person has edited
- Contribution scores drive users to make more edits and to edit more
pages. Is this what we really want?
- Other?
Of the algorithms you use in your wikis, how do you measure success? What
does it mean to say that your algorithm is "working" or "succeeding"?
How do you know your algorithm is not inducing bias in a bad way?
If you'd like to share any info on how you use algorithms and how you
measure success in using them, please reply directly to me (unless you want
to share with the list). I'd appreciate screenshots and other ways to help
me understand how your algorithm works and is used, so I can properly
represent it in my presentation. I hope to get enough responses to showcase
the flexibility of MediaWiki in how it can be used to drive decisions based
on knowledge.
Daren
[0] https://www.mediawiki.org/wiki/EMWCon_Spring_2020
__________________
https://www.mediawiki.org/wiki/User:Darenwelshhttp://mixcloud.com/darenwelsh
Hello,
In an effort to create a repeatable and streamlined process for consumption
of security services the Security Team has been working on changes and
improvements to our workflows. Much of this effort is an attempt to
consolidate work intake for our team in order to more effectively
communicate status, priority and scheduling. This is step 1 and we expect
future changes as our tooling, capabilities and processes mature.
*How to collaborate with the Security Team*
The Security Team works in an iterative manner to build new and mature
existing security services as we face new threats and identify new risks.
For a list of currently deployed services please review our services [1]
page.
The initial point of contact for the majority of our services is now a
consistent Request For Services [2] (RFS) form [3].
The two workflow exceptions to RFS are the Privacy Engineering [4] service
and Security Readiness Review [5] process which already had established
methods that are working well.
If the RFS forms are confusing or don't lead you to the answers you need
try security-help(a)wikimedia.org to get assistance with finding the right
service, process, or person
security(a)wikimedia.org will continue to be our primarily external reporting
channel
*Coming changes in Phabricator*
We will be disabling the workboard on the #Privacy [6] project. This
workboard is not actively or consistently cultivated and often confuses
those who interact with it. #Privacy is a legitimate tag to be used in
many cases, but the resourced privacy contingent within the Security Team
will be using the #privacy engineering [7] component.
We will be disabling the workboard for the #Security [8] project. Like the
#Privacy project this workboard is not actively or consistently cultivated
and is confusing. Tasks which are actively resourced should have an
associated group [9] tag such as #Security Team [10].
The #Security project will be broken up into subprojects [11] with
meaningful names that indicate user relation to the #Security landscape.
This is in service to #Security no longer serving double duty as an ACL and
a group project. An ACL*Security-Issues project will be created and
#Security will still be available to link cross cutting issues, but will
also allow equal footing for membership for all Phabricator users.
*Other Changes*
A quick callout to the consistency [12] and Gerrit sections of our team
handbook [13]. As a team we have agreed that all changesets we interact on
need a linked task with the #security-team tag.
security@ will soon be managed as a Google group collaborative inbox [14]
as outlined in T243446.
Thanks
John
[1] Security Services
https://www.mediawiki.org/wiki/Wikimedia_Security_Team/Services
[2] Security RFS docs
https://www.mediawiki.org/wiki/Security/SOP/Requests_For_Service
[3] RFS form
https://phabricator.wikimedia.org/maniphest/task/edit/form/72/
[4] Privacy Engineering RFS
https://form.asana.com/?hash=554c8a8dbf8e96b2612c15eba479287f9ecce3cbaa09e2…
[5] Readiness Review SOP
https://www.mediawiki.org/wiki/Security/SOP/Security_Readiness_Reviews
[6] Phab Privacy tag
https://phabricator.wikimedia.org/tag/privacy/
[7] Privacy Engineering Project
https://phabricator.wikimedia.org/project/view/4425/
[8] Security Tag
https://phabricator.wikimedia.org/tag/security/
[9] Phab Project types
https://www.mediawiki.org/wiki/Phabricator/Project_management#Types_of_Proj…
[10] Security Team tag
https://phabricator.wikimedia.org/tag/security-team/
[11] Security Sub Projects
https://phabricator.wikimedia.org/project/subprojects/4420/
[12] Security Team Handbook
https://www.mediawiki.org/wiki/Wikimedia_Security_Team/Handbook#Consistency
[13] Secteam handbook-gerrit
https://www.mediawiki.org/wiki/Wikimedia_Security_Team/Handbook#Gerrit
[14] Google collab inbox
https://support.google.com/a/answer/167430?hl=en
Good day,
El mié., 29 ene. 2020 a las 19:13, Jefsey (<jefsey(a)jefsey.com>) escribió:
>
> Gentlemen,
> I run around 290 small thematic citizen research wikis (nr being
> developping) under an old mediawiki version (I fear an upgrading
> hassle). In order to simplify their set-up I systemized them in using
> a script to build the symbolic directories from a unique central one,
> so I only have to build the LocalSettings.php, the images directories
> mainly for the wikilogo.gif particular to the site and to enter the
> templates manually. To be sure I can move them around without too
> much pain and keep them under their own password, I use SQLite .
> Round 20 minutes set-up each.
>
> With a friend we would like to transfer all this under MYSQL (or
> MariaDB?) in order to share template and WikiDB. Possibly on several
> machines. Possibly developping some extension on the middle range.
> Possibly transfering further on under another database system (to mix
> diffect entries and mail entries). I feel we would first need to
> study a conceptual block map of the MediaWiki architecture, internal
> exchanges and database requests. Does that exist ?
>
When it comes to upgrading you may find
<https://www.mediawiki.org/wiki/Manual:Upgrading> useful. Please make
sure you backup the data just in case something goes wrong.
I am not an expert in the MediaWiki architecture area, but perhaps you
could find <https://www.mediawiki.org/wiki/Manual:MediaWiki_architecture>
useful, as well as
<https://www.mediawiki.org/wiki/Manual:Database_layout>.
MariaDB and MySQL (using InnoDB) are the most commonly used database
types <https://www.mediawiki.org/wiki/Manual:MySQL>.
> Also, in order to manage the whole thing advisably I would need two tips:
> 1. is there a secure/reliable method/extension to protect pages on a
> per page basis ?
If as for protection you mean the ability to *see* some wiki pages and
hide others, MediaWiki is intentionally not good at that. As you can
read at <https://www.mediawiki.org/wiki/Extension:Lockdown>, "If you
need per-page or partial page access restrictions, you are advised to
install an appropriate content management package. MediaWiki was not
written to provide per-page access restrictions, and almost all hacks
or patches promising to add them will likely have flaws somewhere,
which could lead to exposure of confidential data. We are not
responsible for anything being leaked, leading to loss of funds or
one's job." I am not sure if that information is still accurate
nowadays though, although I suspect it is.
If however you meant restrict editting wiki pages to some users or
user groups, MediaWiki provides a native protection system which is
described at <https://www.mediawiki.org/wiki/Manual:Protection> and
related pages. The AbuseFilter extension
<https://www.mediawiki.org/wiki/Extension:AbuseFilter> and the
TitleBlacklist (comes with MW 1.21 or higher)
<https://www.mediawiki.org/wiki/Extension:TitleBlacklist> can also
help you further restrict editions or actions in your wiki.
> 2. how to get on a daily basis the access count of the wiki pages ?
>
I suspect you'll need to set up something server-side for that. I
can't remember if MediaWiki comes with a feature that allows you to
get those stats.
Like I said, I'm perhaps not the best contact for this, but I hope the
links provided can help you.
> Thank you !
> jfc
>
Best regards, M.