Hello,
The committee has finished selecting new members and the new committee
candidates are (In alphabetical order):
- Amir Sarabadani
- Martin Urbanec
- MusikAnimal
- Tonina Zhelyazkova
- Tony Thomas
And auxiliary members will be (Also in alphabetical order):
- Huji
- Jayprakash12345
- Matanya
- Nuria Ruiz
- Tpt
You can read more about the members in [0]
The changes compared to last term are:
* Lucie has resigned in September 2019. Tpt (as one of auxiliary members)
filled her seat in the meantime and now Tpt is moving to be an auxiliary
member again.
* Martin Urbanec is joining the main committee to fill Lucie/Tpt seat
* Jay Prakash is joining auxiliary members
* Rosalie is leaving the auxiliary members.
This is not the final structure. According to the CoC [1], the current
committee publishes the new members and call for public feedback for *six
weeks* and after that, the current committee might apply changes to the
structure based on public feedback.
Please let the committee know if you have any concern regarding the members
and its structure until *09 June 2019* and after that, the new committee
will be in effect and will serve for a year.
[0]:
https://www.mediawiki.org/wiki/Code_of_Conduct/Committee/Members/Candidates
[1]:
https://www.mediawiki.org/wiki/Code_of_Conduct/Committee#Selection_of_new_m…
Amir, On behalf of the Code of Conduct committee
Best
Sorry if this an 'asked & answered' topic.
After stumbling through MediaWiki for a couple years, I still find the documentation fairly unnavigable. (When an appropriate page is discovered, it often seems to say something can be done & gives an example but the markup cannot be viewed.)
Just subscribed to the maillist today and have not yet discovered how to search its archives. Nor do I know the correct search terms.
Anyway...
We use MediaWiki for our open source software project documentation.
Can images be resized to match whatever font size the visitor is using for inline graphics? That is, are there resolution independent terms instead of pixels (px)? The 'vertical alignment' references text-top & text-bottom... so the image handler already knows these dimensions.
In particular, I have icons from the GUI that I'd like fit inline without affecting the leading.I also hope to float left, spanning 2 to 3 lines... similar to the way you'd use a drop cap.
Currently, my icon images in px almost disappear on a high DPI display.
-Brian
Reference examples:https://www.gramps-project.org/wiki/index.php/Gramps_Glossary#prim…https://www.gramps-project.org/wiki/index.php?title=Gramps_5.1_Wiki_Manual_…https://gramps-project.org/wiki/index.php/Template:Man_button
Can multiple diffs be fetched in the same API request? If not, what is the
best way to obtain diffs (in HTML form) for many revision IDs at once?
Suggestions involving the Wikimedia Toolforge replica databases would also
work, since I'm working on a Toolforge tool.
Thanks,
Enterprisey
Hello,
In case you haven't done any changes on database schema of mediawiki core,
let me explain the process to you (if you know this, feel free to skip this
paragraph):
* Mediawiki core supports three types of RDBMS: MySQL, Sqlite, Postgres. It
used to be five (plus Oracle and MSSQL)
* For each one of these types, you need to do three parts: 1- Change the
tables.sql file so new installations get the new schema 2- Make .sql schema
change file, like an "ALTER TABLE" for current installations so they can
upgrade. 3- Wire that schema change file into *Updater.php file.
* For example, this is a patch to drop a column:
https://gerrit.wikimedia.org/r/c/mediawiki/core/+/473601 This file touches
14 different files, adds 94 lines and removes 30.
This is bad for several reasons:
* It is extremely complicated to do a even a simple schema change. Usually
something as simple as adding an column takes a whole day for me. There are
lots of complicating factors, like Sqlite doesn't have ALTER TABLE, so when
you want to make a patch for adding a column, you need to make a temporary
table with the new column, copy the old table data to it, drop the old
table and then rename the old table.
** Imagine the pain and sorrow when you want to normalize a table meaning
you need to do several schema changes: 1- Add a table, 2- Add a column on
the old table, 3- make the column not-nullable when it's filled and make
the old column nullable instead 4- drop the old column.
* It's almost impossible to test all DBMS types, I don't have MSSQL or
Oracle installed and I don't even know their differences with MySQL. I
assume most other developers are good in one type, not all.
* Writing raw sqls, specially duplicated ones, and doubly specially when we
don't have CI to test (because we won't install propriety software in our
infra) is pretty much prone to error. My favourite one was that a new
column on a table was actually added to the wrong table in MSSQL and it
went unnoticed for two years (four releases, including one LTS).
* It's impossible to support more DBMS types through extensions or other
third party systems. Because the maintainer needs to keep up with all
patches we add to core and write their equivalents.
* For lots of reasons, these schemas are diverging, there have been several
work to just reduce this to a minimum.
There was a RFC to introduce abstract schema and schema changes and it got
accepted and I have been working to implement this:
https://phabricator.wikimedia.org/T191231
This is not a small task, and like any big work, it's important to cut it
to small pieces and gradually improve things. So my plan is first, I
abstract the schema (tables.sql files), then slowly I abstract schema
changes. For now, the plan is to make these .sql files automatically
generated through maintenance scripts. So we will have a file called
tables.json and when running something like:
php maintenance/generateSchemaSql.php --json maintenance/tables.json --sql
maintenance/tables-generated.sql --type=mysql
It would produce tables-generated.sql file. The code that produces it is
Doctrine DBAL and this is already installed as a dev dependency of core
because you would need Doctrine if you want to make a schema change, if you
maintain an instance, you should not need anything. Most of the work for
automatically generating schema is already merged and the last part that
wires it (and migrates two tables) is up for review:
https://gerrit.wikimedia.org/r/c/mediawiki/core/+/595240
My request is that I need to make lots of patches and since I'm doing this
in my volunteer capacity, I need developers to review (and potentially help
with the work if you're excited about this like me). Let me know if you're
willing to be added in future patches and the current patch also welcomes
any feedback: https://gerrit.wikimedia.org/r/c/mediawiki/core/+/595240
I have added the documentation in
https://www.mediawiki.org/wiki/Manual:Schema_changes for the plan and
future changes. The ideal goal is that when you want to do a schema change,
you just change tables.json and create a json file that is snapshot of
before and after table (remember, sqlite doesn't have alter table, meaning
it has to know the whole table). Also, once we are in a good shape in
migrating mediawiki core, we can start cleaning up extensions.
Any feedback is also welcome.
Best
--
Amir (he/him)
Hi,
A new episode is out of the MediaWiki podcast Between the Brackets: this
one is an interview with Will Kavanagh, who is the Global Community Lead
for Fandom, the website previously known as Wikia. We talked about the
current ongoing project to update Fandom's wikis from MediaWiki 1.19 to
1.33, the benefits of video games, and a bunch of other topics. You can
listen to the whole episode here:
https://betweenthebrackets.libsyn.com/episode-61-will-kavanagh
-Yaron
Hello,
Recently, I stumbled upon a technical writing course which I found very
useful and I wanted to share it and thought of sending an email to
wikitech-l recommending it. Also, I've been looking for a resource about
VueJS with not much luck and wanted to send an email asking if anyone knows
any.
Instead, I have this idea to have a virtual library for developers so they
can share useful resources with each other. You go to a wiki page and see
list of courses, books, conference videos, on each topic and different
people recommanding them. You can also request a resource for a topic and
people respond to you. If the wiki page grows too big, we can split them to
sub pages based on topics, and so on.
I started the page in https://www.mediawiki.org/wiki/User:Ladsgroup/Library
but I'm planning to move it to the main namespace if no one objects. Please
take a look, add more recommandations, co-sign, request for a resource,
respond to a request for a resource, etc.
What do you think? Please let me know if you think it's a horrible idea or
you have feedback on details (mediawiki.org? maybe we should move it to
wikitech.wikimedia.org?)
Hope that'd be useful.
Best
--
Amir (he/him)
Hello,
In the documentation of $wgDBprefix [0], it has been explicitly mentioned
that you should not set this if you're using Postgres as your DBMS since
2006.
We are depending on this assumption on new places now [1] and if you set
this, please unset and fix your database ASAP otherwise you end up with
some tables that have the prefix and some that don't which I assume
wouldn't be really fun.
Thank you for your understanding
[0]: https://www.mediawiki.org/wiki/Manual:$wgDBprefix
[1]:
https://gerrit.wikimedia.org/r/c/mediawiki/core/+/595316#message-9c1b408f7b…
Best
--
Amir (he/him)
Hi,
There's a new episode of the MediaWiki podcast Between the Brackets: this
one is an interview with Amir Aharoni, who is a strategist in the Language
team of the Wikimedia Foundation. You can listen to the episode here:
http://betweenthebrackets.libsyn.com/episode-60-amir-aharoni
-Yaron