In a discussion in the German Pirate Party the idea came up that we
might want to have cryptographically signed wiki pages.
I could not find that this has been implemented already anyhow.
Thus, can we develop an extsion which provides cryptographically signed
wiki pages?
A brief and preliminaly scetch would mean that any user who provides a
matching public key could sign any existing page.
Before a page + signature is saved, the signature is checked for
vadility.
Editing a siged page is possible without resigning it.
There must be a page display allowing to copy+paste the page with
signature for external verification.
Therre should be a button triggering the verifivation via an external
online service.
Maybe signature display of signed pages should be suppressable.
Any numer of independent signatures must be possible to a page.
Does that make sense? Anything vital forgotten?
Feedback welcome.
Greetings -- Purodha
Hi guys,
The tools.wmflabs.org certificate has expired this morning. This
directly impatcs wikis using services, such as maps, in the pages.
Thanks,
Strainu
Hi folks,
Executive summary:
T108255 is the default option for our Wednesday RfC review (E66)[0].
As part of improving our database use, we need to start gating our
code review on better shared norms of SQL correctness. We need to
enable strict mode, cleanup/enforce primary keys (T17441), and start
using row-based replication (T109179). Let's talk about this on
Wednesday.
Details:
We're still not 100% decided what our topic for this week's RfC review
meeting will be, but I'm leaning pretty heavily toward T108255. Jaime
Crespo (Jynus) asked me about it last week, which inspired me to turn
T108255 into an RfC. After he cleared up my writeup, I think there's
something for us to talk about.
In particular, I originally thought this was merely about enabling
MariaDB's strict mode, and all of the rainbows and unicorns that would
result from that. Jaime corrected me, pointing out that there is
other database related cleanup we would need to do to get the benefits
of this.
So, as of this writing, T108255 by title still appears to be about
merely enabling strict mode. It's tempting to split this ticket into
two tickets:
1. RfC: Write/enforce SQL correctness guidelines
2. Enable MariaDB/MySQL's Strict Mode
I may make a separate ticket tomorrow unless someone convinces me that
kittens will die as a result.[1]
Regarding SQL correctness guidelines, we have a mess of stuff on
mediawiki.org, which doesn't seem to be very discoverable, and also
doesn't seem to have any teeth to it. We have a modest number of
pages marked as "MediaWiki development policies"[2], but of the 5
pages that were there, only 1 of them was about specifically about
databases, which is weakly called [[Database optimization]][3]. Since
[[Database optimization]] didn't seem to have gotten the review that
[[Security for developers]] or [[Gerrit/+2]] had, I changed its status
to "{{draft}}"
We *do* have something that actually looks more policy-like, which is
the "#Database patches" section of the [[Development policy]] page[4]
However, it's not clear that the "Development policy" page gets read,
and has gotten pretty crufty. It's tempting to put "{{draft}}" on
that one too.
It seems there are a number of sources we could/should be pulling from
to make a database development policy[5] T108255 (or some
database-related RfC) should be about pulling all of these together
into a coherent set of guidelines. These guidelines should be
well-known to frequent committers, and should be well-written for a
beginning developer.
What we need to actually *do* is not merely enable strict mode, but
also cleanup/enforce primary keys (T17441), and start using row-based
replication (T109179). Before completing all of this, we need our code
review gated on actually making this work.
The fact that we have a mess of documentation and norms is the reason
why I'm leaning toward this topic for the E66 meeting this week. If
you believe we should talk about this, please participate at T108255
and help get this as far along as possible so that we can wrap things
up at the E66 meeting, If you believe we should be talking about
something else in our IRC meeting, please say so in E66 on Phab.
Rob
[0] IRC meeting:
<https://phabricator.wikimedia.org/E66>
"RfC: Enable MariaDB/MySQL's Strict Mode"
<https://phabricator.wikimedia.org/T108255>
[1] if someone decides to jfdi, I would recommend using T108255 for
the "Write/enforce SQL correctness guidelines" RfC, and make a new
ticket for the less important "Enable MariaDB/MySQL's Strict Mode".
The comments on the ticket seem to relate more to the former than the
latter, and the subscribers will probably be more interested in the
former.
[2]
<https://www.mediawiki.org/wiki/Category:MediaWiki_development_policies>
[3] <https://www.mediawiki.org/wiki/Database_optimization>
[4] <https://www.mediawiki.org/wiki/Development_policy#Database_patches>
[5] Other database-related guidance for developers:
<https://www.mediawiki.org/wiki/Performance_guidelines>
<https://www.mediawiki.org/wiki/Manual:Coding_conventions/Database>
<https://www.mediawiki.org/wiki/Architecture_guidelines>
<https://wikitech.wikimedia.org/wiki/Schema_changes>
Have been struggling with this issue for a good few days.
I am currently trying to get oauth to work from node using
passport-mediawiki.
I get as far as getting asked to authorise the oauth application for the
user on a screen that reads:
Hi user1,
ResourceEdge Dev 5 would like to have basic access on your behalf on all
projects of this site.
Privacy Policy Cancel Allow
Once I press Allow, I get the same box, but with the statement
"There are problems with some of your input."
But no info as to what the issue is. Not certain if there is any logs or
similar that might tell me what the actual issue is, have looked at the
standard ones, including the debug log (can't debug the actual window as
its a pop up box, not certain how I would debug that). Even gave the user
access to every part of OUTH, no success.
Any suggestions where to look or what the issue might be.
Have been struggling with this issue for a good few days.
I am currently trying to get oauth to work from node using
passport-mediawiki.
I get as far as getting asked to authorise the oauth application for the
user on a screen that reads:
Hi user1,
ResourceEdge Dev 5 would like to have basic access on your behalf on all
projects of this site.
Privacy Policy Cancel Allow
Once I press Allow, I get the same box, but with the statement
"There are problems with some of your input."
But no info as to what the issue is. Not certain if there is any logs or
similar that might tell me what the actual issue is, have looked at the
standard ones, including the debug log (can't debug the actual window as
its a pop up box, not certain how I would debug that). Even gave the user
access to every part of OUTH, no success.
Any suggestions where to look or what the issue might be.
P.S. sorry if this is a repost, tried to send it earlier but does not seem
to have gone through.
Dear Brian,
On 9/13/15, Brian Wolff <bawolff(a)gmail.com> wrote:
> On 9/12/15, wp mirror <wpmirrordev(a)gmail.com> wrote:
>> 0) Context
>>
>> I am currently developing new features for WP-MIRROR (see <
>> https://www.mediawiki.org/wiki/Wp-mirror>).
>>
>> 1) Objective
>>
>> I would like WP-MIRROR to generate all image thumbs during the mirror
build
>> process. This is so that mediawiki can render pages quickly using
>> precomputed thumbs.
>>
>> 2) Dump importation
>>
>> maintenance/importDump.php - this computes thumbs during importation, but
>> is too slow.
>> mwxml2sql - loads databases quickly, but does not compute thumbs.
>>
>> 3) Question
>>
>> Is there a way to compute all the thumbs after loading databases quickly
>> with mwxml2sql?
>>
>> Sincerely Yours,
>> Kent
>> ______________________________
>_________________
>> Wikitech-l mailing list
>> Wikitech-l(a)lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
> Hi. My understanding is that wp-mirror sets up a MediaWiki instance
> for rendering the mirror. One solution would be to set up 404-thumb
> rendering. This makes it so that instead of pre-rendering the needed
> thumbs, MediaWiki will render the thumbs on-demand whenever the web
> browser requests a thumb. There's some instructions for how this works
> at https://www.mediawiki.org/wiki/Manual:Thumb.php This is probably
> the best solution to your problem.
Right. Currently, wp-mirror does set up mediawiki to use 404-thumb
rendering.
This works fine, but can cause a few seconds latency when rendering pages.
Also, it would be nice to be able to generate thumb dump tarballs, just
like we used to generate original size media dump tarballs. I would like
wp-mirror have such dump features.
> Otherwise, MW needs to know what thumbs are needed for all pages,
> which involves parsing pages (e.g. via refreshLinks.php). This is a
> very slow process. If you already had all the thumbnail's generated,
> you could just copy over the thumb directory perhaps, but I'm not sure
> where you would get a pre-generated thumb directory.
Wp-mirror does load the *links.sql.gz dump files into the *links tables,
because this method is two orders of magnitude faster than
maintenance/refreshLinks.php.
>--
>-bawolff
Idea. I am thinking of piping the *pages-articles.xml.bz2 dump file
through an AWK script to write all unique [[File:*]] tags into a file. This
can be done quickly. The question then is: Given a file with all the media
tags, how can I generate all the thumbs. What mediawiki function shall I
call? Can this be done using the web API? Any other ideas?
Sincerely Yours,
Kent
Hi,
currently change owners use various ways to mark changes
that are not yet ready for review. Recurring patterns are
commit messages beginning with "[WIP]" or "DO NOT MERGE"
and/or -1 votes by the change owner. A common problem with
these solutions is that they cannot be used in Gerrit
searches, for example if someone is looking for open changes
to review, they must filter the results manually. This also
affects scripts & Co. like the Wikimedia Dashboard at
http://korma.wmflabs.org/browser/ that need to use
heuristics to determine if a change is a work in progress
and thus should be ignored for statistical purposes.
There was a bug (https://phabricator.wikimedia.org/T52842)
to implement a "Work in progress" button/status with the
underlying goal to prevent dashboards/queues from the added
noise of "draft changes"/"[WIP]" changes, but it was
declined because a button is not going to be added.
I want to suggest to add a new label "WIP", inspired by
OpenStack's "Workflow" label. Its "neutral" value is 0
("Ready for reviews"). If it is "voted" to -1 ("Work in
progress"), the change cannot be submitted. This vote will
stick with new uploads until it is changed back to 0.
For searches, this will allow Gerrit users to restrict
search results by adding "label:WIP+0" to their filters.
Untested, the change would be something like:
| diff --git a/project.config b/project.config
| index 151eebd..93291e1 100644
| --- a/project.config
| +++ b/project.config
| @@ -12,6 +12,7 @@
| owner = group ldap/ops
| label-Code-Review = -2..+2 group Project Owners
| label-Code-Review = -1..+1 group Registered Users
| + label-WIP = -1..+0 group Registered Users
| create = group Project Owners
| editTopicName = group Registered Users
| viewDrafts = group JenkinsBot
| @@ -78,6 +79,11 @@
| value = +2 Looks good to me, approved
| copyAllScoresOnTrivialRebase = true
| copyAllScoresIfNoCodeChange = true
| +[label "WIP"]
| + function = AnyWithBlock
| + value = -1 Work in progress
| + value = 0 Ready for reviews
| + copyMinScore = true
| [access "refs/meta/dashboards/*"]
| create = group Project Owners
| create = group platform-engineering
Tim
https://github.com/subbuss/parsoid_visual_diffs
I've pushed a bunch of updates over the last week which should now make
this usable for comparing HTML files from different sources (not
restricted to PHP parser and Parsoid). I did this so that this could be
used to compare the rendering of Tidy and HTML5depurate HTML (T89331).
You can now provide options in a config file (example config file
included in bin/settings.js.example).
You can provide DOM-post-processing scripts to inject (in the context of
a browser window) and process the HTML before it is screenshotted. This
is useful in the PHP and Parsoid cases where we need to make sure all
the closed-by-default tabs / tables / etc. are opened, custom CSS is
injected (to use Parsoid styling), chrome is removed, etc. PHP parser
and Parsoid HTML injectable scripts are in lib/*postprocess.js. Similar
scripts can be provided for other use cases.
You can run a visualdiffing node service (which is currently still
targeted at visual diffing PHP parser and Parsoid output, but could
potentially be extended to not be hardcoded for those use cases). The
codebase also provides a testreduce client for use with Parsoid's
testreduce service for running these kind of tests on a set of pages in
an automated fashion.
This codebase is not pretty or polished but it does the job for the use
cases it is targeted for. I wanted to spend only as much time on it as
required to get the job done rather than overengineer it for multiple
use cases.
As part of the RFC discussion on T89331, there was some interest in
using this outside of Parsoid's visual diffing usecase. That is the
motivation for this more wider update.
This first pass of cleanup, refactoring, and generalization is primarily
to support work being done as part of T89331. More testing to be done.
But, ideas for using this in other scenarios welcome. More generally,
other feedback and pull requests welcome.
Subbu.
Hi, we are looking for volunteers willing to get involved in the
organization of Wikimedia's participation in the upcoming round of
Outreachy, a program to involve underrepresented communities in open source
projects. https://www.gnome.org/outreachy/
Experienced org admins like Niharika Kohli, Andre Klapper and myself are
looking forward to support new volunteers in this role. We also have a
solid and reasonably well documented process in place helping the
onboarding of new org admins, mentors, and interns. This is a good chance
to grow your tech community management experience in a safe and supportive
context, dedicating about 2-4 hours per week between October and February.
About half year ago, I asked for volunteers to become co-org-admins of the
upcoming Google Summer of Code and Outreachy round. Niharika Kohli answered
back, and she has been an amazing org admin since then. I'm not
exaggerating when I say that her involvement has been one of the best
things impacting my work this year. She has learned and enjoyed a lot, and
she has been very helpful to the interns and mentors that are concluding
their projects as we speak.
Interested? Just reply to this email or comment at
https://phabricator.wikimedia.org/T112267
--
Quim Gil
Engineering Community Manager @ Wikimedia Foundation
http://www.mediawiki.org/wiki/User:Qgil