Sorry for all the trolling, but why instead of discussing how we need a
responsive skin for MediaWiki and waiting for Winter to come don't we just
do it:
* Move Minerva out of MobileFrontend
* Leave all mobile-specific improvements, "improvements" and hacks in MF
* Polish Minerva to do everythig a normal desktop skin does
* Bundle it with MW by default
----
[0] https://en.wikipedia.org/wiki/James_Randi?useskin=minerva
--
Best regards,
Max Semenik ([[User:MaxSem]])
Hi everyone,
There has been a lot of talk over the past year or so about SOA and
what it means for MediaWiki. What I've taken from the conversation is
the following:
1. There is a general desire for us to separate user interface code
from data manipulation code. This is mainly in service of making
alternative user interfaces simpler and cleaner (e.g. mobile web,
mobile apps, print, offline). The developers that are consumers of
said systems (i.e. user interface developers) probably don't care so
much what goes on inside the data manipulation sausage factory, so
long as the APIs used to access our data are well-defined and easy to
use. It may be one data manipulation service, it may be 100 tiny
little services with a brokering layer, but as long as there's a clean
split, all is well. I've created an RFC which basically spells this
out:
https://www.mediawiki.org/wiki/Requests_for_comment/Service_split_along_pre…
Note that there is nothing terribly creative about this idea, and
*that is the point*. With this RFC, let's document which aspects are
utterly uncontroversial so that we can make clear the urgency to come
to consensus on more specific proposals, such as Trevor and Timo's
proposal to redo the skin system[1]
2. Within the larger data manipulation area, there is also potential
for another split between public information (e.g. pages on public
wikis, the vast majority of old revisions, etc) and private
information (e.g. pages on private wikis, revdel'ed revisions,
CheckUser information). With public information, we can choose
technologies that optimize for replication and data delivery (speed
and volume). With private information, we can lock things down more,
possibly losing efficiency in exchange for greater security. Since
private information will presumably be accessed less frequently in
most cases, and the limited cases where high frequency access is
needed (e.g. authn/authz) typically don't involve a lot of data flow,
we can generally make different design decisions. I've also created
an RFC that spells this out too:
https://www.mediawiki.org/wiki/Requests_for_comment/Service_split_along_pub…
These RFCs are both admittedly vague, somewhat on purpose. It's
useful to agree on a general direction before getting down in the
weeds on specific proposals. Proposal #1, other than the lack of
specificity, is hopefully completely uncontroversial, and thus (maybe
with a *little* fleshing out) could sail through the process.
Proposal #2 may be a bit more controversial, but something that is
worth some discussion.
For the specifics on these proposals, the best place to discuss these
is the talk page. If/when the conversation dies out there (or if it
never really starts), I'll summarize on this list.
If there are other sensible places for cleaner fissures in the system,
please document where you think they should go. Even if the fissures
we define are not boundaries between hardware clusters, but instead
just library boundaries, that's still useful to mark where those lines
should go.
Rob
[1] https://www.mediawiki.org/wiki/Requests_for_comment/Redo_skin_framework
Just a quick note that I have submitted a patch
<https://gerrit.wikimedia.org/r/190819> that will make MediaWiki avoid
serving SVG images to Opera 12 (the old, Presto-based one) when a PNG
fallback is available.
We support it as a Grade A browser, which means that we should do our best
to provide the best possible experience. In my opinion, the best possible
experience involves not giving it any SVGs.
Opera 12 has issues when rendering SVG background-images together with
border-radius or background-size rules (see task
<https://phabricator.wikimedia.org/T87504> for details and examples), and
both of these are becoming increasingly common in our codebase. One common
complaint was the new MediaWiki UI checkboxes, where the check icon would
sometimes not appear for Opera users.
--
Bartosz Dziewoński
Quarterly review minutes and/or slides of the following teams have
been posted in recent days:
Multimedia:
https://meta.wikimedia.org/wiki/WMF_Metrics_and_activities_meetings/Quarter…
Legal & Community Advocacy:
https://commons.wikimedia.org/wiki/File:LCA_Q2_Slides.pdf (abridged slides only)
Fundraising and Fundraising Tech:
https://meta.wikimedia.org/wiki/WMF_Metrics_and_activities_meetings/Quarter…
Communications:
https://commons.wikimedia.org/wiki/File:Communications_WMF_Quarterly_Review…
(slides only, as a report - no actual meeting took place)
With this, documentation from all 20 quarterly review meetings that
took place about Q2 (October-December 2014) has been published.
On Wed, Dec 19, 2012 at 6:49 PM, Erik Moeller <erik(a)wikimedia.org> wrote:
> Hi folks,
>
> to increase accountability and create more opportunities for course
> corrections and resourcing adjustments as necessary, Sue's asked me
> and Howie Fung to set up a quarterly project evaluation process,
> starting with our highest priority initiatives. These are, according
> to Sue's narrowing focus recommendations which were approved by the
> Board [1]:
>
> - Visual Editor
> - Mobile (mobile contributions + Wikipedia Zero)
> - Editor Engagement (also known as the E2 and E3 teams)
> - Funds Dissemination Committe and expanded grant-making capacity
>
> I'm proposing the following initial schedule:
>
> January:
> - Editor Engagement Experiments
>
> February:
> - Visual Editor
> - Mobile (Contribs + Zero)
>
> March:
> - Editor Engagement Features (Echo, Flow projects)
> - Funds Dissemination Committee
>
> We’ll try doing this on the same day or adjacent to the monthly
> metrics meetings [2], since the team(s) will give a presentation on
> their recent progress, which will help set some context that would
> otherwise need to be covered in the quarterly review itself. This will
> also create open opportunities for feedback and questions.
>
> My goal is to do this in a manner where even though the quarterly
> review meetings themselves are internal, the outcomes are captured as
> meeting minutes and shared publicly, which is why I'm starting this
> discussion on a public list as well. I've created a wiki page here
> which we can use to discuss the concept further:
>
> https://meta.wikimedia.org/wiki/Metrics_and_activities_meetings/Quarterly_r…
>
> The internal review will, at minimum, include:
>
> Sue Gardner
> myself
> Howie Fung
> Team members and relevant director(s)
> Designated minute-taker
>
> So for example, for Visual Editor, the review team would be the Visual
> Editor / Parsoid teams, Sue, me, Howie, Terry, and a minute-taker.
>
> I imagine the structure of the review roughly as follows, with a
> duration of about 2 1/2 hours divided into 25-30 minute blocks:
>
> - Brief team intro and recap of team's activities through the quarter,
> compared with goals
> - Drill into goals and targets: Did we achieve what we said we would?
> - Review of challenges, blockers and successes
> - Discussion of proposed changes (e.g. resourcing, targets) and other
> action items
> - Buffer time, debriefing
>
> Once again, the primary purpose of these reviews is to create improved
> structures for internal accountability, escalation points in cases
> where serious changes are necessary, and transparency to the world.
>
> In addition to these priority initiatives, my recommendation would be
> to conduct quarterly reviews for any activity that requires more than
> a set amount of resources (people/dollars). These additional reviews
> may however be conducted in a more lightweight manner and internally
> to the departments. We’re slowly getting into that habit in
> engineering.
>
> As we pilot this process, the format of the high priority reviews can
> help inform and support reviews across the organization.
>
> Feedback and questions are appreciated.
>
> All best,
> Erik
>
> [1] https://wikimediafoundation.org/wiki/Vote:Narrowing_Focus
> [2] https://meta.wikimedia.org/wiki/Metrics_and_activities_meetings
> --
> Erik Möller
> VP of Engineering and Product Development, Wikimedia Foundation
>
> Support Free Knowledge: https://wikimediafoundation.org/wiki/Donate
>
> _______________________________________________
> Wikimedia-l mailing list
> Wikimedia-l(a)lists.wikimedia.org
> Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l
--
Tilman Bayer
Senior Analyst
Wikimedia Foundation
IRC (Freenode): HaeB
In the next RFC meeting we will discuss the following RFC:
* Improving extension management
<https://www.mediawiki.org/wiki/Requests_for_comment/Improving_extension_man…>
Also, we will have a general discussion about services in the second
half hour.
The meeting will be on the IRC channel #wikimedia-office on
chat.freenode.net at the following time:
* UTC: Wednesday 21:00
* US PST: Wednesday 13:00
* Europe CET: Wednesday 22:00
* Australia AEDT: Thursday 08:00
-- Tim Starling
Hi I am getting js script errors in Metrolook skin. The error is in
$(document).click(function(e) {
if (!$(e.target).closest('#'+openDiv).length) {
toggleDiv(openDiv);
}
});
and error says Object expected.
Source code is at https://git.wikimedia.org/summary/mediawiki%2Fskins%2FMetrolook and https://github.com/paladox/Metrolook
If I have to split js script out of the MetrolookTemplate.php could I have some help to do that please thanks.
Hello.
I installed Mediawiki (and a couple of extensions) as an intranet website.
Everyone in the company is entitled to publish, so my fear is people
missing important information. My idea is to send a weekly e-mail to list
every modified pages (link to page, author, last modification date).
I started to write a PHP script as a demonstration and it now works pretty
well. So I have two questions:
- I first unsuccessfully tried to find something similar, but maybe I'm
re-inventing the wheel?
- If I'm the first to build such a feature, maybe someone else could have
some interest in it? If yes, how to publish (this is not an extension, but
a standalone software reading a MySQL database)?
Georges
((Jump to the -- TL;DR -- if you just want to answer my question))
Hey guys, right now managing extensions is a complete mess for those of
us with a wiki running a dozens of extensions on a VPC.
Even if you use git to make things easier. You still need to batch
fetch/pull multiple git repos. And then some extensions don't work via
git, instead you install them using composer.
((And don't say use mediawiki/extensions.git, that should conflict with
composer extensions, doesn't help when an extension uses something like
version tags, and appears to have the same issues with bulk updating a
few extensions))
I convinced my boss to let me spend some time (when I have no client
projects to work on) building a tool to make managing extensions easier
for CLI users.
Some notes on the idea here:
https://www.mediawiki.org/w/index.php?title=User:Dantman/Code_Ideas&diff=13…
Please note that this is NOT meant to be the new way we manage
extensions forever, this is a hack meant to be used as a workaround to
deal with current reality until the grand future when we're supposed to
have a complete extension management system built into core with its own
web interface.
-- TL;DR --
Now, my question.
Do you guys want me to build the local tool in PHP as a maintenance
script or in node.js?
Actually perhaps I should instead ask "Are you guys fine if I build this
in node.js?" because I have a feeling this will be hell to develop if I
have to write it in PHP.
PHP Pros:
- We could bundle this with MediaWiki core if people like it.
- You don't need to install Node.js (though it's not 'that' hard).
PHP Cons:
- Every time I've tried using PHP proc functions I've had to spend
endless time debugging. And this tool requires dozens of proc calls.
- The tool is going to be difficult to access until at least the next
MediaWiki version, since it won't be bundled.
-- It may end up useless for a bunch of people right now when it's
supposed to help.
-- It may also end up locked down so only users with more recent
installations of MediaWiki may use it (
Node.js Pros:
- Once you have node, getting the tool will be as trivial as `[sudo] npm
install -g mediawiki-...something...`.
- The tool will be available for and should work on any MediaWiki
version you can get a current extension to work on, not just future
releases.
- Executing git and composer from node to download things is trivial.
-- I not only expect it to be easy, I'm already doing it. The server
code is Node.js and is already happily chugging away fetching all our
gerrit based git repos, doing multiple in parallel.
- I could make parts of the tool interactive and much more user friendly.
--
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://danielfriesen.name/]
Hi how can I use strops multiple times I am trying to filter the skins api to only show skins and this wokrs but it only allows you to do one it currently looks like this
if ( strpos( $ret['type'] = $type, 'parserhook' ) === 0 ) {
continue;
}
but I would like to get it to do
if ( strpos( $ret['type'] = $type, 'parserhook', 'specialpage' ) === 0 ) {
continue;
}
Or is there something simler to this I could use or is there one that allows to express only skin is allowed to show through skin api instead of blacklisting the other types. Please help.