Hello. Our wiki, he, has Group 2 deployment, an hour ago. There is
something wrong in it. The Special:Watchlist is very weird.
I use grouping revisions, and all groups of single element already marked
as read, before I opened them. Including those before the change hour. And
it's not just me. I got a complain on my user talk page from somebody else
too. Very strange: this revision is marked as read, but orange notice still
here, and I opened it out of account in purpose. Tried in safe mode - same
problems. And by the way, WLM (WatchList Manager) gadget of our wiki shows
bullshit - meaning all the html structure of Special:Watchlist is broken.
Help!
Igal (User:IKhitron)
I noticed that core files have @author annotations.
My take on this is as follows: Any active codebase (such as mediawiki)
sees constant change and code is refactored, rewritten, renamed, files
moved around, split up, etc. that the only meaningful interperation of
"@author" would be someone who first created that file / function, no
matter how small that piece of code was. At that level, it is not that
meaningful, especially in the face of refactoring and restructuring. git
log --follow might provide a better picture for an individual file. I
think all @author annotations should be removed. When editing a piece of
code, I imagine some developers might find it a little annoying ... and
confusing especially during refactoring ... whether to retain it or not.
I find these annotations misleading and wonder why they exist and what
purpose they serve. Would appreciate a discussion on this.
Alternatively, I would appreciate if someone can point me to a wiki page
/ phab task / essay that explains the rationale for why these
annotations should exist and be preserved.
Thanks,
Subbu.
James: thanks for asking; I'm copying that question to the Wikitech list.
While we're on that topic, what's happening to multimedia? I believe that
at one time there was a multimedia team, and I could understand how pairing
multimedia with maps in the same team could make sense. If multimedia is
separate, it would be good to know where that's being housed now; I believe
that there's work happening with 3D files for Commons, and I vaguely recall
hearing about improvements to the Commons upload wizard.
Pine
On Mon, Jun 12, 2017 at 8:07 AM, James Heilman <jmh649(a)gmail.com> wrote:
> Looks like a reasonable change. Glad to see the degree of internal input
> that went into it.
>
> Does maps also include other rich content like graphs, charts, heat maps
> and other forms of data visualization?
>
> Best
> James
>
> On Mon, Jun 12, 2017 at 8:44 AM, Toby Negrin <tnegrin(a)wikimedia.org>
> wrote:
>
> > Hi Jan --
> >
> > Thanks for the question. We'll be making a more specific announcement
> this
> > week about the future of the discovery projects. Sadly we don't have a
> lot
> > of new information for maps in particular and will need to do a bit more
> > scenario planning before we talk to the community.
> >
> > As far as focus, most of our "reading" features are actually content
> > created by editors that is consumed by readers and maps is no different.
> > While we don't have specifics as far as the roadmap, both authoring and
> > consumption features are totally in scope.
> >
> > Hope this helps to provide some information (if not clarity :) about how
> we
> > are approaching this.
> >
> > -Toby
> >
> > On Thu, Jun 8, 2017 at 2:21 PM, Jan Ainali <ainali.jan(a)gmail.com> wrote:
> >
> > > 2017-06-07 23:12 GMT+02:00 Toby Negrin <tnegrin(a)wikimedia.org>:
> > >
> > > >
> > > > The team working on maps, the search experience, and the project
> entry
> > > > portals (such as Wikipedia.org) will join the Readers team. This
> > > > realignment will allow us to build more integrated experiences and
> > > > knowledge-sharing for the end user.
> > > >
> > > Does maps going to readers mean that there will be less focus on
> editors
> > > tools for adding maps to articles and more focus on the readers
> > possibility
> > > to interact with the maps? If so, what is actually in the pipeline for
> > > maps?
> > >
> > > /Jan
> > > _______________________________________________
> > > Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/
> > > wiki/Mailing_lists/Guidelines and https://meta.wikimedia.org/
> > > wiki/Wikimedia-l
> > > New messages to: Wikimedia-l(a)lists.wikimedia.org
> > > Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l,
> > > <mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe>
> > >
> > _______________________________________________
> > Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/
> > wiki/Mailing_lists/Guidelines and https://meta.wikimedia.org/
> > wiki/Wikimedia-l
> > New messages to: Wikimedia-l(a)lists.wikimedia.org
> > Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l,
> > <mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe>
> >
>
>
>
> --
> James Heilman
> MD, CCFP-EM, Wikipedian
>
> The Wikipedia Open Textbook of Medicine
> _______________________________________________
> Wikimedia-l mailing list, guidelines at: https://meta.wikimedia.org/
> wiki/Mailing_lists/Guidelines and https://meta.wikimedia.org/
> wiki/Wikimedia-l
> New messages to: Wikimedia-l(a)lists.wikimedia.org
> Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l,
> <mailto:wikimedia-l-request@lists.wikimedia.org?subject=unsubscribe>
>
Hi all,
Last month I did a minor refactor of Parser, ApiParse, and OutputPage to
better deal with module loading. Specifically to resolve T130632 ("Needless
loading of mediawiki.toc on mobile views.")
* OutputPage: Hardcoded some of modules in OutputPage::output. – Moved to
Skin with https://gerrit.wikimedia.org/r/354682
* Parser: Loaded 'mediawiki.toc' (via ParserOutput). – Moved to Skin with
https://gerrit.wikimedia.org/r/353573
* (Other changes at [1])
Parser and OutputPage now don't load modules by default anymore. OutputPage
is still where modules are added to for the output, but OutputPage itself
no longer adds to this. The defaults now get added by
Skin::getDefaultModules instead.
Skins now have full control over which modules to load.
ApiParse (e.g. used by VisualEditor, live preview, etc.) now has an
explicit 'useskin' parameter. When set, the API will wrap the ParserOutput
object in an OutputPage, and call addParserOutputMetadata() and
Skin::getDefaultModules.
The result is a 'modules' property in the Parse API that much more closely
resembled what a regular page view would load.
Thanks to Brad Jorsch (Anomie), Bartosz Dziewoński (MatmaRex) and Fomafix
for helping me make this happen!
As part of this refactor, the OutputPage::enableTOC() was removed. It
contradicted the one-way logic flow (Parser>ParserOutput>OutputPage). It
was introduced in MediaWiki 1.22 for the purpose of allowing MobileFrontend
to disable the default TOC. However, as we now know (T130632) that only
solved the problem partially. No other callers were found in Wikimedia Git.
As such, I chose not keep the exception to this logic flow.
-- Krinkle
[1]
https://gerrit.wikimedia.org/r/#/q/project:mediawiki/core+topic:mwtoc+is:me…
---------- Forwarded message ----------
From: "Aaron Halfaker" <aaron.halfaker(a)gmail.com>
Date: Jun 12, 2017 12:00 PM
Subject: [AI] ORES operating at slightly reduced capacity
To: "Application of Artificial Intelligence and other advanced computing
strategies to Wikimedia Projects" <ai(a)lists.wikimedia.org>
Cc:
One of our worker nodes in CODFW went down (scb2005). See
https://phabricator.wikimedia.org/T167638 for details.
This should not affect the service except that we'll have a slightly
reduced capacity to handle high request rates. Let us know if you see any
issues with ORES.
https://ores.wikimedia.orghttps://mediawiki.org/wiki/ORES
-Aaron
_______________________________________________
AI mailing list
AI(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/ai
On Mon, 2017-06-12 at 12:00 +0000, wikitech-l-request(a)lists.wikimedia.org wrote:
> On 8 Jun 2017 01:42, "zppix e" <megadev44s.mail(a)gmail.com> wrote:
>
> Hello,
> Wikimedia's AI team (side note: im unaffliated with wmf, was gaven
> permission to send this email) will need help with setting up some
> wordlists for the tawiki ORES system (see T166052). If you have any
> knowledge of the Tamil lang, please come join us at
> chat.freenode.net,
> channel #wikimedia-ai (webchat.freenode.net). Feel free to cross post
> this
> or ask users on-wiki.
>
I am native Tamil speaker. How could I be of help?
--
Regards,
Kaartic Sivaraam <kaarticsivaraam91196(a)gmail.com>
I am working with Tamil Wikipedia team to get the details requested.
Will share details in a week.
On 8 Jun 2017 01:42, "zppix e" <megadev44s.mail(a)gmail.com> wrote:
Hello,
Wikimedia's AI team (side note: im unaffliated with wmf, was gaven
permission to send this email) will need help with setting up some
wordlists for the tawiki ORES system (see T166052). If you have any
knowledge of the Tamil lang, please come join us at chat.freenode.net,
channel #wikimedia-ai (webchat.freenode.net). Feel free to cross post this
or ask users on-wiki.
Thanks,
Zppix
Volunteer developer for WMF
enwp.org/User:Zppix
_______________________________________________
Wikitech-l mailing list
Wikitech-l(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Hi all,
the Wikimedia technical community has recently adopted a Code of Conduct.
You have probably heard more about it than you wanted to, but if you have
missed it somehow, you can read the related blog post [1].
We started adding a CODE_OF_CONDUCT file with a link to all repos (this is
a new convention for declaring what a project's code of conduct is,
promoted by Github), which resulted in a debate about whether that is the
right thing to do. If you are interested, please join the discussion on the
Phabricator task [2].
[1] https://blog.wikimedia.org/2017/06/08/wikimedia-code-of-conduct/
[2] https://phabricator.wikimedia.org/T165540
I'm trying to setup two Parsoid servers to play nicely with two MediaWiki
application servers and am having some issues. I have no problem getting
things working with Parsoid on a single app server, or multiple Parsoid
servers being used by a single app server, but ran into issues when I
increased to multiple app servers. To try to get this working I starting
making the app and Parsoid servers communicate through my load balancer. So
an overview of my config is:
Load balancer = 192.168.56.63
App1 = 192.168.56.80
App2 = 192.168.56.60
Parsoid1 = 192.168.56.80
Parsoid2 = 192.168.56.60
Note, App1 and Parsoid1 are the same server, and App2 and Parsoid2 are the
same server. I can only spin up so many VMs on my laptop.
The load balancer (HAProxy) is configured as follows:
* 80 forwards to 443
* 443 forwards to App1 and App2 port 8080
* 8081 forwards to App1 and App2 port 8080 (this will be a private network
connection later)
* 8001 forwards to Parsoid1 and Parsoid2 port 8000 (also will be private)
On App1/Parsoid1 I can run `curl 192.168.56.63:8001` and get the
appropriate response from Parsoid. I can run `curl 192.168.56.63:8081` and
get the appropriate response from MediaWiki. The same is true for both on
App2/Parsoid2. So the servers can get the info they need from the services.
Currently I'm getting a the error "Error loading data from server: 500:
docserver-http: HTTP 500. Would you like to retry?" when attempting to use
Visual Editor. I've tried various different settings and have not always
gotten that specific error, but am getting it with the settings I currently
have in localsettings.js and LocalSettings.php (shown below in this email).
Removing the proxy config lines from these settings gave slightly better
results. I did not get the 500 error, but instead it sometimes after a very
long time it would work. It also may have been throwing errors in the
parsoid log (with debug on). I have those logs saved if they help. I'm
hoping someone can just point out some misconfiguration, though.
Here are snippets of my config files:
On App1/Parsoid1, relevant localsettings.js:
parsoidConfig.setMwApi({
uri: 'http://192.168.56.80:8081/demo/api.php',
proxy: { uri: 'http://192.168.56.80:8081/' },
domain: 'demo',
prefix: 'demo'
} );
parsoidConfig.serverInterface = '192.168.56.80';
On App2/Parsoid2, relevant localsettings.js:
parsoidConfig.setMwApi({
uri: 'http://192.168.56.80:8081/demo/api.php',
proxy: { uri: 'http://192.168.56.80:8081/' },
domain: 'demo',
prefix: 'demo'
} );
parsoidConfig.serverInterface = '192.168.56.60';
On App1/Parsoid1, relevant LocalSettings.php:
$wgVirtualRestConfig['modules']['parsoid'] = array(
'url' => '192.168.56.80:8001',
'HTTPProxy' => 'http://192.168.56.80:8001',
'domain' => $wikiId,
'prefix' => $wikiId
);
On App2/Parsoid2, relevant LocalSettings.php:
$wgVirtualRestConfig['modules']['parsoid'] = array(
'url' => '192.168.56.80:8001',
'HTTPProxy' => 'http://192.168.56.80:8001',
'domain' => $wikiId,
'prefix' => $wikiId
);
Thanks!
--James