Could we add support for Twitter cards in Wikimedia projects? It wouldn't
be hard to do and would would be really nice for articles, images, videos -
everything linked from Wikimedia on Twitter could have a nice little
preview on Twitter.
The documentation is at https://dev.twitter.com/docs/cards - basically it
involves adding some meta tags around existing content server-side.
User:Mono
Hi,
I'm a bit confused when it comes to various options I have in gerrit and
it seems the docs are not up to date on that.
* What is the difference between Verified and Code Review? When would I
put +1 in one of them but -1 in the other?
* What is the difference between +1 and +2, especially in Verified?
* Why do we even have +2? +1 means that someone else must approve. What
does +2 mean? No one else has to approve but I'm not merging anyway, why?
It seems the docs (http://www.mediawiki.org/wiki/Code_review_guide) do
not explain it.
Juliusz
Hi,
I could find a method to covert a timestamp into the user preferred
timezone in the Language class; Looks like the wrong place to me.
Is there any other way (think global function) to convert to the user's
timezone and preferred format?
Also, is there any common script to do this in JS?
With reference to bug 43365
--
Happy Holidays,
Nischay Nahata
nischayn22.in
bawolff wrote:
> Wikimedia is a pretty big player. Has anyone from the foundation with
> some sort of fancy sounding title called up the ISP in question and
> asked "wtf?". The original email on wikimedia-l made it sound like the
> issue is unintentional.
Dear Bawolff,
as far as I know no one from the WMF has called up the ISPs. Some
local Wikipedians have, but have received no answer.
Just to add: the issue discussed happens with all the ISPs in the
country, not just a single one.
For some background information please take a look at these short sections:
<https://en.wikipedia.org/wiki/Human_rights_in_Uzbekistan#Internet>
<https://en.wikipedia.org/wiki/Uzbek_Wikipedia#Blocking_of_Wikipedia>
With all best wishes.
Do we have one extra machine left. Then we could set up this as NAT-Router. This will replace another machine if we do not have one extra IP left. The original ports need to be forwarded to that then.
Cheers
Marco
-------- Original-Nachricht --------
Von: Leslie Carr <lcarr(a)wikimedia.org>
Gesendet: Fri Dec 28 00:03:33 MEZ 2012
An: Wikimedia Mailing List <wikimedia-l(a)lists.wikimedia.org>
Betreff: Re: [Wikimedia-l] No access to the Uzbek Wikipedia in Uzbekistan
On Thu, Dec 27, 2012 at 2:37 PM, Marco Fleckinger
<marco.fleckinger(a)wikipedia.at> wrote:
>
>
>
>
> Leslie Carr <lcarr(a)wikimedia.org> schrieb:
>
>>On Thu, Dec 27, 2012 at 1:39 PM, Marco Fleckinger
>><marco.fleckinger(a)wikipedia.at> wrote:
>
>>> Just an idea, which is not very beautiful: What about a router
>>forwarding ports to the correct machine by using iptables? Would that
>>also work in connection with search engines?
>>
>>Are you suggesting we use different nonstandard ports for each
>>different wiki/language combo that resides on the same IP ?
>>
> Yes exactly!
>
I guess that is theoretically possible with a more intrusive load
balancer in the middle. We need the HOST information from the http
header to be added as we have our varnish caches serving multiple
services, not one(or more) per language/project combo. I'm pretty
sure that lvs doesn't have this ability (which we use). Some large
commercial load balancers have the ability to rewrite some headers,
but that would be a pretty intensive operation (think lots of cpu
needed, since it needs to terminate SSL and then rewrite headers) and
would probably be expensive. If you have another way you think we can
do this, I am all ears!
We may want to move this discussion to wikitech-l as all the technical
discussions probably bore most of the people on wikimedia-l
Leslie
>
> _______________________________________________
> Wikimedia-l mailing list
> Wikimedia-l(a)lists.wikimedia.org
> Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l
--
Leslie Carr
Wikimedia Foundation
AS 14907, 43821
http://as14907.peeringdb.com/
_______________________________________________
Wikimedia-l mailing list
Wikimedia-l(a)lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l
Hi,
Mozilla are asking for update about bug
https://bugzilla.mozilla.org/show_bug.cgi?id=758857
Can anybody help?
--
Amir
---------- Forwarded message ----------
From: Ryan Lane <rlane32(a)gmail.com>
Date: 2012/6/19
Subject: Re: [Wikitech-l] HTTPS Wikipedia search for Firefox?
To: Wikimedia developers wikitech-l@lists.wikimedia.org
On Tue, Jun 19, 2012 at 3:39 AM, Chris Peterson <cpeterson(a)mozilla.com> wrote:
> hi, I'm a developer at Mozilla and I have a patch [1] that would switch
> Firefox's Wikipedia search box from HTTP to HTTPS.
>
> Who would be an appropriate technical contact at Wikimedia that I can
> coordinate with? Is this a change Wikimedia would welcome? Or would the
> increased SSL server load be an undue burden for Wikimedia? Just to be
> clear, this change would only affect Firefox users who search Wikipedia
> using Firefox's search box.
>
> A few months ago, Mozilla switched Firefox 14 (currently in Beta) to use
> Google's HTTPS search [2]. If I check in my Wikipedia patch soon, the change
> would ride Firefox's Nightly, Aurora, and Beta release channels [3] and be
> released to the general public in Firefox 16 (October 2012).
>
Please don't do so. HTTPS is a new service, and we haven't properly
load tested it yet. The first target for production load testing is
for logged-in users.
I'm not opposed to the change completely, but I'd prefer to let you
guys know when we're ready.
Thanks,
- Ryan
_______________________________________________
Wikitech-l mailing list
Wikitech-l(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Last week I was working on a feature that I didn't want to surface on
a disambiguation page. I was surprised to find there was no way I
could distinguish between a normal article and a disambiguation page.
The disambiguation pages have no clearly marked templates I can use
and they are in the same namespace as normal articles.
Does anyone have a suggestion for how I might do this?
Jon
Hello,
I attended a talk [1] by Elaine Weyuker [2] on Wed, 7 Nov 2012.
The talk, “Looking for Bugs In All the RIGHT Places”, discussed her work on
predicting where bugs would be found in the next release of a program product.
She and her collaborators have created a well validated tool that predicts, in
under a minute, the 20% of the source files of the product, frozen before the
next release, that will contain about 80% of the faults that will be corrected
in that release.
The tool is not a silver bullet, but it is useful; especially because it
sometimes points attention to files that were not expected to have a lot of
problems.
The tool has two parts, a prediction front end and a back end interface to the
revision control system and bug tracker. As I remember it, the entire system
consisted of under 800 lines of python and under 3000 lines of C++. Using it
would require adding a new back end.
I thought that this tool might be useful in mediawiki development. She was
amenable to helping get it working if there was interest.
[1]http://www.ece.udel.edu/spotlight/WeyukerDLS.php
[2]http://en.wikipedia.org/wiki/Elaine_Weyuker
--
Jim Laurino
wican.x.jimlaur(a)dfgh.net
Please direct any reply to the list.
Only mail from the listserver reaches this address.