Hello all,
In the last weeks, I have been applying protocol-relative URLs in links with an HTTP defined protocol, and converting links from external to internal link format with my bot in all content pages of some (few) Wikimedia projects (please see details in [1]). I think this is an important task to allow users to navigate without changing the protocol in use, and I will prepare and post the regular expressions in a few days so they can be used by other bot operators.
However, most of the projects are still pending, this task is not included in GBs scope, the community is raising many doubts, and I think that, in some way, running this task on all pages and projects, even without exceptions, should be allowed.
What can we do? Is there a common and workable solution? Best regards, and thanks in advance. :-)
[1] https://meta.wikimedia.org/wiki/User:Invadibot/scope/meta-2
On 05/17/2013 04:23 PM, David Abián wrote:
Hello all,
In the last weeks, I have been applying protocol-relative URLs in links with an HTTP defined protocol, and converting links from external to internal link format with my bot in all content pages of some (few) Wikimedia projects (please see details in [1]).
You quote Ryan Lane, "A number of templates, CSS, and Javascript on projects are improperly referencing resources, and as such, they are being loaded incorrectly. All resources should be referenced using protocol-relative URLs now (//<resource-url> vs http://<resource-url>)."
But he is talking about resources like CSS/images/JavaScript, which can cause mixed content warnings. Your bot only does links, which is a separate issue.
Note that public WMF wikis do not have such external content in wikitext. Images can only be from the local wiki or Commons (which of course handles the protocol right).
The rest (CSS, JavaScript, other images, etc.) can only be from extensions, gadgets, and user scripts.
However, most of the projects are still pending, this task is not included in GBs scope, the community is raising many doubts, and I think that, in some way, running this task on all pages and projects, even without exceptions, should be allowed.
I use HTTPS Everywhere myself, so I get where you're coming from. But I see this is a normal task that should follow the normal per-wiki bot approval process(es) (if any).
Matt Flaschen
Some web resources are only available via HTTP or only via HTTPS. Making links to them protocol-relative will subtly break them.
(Are you "fixing" in this way all links, or only links to Wikimedia wikis? The Meta page doesn't explain that, and this is quite crucial.)
On 05/17/2013 06:05 PM, Bartosz Dziewoński wrote:
Some web resources are only available via HTTP or only via HTTPS. Making links to them protocol-relative will subtly break them.
(Are you "fixing" in this way all links, or only links to Wikimedia wikis? The Meta page doesn't explain that, and this is quite crucial.)
It's a little buried, but https://meta.wikimedia.org/wiki/User:Invadibot/scope/meta-2 does specify that it's limited to a subset (seems to be all the public ones) of WMF wikis.
Also, it's only for HTTP, so you can still force a link to HTTPS ("log in securely").
Matt Flaschen
On 05/17/2013 06:05 PM, Bartosz Dziewoński wrote:
Some web resources are only available via HTTP or only via HTTPS. Making links to them protocol-relative will subtly break them.
(Are you "fixing" in this way all links, or only links to Wikimedia wikis? The Meta page doesn't explain that, and this is quite crucial.)
It's a little buried, but https://meta.wikimedia.org/wiki/User:Invadibot/scope/meta-2 does specify that it's limited to a subset (seems to be all the public ones) of WMF wikis.
Also, it's only for HTTP, so you can still force a link to HTTPS ("log in securely").
Matt Flaschen
Wikitech-ambassadors mailing list Wikitech-ambassadors@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-ambassadors
That's right, Matt. Feel free to modify everything that you think it's wrong.
I've posted the regular expressions for their use with Pywikipediabot in user-fixes.py (https://meta.wikimedia.org/wiki/User:Invadibot/scope/meta-2/user-fixes.py). You can debug, distribute and "translate" them into other programming languages and, of course, you can run them in your wikis.
Thank you.
wikitech-ambassadors@lists.wikimedia.org