I registered into phabricator with my Mediawiki account initially . Now
when I opt for "login or register via mediawiki" . It takes me to a new
account registration page whereas I have an existing phabricator account
(@dg711) already linked with that same mediawiki account . Is that a bug or
what ?
Please help me resolve this .
Thanks ,
Devang Gaur
On Fri, Jan 29, 2016 at 1:41 PM, Bináris <wikiposta(a)gmail.com> wrote:
> 2016-01-29 18:56 GMT+01:00 Brad Jorsch (Anomie) <bjorsch(a)wikimedia.org>:
>
> > by going to
> > https://www.mediawiki.org/wiki/Special:ApiFeatureUsage, entering your
> > agent
> > (or any useful prefix of it), and looking for "https-expected".
> >
>
> What does *unclear-"now"-timestamp* mean here?
>
For various API timestamp-typed parameters, you can pass unusual values
such as the empty string or "0" and it will be interpreted as meaning
"now", which doesn't make very much sense except for the fact that it has
always done that. If you really mean "now", you should pass that as the
value instead.
action=edit even has to hack around this to avoid spurious edit conflicts
if you do it for the 'basetimestamp' parameter. Ideally we'd make empty
string and '0' be rejected as invalid timestamps, but first people have to
stop passing them in.
--
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
It currently still works to POST to the API via http instead of https. But
we'd really like to stop allowing that, see
https://phabricator.wikimedia.org/T105794. Thus, the API will now return a
warning if https was expected but not used.
If you run a bot, please check your configuration to make sure that you're
using https rather than http. If you're using a distinctive user agent for
your bot (which you all are, right?[1]), you can now check whether your bot
is using http by going to
https://www.mediawiki.org/wiki/Special:ApiFeatureUsage, entering your agent
(or any useful prefix of it), and looking for "https-expected".
If for some reason your bot cannot support https, you really should upgrade
it to make that happen.
[1]: https://meta.wikimedia.org/wiki/User-Agent_policy
--
Brad Jorsch (Anomie)
Senior Software Engineer
Wikimedia Foundation
This is now entering its final comment period, so please weigh in at
https://phabricator.wikimedia.org/T124365.
Based on your input, the Parsing, Editing & Services teams will make a
decision on this next Wednesday, Feb 2nd.
Thanks,
Gabriel
On Thu, Jan 21, 2016 at 4:29 PM, Gabriel Wicke <gwicke(a)wikimedia.org> wrote:
> Hi,
>
> we are considering a policy for REST API end point result format
> versioning and negotiation. The background and considerations are
> spelled out in a task and mw.org page:
>
> https://phabricator.wikimedia.org/T124365
> https://www.mediawiki.org/wiki/Talk:API_versioning
>
> Based on the discussion so far, have come up with the following
> candidate solution:
>
> 1) Clearly advise clients to explicitly request the expected mime type
> with an Accept header. Support older mime types (with on-the-fly
> transformations) until usage has fallen below a very low percentage,
> with an explicit sunset announcement.
>
> 2) Always return the latest content type if no explicit Accept header
> was specified.
>
> We are interested in hearing your thoughts on this.
>
> Once we have reached rough consensus on the way forward, we intend to
> apply the newly minted policy to an evolution of the Parsoid HTML
> format, which will move the data-mw attribute to a separate metadata
> blob.
>
> Gabriel Wicke
--
Gabriel Wicke
Principal Engineer, Wikimedia Foundation
Luigi,
On Thu, Jan 28, 2016 at 2:09 AM, XDiscovery Team <info(a)xdiscovery.com> wrote:
> I tried /rest_v1/ endpoint and it is terribly fast.
that is great to hear. A major goal is indeed to provide high volume
and low latency access to our content.
> @Strainu / @Gabriel , what does 'graph' extension do ?
If you refer to
https://en.wikipedia.org/api/rest_v1/?doc#!/Page_content/get_page_graph_png…,
this is an end point exposing rendered graph images for
https://www.mediawiki.org/wiki/Extension:Graph (as linked in the end
point documentation).
> I have few questions for using proxy cache:
> 1# Is it possible to query a page by page_ID and include redirect?
We don't currently provide access by page ID. Could you describe your
use case a bit to help us understand how access by page id would help
you?
> /page/title/{title}
> allow to get metadata by page, including the pageID , but I would like to
> have final page redirect (e.g. dna return 7956 and I would like to fetch
> 7955 of redirected 'DNA' )
We are looking into improving our support for redirects:
https://phabricator.wikimedia.org/T118548. Your input on this topic
would be much appreciated.
> /page/html/{title} get the article but page_ID / curid is missing in source
> I would like to get the two combined.
This information is actually included in the response, both in the
`ETag` header and in the <head> of the HTML itself. I have updated the
documentation to spell this out more clearly in [1]. The relevant
addition is this:
The response provides an `ETag` header indicating the revision and
render timeuuid separated by a slash (ex: `ETag:
701384379/154d7bca-c264-11e5-8c2f-1b51b33b59fc`). This ETag can be
passed to the HTML save end point (as `base_etag` POST parameter), and
can also be used to retrieve the exact corresponding data-parsoid
metadata, by requesting the specific `revision` and `tid` indicated by
the `ETag`.
> 2# The rest are experimental:
> what could happen if a query fail?
> Does it raise an error, return 404 page or what else?
The stability markers are primarily about request and response
formats, and not about technical availability. Experimental end points
can change at any time, which can result in errors (if the request
interface changed), or return a different response format.
We are currently discussing the use of `Accept` headers for response
format versioning at
https://www.mediawiki.org/wiki/Talk:API_versioning. This will allow us
to more aggressively stabilize end points by giving us the option of
tweaking response formats without breaking existing clients.
> I am thinking if possible to use api.wikipedia as fallback, and use proxy
> cache as primary source any ajax example for doing that to handle possible
> failures?
Yes, this is certainly possible. However, you can rely on end points
currently marked as "unstable" in the REST API. Basically all of them
are used by a lot of production clients at this point, and are very
reliable. Once we introduce general `Accept` support, basically all of
the unstable end points will likely become officially "stable", and
several `experimental` end points will graduate to `unstable`.
> 3# Does /rest/ endpoint exist also for other languages?
Yes, it is available for all 800+ public Wikimedia projects at /api/rest_v1/.
[1]: https://github.com/wikimedia/restbase/pull/488/files#diff-2b6b60416eaafdf0a…
--
Gabriel Wicke
Principal Engineer, Wikimedia Foundation
Hello all,
I would like to announce the release of MediaWiki Language Extension
Bundle 2016.01. This bundle is compatible with MediaWiki 1.25.x and
1.26.x.
Next MLEB is expected to be released in 3 months. If there are major
changes or important bug fixes, we will do intermediate release.
Please give us your feedback at
[[Talk:MLEB|https://www.mediawiki.org/wiki/Talk:MLEB]].
* Download: https://translatewiki.net/mleb/MediaWikiLanguageExtensionBundle-2016.01.tar…
* sha256sum: 7a46bb96f852aa42f728c68e4e21558878c8cba703ce9f8f6c2316af7bbe03e3
Quick links:
* Installation instructions are at: https://www.mediawiki.org/wiki/MLEB
* Announcements of new releases will be posted to a mailing list:
https://lists.wikimedia.org/mailman/listinfo/mediawiki-i18n
* Report bugs to: https://phabricator.wikimedia.org/
* Talk with us at: #mediawiki-i18n @ Freenode
Release notes for each extension are below.
-- Kartik Mistry
== Babel, CleanChanges and LocalisationUpdate ==
* Localisation updates only.
== CLDR ==
* Hebrew names for Cebuano and Norwegian are fixed.
* cldr converted to extension registration. Please update your
LocalSettings.php!
== Translate ==
* Old custom tokens were deprecated in favor of using regular "csrf"
(previously known as "edit") token. If you are using Translate WebAPIs
you might need to migrate.
* Special:Translations no longer shows PHP notices for pages with
invalid language codes
* Special:Translate now longer shows deprecation warnings about access
keys in the JavaScript console.
* Translate is now syntactically compatible with PHP7.
* Message group selector on Special:SearchTranslations no longer has
glitchy behavior after selecting a group.
* MessageIndex code was optimized. It now ignored messages not in
$wgTranslateMessageNamespaces. By default this should be fine, but if
you have translations in custom namespaces, check that they are
included.
* Pages with <languages /> now load faster with cold caches.
* MachineTranslationAid no longer throws uncaught exceptions with
default configuration that broke other translation aids.
== UniversalLanguageSelector ==
* ULS now uses extension registration and thus requires MediaWiki 1.25 or later
* Input methods should now work inside Visual Editor.
* Fonts:
** OpenDyslexic font updated to latest upstream.
** Akkadian font should work again.
* Input Methods:
* New input methods:
** Rodali (Assamese) layout.
** OdiScript (Oriya) layout.
** Yoruba layout.
* Updated input methods:
** Updated Oriya Lekhani layout.
** Digit fixes in Southern Kurdish layout.
** Minor fixes in Sinhala layout.
--
Kartik Mistry/કાર્તિક મિસ્ત્રી | IRC: kart_
{kartikm, 0x1f1f}.wordpress.com
Hi everyone,
Those with a keen eye will notice that I filed T124255
<https://phabricator.wikimedia.org/T124255>, which calls for renaming
#MediaWIki-RfCs in Phab to "#ArchCom-RfC". This would be a boring Phab
administrivia email if it was simply that.
The reason I want the rename: ArchCom is the mechanism we hope to ensure
we build and deploy increasingly excellent software on the Wikimedia
production cluster in a consensus-oriented manner. MediaWiki is at the
center of this, but ArchCom's responsibility doesn't end with MediaWiki.
T124255 <https://phabricator.wikimedia.org/T124255> is an odd place to have
a more sweeping conversation about the scope of ArchCom, but it'll do for
now. Feel free to comment there or on this mailing list.
Thanks
Rob
In the last weeks we have been exploring ways to improve our technical
consensus building and decision making process. I wrote a short RFC
[1] describing some issues, and proposed to adopt ideas from the Rust
community [2] to address them. The discussion on the task and in an
IRC meeting showed broad support for the proposals.
In yesterday's architecture committee meeting, we decided to adopt
much of the Rust RFC decision making process [3] on a trial basis.
Concretely, this means:
- We will nominate a member of the architecture committee as a
shepherd, guiding an active RFC through the process. Among other
things, the shepherd is responsible for informing all relevant
stakeholders of the ongoing discussion on the task. The shepherd might
also lead an IRC discussion on the RFC, which will be summarized on
the task.
- Once the discussion on a task plateaus or stalls, the shepherd (in
coordination with the RFC author(s)) announces and widely publicizes a
"Final Comment Period", which is one week.
- At the end of the "Final Comment Period", the architecture committee
decides based on the points made in the RFC discussion, and justifies
its decision based on the overall project principles and priorities.
If any new facts or aspects are surfaced in this discussion, a new
Final Comment Period needs to be started before making a decision.
For now, we are holding off on the second part of the RFC, the
introduction of working groups. There is agreement that we need to
broaden the involvement and scale the process, but the details of how
are still under discussion.
Gabriel
[1]: https://www.mediawiki.org/wiki/Requests_for_comment/Governance
[2]: https://github.com/rust-lang/rfcs/blob/master/text/1068-rust-governance.md
[3]: https://github.com/rust-lang/rfcs/blob/master/text/1068-rust-governance.md#…