Found this interesting articles on designing an API for what it's worth. Thought some people my find it interesting.
http://mathieu.fenniak.net/the-api-checklist/ *-- * *Tyler Romeo* Stevens Institute of Technology, Class of 2015 Major in Computer Science www.whizkidztech.com | tylerromeo@gmail.com
Thanks Tyler, some points are very interesting.
On Wed, Apr 17, 2013 at 5:50 AM, Tyler Romeo tylerromeo@gmail.com wrote:
Found this interesting articles on designing an API for what it's worth. Thought some people my find it interesting.
http://mathieu.fenniak.net/the-api-checklist/ *-- * *Tyler Romeo* Stevens Institute of Technology, Class of 2015 Major in Computer Science www.whizkidztech.com | tylerromeo@gmail.com _______________________________________________ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On Tue, Apr 16, 2013 at 11:50 PM, Tyler Romeo tylerromeo@gmail.com wrote:
Found this interesting articles on designing an API for what it's worth. Thought some people my find it interesting.
Some thoughts:
#2: HTTP Basic authentication is simple, but not particularly secure.
#3, #4, #6, #12, #18, #21: It depends on your API design. Are you using HTTP as a transport (which is what the MediaWiki API does) or is HTTP your API (e.g. REST)?
#7, #8, #16, #17: Easier said than done, unless you're writing your own webserver, or hooking into the existing webserver at a deep level (which may not even be possible).
#9: There's no generic mechanism in HTTP to compress the request body; although some servers may support such a thing (if configured to do so), it's not standardized. "Accept-Encoding" tells the server which encodings it is allowed to use for the response body, and "Content-Encoding" indicates which were used for the response. And this too is often handled automatically at a lower level than you're likely to care much about.
#24: The downside being that then you have to maintain multiple different versions of your API. If you are doing this, it would be best to have a clear policy on removing old versions so you're not stuck supporting them forever. And if the underlying code or data structure has to change in a way that makes something prohibitive or impossible in the old version of the API, you're still stuck breaking things.
#26: Since he mentions REST several times (e.g. #31), I note that I've yet to see a method for really combining REST and bulk operations beyond something akin to "rm -rf".
#34: "If your API accepts the same authentication configuration that your interactive users use", not quite. Try "If whatever your API uses for auth is automatically sent by a browser" whether you intend it to be interactive or not.
#40: OTOH, hand-written documentation easily gets out of date. Having both hand-written and automatically-generated is nice.
Regarding #7 in that list (Expect: 100-Continue), I think it would be nice if Wikimedia wikis did this.
I know that at least in .Net, if I send a POST request to http://en.wikipedia.org/w/api.php, the Expect: 100-Continue header will be set, which results in an 417 Expectation failed error.
.Net has a switch to turn that header off, and with that the request will work fine. But I think it would be nice if Wikimedia wikis supported this.
I think this is an issue with something in Wikimedia's configuration (Squid? or maybe something like that) and not MediaWiki itself, because it works fine for my local MediaWiki installation even with Expect: 100-Continue set.
Petr Onderka [[en:User:Svick]]
On Wed, Apr 17, 2013 at 5:50 AM, Tyler Romeo tylerromeo@gmail.com wrote:
Found this interesting articles on designing an API for what it's worth. Thought some people my find it interesting.
http://mathieu.fenniak.net/the-api-checklist/ *-- * *Tyler Romeo* Stevens Institute of Technology, Class of 2015 Major in Computer Science www.whizkidztech.com | tylerromeo@gmail.com _______________________________________________ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On Wed, Apr 17, 2013 at 4:18 PM, Petr Onderka gsvick@gmail.com wrote:
Regarding #7 in that list (Expect: 100-Continue), I think it would be nice if Wikimedia wikis did this.
I know that at least in .Net, if I send a POST request to http://en.wikipedia.org/w/api.php, the Expect: 100-Continue header will be set, which results in an 417 Expectation failed error.
.Net has a switch to turn that header off, and with that the request will work fine. But I think it would be nice if Wikimedia wikis supported this.
I think this is an issue with something in Wikimedia's configuration (Squid? or maybe something like that) and not MediaWiki itself, because it works fine for my local MediaWiki installation even with Expect: 100-Continue set.
Well, PHP and Apache do not support the 100-Continue workflow, i.e., there's no way to tell Apache to let PHP do its thing before sending the 100-Continue and receiving the rest of the data.
*-- * *Tyler Romeo* Stevens Institute of Technology, Class of 2015 Major in Computer Science www.whizkidztech.com | tylerromeo@gmail.com
My understanding is its not really possible to do this in php in a way that would actually be of use to anyone. See https://bugzilla.wikimedia.org/show_bug.cgi?id=26631#c1
-bawolff
On 4/17/13, Petr Onderka gsvick@gmail.com wrote:
Regarding #7 in that list (Expect: 100-Continue), I think it would be nice if Wikimedia wikis did this.
I know that at least in .Net, if I send a POST request to http://en.wikipedia.org/w/api.php, the Expect: 100-Continue header will be set, which results in an 417 Expectation failed error.
.Net has a switch to turn that header off, and with that the request will work fine. But I think it would be nice if Wikimedia wikis supported this.
I think this is an issue with something in Wikimedia's configuration (Squid? or maybe something like that) and not MediaWiki itself, because it works fine for my local MediaWiki installation even with Expect: 100-Continue set.
Petr Onderka [[en:User:Svick]]
On Wed, Apr 17, 2013 at 5:50 AM, Tyler Romeo tylerromeo@gmail.com wrote:
Found this interesting articles on designing an API for what it's worth. Thought some people my find it interesting.
http://mathieu.fenniak.net/the-api-checklist/ *-- * *Tyler Romeo* Stevens Institute of Technology, Class of 2015 Major in Computer Science www.whizkidztech.com | tylerromeo@gmail.com _______________________________________________ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
I didn't necessarily mean that the 100-Continue workflow should be fully supported. I think Ignoring the header would be much better than completely refusing to work with it (and replying with error 417) and my guess is that doing that should be possible.
Looking at the HTTP specification [1], it says that a proxy *has to* return 417 if the target server doesn't support HTTP/1.1. Though I have no idea why would the specification require this.
Petr Onderka [[en:user:Svick]]
[1]: http://www.w3.org/Protocols/rfc2616/rfc2616-sec8.html#sec8.2.3
On Wed, Apr 17, 2013 at 10:33 PM, Brian Wolff bawolff@gmail.com wrote:
My understanding is its not really possible to do this in php in a way that would actually be of use to anyone. See https://bugzilla.wikimedia.org/show_bug.cgi?id=26631#c1
-bawolff
On 4/17/13, Petr Onderka gsvick@gmail.com wrote:
Regarding #7 in that list (Expect: 100-Continue), I think it would be
nice
if Wikimedia wikis did this.
I know that at least in .Net, if I send a POST request to http://en.wikipedia.org/w/api.php, the Expect: 100-Continue header will be set, which results in an 417 Expectation failed error.
.Net has a switch to turn that header off, and with that the request will work fine. But I think it would be nice if Wikimedia wikis supported this.
I think this is an issue with something in Wikimedia's configuration (Squid? or maybe something like that) and not MediaWiki itself, because
it
works fine for my local MediaWiki installation even with Expect: 100-Continue set.
Petr Onderka [[en:User:Svick]]
On Wed, Apr 17, 2013 at 5:50 AM, Tyler Romeo tylerromeo@gmail.com
wrote:
Found this interesting articles on designing an API for what it's worth. Thought some people my find it interesting.
http://mathieu.fenniak.net/the-api-checklist/ *-- * *Tyler Romeo* Stevens Institute of Technology, Class of 2015 Major in Computer Science www.whizkidztech.com | tylerromeo@gmail.com _______________________________________________ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On Wed, Apr 17, 2013 at 4:47 PM, Petr Onderka gsvick@gmail.com wrote:
I didn't necessarily mean that the 100-Continue workflow should be fully supported. I think Ignoring the header would be much better than completely refusing to work with it (and replying with error 417) and my guess is that doing that should be possible.
Looking at the HTTP specification [1], it says that a proxy *has to* return 417 if the target server doesn't support HTTP/1.1. Though I have no idea why would the specification require this.
So you're suggesting we go *against* the HTTP standard? That's not exactly what you're supposed to do.
*-- * *Tyler Romeo* Stevens Institute of Technology, Class of 2015 Major in Computer Science www.whizkidztech.com | tylerromeo@gmail.com
So you're suggesting we go *against* the HTTP standard? That's not exactly what you're supposed to do.
Well, ignoring the header makes more sense to me and, personally, I would prefer that behavior.
But it's a minor issue and I think going against the standard is not actually worth it.
Petr Onderka
On 17 April 2013 22:33, Brian Wolff bawolff@gmail.com wrote:
My understanding is its not really possible to do this in php in a way that would actually be of use to anyone. See https://bugzilla.wikimedia.org/show_bug.cgi?id=26631#c1
Still, supporting this in a way “that wouldn’t be of use”, i.e. send the 100 status immediately instead of 417 would probably make it a tiny bit easier for clients. However, this is not a bug/problem/feature-request for MediaWiki, but for Squid. It seems Apache&PHP would handle this correctly, but Squid rejects such requests. There is a configuration variable doing exactly what Svick is proposing < http://www.squid-cache.org/Doc/config/ignore_expect_100/%3E, but I agree turning it on would not be a good idea. And FYI: Squid 3.2 seems to support 100-continue somehow, but not sure how much. < http://wiki.squid-cache.org/Features/HTTP11%3E
-- [[cs:User:Mormegil | Petr Kadlec]]
On 2013-04-18 1:13 AM, "Petr Kadlec" petr.kadlec@gmail.com wrote:
On 17 April 2013 22:33, Brian Wolff bawolff@gmail.com wrote:
My understanding is its not really possible to do this in php in a way that would actually be of use to anyone. See https://bugzilla.wikimedia.org/show_bug.cgi?id=26631#c1
Still, supporting this in a way “that wouldn’t be of use”, i.e. send the 100 status immediately instead of 417 would probably make it a tiny bit easier for clients. However, this is not a bug/problem/feature-request for MediaWiki, but for Squid. It seems Apache&PHP would handle this correctly, but Squid rejects such requests. There is a configuration variable doing exactly what Svick is proposing < http://www.squid-cache.org/Doc/config/ignore_expect_100/%3E, but I agree turning it on would not be a good idea. And FYI: Squid 3.2 seems to
support
100-continue somehow, but not sure how much. < http://wiki.squid-cache.org/Features/HTTP11%3E
-- [[cs:User:Mormegil | Petr Kadlec]] _______________________________________________ Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
I disagree. If we supported it, people will expect it to actually work, and add the extra complexity to support 100 continue to their bots. This would be bad since it would essentially be a no-op and just slow things down.
-bawolff
wikitech-l@lists.wikimedia.org