Hi,
I have several - private - wiki's that are used by one group of users. All wiki's are on MW 1.27.1. What I would like is:
- Same users and passwords for all wiki's - Single sign on - API requests from one wiki to another
I'm getting lost in OAuth, CentralAuth, API:Login and I would like some advice on what to do.
Can anyone put me on the right track?
Thanks! Ad
Ad Strack van Schijndel writes:
- Same users and passwords for all wiki's
- Single sign on
- API requests from one wiki to another
Does the enterprise you are hosting these for have an SSO source like Active Directory?
Cindy Cicalese has just finished updating PluggableAuth for MW 1.27. I think it would prove very helpful to you.
You don't say what you're using the API requests for, but that seems trivial. Are you running into problems there?
For one of my clients, I have written PluggableSSO (which uses Cindy's PluggableAuth) to take care of authentication based on HTTP headers that Apache provides using mod_auth_kerb. I have and AD user that queries AD via LDAP for user information then.
Another client of mine uses a CA-provided module for Apache that takes care of SAML authentication and then sends me headers for the user. I'm thinking of replacing that apache module with Cindy's SimpleSAMLphp extension that uses PluggableAuth.
Hope this helps!
Mark.
Hi Mark,
Nice to hear from you! I'm very interested to understand more about it, but as you can see below I'm not sure if I have the right skills.
Ad
On 6 Oct 2016, at 17:54, Mark A. Hershberger mah@nichework.com wrote:
Ad Strack van Schijndel writes:
- Same users and passwords for all wiki's
- Single sign on
- API requests from one wiki to another
Does the enterprise you are hosting these for have an SSO source like Active Directory?
No, this is for our own company. First attempts on this road. Would it be an idea to have a shared user table for all wiki's?
Cindy Cicalese has just finished updating PluggableAuth for MW 1.27. I think it would prove very helpful to you.
Mmm, at first glance it seems that situation where PluggableAuth is especially useful is not ours. And I wouldn't know how to use it. I'm more the 'use extensions as-is' kind of guy :-).
You don't say what you're using the API requests for, but that seems trivial. Are you running into problems there?
Semantic External Query Lookup! So I am not doing the api requests myself, but I want to try to contribute to this fine extension and make it usable with private wiki's. SEQL works fine if the 'source' wiki is public.
For one of my clients, I have written PluggableSSO (which uses Cindy's PluggableAuth) to take care of authentication based on HTTP headers that Apache provides using mod_auth_kerb. I have and AD user that queries AD via LDAP for user information then.
In the meantime I have tried OAuth 'Owner-only consumers' https://www.mediawiki.org/wiki/OAuth/Owner-only_consumers#PHP. Added this code to James's QueryResultFetcher in what I figured would be the right place. With the values I got when registering the wiki as a consumer. This should add the authorization header to the request. That looks fine, but it doesn't work and I can't figure out what happens next. And I must admit: I'm not that knowledgable with these things...
Another client of mine uses a CA-provided module for Apache that takes care of SAML authentication and then sends me headers for the user. I'm thinking of replacing that apache module with Cindy's SimpleSAMLphp extension that uses PluggableAuth.
Hope this helps!
Mark.
-- Mark A. Hershberger NicheWork LLC 717-271-1084
Mediawiki-enterprise mailing list Mediawiki-enterprise@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-enterprise
Ad Strack van Schijndel writes:
Does the enterprise you are hosting these for have an SSO source like Active Directory?
No, this is for our own company. First attempts on this road. Would it be an idea to have a shared user table for all wiki's?
The shared table would work, but it wouldn't give you SSO. Typically, SSO requires some kind of service like Microsoft's AD, Amazon's AWS Directory Service, or a Kerberos infrastructure.
Cindy Cicalese has just finished updating PluggableAuth for MW 1.27. I think it would prove very helpful to you.
Mmm, at first glance it seems that situation where PluggableAuth is especially useful is not ours. And I wouldn't know how to use it. I'm more the 'use extensions as-is' kind of guy :-).
If you had a way to work with SAML, it would just work, but...
In the meantime I have tried OAuth 'Owner-only consumers' https://www.mediawiki.org/wiki/OAuth/Owner-only_consumers#PHP
Ah, I hadn't seen that. That looks like the approach you want if you to have a set-up like the WMF's.
To continue, though, I think you'd really need CentralAuth, but as that extension's page notes:
Warning: CentralAuth was designed specifically for Wikimedia...
If you are starting a new wiki farm from scratch ... it is much easier to set up global accounts using $wgSharedDB rather than using CentralAuth.
However, this provides no solution to the single-sign-on feature (sign-in on one wiki, like wikipedia.org, automatically makes the user signed-in to e.g. a shared repository under a different, shared domain, like commons.wikimedia.org), nor global account locking.
So, unless you're prepared to tie yourself to WMF's code roll-outs, I would avoid CentralAuth.
So, you're stuck with shared tables, and that means you have no SSO.
You *might* be able to do something to make SSO work by setting $wgSharedTables, $wgSharedPrefix, and/or $wgCookiePrefix, but I haven't tried this.
Added this code to James's QueryResultFetcher in what I figured would be the right place. With the values I got when registering the wiki as a consumer.
See, this looks like an interesting problem. Even with the above work you wouldn't have the inter-wiki API authorization set up (unless that all happens via JS).
Mark.
The scenario with shared user tables and no SSO is second best, but perfectly acceptable.
Now for the inter-wiki API authorization. On https://www.mediawiki.org/wiki/API:Login#How_to_log_in it says "Note that logging in and remaining logged in requires correct HTTP cookie handling by your client on all requests. Typically your framework or HTTP request library will handle this for you."
So the question is, what framework or HTTP request library is that and how can it be used.
Do you have an idea what that could be?
Ad
On 6 Oct 2016, at 20:11, Mark A. Hershberger mah@nichework.com wrote:
Ad Strack van Schijndel writes:
Does the enterprise you are hosting these for have an SSO source like Active Directory?
No, this is for our own company. First attempts on this road. Would it be an idea to have a shared user table for all wiki's?
The shared table would work, but it wouldn't give you SSO. Typically, SSO requires some kind of service like Microsoft's AD, Amazon's AWS Directory Service, or a Kerberos infrastructure.
Cindy Cicalese has just finished updating PluggableAuth for MW 1.27. I think it would prove very helpful to you.
Mmm, at first glance it seems that situation where PluggableAuth is especially useful is not ours. And I wouldn't know how to use it. I'm more the 'use extensions as-is' kind of guy :-).
If you had a way to work with SAML, it would just work, but...
In the meantime I have tried OAuth 'Owner-only consumers' https://www.mediawiki.org/wiki/OAuth/Owner-only_consumers#PHP
Ah, I hadn't seen that. That looks like the approach you want if you to have a set-up like the WMF's.
To continue, though, I think you'd really need CentralAuth, but as that extension's page notes:
Warning: CentralAuth was designed specifically for Wikimedia...
If you are starting a new wiki farm from scratch ... it is much easier to set up global accounts using $wgSharedDB rather than using CentralAuth.
However, this provides no solution to the single-sign-on feature (sign-in on one wiki, like wikipedia.org, automatically makes the user signed-in to e.g. a shared repository under a different, shared domain, like commons.wikimedia.org), nor global account locking.
So, unless you're prepared to tie yourself to WMF's code roll-outs, I would avoid CentralAuth.
So, you're stuck with shared tables, and that means you have no SSO.
You *might* be able to do something to make SSO work by setting $wgSharedTables, $wgSharedPrefix, and/or $wgCookiePrefix, but I haven't tried this.
Added this code to James's QueryResultFetcher in what I figured would be the right place. With the values I got when registering the wiki as a consumer.
See, this looks like an interesting problem. Even with the above work you wouldn't have the inter-wiki API authorization set up (unless that all happens via JS).
Mark.
-- Mark A. Hershberger NicheWork LLC 717-271-1084
Mediawiki-enterprise mailing list Mediawiki-enterprise@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-enterprise
Ad,
I've been meaning to reply to your email for the past two weeks, but I'm just now getting around to it.
I am curious what progress you've made on this. But I also have some information (see below) that may help if you need it.
Ad Strack van Schijndel writes:
https://www.mediawiki.org/wiki/API:Login#How_to_log_in it says "Note that logging in and remaining logged in requires correct HTTP cookie handling by your client on all requests. Typically your framework or HTTP request library will handle this for you."
So the question is, what framework or HTTP request library is that and how can it be used.
MediaWiki has a built-in HTTP client class[1], but if you're building this outside of the MediaWiki framework, you could use the Guzzle library[2].
I've written code using both of these, so if you need help, let me know and I'll try to provide a more speedy reply.
Mark.
Footnotes: [1] https://doc.wikimedia.org/mediawiki-core/master/php/classHttp.html
[2] http://docs.guzzlephp.org/en/latest/
Mark,
Same here! Busy with other things. But although this topic is not urgent it is definitely important.
Actually we have 2 use cases: one inside the MW framework and one outside. Is it possible/wise to use the Guzzle library for both?
The 'inside' use case is doing SEQL [ https://www.mediawiki.org/wiki/Extension:Semantic_External_Query_Lookup1] queries from one wiki to another (private) wiki. The 'outside' use case is a script that reads email messages and creates pages from them in a private wiki.
Ad
[1] https://www.mediawiki.org/wiki/Extension:Semantic_External_Query_Lookup
On 25 Oct 2016, at 15:52, Mark A. Hershberger mah@nichework.com wrote:
Ad,
I've been meaning to reply to your email for the past two weeks, but I'm just now getting around to it.
I am curious what progress you've made on this. But I also have some information (see below) that may help if you need it.
Ad Strack van Schijndel writes:
https://www.mediawiki.org/wiki/API:Login#How_to_log_in it says "Note that logging in and remaining logged in requires correct HTTP cookie handling by your client on all requests. Typically your framework or HTTP request library will handle this for you."
So the question is, what framework or HTTP request library is that and how can it be used.
MediaWiki has a built-in HTTP client class[1], but if you're building this outside of the MediaWiki framework, you could use the Guzzle library[2].
I've written code using both of these, so if you need help, let me know and I'll try to provide a more speedy reply.
Mark.
Footnotes: [1] https://doc.wikimedia.org/mediawiki-core/master/php/classHttp.html
[2] http://docs.guzzlephp.org/en/latest/
-- Mark A. Hershberger NicheWork LLC 717-271-1084
Ad Strack van Schijndel writes:
Same here! Busy with other things. But although this topic is not urgent it is definitely important.
I'm making an effort to get on top of things here. Hence, the “early” reply.
Actually we have 2 use cases: one inside the MW framework and one outside. Is it possible/wise to use the Guzzle library for both?
It is certainly possible, especially now that you have composer.local.json to install Guzzle with inside of MediaWiki. Since Guzzle uses the namespace \GuzzleHttp, you shouldn't have a problem with conflicting names inside MW.
It seems like you'd only need to use one set of APIs this way, which is a bonus!
Mark.
I'll try to keep up with you.
Why can't Guzzle be installed with composer.json? I put other libraries in there as well...
Ad
Op 11 nov. 2016 om 01:16 heeft Mark A. Hershberger mah@nichework.com het volgende geschreven:
Ad Strack van Schijndel writes:
Same here! Busy with other things. But although this topic is not urgent it is definitely important.
I'm making an effort to get on top of things here. Hence, the “early” reply.
Actually we have 2 use cases: one inside the MW framework and one outside. Is it possible/wise to use the Guzzle library for both?
It is certainly possible, especially now that you have composer.local.json to install Guzzle with inside of MediaWiki. Since Guzzle uses the namespace \GuzzleHttp, you shouldn't have a problem with conflicting names inside MW.
It seems like you'd only need to use one set of APIs this way, which is a bonus!
Mark.
-- Mark A. Hershberger NicheWork LLC 717-271-1084
Heiya Ad,
in contrast to "composer.local.json" the "composer.json" file gets overwritten on every update or upgrade, so you will have to recreate this file every time (four to six times a year). This does not happen to "composer.local.json".
Cheers Karsten
Am 11.11.2016 um 07:19 schrieb Ad Strack van Schijndel:
I'll try to keep up with you.
Why can't Guzzle be installed with composer.json? I put other libraries in there as well...
Ad
Op 11 nov. 2016 om 01:16 heeft Mark A. Hershberger mah@nichework.com het volgende geschreven:
Ad Strack van Schijndel writes:
Same here! Busy with other things. But although this topic is not urgent it is definitely important.
I'm making an effort to get on top of things here. Hence, the “early” reply.
Actually we have 2 use cases: one inside the MW framework and one outside. Is it possible/wise to use the Guzzle library for both?
It is certainly possible, especially now that you have composer.local.json to install Guzzle with inside of MediaWiki. Since Guzzle uses the namespace \GuzzleHttp, you shouldn't have a problem with conflicting names inside MW.
It seems like you'd only need to use one set of APIs this way, which is a bonus!
Mark.
-- Mark A. Hershberger NicheWork LLC 717-271-1084
Mediawiki-enterprise mailing list Mediawiki-enterprise@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-enterprise
Hi Karsten,
Ok, so it is better to use composer.local.json for the extensions as well. The composer equivalent of the LocalSettings.php.
Is that it?
Ad
On 11 Nov 2016, at 15:58, kghbln mediawiki@kghoffmeyer.de wrote:
Heiya Ad,
in contrast to "composer.local.json" the "composer.json" file gets overwritten on every update or upgrade, so you will have to recreate this file every time (four to six times a year). This does not happen to "composer.local.json".
Cheers Karsten
Am 11.11.2016 um 07:19 schrieb Ad Strack van Schijndel:
I'll try to keep up with you.
Why can't Guzzle be installed with composer.json? I put other libraries in there as well...
Ad
Op 11 nov. 2016 om 01:16 heeft Mark A. Hershberger mah@nichework.com het volgende geschreven:
Ad Strack van Schijndel writes:
Same here! Busy with other things. But although this topic is not urgent it is definitely important.
I'm making an effort to get on top of things here. Hence, the “early” reply.
Actually we have 2 use cases: one inside the MW framework and one outside. Is it possible/wise to use the Guzzle library for both?
It is certainly possible, especially now that you have composer.local.json to install Guzzle with inside of MediaWiki. Since Guzzle uses the namespace \GuzzleHttp, you shouldn't have a problem with conflicting names inside MW.
It seems like you'd only need to use one set of APIs this way, which is a bonus!
Mark.
-- Mark A. Hershberger NicheWork LLC 717-271-1084
Mediawiki-enterprise mailing list Mediawiki-enterprise@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-enterprise
Mediawiki-enterprise mailing list Mediawiki-enterprise@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-enterprise
Heiya Ad,
that's true. Indeed, this is another way to say it. :)
Cheers Karsten
Am 12.11.2016 um 14:29 schrieb Ad Strack van Schijndel:
Hi Karsten,
Ok, so it is better to use composer.local.json for the extensions as well. The composer equivalent of the LocalSettings.php.
Is that it?
Ad
On 11 Nov 2016, at 15:58, kghbln mediawiki@kghoffmeyer.de wrote:
Heiya Ad,
in contrast to "composer.local.json" the "composer.json" file gets overwritten on every update or upgrade, so you will have to recreate this file every time (four to six times a year). This does not happen to "composer.local.json".
Cheers Karsten
Am 11.11.2016 um 07:19 schrieb Ad Strack van Schijndel:
I'll try to keep up with you.
Why can't Guzzle be installed with composer.json? I put other libraries in there as well...
Ad
Op 11 nov. 2016 om 01:16 heeft Mark A. Hershberger mah@nichework.com het volgende geschreven:
Ad Strack van Schijndel writes:
Same here! Busy with other things. But although this topic is not urgent it is definitely important.
I'm making an effort to get on top of things here. Hence, the “early” reply.
Actually we have 2 use cases: one inside the MW framework and one outside. Is it possible/wise to use the Guzzle library for both?
It is certainly possible, especially now that you have composer.local.json to install Guzzle with inside of MediaWiki. Since Guzzle uses the namespace \GuzzleHttp, you shouldn't have a problem with conflicting names inside MW.
It seems like you'd only need to use one set of APIs this way, which is a bonus!
Mark.
-- Mark A. Hershberger NicheWork LLC 717-271-1084
Mediawiki-enterprise mailing list Mediawiki-enterprise@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-enterprise
Mediawiki-enterprise mailing list Mediawiki-enterprise@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-enterprise
Mediawiki-enterprise mailing list Mediawiki-enterprise@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-enterprise
mediawiki-enterprise@lists.wikimedia.org