I thought this would be easy. Write an extension with a CURL get
function that fetches some HTML from a server elsewhere that would
allow me to do AJAX fetches from that server using javascript supplied
from that server.
It's not happening. I get back status = 0 as if MediaWiki appears to
that server like a file:// or something.
I can call the same URL the CURL uses in a plain browser bar. I get
the HTML and then am able to get JSON strings back from the server,
but not when using the same html in MW.
Does any of that ring any bells?
Many thanks in advance.
Jack
With respect, you need to contact the people at wikipedia.org.
This is nothing but a user-to-user list. No one here can help you, unless
by pure luck they happen to be a wikipedia admin.
----- Original Message -----
From: Cetateanu Moldovanu <cetateanumd(a)gmail.com>
To: Brion Vibber <brion(a)wikimedia.org>, foundation-l(a)lists.wikimedia.org,
mediawiki-l(a)lists.wikimedia.org
Date: Sun, 21 Feb 2010 18:42:06 +0100
Subject: [Mediawiki-l] Brion Vibber delete mo. as promised ! Please wake up
> On Wed Nov 26 01:04:12 UTC 2008 a strategy about subdomains rename was
made
> public
>
http://lists.wikimedia.org/pipermail/foundation-l/2008-November/047554.html
> But to my regret, nothing happened until now, I want to reiterate how *
> IMPORTANT that you rename that mo.wikipedia.org -> mo-cyrl.wikipedia.org
> domain.*
>
> Dozens of messages, promises, and NO SINGLE REAL ACTION ?!
>
> With all due respect, why is it taking some much freaking time to redirect
> everything to a new freaking subdomain ? Thank you for your answer.
> _______________________________________________
> MediaWiki-l mailing list
> MediaWiki-l(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
>
On Wed Nov 26 01:04:12 UTC 2008 a strategy about subdomains rename was made
public
http://lists.wikimedia.org/pipermail/foundation-l/2008-November/047554.html
But to my regret, nothing happened until now, I want to reiterate how *
IMPORTANT that you rename that mo.wikipedia.org -> mo-cyrl.wikipedia.org
domain.*
Dozens of messages, promises, and NO SINGLE REAL ACTION ?!
With all due respect, why is it taking some much freaking time to redirect
everything to a new freaking subdomain ? Thank you for your answer.
Hi all,
At Brasilian's WMF chapter
wiki<http://br.wikimedia.org/wiki/P%C3%A1gina_principal>we are facing
problems with search engine. It does not return any results
but if the complete name of a page is typed. If you type "Sexta", for
instance, or alternativelly "poética" it does not return any information of
a page which contains these word. And we actually have a page named "Sexta
poética".Could you give us any hint?
Thank you
--
{+}Nevinho
Venha para o Movimento Colaborativo http://sextapoetica.com.br !!
I'd like an easy way to discover which articles on my wiki use a given parser tag or parser function. Regex search would give only an approximate answer.
For example, the wiki could automatically create (hidden) categories called "Articles that use parser tag A", "Articles that use parser function B", etc., and populate them. What would be the best way to implement this with an extension?
- Use a Save hook, such as ArticleSave or ArticleSaveComplete, and perhaps some custom database tables to store the info?
- Affect the core extension code so when any extension executes, it adds the current article to the category?
- Something else?
We can do this manually with our own extensions, but we'd rather not modify third-party extensions in this way.
Any other ideas?
Thanks,
DanB
Hello,
I'm building an international dispute resolution wiki. The English version
of the site is launched - but now I'm trying to launch language versions of
the site.
My wiki mainly uses forms to ask users for information. The English version
has 92 properties, which are called by the forms.
Is there any way I can mass-migrate these Properties to e.g. the Spanish
instance of the site?
Here's the English Properties page:
http://baseswiki.org/w/en/index.php?title=Special:Properties&limit=100&offs…
Here's the Spanish Properties page:
http://beta.baseswiki.org/es/Especial:Properties
92 properties x 6 languages = >500 times I would have to perform the same
manual operation. Yikes! How can I make this more automatic?
Thanks,
Kyle Stone
Hi All,
I don't see one on the extensions pages, but does anyone know of an extension that would allow users to make a copy of a page? I know it seems like it would cause duplication of data that would have to be kept up on both pages, but we have a method of doing citations where each citation is its own page. Sometimes our authors would like to copy an old citation with most of the same info as a new one they are using.
Thanks!
Courtney Christensen
Welcome to mediawiki-l. This mailing list exists for discussion and questions
about the MediaWiki software[0]. Important MediaWiki-related announcements
(such as new versions) are also posted to this list.
Other resources.
If you only wish to receive announcements, you should subscribe to
mediawiki-announce[1] instead.
MediaWiki development discussion, and all Wikimedia technical questions, should
be directed to the wikitech-l[2] mailing list.
Several other MediaWiki-related lists exist:
- mediawiki-api[5] for API discussions,
- mediawiki-enterprise[6] for discussion of MediaWiki in the enterprise,
- mediawiki-cvs[7] for notification of commits to the Subversion repository,
- mediawiki-i18n[8] for discussion of MediaWiki internationalisation support,
- wikibugs-l[9] for notification of changes to the bug tracker.
List administrivia (unsubscribing, list archives).
To unsubscribe from this mailing list, visit [12]. Archives of previous postings
can be found at [3].
This list is also gatewayed to the Gmane NNTP server[4], which you can use to
read and post to the list.
Posting to the list.
Before posting to this list, please read the MediaWiki FAQ[10]. Many common
questions are answered here. You may also search the list archives to see if
your question has been asked before.
Please try to ask your question in a way that enables people to answer you.
Provide all relevant details, explain your problem clearly, etc. You may
wish to read [13], which explains how to ask questions well.
To post to the list, send mail to <mediawiki-l(a)lists.wikimedia.org>. This is a
public list, so you should not include confidential information in mails you
send.
When replying to an existing thread, use the "Reply" or "Followup" feature of
your mail client, so that clients that understand threading can sort your
message properly. When quoting other messages, please use the "inline" quoting
style[11], for clarity.
When creating a new thread, do not reply to an existing message and change the
subject. This will confuse peoples' mail readers, and will result in fewer
people reading your mail. Instead, compose a new message for your post.
Messages posted to the list have the "Reply-To" header set to the mailing list,
which means that by default, replies will go to the entire list. If you are
posting a reply which is only interesting to the original poster, and not the
list in general, you should change the reply to only go to that person. This
avoids cluttering the list with irrelevant traffic.
About this message.
This message is posted to the list once per week by <river(a)wikimedia.org>.
Please contact me if you have any questions or concerns about this mailing.
References.
[0] http://www.mediawiki.org/
[1] http://lists.wikimedia.org/mailman/listinfo/mediawiki-announce
[2] http://lists.wikimedia.org/mailman/listinfo/wikitech-l
[3] http://lists.wikimedia.org/pipermail/mediawiki-l/
[4] http://dir.gmane.org/gmane.org.wikimedia.mediawiki
[5] http://lists.wikimedia.org/mailman/listinfo/mediawiki-api
[6] http://lists.wikimedia.org/mailman/listinfo/mediawiki-enterprise
[7] http://lists.wikimedia.org/mailman/listinfo/mediawiki-cvs
[8] http://lists.wikimedia.org/mailman/listinfo/mediawiki-i18n
[9] http://lists.wikimedia.org/mailman/listinfo/wikibugs-l
[10] http://www.mediawiki.org/wiki/FAQ
[11] http://en.wikipedia.org/wiki/Posting_style#Inline_replying
[12] http://lists.wikimedia.org/mailman/listinfo/mediawiki-l
[13] http://www.catb.org/~esr/faqs/smart-questions.html
> From: Juan Jes?s Cremades Monserrat <relicary(a)gmail.com>
>
> In the Main Page of Wikipedia ther are sections like "Today's featured
> article" which shows part of an article and a tag called "more..."
> I've
> studied the code, but I don't undertand it. How can I do some
> similar? My
> objective is show part of another wiki page in my own main page. I
> don't
> need the aleatory selection of a page of the day. Thanks!
Not sure how they're doing it, but you might check out the DPL
extension.
We use that on our home page to transclude a short section of a random
page:
{{note|<dpl> namespace= includepage=* includemaxlength=600
escapelinks=false resultsheader=__NOTOC__ __NOEDITSECTION__
randomcount=1 mode=userformat addpagecounter=true listseparators=<h2>A
Random Selection From Page: , [[%PAGE%]]</h2>,,\n\n </dpl>}}
----------------
Would that there were an award for people who come to understand the
concept of enough. Good enough. Successful enough. Thin enough. Rich
enough... When you have self-respect you have enough, and when you
have enough you have self-respect. -- Gail Sheehy
:::: Jan Steinman, EcoReality Co-op ::::