I've looked at the diagram on the mediawiki site -- and it still not
clear to me what the page_latest column references (they don't seem to
be page_ids)...
Can someone please clarify?
Thanks.
Since the very beginning of Wikipedia, the parser has contained a
special pattern for the letters ISBN followed by 10 (now 10 or 13)
digits, resulting in a link pertaining to the specified
International Standard Book Number. MediaWiki inherited this
behaviour from UseModWiki. At one point, a Special:Booksource
page was introduced as an intermediary between the ISBN mentioned
in an article and a list of alternative libraries and bookstores.
Since June 2003, the Special:Booksource page transcludes
user-editable contents from [[Wikipedia:Book sources]] and
substitutes the word MAGICNUMBER.
Today, new numbering and reference schemes are not introduced in
the parser, but instead through templates. If the above had been
invented today, we would write {{ISBN | 9876543210}} instead of
the "naked" pattern ISBN 9876543210.
For many years now, it has been asked if we could have ISSN
support similar to the ISBN support. ISSN are International
Standard Serial Numbers, identifying newspapers and journals
(collectively called "serials" by librarians), such
as Nature (ISSN 0028-0836) and Science (ISSN 0036-8075).
An increasing number of Wikipedias have introduced the template
{{ISSN|0028-0836}} but the template designer then has to decide
whether this should link to
http://dispatch.opac.ddb.de/DB=1.1/LNG=EN/CMD?ACT=SRCHA&IKT=8&TRM=0028-0836
(as in the English Wikipedia) or to
http://worldcat.org/issn/0028-0836
(as in the Danish Wikipedia) or to
http://libris.kb.se/showrecord.jsp?q=numm:0028-0836
(as in the Swedish Wikipedia) or to
http://www.gegnir.is/F/?func=find-a&find_code=ISSN&request=0028-0836
(as in the Icelandic Wikipedia).
The national libraries (gegnir.is, kb.se) are very useful for one
country's local serials, but the international ones (worldcat.org)
are more useful for internationally renowned journals.
It would be a lot easier for the template designer, if the
existing template (without modifying the wiki parser!) could link
to an intermediary page called Special:Serialsource which could
transclude text from a user-editable [[Wikipedia:Serial sources]].
Is this something we could hope to see implemented?
It is important to note that other numbering schemes such as ASIN
(Amazon), LCCN (Library of Congress), OCLC and DOI are tied to one
supplier, and there is no need to link to different sources.
ISBN and ISSN (and perhaps other schemes?) are different.
--
Lars Aronsson (lars(a)aronsson.se)
Aronsson Datateknik - http://aronsson.se
Hi,
I am trying to include another extension in my own extension. I need
to parse the other extension as wikitext, but it doesn't work.
I am extending http://www.mediawiki.org/wiki/Extension:SelectCategoryTagCloud
to include not only a tag cloud, but also a hierarchical view on
categories. Instead of writing the code from scratch, I would like to
reuse http://www.mediawiki.org/wiki/Extension:CategoryTree. I am
trying to include the <categorytree> tags in the area where the tag
cloud is. Users would then be able to select categories from the
categorytree and assign them to the article.
I have used the following code, but I get an error.
$outputtext = $wgParser->recursiveTagParse("<categorytree depth=\"2\"
onlyroot=\"on\" mode=\"all\" style=\"float:none; margin-left:1ex;
border:1px solid gray; padding:0.7ex; background-color:white;\">My
Category</categorytree>");
The error is:
Fatal error: Call to a member function getUseTeX() on a non-object in
D:\Apache2.2\htdocs\mike2\includes\Parser.php on line 551
Just to clarify, I have a hook extension where I need to make sure
that the wikitext is parsed before I include it in the output of the
extension.
How can I get rid of the error? Or does anyone have other suggestions?
Thanks,
Andi
On 9/13/07, Brion Vibber <brion(a)wikimedia.org> wrote:
> Yousef Ourabi wrote:
> > I've been looking at the *-page.sql dump from wikipedia, I'm now
> > intersted in looking at the recentchanges table, can anyone point out
> > which of the sql dumps (if any) contains that table?
>
> None, we don't make it available as
>
> a) it's primarily duplicates of other tables' data
>
> b) it contains private data (IP addresses of editors etc)
>
The only way to get the public information would be to filter
stub-meta-history.xml for the most recent edits, right?
Or, I guess you could get it from pages-meta-history.xml also, if that
file was ever successfully dumped.
Hi fellow MediaWiki admins,
I'm having difficulty setting up SMTP email notification under
MediaWiki. Does anyone have any leads or links to a checklist, a
sequence of criteria for successful MediaWiki SMTP email setup? If I
could see a successful MediaWiki SMTP setup, then I could emulate it and
troubleshoot my non-successful setups. Even a successful example using
Google GMail SMTP, or other external freebie, would at least allow me to
confirm if I've got the underpinnings in order.
When I click on a user name from "Special:Listusers", and then
click on "E-mail this user", then "Special:Emailuser/(username)"
returns: "No send address. You must be logged in and have a valid
e-mail address in your preferences to send e-mail to other users." ...
but I am logged in, and both the username and my current log-in have
legitimate, working email addresses entered into and saved in each of
their preferences (unverified, of course)!
I click on "My preferences" ("Special:Preferences") then select
"Confirm your e-mail address" ("Special:Confirmemail"), and it then
displays: "A confirmation code has already been e-mailed to you; if you
recently created your account, you may wish to wait a few minutes for it
to arrive before trying to request a new code." ... but of course none
has ever been sent. Then I click on "Mail a confirmation code", and it
returns a blank screen, and no email ever arrives.
--
My background searches:
- I've had no luck finding how to turn on PHP error logging
(where I presume the email is handled) and then reading error logs (if
any?) so I can see where it's stuck - does anyone have a link to samples
of PHP error logging? This may not relate to email.
- I had no luck searching Google or MediaWiki for "gmail smtp in
php" or equivalent, plus or minus any of these and related words, so I'm
hoping someone here will have a successful sample to share.
I appreciate there are at least 4 types of SMTP, and each
requires different setup:
- private: SMTP server in the same computer as the MediaWiki
installation, directly accessed
- local: SMTP accessed via Intranet on another, in-house
computer
- remote: SMTP via ISP or web host - remote access
- public: SMTP via external, such as GMail, et cetera, over the
Internet
Have I got those alternatives right? Anyway, even successfully
testing with a public SMTP server would help! I can then search through
my local settings to make changes, or try to install a private SMTP
server if I find no satisfactory alternative. I'd just like to see it
work at all, ever, any way, at least once!
Would anyone who's had success with MediaWiki SMTP in any form
please cut and paste their successful settings into examples at
http://www.mediawiki.org/ in a sample SMTP page - and include a link to
it in a response here?
--
Here are some sample "LocalSettings.php" elements that may
relate, I've played with these using a variety of settings to no avail:
$wgEnableEmail = true;
$wgEnableUserEmail = true;
$wgGroupPermissions['autoconfirmed']['autoconfirmed'] = true;
$wgGroupPermissions['emailconfirmed']['emailconfirmed'] = true;
$wgSMTP = array(
'host' => "smtp.gmail.com", or "mail.google.com", or ...
'IDHost' => "google.com", or "mail.google.com", or "gmail.com, or
commented out ...
'port' => 25, or 465, or commented out ...
'auth' => true, or false, or commented out ...
'username' => "myaccont(a)gmail.com",
'password' => "********"
);
--
"Special:Version" returns the following, for your information:
Version
- MediaWiki: 1.10.0
- PHP: 5.2.2 (apache2handler)
- MySQL: 5.0.41-community-nt
Extensions
- Special pages
- LastUserLogin (version 1.0.6)
Parser hooks
- DynamicPageList2 (version 1.2.1)
- Subpage List 2
Extension functions
- wfLastUserLogin, wfUpdateUserTouched, wfDynamicPageList3,
wfDynamicPageList2 and efSubpageList
Parser extension tags
- <dpl>, <section>, <subpages> and <pre>
Parser function hooks
- dpl, int, ns, urlencode, lcfirst, ucfirst, lc, uc, localurl,
localurle, fullurl, fullurle, formatnum, grammar, plural, numberofpages,
numberofusers, numberofarticles, numberoffiles, numberofadmins,
numberofedits, language, padleft, padright, anchorencode, special and
defaultsort
Hooks
- LanguageGetMagic wfDynamicPageList3_Magic
- LoadAllMessages wfDynamicPageListSPloadMessages
--
... I see on other wikis:
Hooks
- PrefsEmailAudit logPrefsEmail (from SpecialPreferences.php, not
documented)
- UserCanSendEmail wfUserCanSendEmailExt
... but I can't find a reference to see if they matter to my
situation or how to engage them anyway!
--
Thanks in advance for sharing your successful MediaWiki SMTP setups,
- Peter Blaise
I've been looking at the *-page.sql dump from wikipedia, I'm now
intersted in looking at the recentchanges table, can anyone point out
which of the sql dumps (if any) contains that table?
Thanks.
-Yousef
I'd like to voice my support for this feature, which makes us fairly
evenly split by the numbers. But the problem is, the argument is already
rather heated, and I'd like to see this decided on technical grounds,
rather than on the basis of "who's got the biggest ego".
So I'd like to suggest that we all take a deep breath before starting a
discussion under this thread. I think we should discuss the matter on a
detached, technical level, and avoid the use of emotionally-charged
words such as "fragile" and "bogus". Does that sound reasonable?
-- Tim Starling
Hi,
I am thinking about creating a hook extension that would display
external content (i.e. a list of bookmarks from another site, e.g.
Digg, Scuttle etc.) below the list of articles on an article page.
Whenver a category page title corresponds to the description, tag or
title of a bookmark, I want it to show up. So users can see all links
to local pages in that category, but also some external bookmarks.
I have looked but don't seem to be able to find any hooks that would
allow me to inject my content?
I could create a tag extension and display it on top of the page, but
that isn't that intuitive for the user (why have external articles
before local ones?).
Any help is appreciated!
Thanks,
Andi
An automated run of parserTests.php showed the following failures:
This is MediaWiki version 1.12alpha (r25821).
Reading tests from "maintenance/parserTests.txt"...
Reading tests from "extensions/Cite/citeParserTests.txt"...
Reading tests from "extensions/Poem/poemParserTests.txt"...
Reading tests from "extensions/LabeledSectionTransclusion/lstParserTests.txt"...
17 still FAILING test(s) :(
* URL-encoding in URL functions (single parameter) [Has never passed]
* URL-encoding in URL functions (multiple parameters) [Has never passed]
* Table security: embedded pipes (http://lists.wikimedia.org/mailman/htdig/wikitech-l/2006-April/022293.html) [Has never passed]
* Link containing double-single-quotes '' (bug 4598) [Has never passed]
* message transform: <noinclude> in transcluded template (bug 4926) [Has never passed]
* message transform: <onlyinclude> in transcluded template (bug 4926) [Has never passed]
* BUG 1887, part 2: A <math> with a thumbnail- math enabled [Has never passed]
* HTML bullet list, unclosed tags (bug 5497) [Has never passed]
* HTML ordered list, unclosed tags (bug 5497) [Has never passed]
* HTML nested bullet list, open tags (bug 5497) [Has never passed]
* HTML nested ordered list, open tags (bug 5497) [Has never passed]
* Inline HTML vs wiki block nesting [Has never passed]
* Mixing markup for italics and bold [Has never passed]
* dt/dd/dl test [Has never passed]
* Images with the "|" character in the comment [Has never passed]
* Parents of subpages, two levels up, without trailing slash or name. [Has never passed]
* Parents of subpages, two levels up, with lots of extra trailing slashes. [Has never passed]
Passed 527 of 544 tests (96.88%)... 17 tests failed!
Hi,
Thanks for the support.
On FreeBSD in order to make texvc work it needs to be compiled post install.
The following worked for me
cd /path/to_my_mediawiki/math
make or gmake
Left to fix is LateX.
John