> From: Andr?s Kardos <k.andris(a)gmail.com>
>
> Wikipedia has a lot of information, and it is heavily crosslinked.
> But it's not
> indexed. I mean an index of people, and index of places and an
> index of things.
> And events. And countries. And lakes. And whatever.
Although nothing exactly like what you're describing currently
exists, I think (as Brion points out) similar functionality already
exists.
For one thing, Categories. I think they are poorly managed at
present, and could use some additional support. I say this without
having given it much thought, and without any suggestions, so I know
I'm setting myself up for criticism here! :-)
For another thing, Special:Allpages. This *is* an index already.
(What you're asking for is not formerly an *index.*) The concept
could be expanded by hacking a copy of Special:Allpages into a
concordance, where both the link and link text would have meaning, as
well as preceding and following words. That could be done
mechanically, which (IMHO) is an vast advantage over giving the
WikiPedia public yet another tool to master or misuse!
> "indexers" would be wikipedians who index things. Make and index, like
> "countries" or "operating systems" or "mysteries". And then collect
> things into
> that index.
I dated a professional indexer once. At least before Microsoft Word
made everyone an (untrained, unskilled) indexer, this was a
profession with its own society and conferences and such. Doing it
"correctly" is difficult, specialized work. Doing it by rote *should*
be automatic and mechanical, rather than depend on untrained indexers.
:::: Insanity: doing the same thing over and over and expecting
different results
:::: Jan Steinman <http://www.Bytesmiths.com/Item/99AU22>
Hi y'all
Background
I have an old WIki in ISO-8859-1format and tried to upgrade to 1.5.4. I
ran the upgrade1_5.php which stated should convert to UTF-8. The problem
is that it didn't. upn closer inspection of the script it was clear that
the global variable $wgUseLatin1 should have a non false value for it to
convert enything. In an attempt to minimize side effects I set
$wgUseLatin1=true in the subrotine that does the conversion. This gave me
converted pagenames, but the page contents were still ISO. Strengthened by
my success I simply applied the conversion subroutine on the data the
comes from the cur.text field. This seems to have worked, as far as I can
tell my data is now properly UTF-8 formatted.
Question
1. Mediawiki is supposed to convert old ISO format article text to UTF-8
on the fly, right. How is this triggered, Do I need to set $wgUseLatin1 to
get it to work or what.
2. I now have a converted wiki, converted as descibed above is there a
downside my aproach. Does some text not get converted.
-------------------------------------------------
Anders Nygård
Operations Specialist
Gl. Køge Landevej 55
2500 Valby Denmark
Phone 45-7730 12 00
Direct 45-7730 12 74
Mobile 45-4144 38 77
www.uni2.dk
Hello,
I am from the editorial side, not the developer side and so need to hire
some help for my nonprofit's arts wiki (history and criticism).
These are my problems, let me know how long it would take, when you
could start, and what you think is fair value for the work.
I am anxious to have the work done very soon. Please have the time
available now if you are interested.
Needed fixes:
1. For some reason there is no "Create New Article" element to my wiki.
2. I need someone to switch the flower logo out and put in my own.
3. I need someone to edit the navigation for me.
4. I need my Creative Commons logo put in the far botton box.
5. The Upload File page says it is set to no uploads - I would like
uploads to be enabled.
6. I need the permissioning checked and set so anyone can edit but only
registered, logged-in users can create a new article.
7. There are errors in red in my index.php file and I need someone to
evaluate them and fix them if they are significant.
8. Is there any kind of hack out there to simplify the writer's access
to image/media files while working in the central wiki page space?
Thanks,
Matt
thank you for taking the time - i try to be precise
Situation:
a) our task is to set up new wikis all the time which should be based on an individual design
b) my task is to provide the admins with a custom design
c) we have to handle all the future upgrades of wikimedia on all those wikis
Problem:
After initial fiddling with the monobook and cross browser problems i came up with a design on a working wiki.
Now the Admins need a "package" from me for further installations of new wikis.
They have to support those wikis and therefore they need to update/upgrade all those wikis.
We need the best strategy to create a custom design under this point of view.
Theory:
Obviously it is easier to re-install a user skin
than to sync all your individual changes with base design
after an regular upgrade of the wiki software
(which I will have to do on a growing number of wikis regularly)
Question:
I need to make sure, every user sees the customized design immediately.
a) Can I (and how do I) make a skin the default view for the wiki and
b) prevent the user from changing it
What I check so far:
a) I can completely disable user skins.
b) many people just do their changes on the monobook etc. -
but how do you handle the regular upgrades best then?
c) I understand I need to do a lot of cross browser compatibility fiddling :-(
Thanks for your input
Hello,
It there any way to set the level of headings you want listed in the TOC?
If I'm not clear I got something like this:
== A ==
=== a ===
=== b ===
=== c ===
== B ==
=== 1 ===
=== 2 ===
=== 3 ===
And this results in a TOC like this:
A
a
b
c
B
1
2
3
And I like something like this:
A
B
I want to do this on a single article and not on the whole wiki. Hope
I'm clear enough.
Thanks,
Rune
-------- Original Message --------
Subject: Re: [Wikitech-l] create new pages by accessing the database
directly FAILS
Date: Sat, 07 Jan 2006 21:50:57 +0100
From: edward hage <edward(a)confirmat.nl>
To: Wikimedia developers <wikitech-l(a)wikimedia.org>
References: <A8338714-BC86-476A-9597-E48DF2F109D4(a)mcneilco.com>
<43C022FB.5090501(a)confirmat.nl>
<e92136380601071224nbd4a98es(a)mail.gmail.com>
What I want to make is a site for scuba-divers. When they log in they
get a special Userpage (kind of homepage for every user). On that page
they have the possiblity to make a new dive-entry. If they click on that
link they go to a special page where they can fill in a table about
depth, time in the water, pressure, bla.bla. When they submit that data
a new page is created with a nice layout (I use a template in mediawiki)
with all dive-data. Also on the Userpage a new link to that new page is
automatically created when the dive-entry is submitted. So after a few
logged dives a complete list is created.
Because it is a wiki they can make additional pages about dive-locations
(like travelwiki) and upload underwater-pictures and create pages about
marine biology etc..
Any pointers are welcome.
After I solve the issue of making the new pages flauwless, I have to add
the privilege of the user to protect his own pages (so their own
divedata is not molested by other users). I saw on mediawiki that
already some patches and extenstions were written for that so that issue
is next on my agenda.
Also I am very interested to hear why new enrty points can not be made.
Greetings, Edward
Rob Church wrote:
>A special page should NEVER define a new entry point, ever. Create it
>as a proper special page, and use the proper, built-in classes, member
>functions, and where needed, the globals.
>
>Telling us what you're trying to *do* would also be a big step, so we
>can give pointers.
>
>
>Rob Church
>
>On 07/01/06, edward hage <edward(a)confirmat.nl> wrote:
>
>
>>Hello John, Hendrik,
>>
>>I also made a SpecialPage, but I did not post back to the page itself. I
>>did to a new page where I defined some basic stuff by:
>>define( 'MEDIAWIKI', true );
>>require_once( '../includes/Defines.php' );
>>require_once( '../LocalSettings.php' );
>>require_once( '../includes/Setup.php' );
>>
>>Basically made a new entry-point. I hope this does not cause a
>>safety-issue ! (I don't know the do's and dont's about this.)
>>
>>
>>
--
CONFIRMAT Mechatronics & product development
Veluwehof 71
5709 KJ Helmond
http://www.confirmat.nl
************************************* Disclaimer ***********************************
De informatie opgenomen in dit bericht kan vertrouwelijk zijn en is uitsluitend bestemd voor de geadresseerde. Indien u dit bericht onterecht ontvangt, verzoeken wij u de inhoud niet te gebruiken. Wij verzoeken u de zender direct op de hoogte te stellen door het bericht te retourneren en alle copieen van dit bericht inclusief enige aangehechte bestanden te verwijderen.
The information contained in this message may be confidential and is intended to be exclusively for the addressee. Should you receive this message unintentionally, please do not use the contents herein. Please notify the sender directly by replying this message and destroy all copies of this message and any attachments.
Thanks... That was enough to steer me in the right direction. I now
remember that in Decemeber I tried the upgrade, but had some problems
with my custom skin. So then I just stuck with my old version. This time
when I did the upgrade, it saw the new tables were already there and
just skipped them.
-----Original Message-----
From: mediawiki-l-bounces(a)Wikimedia.org
[mailto:mediawiki-l-bounces@Wikimedia.org] On Behalf Of Brion Vibber
Sent: Wednesday, January 11, 2006 9:35 AM
To: MediaWiki announcements and site admin list
Subject: Re: [Mediawiki-l] Upgrade OK, but showing old data
Anderson, Patrick G wrote:
> Everything looked OK at first glance. However, when I browsed some of
> the pages, I noticed they were several months old. I checked the
> "Recent Pages" and it showed that I made edits in the past week, but
> when I clicked on the diff link and looked at those, it displayed old
> data as well. Finally I used SQLyog to open the database directly and
> look at the tables. in the wiki_cur table I can see the latest edits
> and it has the correct data there.
>
> Is there something obvious I am missing?
Don't look at the 'cur' table, it's obsolete and leftover from your
upgrade.
Look in 'page', 'revision', and 'text'.
Check also for duplicate cur entries if you started with an old version
of the schema without the unique title index; this could perhaps cause
the wrong duplicate being picked during the upgrade.
-- brion vibber (brion @ pobox.com)
I tried to do an upgrade of my Mediawiki site from 1.4.1 to 1.5.5.
Luckily, I decided to test before I did this on my live site. Here is
what I did:
1. Setup Apache, MySQL, and PHP on my local workstation.
2. Imported live database into my local workstation's MySQL.
3. Copied MediaWiki 1.5.5 files to local Apache htdocs directory.
4. Ran the install/config script and pointed it to local copy of
database. The script saw the database tables and performed an upgrade on
the tables as needed.
5. Copied LocalSettings.php to main directory.
6. Opened my upgrade local mediawiki site.
Everything looked OK at first glance. However, when I browsed some of
the pages, I noticed they were several months old. I checked the "Recent
Pages" and it showed that I made edits in the past week, but when I
clicked on the diff link and looked at those, it displayed old data as
well. Finally I used SQLyog to open the database directly and look at
the tables. in the wiki_cur table I can see the latest edits and it has
the correct data there.
Is there something obvious I am missing?
Inspired by a post on wikitech-l, I've created a simple but powerful
(and, I hope, useful) extension which allows sysops to maintain a
blacklist of usernames through a system message.
The list supports pattern matching through regular expressions, and
the documentation should be sufficient for wiki owners with most
levels of experience. With luck, this will help cut down the offensive
usernames, etc. found on too many wikis.
It's fresh into CVS - extensions module, UsernameBlacklist folder.
Feedback welcome.
Rob Church
Hello,
I've been trying to create a special page for research papers that creates
multiple articles (e.g. a separate article for the author(s), the paper
summary information article and several categories that tie them all
together.)
The main code for saving an article uses Article::insertNewArticle() to
insert the article text and Title::exists() to determine if the article
exists before I save anything. I've looked at the EditPage class to mimic
it's way of saving pages which appears to be pretty straight forword.
The pages are created as I expected but the categories are messed up.
(i.e. some of the articles appearing in categories that the shouldn't be
in and other articles no appearing in categories where they should be).
Am I trying to do something that mediawiki is not designed to do?
Any help would be greatly appreciated.
I'm using mediawiki 1.5.
Thanks,
-Pat.
--
============================
Patrick Day
uf069(a)victoria.tc.ca