Nothing help for me ???
UUUUUUUUUUUUUUUEEEEEEHHHHHHHHHHHHH
Please Try to help me.
------------------------------
I have a problem with a Wikipedia Mirror using Mediawiki 1.7.1.
I have installed a Mediawiki dump, for a search engine, using the instructions here :
http://en.wikipedia.org/wiki/Wikipedia:Database_download#Images_and_uploade…http://meta.wikimedia.org/wiki/Data_dumps
Where is the problem ? The problem is with the images. i can not download the images of wikipedia (the dump is 76 GB...) and , at the moment, in all pages, i have the name of the images with a link url that point to the upload url where the images can be stored but i have nothing stored at this link.
For example :
http://encyclopedia.meta99.com/wiki/Pets i have a box with the words "Image:Man and dog.jpg" that point to a link : http://encyclopedia.meta99.com/wiki?title=Special:Upload&wpDestFile=Man_and….
http://encyclopedia.meta99.com/wiki/George_washington i have a box with the word "Image:Lawrence Washington.jpg" that point to the link : "http://encyclopedia.meta99.com/wiki?title=Special:Upload&wpDestFile=Lawrenc…".
I only need to delete these strange words relative to the images that i can not use.
I have attempted many setting in LocalSettings.php to disable the image words but i have failed.what is the system to disable the words above, relative to the images. I have not found any instructions in mediawiki relative the system to disable the words relative the images. Have you a setting or instructions or links to solve ?
I have also been trying to squeeze a hierarchical system into MW.
Actually, it is not a full heirarchy, there are only two levels, so
you could call it "Content subspaces" -- distinct groups of pages that
have a mostly distinct group of users.
== Outline of problem ==
The wiki I am setting up is for local communities, so they will mostly
only be interested in the subspace of pages that relates to their
local region. A distinct group of people will want to maintain that
subspace: look for recent changes, check out new pages, and possibly
search within it. The wiki will use a specific content framework such
that there would never be more than about 30 pages within one region
(excluding Talk pages).
I guess the same problem would be faced by cities.wikia.com if one
city grew to more than a few pages.
I am trying to decide between two options:
1. Use separate mediawikis for each local region. This would be a
setup similar to wikia.com, with some shared templates, help pages and
navigation.
__Pros__
* "Recent changes" applies to only one region, as most people would want it to.
* [[Links]] go to pages within the subspace.
* No need to to use naming conventions or disambiguation pages to
avoid name conflicts.
__Cons__
* There is not much interaction between regions, since they are on
different wikis with different logins, one cannot look at "recent
changes" or "requests for comment" over the whole site, so lose the
benefit of experienced users from one region helping small/emerging
communities in another.
* Software setup would be difficult, particularly for things like
shared templates.
2. Using subpages to organise content. There would be a main page for
each region, as a normal article. All other pages relating to that
region would be subpages of it.
__Pros__
* All on one wiki, so all well integrated and allows interaction
across communities.
* Pages within the subspace have an automatic link back to the main
page, so new pages aren't "dead ends".
* No need to use disambiguation pages to avoid name conflicts (using
subpages is essentially a naming convention).
__Cons__
* Can't easily see "recent changes" just within a subspace.
* New pages for the subspace must be subpages, e.g.
"Region_name/Some_page", and anyone who creates a page must remember
that.
* To link to a page in the subspace you have to type
[[Region_name/Some_page|Some_page]] (is there a shortcut way to do
this?).
The usual solution (as in Wikipedia etc) of having everything in one
namespace, using categories and disambiguation pages, I think would
not work well in this case. There will be many pages within each
subspace that have a common name, such as "Erosion", "Population",
"Water quality". So the disambiguation pages could be very large, and
naming conventions confusing.
As for namespaces, there are too many local regions to set up a
namespace for each.
== Technical challenges ==
I prefer option 2 (above): using subpages to organise content. I have
set up a prototype in that way. As an interim method for seeing
"recent changes" within a subspace, I have put links to all subpages
on their main (parent) page, and then used "related changes". However,
this is not ideal:
* it does not include changes to the parent page itself;
* it breaks if links are removed from the main page, pages are moved, etc.
* it does not include talk pages unless they are also linked from the main page.
So, to see "recent changes" within a subspace (i.e. all subpages of a
specified page, including the parent page itself), I think a new
special page would be required. Also, a special page to see a list of
all subpages would be useful (a subset of Special:Allpages).
Does this sound sensible and feasible?
Thanks
Felix
On 8/26/06, Morten Blaabjerg <morten(a)crewscut.com> wrote:
> Yusuf, it sounds to me like you're trying to squeeze a hierarchical system
> into MW. A scheme like the one you describe may work, but it is with the
> risk of restraining yourself and your users too much, and not really taking
> advantage of the freedom a wiki can give you. There basically is no
> "correct" way of organizing information in a wiki - it is up to you and your
> users to organize the content, as you see fit.
>
> If you want a more hierarchical organization of content, you may be better
> off with another CMS or wiki engine which is better set up for this kind of
> thing. If you want your site to be a wiki, I'd suggest you simpy throw
> everything together in the main namespace, and use categories like Java, C,
> C+ etc to tie related pages together. This will be the least restrained
> option, leaving you and your users with the greatest freedom to organize
> things as you go.
>
> You may use custom namespaces as you suggest, but namespaces are not an
> ideal way of organizing content, they are primarily meant to differentiate
> between different types of content and functionality, i.e. between content
> and userpages, discussionpages, system pages etc.
>
> Best wishes,
> Morten :-)
>
--
Felix Andrews / 安福立
Beijing Bag, Locked Bag 40, Kingston ACT 2604
Building 48A, Linnaeus Way, The Australian National University ACT 0200
Australia
http://www.neurofractal.org/felix/
mailto:felix@nfrac.org
voice:+86_1051404394 (in China)
mobile:+86_13439575331 (in China)
mobile:+61_410400963 (in Australia)
xmpp:foolish.android@gmail.com
#xmpp:floybix@jabber.cn (in China)
#xmpp:foolish@jabber.org.au (in Australia)
3358 543D AAC6 22C2 D336 80D9 360B 72DD 3E4C F5D8
>From: "Rob Church" <robchur(a)gmail.com>
>
[ ... ]
Really that Special:AllpagesRaw would help. What was your opinion on it?
I'm downloading a few wikis not maintained by me. I'm not registered
as an author in them.
The export help told to check the pages from Special:Allpages.
I want do that automatically, and now it is not that simple.
I have started writing a C code to download the Allpages but
it is not a simple task. I can do trickery and be able to extract
all the pages that way, but this is trickery of a questionable sort.
(E.g., first search for "<hr", then the pages are in the next table.)
My wikiget will download both the html versions and xml versions
of the pages. And the images, small and large. The urls will be
converted to relative. The filenames will be converted from
"Category:Functions" to "Category-Functions.html" because
lynx, galeon, mozilla does not like ":" chars in filenames.
Juhana
--
http://music.columbia.edu/mailman/listinfo/linux-graphics-dev
for developers of open source graphics software
If I wanted to have a single page (Special:Userlogin) use a different
skin how would you go about doing that? I actually don't need it to
have a skin, but I would like to be able to customize the display of
all displayed elements, so another skin might be easiest.
I was looking at SkinTemplate, and there are some skin variables there
but when they are changed they seem to be ignored. There is also some
mention of child classes being able to override the settings but I
don't know how that would work.
As a high-level description, I'm trying to make mediawiki display a
fairly empty login page to everyone that isn't already logged in, that
would be separate from the general site layout. This is a wiki where
only logged in users can contribute, and I would rather not have the
login page display any content (sidebar etc).
Any pointers would be great :D
Judson
Hello,
very often we get a HTTP 403 (vorbidden) error in the wiki (1.7.1)
that occurs very often when the save button is clicked.
this happend randomly on different pages, but very often.
There are none restriction for the users.
Peter
--
Newsreader: http://mesnews.net/index-gb.php
Deutsche Hilfedatei: http://www.lastwebpage.de/download/mesnews-de.zip
This must be quite simple, but i cannot find the right information.
I would like to install images from my own uploads directory [[Image:
image.jpg]], together with commons [[commons:Image:Image.jpg]]
is that possible?
Hello all,
I'm trying to edit the page that users see when they go to create an
account.
I want them to use their Windows ID when creating an account in hopes
that someday we'll have this running with LDAP authentication working.
I changed MediaWiki:Username to "Username (Windows ID) :" but I still
only see "Username:" when I'm on the login page and on the create
account page.
Where can I change that test that says Username: on those two pages?
Also, on the create account page where it says "Already have an account?
Log in." I'd like to add a line under it instructing them to use their
Windows user ID. I have been looking all over and can't find the places
to do this.
Thanks in advance,
~Eric
Hi All,
I wanna run the same Wiki on my ISP webserver and on a local machine. One
reason is having a functional backup, the other is that editing may be more
conveniant on the local machine than on the ISP's server.
To copy the wiki from the local machine to the server, I currently dump the DB
and import it on the server, and I just copy all the files to their remote
directories.
Then I adjust the LocalSettings.php to account for the different paths on both
machines. In particular, I change
$IP = "/var/www/all-inkl.com/karba.ch/wiki";
$wgScriptPath = "/all-inkl.com/karba.ch/wiki";
to
$IP = "/www/htdocs/*/karba.ch/wiki";
$wgScriptPath = "/wiki";
(the asterisk's hiding my username).
This works well besides of some image links being just the old ones from the
local machine... There seems some magic to be going on since occasionally
they change to the correct links...
What can solve the problem? Do I have to run a certain file from the
maintenance area before I dump the local DB to clear some cache?
Cheers,
- Moritz
--
m:o