I am experiencing exactly same problem as 'Shawn' stated below. I have been
struggling for a long time on this, but could not find the solution. Can
anybody tell me why this has happened.
Regards
Umesh
contact: umeshtmi1(a)gmail.com
Strange- only one account shows up - a standard user I created to
explore. Nothing under sysop or bureaucrat groups. Is it possible
that when filling out the configuration form multiple times (had to
tweak some SQL settings inbetween) that I left the sysop field
blank? Do I have to start over? If so it's not a big deal - this
wiki is for me to learn on before deploying...
-Shawn
On Jul 6, 2007, at 12:37 PM, Rob Church wrote:
> On 06/07/07, Shawn Allgeier <allgeier(a)gmail.com> wrote:
>> I'm new to wikis in general. Just got MW up and running on a machine
>> running Mac OS 10.4. When I used the setup script there was a field
>> for WikiSysop. I don't seem to be able to log in using this
>> username. How does one admin a wiki? There is some high level user
>> who has extended powers, right? I also don't see a way to remove a
>> file that has been uploaded?
>
> When you install MediaWiki, you provide two sets of credentials; the
> first is the initial administrator/bureaucrat account details, and the
> default username for this is "WikiSysop", which you may have changed.
> To check this username, visit Special:Listusers on your wiki and look
> for the user with "sysop, bureaucrat" permissions.
>
> If you've forgotten the password for this user, then you can use the
> maintenance/changePassword.php script to reset it. Once logged in as
> this user, you'll be able to use all the administrative functions,
> plus manage user rights, etc.
>
> As to the image deletion question; when you are logged in as a sysop,
> you'll notice deletion links on image description pages.
>
>
> Rob Church
>
> _______________________________________________
> MediaWiki-l mailing list
> MediaWiki-l(a)lists.wikimedia.org
> http://lists.wikimedia.org/mailman/listinfo/mediawiki-l
Hi,
Having a $wgArticle object how can I get the *rendered* article
text ? Can't find any method/function that will return that...
getContent just returns the article as wiki markup.
I am looking to replace a large number of pages with a newer version from
outside of my wiki. My plan is to delete the existing pages and then import
the new ones.
With the number of pages being deleted (@20,000) it is possibly going to be
easier to do it through SQL.
The thing is that there are numerous tables with entries relating to the
affected pages. Is there an easy way to remove pages en masse? Unfortunately
the 20k pages in question represent only a small portion of the wiki so I
ant simply empty tables.
Each of the pages in question contains the same phrase which is itself
unique. i.e. I am happy to delete all pages that include a specific phrase.
This will only, I think, remove entries from the searchindex and text
tables. Is there a way to detect all entries and remove them for these
pages?
Many thanks,
Paul
Hi,
On the same host I'm running running my mediawiki site, I produce graphics
about performance data. To create them, I call a cgi script and it
does produce an html page with only the graphics.
<body>
<IMG ALIGN=CENTER SRC=/tmp/gnuplot.15898>
</body>
Is there any way, I could define a call to this cgi script, and embed the
generated graphics/jpeg in a mediawiki page. This will be a kind of dynamic
page.
Any feedback on way to achieve this will be greatly appreciated.
Thank you
Hello,
i am using mediawiki 1.10 with wikipedia's DB. Whenever i try making a
search i recieve no output. Can you please help me ?
Thank you for your time,
Waleed A. Meligy
Hi all,
I tried to export the article Intelligent Design (all revisions) from
Wikipedia. There's a limitation of max 100 revisions for Special:Export
but the article has much more than those 100 revisions. The problem is
that I'd like to retrieve a list of the LATEST revisions, but, the
export function returns only the FIRST (100) revisions... :(
I presume that most people would like to retrieve the *latest* revisions
if the number of existing revisions exceed the maximum number allowed
for export.
http://meta.wikimedia.org/wiki/XML_export#Exporting_the_full_history
May I suggest that
1. (default) - the export list would contain the latest revisions
(current(n)...-100)
2. (option) - the user could check a check box in order to retrieve
the earliest revisions (1..100)
In this way one could get up to 200 revisions for a title (with a gap in
the middle though, if there's more than 200 revisions altogether).
Regards,
// Rolf Lampa
> Gary wrote: ... What I think the original
> poster means is the links displayed below
> the title, like Page | Subpage. Simple as that.
Thanks for sharing your experience, Gary.
... but I get NO "Page | Subpage" displayed anywhere, and certainly not
below the title anywhere in my MediaWiki v1.10.0.
Does anyone have any links to what you're looking at looks like?
Also, Daniel, do you have any extensions installed? Let us know what
[[special:version]] says.
Thanks.
- Peter Blaise
I'm new to wikis in general. Just got MW up and running on a machine
running Mac OS 10.4. When I used the setup script there was a field
for WikiSysop. I don't seem to be able to log in using this
username. How does one admin a wiki? There is some high level user
who has extended powers, right? I also don't see a way to remove a
file that has been uploaded?
Thank you,
-Shawn
I'm having trouble with uploads.
We're running Mediawiki 1.5 on Debian old Stable.
When I upload a file, it seems to upload fine. I can access the
description page. However I cannot access the file.
I get the following error:
The requested URL /images/e/eb/Nerdcore(small).jpg was not found on this
server.
from Localsettings.php
$wgScriptPath = "";
$wgUploadPath = "{$wgScriptPath}/images";
$wgUploadDirectory = "{$IP}/images";
$wgEnableUploads = true;
$wgFileExtensions = array('png', 'gif', 'jpg', 'jpeg', 'doc', 'xls', 'pdf');
$wgUseImageResize = true;
permissions on the images directory are currently 777
I can't seem to figure out what's going on. Any help would be greatly
appreciated.
thanks,
--
- Matt Miller -
Solutions for Progress
728 South Broad Street
Philadelphia, PA 19146
215-701-6108 (v)
215-972-8109 (f)
Hi all,
Problems with mwdumper
Mwdumper (http://www.mediawiki.org/wiki/Mwdumper) crashes (around 35000 pages)
when processing the en-WP dump as of 2007-05-27, with the following error:
root@xubuntu-svn:/home/admin/Desktop# jdk1.5.0_12/bin/java -jar mwdumper.jar
--format=sql:1.5 enwp-200707 > enwp-200707.sql
...
32,000 pages (373.893/sec), 32,000 revs (373.893/sec)
33,000 pages (373.206/sec), 33,000 revs (373.206/sec)
34,000 pages (377.979/sec), 34,000 revs (377.979/sec)
35,000 pages (377.851/sec), 35,000 revs (377.851/sec)
Exception in thread "main" java.lang.ArrayIndexOutOfBoundsException: 2048
at org.apache.xerces.impl.io.UTF8Reader.read(Unknown Source)
at org.apache.xerces.impl.XMLEntityScanner.load(Unknown Source)
at org.apache.xerces.impl.XMLEntityScanner.skipChar(Unknown Source)
at org.apache.xerces.impl.XMLDocumentFragmentScannerImpl$Frag
mentContentDispatcher.dispatch(Unknown Source)
at org.apache.xerces.impl.XMLDocumentFragmentScannerImpl.scan
Document(Unknown Source)
at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source)
at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source)
at org.apache.xerces.parsers.XMLParser.parse(Unknown Source)
at org.apache.xerces.parsers.AbstractSAXParser.parse(Unknown Source)
at org.apache.xerces.jaxp.SAXParserImpl$JAXPSAXParser.parse(Unknown Source)
at javax.xml.parsers.SAXParser.parse(SAXParser.java:375)
at javax.xml.parsers.SAXParser.parse(SAXParser.java:176)
at org.mediawiki.importer.XmlDumpReader.readDump(Unknown Source)
at org.mediawiki.dumper.Dumper.main(Unknown Source)
root@xubuntu:/home/admin/Desktop#
More info about the environment:
Java version: root@xubuntu:/home/admin/Desktop# sudo ./jdk1.5.0_12/bin/java -version
java version "1.5.0_12"
Java(TM) 2 Runtime Environment, Standard Edition (build 1.5.0_12-b04)
Java HotSpot(TM) Client VM (build 1.5.0_12-b04, mixed mode, sharing)
OS: GNU/Linux Xubuntu 6.10
Kernel release: 2.6.17-10-generic,
Kernel version: #2 SMP Fri Oct 13 18:45:35 UTC 2006
Any ideas anyone?
Regards,
// Rolf Lampa