Although MediaWiki has a good history facility, it would be great if the
text in a page were flagged to indicate changes since the last time that
page was viewed. This could range from revision bars in the margin to
marking adds/changes/deletes similar to Word change tracking. Are old
versions of a page saved by MediaWiki or are only the changes recorded for
the revision history? Also, does MediaWiki record (a) when a user last
logged in or (b) when a user last viewed a page?
Thanks, Norbert
Does anyone know of anyone or a website which does cheap off-shore Mediawiki code development? I've searched the net but wasnt able to find any thing. Looking for someone to develop extensions.
thanks
Eric
---------------------------------
Ready for the edge of your seat? Check out tonight's top picks on Yahoo! TV.
Hi again,
_in short_:
* did an installation of MediaWiki 1.9.x a few weeks ago;
* MediaWiki was working fine as expected and as used for several weeks;
* MediaWiki suddenly stopped working partly and became mostly
inaccessible a feew days ago;
* Currently, it started to interfere with other services runnin on the
same host (requesting a page from Media Wiki kills apache2)
* can't find any helpful information in ../syslog, ../messages, the
apache2's error.log or access.log; enabled php.log (in php.ini) and
debug.log (in LocalSettings.php): the first logs nothing helpful, also,
the latter is not being created at all.
_in full_:
A few months ago, I installed MediaWiki 1.9, created some hundred nodes
and all appeared to work like a charm: Creating articles, deleting,
moving, or editing them, all reasonably fast and relaible.
Now, after several weeks of operation, the installation has partially
ceased to work: Entering an existing term in the Search box has no
effect - the browser claims to request the page, but nothing happens
(nothing = no page showing up, and no timeout appearing after several
minutes [hours]). Sometimes - but not reproducable - the requested page
appears, when I directly enter the page's full URL in the browser's
location bar. Creating new pages seems no longer to work at all. I
haven't changed the configuration in LocalSettings.php for weeks, and I
haven't noticed having changed anything else on the server (running
under Debian "Etch" since it was released).
However, when I do a "tail -f" on the webserver's logfiles, requests
from spiders and external browsers are hushing by with a pretty high
frequency (several files requested per second). What appears *really*
strange to me is, that requests for pages from my own browser (Opera
9.2.1/Win) don't appear in access.log, as long as I use Opera, where I'm
logged in to the Wiki; when I switch to Firefox (2.0.0.4/Win, not logged
in to the Wiki), those requests start to appear in access.log.
Sticking with Firefox, entering page names of existing articles results
in delivering the article and showing up in access.log; however, when
entering non-existing article names (in order to create them), those
requests don't appear in access.log, and MediaWiki delivers no page
(search result etc.), like with Opera. It's also mostly the same result,
if I enter a new term in the Search box or directly via the URL location
(.../w/index.php?title=New_Article&action=edit). Directly after
rebooting the server, sometimes creating new articles does work again
when directly entering the new article's URL, but only once.
Another LAMP application on the same host (Drupal) is working fine (more
or less, and very slow, but however it's still working), so basically
Apache, MySQL, and PHP seem to work.
The server hosts three domains, for each one exists one MediaWiki
installation; two of them were set up completely and checked for correct
operation a few weeks ago, but they were not in use yet. These
installations stopped working also and show similar behaviour to those
described below.
Somehow requests to MediaWiki started a few days ago to interfere with
the other services operation; e.g., when I'm requesting a page from
MediaWiki (e.g. Front Page) and concurrently try to save an article in
Drupal, the server does neither deliver the FrontPage from MediaWiki,
nor save the article in Drupal (I'm waiting for approx. 10 minutes and
usually cancel then the operation by pressing "Esc" in the browser). At
this moment, all apache2 proecess dissapear from top/htop and no other
requests to either Drupal or MediaWiki are served in any way. As it
seeems, MediaWiki started for an unknown reason to kill apache2 in the
setup im running.
Drupal starts responding again as soon as I do an "/etc/init.d/apache2
restart". Sometimes even this fails:
# /etc/init.d/apache2 restart
Forcing reload of web server (apache2)...(98)Address already in use:
make_sock: could not bind to address 0.0.0.0:80
no listening sockets available, shutting down
Unable to open logs
Apache2 seems to die completely:
# /etc/init.d/apache2 stop
Stopping web server (apache2)...httpd (no pid file) not running
It then can be restarted;
# /etc/init.d/apache2 start
Starting web server (apache2)....
After this, normal operation continues, as long as I don't request any
page from MediaWiki. If I request the FrontPage of any of those three
setupts, this repeats all over again.
I tried to check the MediaWiki databases with phpMyAdmin
(2.9.1.1-Debian-3); even opening them takes over two minutes, checking
single small tables like "de_watchlist" results in an "OK", checking
large tables like "de_text" or even all at once does not finish after
several hours (also without phpMyAdmin giving a timeout). During this
"check" operations, CPU load according to "htop" fluctuates between 16
and 97% (most of the time it's around or below 50%)).
This is a pretty bizarre scenery: In the background of my desktop is a
shell, where requests logged to /var/log/apache2/access.log continuously
are floating by, on the same time being unable to load or even edit a
single page on my own Wiki... ;-/
/var/log/syslog and ../messages don't show anything unusual.
In LocalSetting.php, I enabled
$wgShowExceptionDetails = true;
$wgDebugLogFile = "debug.log";
However, a "debug.log" is not being created in the wiki directory (or
according to "updatedb; locate" anywhere else on the server).
Also, the php error log is pretty quiet; after doing an "apache2
restart", it reports once:
-- snip --
# tail -f php.log
[29-Jun-2007 01:39:12] PHP Warning: Module 'gd' already loaded in
Unknown on line 0
[29-Jun-2007 01:39:12] PHP Warning: Module 'mysql' already loaded in
Unknown on line 0
-- snip --
When entering some expression in the search box, php.log logs nothing
else (at least not for several minutes). While waiting for something to
happen, requests from external hosts are rushing by, also not causing
php errors.
/etc/php5/apache2/php.ini (I hope that's the right one) is set to:
...
error_reporting = E_ALL & ~E_NOTICE
display_errors = On
display_startup_errors = Off
log_errors = On
ignore_repeated_source = Off
report_memleaks = On
...
Any hints where and how I could look what's going wrong?
Thanks & regards, -asb
On our wiki we have clusters of pages with subtopics related to
specific genes, and an "On one page" that transcludes each of the
subtopic pages. This is based on the idea that some users prefer
clicking to get around, while others prefer scrolling.
What I'm struggling with is how to handle the Categories in these.
The subpages have categories, and in general I was thinking of
wrapping these in <noinclude> to keep the On one page from
duplicating the entry on each affected Category page. But I'd like
for the Category links to show up at the bottom of On on page.
I could hack the Category page (esp. since I'm already using my
hacked Category page extension) to exclude them from the sql query
that builds the categories. Anyone have any alternative ideas?
=====================================
Jim Hu
Associate Professor
Dept. of Biochemistry and Biophysics
2128 TAMU
Texas A&M Univ.
College Station, TX 77843-2128
979-862-4054
Hi,
I'm trying to fix an old MediaWiki installation, where a few hundred
images are located at ./images; they are registered at Special:Files,
but the images itself are missing. The paths do not point to
./images/{image.ext}, but to something like ./images/1/15/{image.ext}. I
moved a few images manually to the reported locations, which seems to
work fine, but I'd like to avoid wasting several hours to create missing
subdirectories etc.
I assume, when the Wiki was started, $wgHashedUploadDirectory was set
initially to "false" and later changed to "true"; as I understand, there
is no maintenance script to fix this. So I tried to utilize
./maintenance/importImages.php which should do the trick, I thought.
However, the script is smarter than me and checks for the existence of
the files; of course, this check results in "...could not be imported; a
file with this name exists in the wiki". As far as I can see, there is
no switch to override this check. In the script's code, I found a
routine starting with "# Check existence", which I tried to comment out;
this resulted in breaking the script since I don't speak PHP.
If this is the right approach, could someone tell me, which lines I have
to comment out in importImages.php or maybe even provide a quick Patch?
If, however, this is complicated, don't bother, then I'll waste a few
hours for manual fixing this.
Thanks & regards, -asb
A few people had asked for a demo site for TableEdit, but I was
reluctant to open my wiki to anonymous editing. I still haven't done
that, but I have now changed the way TableEdit deals with $wgUser-
>isAllowed so that instead of kicking the users out right away, it
lets them see the TableEdit user interface and add rows, edit things,
etc. But it displays a big warning that changes won't be saved.
So... if anyone wants a taste of how it works:
http://ecoliwiki.net/colipedia/index.php/EcoliWiki:TableEdit_Sandbox
Jim
=====================================
Jim Hu
Associate Professor
Dept. of Biochemistry and Biophysics
2128 TAMU
Texas A&M Univ.
College Station, TX 77843-2128
979-862-4054
Hello dear MW users,
I write this message because my Wiki was attacked
by a WWW BOT that substituted content of a discussion
page with some links to malicious websites.
This is the vandalized page:
http://web.math.unifi.it/beppolevi/index.php/Discussioni_utente:WikiSysop
and this is the page with infos about that "user":
http://web.math.unifi.it/beppolevi/index.php/Speciale:Contributi/216.93.179…
All I know is its IP address, 216.93.179.108 .
I tried to query the WHOIS database with the prompt
=================
whois -h whois.arin.net 216.93.179.108
=================
and I got
*********************************
OrgName: ServePath, LLC
OrgID: SERVEP
Address: 360 Spear Street.
Address: Suite 200
City: San Francisco
StateProv: CA
PostalCode: 94105
Country: US
ReferralServer: rwhois://rwhois.servepath.com:4321
NetRange: 216.93.160.0 - 216.93.191.255
CIDR: 216.93.160.0/19
NetName: SERVEPATH
NetHandle: NET-216-93-160-0-1
Parent: NET-216-0-0-0-0
NetType: Direct Allocation
NameServer: NS.SERVEPATH.COM
NameServer: NS1.SERVEPATH.COM
Comment:
RegDate: 2002-11-15
Updated: 2003-04-10
RNOCHandle: SN458-ARIN
RNOCName: NOC, ServePath, ServePath
RNOCPhone: +1-415-252-3600
RNOCEmail: noc(a)servepath.com
OrgTechHandle: SN458-ARIN
OrgTechName: NOC, ServePath, ServePath
OrgTechPhone: +1-415-252-3600
OrgTechEmail: noc(a)servepath.com
***************************************
The IP node is located in San Francisco
(in front of the bridge, following
Google Maps!!).
Of course I cannot be sure the cracker is
actualli in California...
I tried to traceroute that IP with the prompt
=================
traceroute 216.93.179.108
=================
and i got the path that packages do between my
server (Florence, Italy) and San Francisco.
Of course I'm interesting what is hidden behind
the San Francisco node. I can I discover it?
This is the traceroute output:
********************************
traceroute to 216.93.179.108 (216.93.179.108), 30 hops
max, 40 byte packets
1 10.0.0.2 (10.0.0.2) 8.861 ms 9.097 ms 10.847 ms
2 FI1IE05R.wind.it (151.6.145.65) 8.943 ms 9.246
ms *
3 FIAR-B01-Ge2-0.30.wind.it (151.6.69.65) 10.060 ms
9.180 ms 9.980 ms
4 151.6.7.29 (151.6.7.29) 15.232 ms 14.774 ms
15.806 ms
5 212.245.228.62 (212.245.228.62) 15.541 ms 15.081
ms 15.737 ms
6 so-8-1.car1.Milan1.Level3.net (213.242.65.29)
16.097 ms 16.010
ms 16.254 ms
7 ae-4-4.ebr2.Paris1.Level3.net (4.69.133.134)
33.281 ms 44.139 ms
36.062 ms
8 ae-5.ebr2.Washington1.Level3.net (4.69.132.113)
120.257 ms
118.710 ms 126.568 ms
9 ae-92-92.csw4.Washington1.Level3.net
(4.69.134.158) 123.717 ms
114.246 ms 123.178 ms
10 ae-94-94.ebr4.Washington1.Level3.net
(4.69.134.189) 121.347 ms
115.675 ms 124.935 ms
11 ae-4.ebr3.LosAngeles1.Level3.net (4.69.132.81)
188.811 ms
186.195 ms 181.196 ms
12 ae-2.ebr3.SanJose1.Level3.net (4.69.132.9)
186.953 ms 190.937 ms
196.877 ms
13 ae-93-93.csw4.SanJose1.Level3.net (4.69.134.238)
198.998 ms
189.511 ms 198.439 ms
14 ae-92-92.ebr2.SanJose1.Level3.net (4.69.134.221)
190.567 ms
188.511 ms 194.894 ms
15 ae-4-4.car2.SanFrancisco1.Level3.net
(4.69.133.157) 188.257 ms
189.949 ms 189.967 ms
16 ae-11-11.car1.SanFrancisco1.Level3.net
(4.69.133.153) 189.608 ms
332.129 ms 199.655 ms
17 YIPES-ENTER.car1.SanFrancisco1.Level3.net
(63.211.150.226)
189.971 ms 190.346 ms 190.584 ms
18 border-core1-ge3-0.sfo2.servepath.net
(209.213.192.123) 188.986
ms 188.788 ms 190.316 ms
19 customer-reverse-entry.208.96.31.8 (208.96.31.8)
190.327 ms
190.334 ms 189.487 ms
20 customer-reverse-entry.216.93.179.108
(216.93.179.108) 191.396 ms
190.199 ms 189.544 ms
*********************************
Maybe the last two lines, with "customer-reverse-entry"
can offer more hint for a more deep search.
I ask you to give me hints about how can I
locate that cracker, and on how to avoid
this vandalism in the future.
Best regards,
Giovanni Gherdovich
Hi Jim, Courtney!
Thanks very much for the suggestions. I'll try out the ideas you've mentioned. I'll update you of my findings.
It just seems to me that if MWiki allows custom tags, then it should allow a reasonable number of embeddings of custom tags within custom tags.
That is, it should take the innermost custom tag, evaluate it, then use it as $input to the enclosing custom tag, evaluate it, and so on... until we've reached the outermost custom tag, and then convert to the final XHTML.
Do we see some shades of recursion here?
Filip
Send instant messages to your online friends http://uk.messenger.yahoo.com
Hi,
I have been reading up on http://meta.wikimedia.org/wiki/Help:System_adminandhttp://meta.wikimedia.org/wiki/Wikimedia_servers on the setup of Wikimedia.
Fascinating read!! But does anyone know how the actual Wikipedia or
Wikimedia LocalSettings.php file looks like? I don't know if that would be
too much of a security risk (you obviously would have to take out the
sensitive stuff), but it would be an equally interesting read...
Cheers,
Andi
I was wondering why my imports were blowing up when they got to the
gene called rfc, and then I found this:
http://www.mediawiki.org/wiki/Markup_spec/BNF/Magic_links#RFCs
How do I turn off parsing of these during importDump? Thanks!
Jim
=====================================
Jim Hu
Associate Professor
Dept. of Biochemistry and Biophysics
2128 TAMU
Texas A&M Univ.
College Station, TX 77843-2128
979-862-4054