The file "INSTALL" says, "Decompress the MediaWiki installation archive
either on your server, or on your local machine and upload the directory
tree. Rename it from "mediawiki-1.x.x" to something nice, like "wiki,"
since it'll be in your URL."
Shouldn't that be "w" instead of "wiki"?
I was running mediawiki on a Shared host and traffic was around 10K views a
day (small to moderate size wiki). I was forced to leave that setup because
of high CPU usage. I was not able to install Squid there or do anything to
speed things up. I had talked about that before on this list and I'm
thankful for the recommendations.
Now I'm on a VPS where Squid is running and currently I don't have CPU
issues except when there's a traffic spike. So I've decided to look for a
dedicated server. I've seen on web hosting forums that (low-end?) dedicated
servers are available for pretty cheap ($100). Currently I'm paying $70 for
the VPS.
My key issue is that the webhost has to willing to let me remain anonymous
and because of this my options are limited. For example they have to accept
Paypal. I have not looked around yet at what options are available but I
will look into that next after this discussion.
To be prepared for the future, I want the server to be able to support 30K
views a day (3 times the current traffic) and display pages with no
noticeable/serious delays. I hope a $100 server with Squid can do this for
me.
Are there any server specs that I should look for? The first one would be
RAM. What's the minimum RAM I should have? Other desirable specs?
My second issue is the hit ratio for Squid: According to Squid's cache
manager, the cache hit rate is about 40% and the byte hit ratio is 20%.
Average time taken to serve a "missed" request is 0.7 seconds, while for a
hit its only 0.02 seconds (35 times faster). So a higher hit ratio would be
really nice.
Looking at Squid's access logs, I also noticed that calls to Load.php are
always "misses". Can anything be done to fix that?
What can be done to optimize Squid for mediawiki and increase the hit
ratio? The RAM I have available is 1.3GB and I told Squid it can use 130MB
and it goes over and the total RAM used usually stays around 40%. I know
1.3GB may be small. I've heard we need to leave some ram free, to ensure
system stability. I may have more RAM in the dedicated server when I get it.
If anyone has a high hit ratio, I would really be thankful if you could
email me your Squid.conf (remove any sensitive information) and I can
compare it with my setup. Or you could tell me the settings I should change
or add.
thanks!
Dan
Greetings!
I am trying to do tranclusion form one private wiki of mine to another
(so that I do not have to duplicate pages between wikis). However, when
I try it is requiring a Log In.
Does anyone know a means of transclusion with log on info?
-Bri
Hello all,
I have two version 1.16 wikis that use a shared database configuration that
I'd like to split (I'm working towards upgrading them to 1.20). Say X is
the database that has the user and user_properties tables, and Y is the one
that shares those tables. Would it be sufficient to mysqldump the user and
user_properties tables from X, import them into Y, and set $wgSharedDB to
null in the wiki that uses Y?
Also, what would the process be if I just wanted to upgrade the wikis to
1.20 but still with the shared configuration? One stumbling block so far is
that one database is about 5 GB and the shared one is around 50 GB. As a
test, I ran the 1.20 update.php script against a copy of the 5 GB database
and it took about 18 hours, though that's on a small (2 CPU, 1 GB RAM)
VMware VM. Even on the wiki database production hardware I'd be concerned
about how long an update.php run against the 50 GB database might take, as
I want to avoid a lengthy downtime. Is there any way to speed up the
process? Any field-tested recommendations on improving this procedure?
Thanks,
Justin
Hi everybody. I just configured LDAP Authentication in my mediawiki. It
works fine.
Now I would like to create 1-2 groups of users, and give them different
permission (e.g. create a default group where user take place at first
login, and then a group where an admin can move him to give him editing
permissions...) like in CMS like joomla or similar...
Is it possible?
Any help is approciated... Thanks a lot.
Nick
--
+---------------------+
| Linux User #554252 |
+---------------------+
Thanks Yury Katkov and Phil Rice for answering.
Regarding Phil's answer, I'm too new at this to understand the actual code
that you wrote and my question may have been unclear, so I'll try to
rephrase it to find out whether it is referring to my situation:
I'd like to set up a wiki where the wiki editors will not need to understand
MediaWiki tags; I want the editing page to be a form with specific
predetermined fields to fill out. e.g. The people who add content to the
wiki will see two fields, Title1 and Paragraph1, and after they fill out
those they can say that they want to add another paragraph to the page and
then two more fields will appear, Title2 and Paragraph2, etc. And finally
when they're done they can save the whole page. They can also somehow insert
links to source quotes and the source quotes would be stored via a different
form.
How do I go about this? Do I need to learn how to write a custom extension
or is there already a way available? Semantic Forms extension seems to
possibly provide some of this functionality but I have a feeling that it
might not be the correct tool for this job. Maybe overkill or maybe just
wrong?
Thanks again,
Eli
__________ Information from ESET NOD32 Antivirus, version of virus signature database 7656 (20121103) __________
The message was checked by ESET NOD32 Antivirus.
http://www.eset.com
Will the assorted modules needed for Lua templates work with 1.19, or
do we need to get an actually recent MW for this? (Does it work in any
tarball release version yet?)
- d.
Hello, I hope that I'm sending this queston to the appropriate place.
I'd like to create a wiki where articles are restricted to a specific format, for example the format will be that an article can have a number of separate paragraphs and each paragraph will have a title. So I'd like to store the parts of the article (paragraphs, titles) as separate fields in a database. I will then display the database data using a custom display.
Do I have to write my own extension to MediaWiki? Can I use the Semantic Forms extension which seems to provide all kinds of helpful ways of defining classes with properties, or is that a misuse of the Semantic Forms extension? Is there a different extension or way that I can do things like this where I'm basically using MediWiki to fill a database and then to display the contents of the database? Or maybe MediaWiki is not even the tool that I should be using?
Thank you very much.
__________ Information from ESET NOD32 Antivirus, version of virus signature database 7656 (20121103) __________
The message was checked by ESET NOD32 Antivirus.
http://www.eset.com
Hello,
I just installed a new wiki. I'm trying to get Short URLs working using
.htaccess.
Once things are working properly, my URLs should look like this:
http://wiki.domain.tld/Main_Page
I tried following the example here:
https://www.mediawiki.org/wiki/Manual:Short_URL/Apache
...but the example assumes two things that are not true for me:
1. you installed mediawiki into a folder called "w"
2. you want you URLs to look like this:
http://wiki.domain.tld/wiki/Main_Page
I installed mediawiki into my "root" domain folder, not in a sub-folder
I don't want "wiki" to appear before /Main_Page.
Can someone please paste an example htaccess file that will work with these
two requirements?
Thanks very much in advance!
-Simon