We use a magic word/template to index one of our books we added to our wiki.
The book is a dictionary of terms, so we organized using letters. For
example, on the article page we've added {{#category_index:D|datum
correction}}. We are currently translating this book in multiple languages
and have started to see the index add multiple links to the same word -
http://wiki.seg.org/wiki/Dictionary:D:Index. In addition, we have one
article that has created multiple links with strange text (one example -
delta
UNIQ003571207dbef6cb-math-00000002-QINU,UNIQ003571207dbef6cb-math-0000000
3-QINU). I have removed all the html from the original article -
http://wiki.seg.org/wiki/Dictionary:Delta - but the index still lists the
links with the strange text.
Does anyone know how to remove the repeating words? What issue(s) might be
happening with the long, coded article? Is there a better way to organize
the articles by letters? Thanks.
Andrew
Andrew Geary | publications outreach editor
+1.918.497.4615 | ageary(a)seg.org
Society of Exploration Geophysicists (SEG)
8801 South Yale Ave, Ste. 500, Tulsa, OK 74137 USA
With extension ExternalLinks now marked unsafe and unmaintained, is there a
quick and easy way to maintain external links? The extension gave
bureaucrats a way to quickly validate all external URLs (per page) and
presented the results in a neat table.
Thanks,
Rob
Hi.
I tried to follow the instructions to serve from file cache directly (link
<https://www.mediawiki.org/w/index.php?title=Manual:File_cache#Serving_cache…>)
and to have short url (link
<https://www.mediawiki.org/wiki/Manual:Short_URL/Apache>).
But those two instructions don't seem to be directly compatible. How should
I write the .htaccess file at the web root?
The php7 at the host is slow, so I want to minimize number of redirects /
avoid php.
Current .htaccess at the webroot:
# Enable the rewrite engine
RewriteEngine On
# Short url for wiki pages
# RewriteRule ^/?wiki(/.*)?$ %{DOCUMENT_ROOT}/w/index.php [L]
# Redirect / to Main Page
# RewriteRule ^/*$ %{DOCUMENT_ROOT}/w/index.php [L]
# Serve directly from cache
RewriteBase /
# If a cached page exists under /w/html_cache, do an internal redirect to
it:
RewriteCond %{HTTP_COOKIE} !UserID=
RewriteCond %{QUERY_STRING} !.
ReWriteCond %{THE_REQUEST} ^GET\x20/wiki/([^\x20/]+)\x20HTTP
RewriteCond %{DOCUMENT_ROOT}/w/html_cache/%1.html -s
RewriteRule ^wiki/(.+)$ /w/html_cache/%1.html [B,L,NS]
# Redirect / to cached Main Page
RewriteRule ^/*$ %{DOCUMENT_ROOT}/w/html_cache/Main_Page.html [L]
Permissions for Folders:
0700 web_root/w/cache
0755 web_root/w/html_cache
0755 web_root/w/images
LocalSettings.php
$wgMessageCacheType = CACHE_ACCEL;
$wgUseFileCache = true; // Enable file cache
$wgCacheDirectory = "$IP/cache";
$wgFileCacheDirectory = "$IP/html_cache";
$wgDisableOutputCompression = true; // already used mod_deflat
$wgUseLocalMessageCache = false;
$wgParserCacheType = CACHE_DB;
$wgEnableSidebarCache = true;
# NO DB HITS!
$wgDisableCounters = true;
$wgMiserMode = true;
$wgRevisionCacheExpiry = 3*24*3600;
$wgParserCacheExpireTime = 14*24*3600;
# Serve directly from cache
$wgFileCacheDepth = 0;
Thanks again.
Hi. How to make two wikis such that
1. wiki A can only read the page.
2. wiki B allows both read and write the page
3. The pages on wiki A and wiki B are the same
4. They use the same MySQL database.
Is that possible? I am thinking about aggressively cut down the
resource.php and remove all extensions for the read only wiki to make it
faster.
I installed Mediawiki on a free shared host, which allows 2 cron jobs that
don't use too much cpu/memory per day.
PHP seems slow. I don't have a domain name and so can't use cloud flare.
I tried file cache, enabled PHP7's mod_deflat with cPannel, requested APCu,
OpCache and:
$wgMessageCacheType = CACHE_ACCEL;
$wgUseFileCache = true;
$wgCacheDirectory = "$IP/cache";
$wgUseLocalMessageCache = true;
$wgParserCacheType = CACHE_DB;
$wgEnableSidebarCache = true;
# NO DB HITS!
$wgDisableCounters = true;
$wgMiserMode = true;
$wgRevisionCacheExpiry = 3*24*3600;
$wgParserCacheExpireTime = 14*24*3600;
I didn't try the trick that serves the cached page directly without going
through PHP because I worry things will slow down when the number of pages
increased.
Thanks in advance
I'm trying to set up a wiki-farm using the instructions at [fn:1] which
I know are a bit old, but I like using sequential installation
instructions.
However, wiki-1 installs perfectly with no problems, but if I try and
install wiki-2 it states "A LocalSettings.php file has been detected. To
upgrade this installation, please enter the value of $wgUpgradeKey in
the box below. You will find it in LocalSettings.php."
I've tried using the upgrade key but am unable to progress further. I've
tried removing the LocalSettings.php, installing wiki-2 and then
replacing the LocalSettings.php from wiki-1. Wiki-1 still works, but
wiki-2 doesn't.
Wiki-1 has a prefix of 'df-' for its mariadb with the intention of
wiki-2 having a prefix of 'tg-'. So they're using the same database but
with different prefixes for their respective tables.
Both of the sites that I'm using are purely localhost, they will have no
internet presence at all, and are just intended to be used for
generating and checking new pages in their respective wikis.
So how can I easily install wiki-2, and perhaps others, too please?
Thanks
Sharon.
[Fn:1] http://web.archive.org/web/20070715020649/http://www.steverumberg.com/wiki/…
--
A taste of linux = http://www.sharons.org.uk
TGmeds = http://www.tgmeds.org.uk
DrugFacts = https://www.drugfacts.org.uk
Debian 9.0, fluxbox 1.3.5-2, emacs 25.1.1, org-mode 9.0.7
Greetings,
I'm looking for some assistance with installing a personal MediaWiki configuration for a site that I want to set up. I've looked through almost all of the predefined hosting services at https://mediawiki.org/wiki/Hosting_services and have not been successful at finding something that works for me.
I do have a code repository of my preferred configuration settings, extensions, skins, etc. that is in the works at GitHub - my goal is to finalize it by the end of the month.
I was wondering if there was anyone who would be willing to assist me just with the initial installation of MediaWiki - I can manage just about everything else from there.
Sincerely,Amanda
Did anyone ever fix the sitemap generator maintenance script, after it
broke in v1.16? See
https://www.mediawiki.org/wiki/Manual:GenerateSitemap.php
I put my sitemap index file into google and it found the .xml.gz files in
that directory but gave me "Invalid URL" errors for each one. So I ran
DaSch's alternative script and got Fatal error: Call to undefined method
MWNamespace::isMain() on line 248.
If it's just looking to see if we're in namespace 0, that should be trivial
to hack, but I just wanted to see what the status of MediaWiki sitemap tool
development is at this point before pursuing that. Thanks.
Greetings to the MediaWiki community!
MediaWiki Stakeholders (MWStake) is organizing an Enterprise MediaWiki Conference (EMWcon) for Spring of 2018 in Houston, Texas.
MWStake is pre-announcing the conference and is also requesting your input so that we can pick a week that would be most ideal for hosting the event.
The hope is to have the conference in March or April of 2018, with the start date on a Wednesday , continuing through Thursday and Friday . Based on the community feedback, MWStake will pick the week that will allow for the most impact for the most attendees.
If you're interested in attending the conference, please choose a start date or dates that work best for you using the following link: [ https://doodle.com/poll/inh7ut84cn2rn3i9 | https://doodle.com/poll/inh7ut84cn2rn3i9 ]
Please leave comments, suggestions, and your email if you'd like to be contacted further.
Thank you,
MWStakes
Olá, boa tarde! Me chamo Joao Vitor e gostaria de compartilhar uma ideia
com vcs.
Sinceramente, como creio que todos os cidadãos brasileiros, estou cansado
da corrupção que devasta nosso país. Mais que isso, fico puto demais que
todos os órgãos do governo possuam informaçoes irrestritas sobre nós,
enquanto é extremamente difícil para cada um de nós obter qualquer tipo de
informação sobre qualquer ente da administração pública. A sociedade em que
vivemos não comporta mais o sigilo de informação, reuniões a portas
fechadas e mentiras. Nas redes sociais, compartilhamos toda a nossa vida;
ao governo, temos que declarar todos nossos bens e receitas. Certidões
criminais e fiscais, que dizem respeito à vida dos cidadãos são fáceis de
serem obtidas e contam nossas vidas em poucas linhas.
No entanto, qualquer informação relativa ao governo é extremamente penosa
para se obter. Ainda que a lei de acesso à informação (lei 12527/2011)
tenha representado avanço nesse sentido, os portais de transparência são
confusos, burocráticos e demandam uma paciência do caralho pra que vc
receba qualquer documento. Além do mais, são difusos: cada braço da
administração possui o próprio site, cada autarquia, empresa estatal,
unidade federativa, união, etc. Parece que é feito pra desestimular o
encontro de informações.
Pensei em fazer uma ferramenta de pesquisa colaborativa, em moldes
similares ao da Wikipedia, onde todas as informações públicas fossem
facilmente compiladas e simples de achar. Divisões por ramos do poder,
governo, entidades federativas, enfim, algo que tornasse a informação
pública realmente clara ao povo. Documentos e contratos públicos, de fácil
acesso, agenda de políticos, sentenças judiciais públicas, gastos,
auditorias,etc. Estas informações estão na internet e podem ser acessadas,
não estou falando de algo tipo Wikileaks. Falo sobre uma ferramenta
contributiva do usuário, para compilação de transparência pública, já que o
volume de informações é extremamente grande.
Escrevo esta ideia, até em forma de desabafo, já que acabo de voltar do
Detran para fazer um simples procedimento. Gastei mais de duas horas, o
atendimento foi escasso e horrível e me fez perguntar por que caralhos,
recebendo a quantia absurda que este órgão recebe em SP, o sistema deles
era tão lixo. Sem informação, não poderemos saber.
Quem estiver disposto a ajudar, vamos entrar em contato! Possuo zero de
conhecimento em programação e TI, por isso peço a ajuda de vocês para
construirmos juntos algo que possa servir ao povo e à nação.
Abraços!
Hi all
Perhaps I am missing it, but is there anyway to stop users from using the
"create2" function and only allow sysops and * to create a user accounts. I
am not talking about "create" or "createaccount" as a permission. What I
don't want is existing logged in users to go to Special:Createaccount and
create another account.
Thanks
Tom