Hello,
I have found a listing of SEO-related extensions:
https://www.mediawiki.org/wiki/Category:Search_engine_optimization_extensio…
Seems many of them are duplicating each other, some listed as "beta", others were not touched for 2-3 years.
I see "WikiSEO" is marked as stable, was "recently" touched (<2 yr) and documents union of
features of other extensions (ex: Description2, YetAnotherKeywords). Does anyone use it, or other extensions are preferred?
Or more popular technologies to achieve SEO goals are: "Semantic MediaWiki" or "DocTypes" or "Widgets"?
If you work in the space, could you reply with extension(s)/tech you use and what made you choose it?
Thanks.
This message is intended for specified recipient(s) only. If received by any other party, please discard immediately.
Happy New year to all.
My configuration:
__________________
MediaWiki <https://www.mediawiki.org/>
1.26.2 PHP <https://php.net/> 5.3.29 (apache2handler) MySQL
<https://www.mysql.com/> 5.0.95 ConfirmEdit
<https://www.mediawiki.org/wiki/Extension:ConfirmEdit>
1.4.0 ContactPage <https://www.mediawiki.org/wiki/Extension:ContactPage>
2.2 Semantic Maps
<https://github.com/SemanticMediaWiki/SemanticMaps/blob/master/README.md#sem…>
3.2 Semantic MediaWiki <https://semantic-mediawiki.org/> 2.3
__________________
I'm a bit confused by the following:
My contact form works fine when logged in. But when not logged in, I get
*Catchable fatal error*: Argument 1 passed to QuestyCaptcha::getForm() must
be an instance of OutputPage, none given, called in
/www/htdocs/drebbel/wiki/extensions/ContactPage/ContactPage_body.php on
line 433 and defined in
*/www/htdocs/drebbel/wiki/extensions/ConfirmEdit/QuestyCaptcha/QuestyCaptcha.class.php*
on
line *44*
__________________
My LocalSettings.php:
require_once( "$IP/extensions/ConfirmEdit/ConfirmEdit.php" );
require_once( "$IP/extensions/ConfirmEdit/QuestyCaptcha.php");
$wgCaptchaClass = 'QuestyCaptcha';
require_once "$IP/extensions/ContactPage/ContactPage.php";
$wgCaptchaTriggers['contactpage'] = true;
$wgContactConfig['default'] = array(
'RecipientUser' => 'xxxx', // a valid account which also has a verified
e-mail-address added to it.
'SenderName' => 'Contact Form on ' . $wgSitename, // "Contact Form on"
needs to be translated
'SenderEmail' => null, // Defaults to $wgPasswordSender, may be changed as
required
'RequireDetails' => true, // Either "true" or "false" as required
'IncludeIP' => true, // Either "true" or "false" as required
'AdditionalFields' => array(
'Text' => array(
'label-message' => 'emailmessage',
'type' => 'textarea',
'rows' => 20,
'cols' => 80,
'required' => true, // Either "true" or "false" as required
),
),
// Added in MW 1.26
'DisplayFormat' => 'table', // See HTMLForm documentation for
available values.
'RLModules' => array(), // Resource loader modules to add to the
form display page.
'RLStyleModules' => array(), // Resource loader CSS modules to add
to the form display page.
);
_________________
If I change $wgContactConfig['*default*'] = array( into
$wgContactConfig['*contactpage*'] = array( , I get
Contact form error
A contact form is either not configured for this page or is configured
incorrectly.
__________________
I've changed (in ReCaptcha.class.php) the function getForm( OutputPage $out
) into getform() and now it works fine apart from the fact that one often
has to refresh the contactpage for the Captcha to appear. See
http://drebbel.net/wiki/Special:Contact
I'll be out of town until Sunday, January 3rd. If you need immediate assistance, feel free to contact my team by writing support(a)wddx.net.
Have a Happy Holiday!
Hello,
I have notices that page results for internal wiki search on our site show wiki code in page summary.
For example, if I search for "New York City", a page comes up and its summary is listed as wiki section header markup:
== New York City == |key=city... (and the rest of my template on that page).
It is as if search is done on raw/unparsed wiki markup, instead of on pages as they shown to the end user.
What could cause that? Error log shows nothing special.
Have I misconfigured, or didn't configure some parameter?
Thanks.
This message is intended for specified recipient(s) only. If received by any other party, please discard immediately.
Hello,
I am having trouble composing DPL page with "include" parameter.
It could be by misunderstanding of DPL capabilities, or my page structure in "NS_MAIN".
I have tried including by section name ["#Header"], by regex ["##Header.*"] and by section ["date_of_interest"].
None of these three methods work, with DPL like this:
<DPL>
titlematch = %New York%
include = date_of_interest
mode = unordered
</DPL>
My pages are structured (for now) to have 2 "vardefine" variables that used a lot in a page, followed by common "header template" to provide
consistent layout of data across pages, page-specific midsection and common "footer template" for consistent bottom layout.
If I mark with "section" (LabeledSectionTransclusion) a date in the midsection of the page (not inside another template), then DPL does find that mark "as advertised", and includes it into resulting table.
But if I mark with "section" a date inside my "header template" - DPL does not find that section.
It is as if DPL does not execute my "header template" before trying to extract data from the page.
Is DPL not capable of finding marked sections that are produced by a template in the page it is processing (that seems unlikely)?
(MW is full of multi-layered templates, they do not cause an issue.)
Am I doing something wrong (should the DPL extract section marks in this setup)?
To summarize, I can find pages by "category" or "titlematch" filter but I cannot (at all) extract parts of the pages via 2 mechanisms (header, regex).
Extracting marked sections - only works if sections are created within the page itself, not via additional templates used on the page.
Any ideas what is could be, and how to diagnose the issue?
This message is intended for specified recipient(s) only. If received by any other party, please discard immediately.
Okay, a few weeks ago my wiki stopped loading. The version I've got
installed there is ancient, but it was working fine until one day it
wasn't. At first the page was giving me indefinite load-times,
sometimes followed by a 502 error (depending on what web browser I tried
it on). I conversed with someone at my webspace provider and they
rolled the stored page info back a couple days, and now I instead get:
Parse error: syntax error, unexpected T_NAMESPACE, expecting T_STRING in
/hermes/bosoraweb123/b65/ipw.joshua-w/public_html/wiki/includes/Namespace.php
on line 46
The web-address for my wiki is
http://www.joshua-wopr.com/wiki/index.php?title=Main_Page
I haven't touched the system in ages, and the tech help person seemed to
think nothing had been altered (updated under the hood, say) at their
end recently either. Any idea what's wrong?
Hi
I'm running MW 1.26.2, PHP 5.5.42, Ubuntu 14.04 in a Bitnami-AWS
one-click virutal stack.
Somewhere along the way (MW 1.25.x) file uploads stopped working.
I thought it might be somehow a change in permissions so I sudo chmod
777 images.
I have $wgShowExceptionDetails = true;
but at the moment I'm not getting any details, just this error message:
Could not create directory "mwstore://local-backend/local-public/4/44".
Any thoughts?
David
Folks,
Performance Issue/Question
MediaWiki:1.26.2 :: Apache:2.4.16 (Ubuntu) ::
PHP:5.6.16-2+deb.sury.org~trusty+1
MySQL:5.6.27-0ubuntu0.14.04.1-log :: Processor:Intel Xeon QuadCore X3210
:: Memory: 3 Gig
HD: 1.5Tb @ 126g usage :: 64Bit System
*****************************************
I have a very small single user(me) wiki in use for genealogy work. I
have very little traffic visiting the site. I keep the wiki updated
fairly religiously, including extensions as well as the physical server
itself. I have Zend OPCache running on this server as well.
My issue has always been a speed issue in importing XML files that have
been exported from Wikipedia. As an example, I may export a wikipedia
page about a particular town or county that is relevant to my genealogy
research and import the XML into my wiki. It takes an excruciatingly
long time to import even a relatively small wikipedia page into my wiki.
I'm talking 5+ minutes to import a single page such as this,
https://en.wikipedia.org/wiki/Guilford_County,_North_Carolina, into my wiki.
I've been trying to fix this issue for a couple of years and have not
figured out why it takes so long to import the wikipedia pages.
I'd like to mention here that my physical server is located in a full
blown data center with 100Mb switch ports on a gigabit fiber network and
out through a 10 gig pipe to the world. Network access is not a problem.
I guess my question is, what can I do to improve the import process?
Thanks,
Chap
wiki.jonesipedia.com
********************
Bonjour
J'essaie de faire une mise à jour de mon site MediaWiki hébergé chez OVH.
En effet, ma version 1.20 n'affiche plus le rendu classique des pages
MediaWiki bien que le texte source soit accessible (sans doute une
histoire de version php ?).
Cependant, je rencontre bien des difficultés : j'ai suivi le tutoriel
https://www.mediawiki.org/wiki/Manual:Upgrading/fr.
Comme je n'ai pas accès en ssh au serveur, j'ai tenté la mise à jour Web.
Déjà, pourquoi le répertoire mw-config n'est-il pas présent par défaut
dans la racine ?
Du coup, j'ai importé mon précédent mw-config qui fonctionnait bien dans
MW 1.20 mais pas dans MW 1.26.2 : j'obtiens une page blanche.
Maerci pour toute suggestion qui me permettrai d'avancer.
Herve
There was some chatter in recent weeks about most efficient methods on
how to upgrade an existing (older version) install of MediaWiki.
Was there ever a consensus on the easiest method for non-programmers?
Or should I be knocking on another list for this info (and if so, which list?)
Thanks for your time.