Lis ton message de Nganguem Victor avant qu'il ne soit effacé!
Pour lire ton message, suis simplement ce lien:
http://eu1.badoo.com/chatnoirxx/in/p4hxwt052ok/?lang_id=6
D'autres personnes sont aussi présentes:
Oded (Tel Aviv, Israël)
Priya (Udaipur, Inde)
RajaYogi BK (Udaipur, Inde)
Karuna (Udaipur, Inde)
Chika Reginald Onyia (Pnompen', Cambodge)
...Qui d'autre?
http://eu1.badoo.com/chatnoirxx/in/p4hxwt052ok/?lang_id=6
Les liens ne fonctionnent pas dans ce message? Copie les dans la barre d'adresse de ton navigateur.
Tu as reçu cet email suite à un message envoyé par Nganguem Victor de notre système. S'il s'agit d'une erreur, ignore simplement cet email. La requête sera alors effacée du système.
Amuse-toi bien !
L'équipe Badoo
Courrier automatique de Badoo suite à l'envoi d'un message à ton attention sur Badoo. Les réponses ne sont ni stockées, ni traitées. Si tu ne veux plus recevoir de message de Badoo, fais-le nous savoir:
http://eu1.badoo.com/impersonation.phtml?lang_id=6&mail_code=65&email=media…
I had the Replace_Text extension working fine until the upgrade to 1.18.
I reinstalled the latest version of the extension (version 0.9.1), and it's showing up in Special:Version,
but the page:
http://fraternityofshadows.com/wiki/Special:ReplaceText
is missing.
Any ideas?
(and thanks again to PathosChild for the css help)
Ron
--
Ron Laufer
Hello,
I recently set up Varnish for a MediaWiki wiki I do IT for. I noticed that the hit rate was almost zero, though, because we use Google Analytics, which sets cookies that cause the HTTP accelerator to essentially ignore the request and simply pass the request back to Apache. As a result, Varnish effectively can't function, and as while I'm not sure, I think that the same cookies would have a similar effect on a squid cache as well.
The Vector skin causes similar problems because it sets a cookie whenever the user opens or closes the CollapsibleNav elements in the sidebar (the little triangles that let you open and close sections of the sidebar). These collapsible nav elemens are heavily used on our wiki because our sidebar is used very heavily for navigation.
What makes this a little frustrating is that, after having spent several days figuring out how to use Varnish (I'm an economist, after all, not a programmer, and it's a sophisticated program), I get the feeling that my effort is being thwarted by cookies that play a role only on the client side, in JavaScript. Therefore, the cache should be able to completely ignore them.
Varnish uses scripts in a remarkably flexible programming language (called VCL) to determine which files to cache and which to pass to Apache. In particular, it can use PCRE based regexes on http headers to determine how to handle a request. For example, it should be able to look for session, UserID or Token cookies from MediaWiki. If it finds one of these, then it could be configured to pass the request to Apache rather than looking in the cache. However, if those cookies aren't set and the requested resource is in the MediaWiki folder (i.e. the folder in $wgScriptPath), then it could conceivably look for the object in the cache rather than contacting Apache, even if other cookies, such as the Google Analytics or Vector cookies, are set. If the requested resource is in another folder, different behavior could apply.
Does this seem like a reasonable workaround to you? Does anyone have any tips on how to make it work more smoothly? My wiki is quite small and most users are fairly technically unsophisticated, so we might not stumble upon many of Mediawiki's more esoteric cookies (if it has any) very frequently. We're also quite a tight group, so if the above behavior caused any quirks by caused by Varnish occasionally caching something it shouldn't, we might be able to live with that by simply educating users about how to mitigate those quirks by the time they become adventurous enough to stumble on them.
Any help would be gratefully appreciated.
Forest
I've just joined; I hope folks will forgive my ignorance. I'm trying to
reply to this thread from last week, about the WPTouch skin.
I've installed and activated WPTouch for our wiki, but on all devices
(including my laptop's browser) all I get is the "skated concrete"
background image. I'm not sure if this is the same problem others have with
WPTouch, or whether it's related to our unorthodox file structure (in a
subdirectory, sharing a root domain with a Wordpress installation) or
something else.
Anyway, I'd also love to have a working mobile-optimized skin. Am I right
that there aren't any other mobile skins available for Mediawiki? (Aside
from iWiki<http://www.mediawiki.org/wiki/Manual:Gallery_of_user_styles/iWiki>,
which doesn't seem to be Android-friendly.) Crazy.
Michael
I use version 1.16.2, PHP 5.3.3 (cgi-fcgi), MySQL 5.1.52 for more than a year already. Out of a sudden all attempts to create preview images (no matter whether I use "thumb" or another parameter) result in an error message: "Fehler beim Erstellen des Vorschaubildes," New original files are not affected, but no smaller versions are shown, even on file-pages! Strangely, the same error does not occur on a backup version of the same wiki on the same server. Also, I did not change any settings in LocalSettings.php. Last time the Wiki worked as it should is only a few days ago.
Any suggestions?
Thank you
Bernhard
Hello, I run IT for a small nonprofit that just purchased a VPS to speed up our wiki. This being our first VPS, we are starting completely fresh, and are trying to find the current “best in class” software to start with. I was wondering if people could provide feedback on the software stack we are planning on using.
First, the 30,000 foot view:
* MediaWiki 1.18
* PHP 5.3.3, Apache 2.0.63, and MySQL 5.0.92, and CentOS 5.7
* APC 3.1.9 (current) with $wgMainCacheType = CACHE_ACCEL;
* Varnish 3.0.2
* Our wiki gets about 25,000 pageviews per month, mostly from people who aren’t logged in. The test wiki can be viewed here
We’re aware that Squid has classically been used my Wikimedia, but many sources seem to be saying that Varnish, having been built from the ground up as a http accelerator is faster. I also found the following statement in a slide show called “Wikimedia Operations Overview:” “Currently in development to replace squid for all caching. More efficient than squid, better able to use resources.”
Overall, our goal is simply to choose the best software we can find to provide a good foundation for our wiki. Any feedback would be appreciated.
In terms of other settings, because our wiki is quite small (600 articles) and because we like being able to mess around in the database, we are planning on using the MySQL 4.1/5.0 UTF-8 character encoding. Frankly, we have no idea how to make that call, but the idea of the text being encoded in a human readable format does seem attractive.
Thanks in advance,
Rob
Perhaps another issue from upgrading to 1.18...
A user noticed than none of the "next 200" links work on my category pages
example:
http://www.fraternityofshadows.com/wiki/Category:NPC
it just goes back to the same page.
Any way to fix this?
Ron
--
Ron Laufer
Hello,
I have a request to have WYSIWYG / FCKeditor setup. However, not
everyone will want to use it. How do I configure it so that you have to
select WYSIWYG. You can switch back and forth currently, but I would
like to not have the WYSIWYG as the default.
Thanks,
Graham
I want to build a Template:MathJax page that will automatically add
the following script call to any page/article that has this template
included.
<script type="text/javascript"
src="https://d3eoax9i5htok0.cloudfront.net/mathjax/latest/MathJax.js?config=TeX-…">
</script>
This script call needs to be placed as a header in an HTML based page
according to the sources as its purpose is to render Latex/Tex or MathML
inputs into something viewable to the browser. I want to get this same
result in Mediawiki 1.17. I am aware that I can install
Extension:MathJax but that will require constant maintaining & updating
as this is a somewhat active system. It just seems to me that a template
is a better way to do this. I am open to suggestions or ideas. I figure
this is a tag thing but do not have enough experience to do this
quickly.
Thanks!
frosty
I snipped this from the terminal screen when I was running runjobs.php
Seems I have a memory issue & I,m not sure what I need to do to fix it.
System has 8Gb of ram so I don't think that is the problem. Perhaps a
setting in PHP or Mediawiki 1.17. Any tips?
frosty
----------------------------------
2011-12-09 05:49:15 refreshLinks2 Template:As_of start= end= t=19954
good
2011-12-09 05:49:17 refreshLinks2 Template:Bot start= end= t=1968 good
PHP Fatal error: Allowed memory size of 134217728 bytes exhausted
(tried to allocate 1221613 bytes)
in /home/frosty/mediawiki-1.17.0/includes/parser/Parser.php on line 393