Hello everyone,
I was careless enough to run a MediaWiki installation allowing people to
sign up without a moderator's approval. Hence a few hundred of them did
sign up and started to use the site to swap images.
How can I delete their accounts in the most expeditious way?
Thanks.
Boris.
I invite you to the yearly Berlin hackathon. It's 1-3 June and
registration is now open. If you need financial assistance or help with
hotel/hostel, just mention it in the registration form.
https://wmberlin.eventbrite.com/
This is the premier event for the MediaWiki and Wikimedia technical
community. We'll be hacking, designing, and socialising, primarily
talking about Gadgets, the switch to Lua, Wikidata, and Wikimedia Labs.
Our goals for the event are to bring 100-150 people together, with
lots of people who have not attended such events before. User
scripts, gadgets, API use, Toolserver, Wikimedia Labs, mobile,
structured data, templates -- if you are into any of these things, we
want you to come!
Details: https://www.mediawiki.org/wiki/Berlin_Hackathon_2012
Thanks to Wikimedia Germany for hosting and coordinating this event.
(Venue still to be determined.)
--
Sumana Harihareswara
Volunteer Development Coordinator
Wikimedia Foundation
Greetings.
I am writing in response to an unanswered question on the Discussion page
of the Header Tabs extension page here:
http://www.mediawiki.org/wiki/Extension_talk:Header_Tabs#switchtablink_pars…
This issue exists on my installation as well and I am hoping that I might
get some developer insight as to why this is occurring and, barring that,
at least ensure that you are all aware of the issue.
My installation:
Mediawiki Version 1.18.1
Header Tabs Version 0.9
Semantic Mediawiki 1.7.0.2
Semantic Drilldown 1.1
Semantic Forms 2.3.2
Semantic Results 1.7.1
ParserFunctions 1.4.0
Skin: GuMadDD
Regards.
> From: Daniel Kinzler <daniel(a)brightbyte.de>
>
> I propose a pluggable handler system for different
> types of content, similar to what we have for file uploads. So, I propose to
> associate a "content model" identifier with each page, and have handlers for
> each model that provide serialization, rendering, an editor, etc.
>
> The background is that the Wikidata project needs a way to store structured data
> (JSON) on wiki pages instead of wikitext.
The "pluggable" part sounds great, as long as it isn't JSON-centric. I could see a need for XML or even SQL adaptors.
It would be great if namespaces, subpages, or even regexp title matches could trigger a particular content rendering. For example, I have LOTS of pages that contain identical wikitext in order to access SQL data, using subpages:
{{plant used for|{{SUBPAGENAME}}}}
This simply fires off an identical template for each subpage, passing in the SUBPAGENAME, which then is used as a query term to display a page of structured data. For example:
http://www.EcoReality.org/wiki/Plant_used_for/Adaptogen
I've also got LOTS of pages with "{{plant needs|{{SUBPAGENAME}}}}", "{{plant supplies|{{SUBPAGENAME}}}}", etc., not to mention pages like "{{annual harvest for|...}}" and "{{monthly harvest for|...}}".
I seem to have LOTS of templates whose name ends with "for" that take one argument and that display data. I've often thought "there should be an easier way..." I played with SMW for a bit, but couldn't easily bend it to my needs.
Is this the sort of use you had in mind?
----------------
Life isn't fair. It's just fairer than death, that's all. -- William Goldman
:::: Jan Steinman, EcoReality Co-op ::::
Can anyone tell me exactly what I'm supposed to do with line 2. of this
set of instructions:
Installation
1. Download the extension and unpack it in the extensions directory.
2. Source GlobalUsage.sql: php maintenance/sql.php
extensions/GlobalUsage/GlobalUsage.sql
Note: do this from the wiki installation where you want the GlobalUsage
data to be located. Typically this is your shared image repository.
Running update.php on that wiki will also create the table.
3. Edit LocalSettings.php and add to the bottom of the file:
require_once( "$IP/extensions/GlobalUsage/GlobalUsage.php" );
4. In LocalSettings, set $wgGlobalUsageDatabase to the identifier of the
wiki where the GlobalUsage data is located (usually the database name).
It should be something understandable to wfGetDB. Example:
$wgGlobalUsageDatabase = 'commonswiki';
At present, this can NOT be a database name of a wiki, but must be a
name given in a load balancer configuration: Manual:$wgLBFactoryConf, a
complex and largely undocumented configuration structure used by the WMF
and Wikia. For a normal mediawiki setup using multiple databases
configuring and maintaining this data structures implies significant
difficulties.
5. Run refreshGlobalimagelinks.php on all wikis in your farm. This will
take a long time. php extensions/GlobalUsage/refreshGlobalimagelinks.php
6. Visit Special:GlobalUsage and enjoy.
Hello,
I do IT for a nonprofit that runs a specialized MediaWiki wiki for people with chronic pain. It's our main project, and we'd like to use CentralNotice on it because of several features that make it superior to SiteNotice for our use.
I've installed it and set up a test banner and a campaign. Interestingly, the test notice doesn't appear when you access a page via the short URL, but does appear when you access it through the long URL.
For example, the SiteNotice currently is set to display the phrase, "For testing purposes only: CentralNotice is working." You will only see that phrase if you navigate to the first of the following two links:
http://www.tmswiki.org/w/index.php?title=The_Tension_Myositis_Syndrome_Wikihttp://www.tmswiki.org/ppd/The_Tension_Myositis_Syndrome_Wiki
Theoretically, the above two URLs refer to the same resource.
We used the "Recommended how-to guide (setup used on Wikipedia)" method for setting up Short URLs. The key line, in httpd.conf, is Alias /ppd /home/ptpn/public_html/w/index.php
Many thanks for any help you can offer. We're very eager to get this set up.
Forest
Resources:
http://www.tmswiki.org/ppd/Special:CentralNoticehttp://www.tmswiki.org/ppd/Special:Version (v. 1.18. CentralNotice was installed from 1.18 version today)
There are file types that Mediawiki doesn't allow you to upload, including
Office (MS & Open) documents. Therefore, we have what I call an
"auxiliary" storage, where people can download a copy via the web.
Currently, I simply move files to a UNIX web server via UNIX CLI (scp,
mkdir, chmod, etc.). I'd like a simpler method; one that even our MS
sysadmins could use.
*Anyone have any suggestions? *
I was thinking of using our internal SVN repository for this. We can
download via this repository's web interface, but cannot upload, update,
etc. I've been playing with RapidSVN, and its easier than using CLI
commands, but I have not found it very intuitve (maybe because I'm not so
fluent in SVN terminology).
I think I prefer a web app to allow people to upload to the SVN via their
browsers.
But, it doesn't have to be SVN, if I hear of something better.
Thanks
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Haim (Howard) Roman
Computer Center, Jerusalem College of Technology
Phone: 052-8-592-599 (6022 from within Machon Lev)
Hello.
Is it possible to receive an email, when someone makes a comment on a
discussion page in my Wiki? So I'd like to get informed.
Thanks and take care
Hello All,
First time post here.
Trying to resolve a problem where I'm trying to request an RSS feed
from my WIki server to a secure tomcat web server (under my control) but
I'm getting this error:
Failed to load RSS feed from https://my.web.server.com/feed: Error
fetching URL: Peer certificate cannot be authenticated with known CA
certificates
This may be a simple fix either by importing the ssl certificate from
the server i wish to get the feed from or by maybe telling PHP to ignore
the certificate ? Not sure since I don't really know if the Wiki PHP
code is using cURL to do the request or something else. And I wouldn't
be too sure on where to import it to if i need to do this.
I tried going into the /includes/HttpFunctions.php file and setting
"protected $sslVerifyHost = false;" and "protected $sslVerifyCert =
false;" but no go.
any help would be nice.
Chris