I've been looking at the various GA extensions and can't find one that meets
my specific requirements. I need to be able to track my whole wiki, as well
as allow others to track page subsets that they're interested in tracking.
These subset users should not see GA data for the entire wiki.
Is this a scenario that anyone has encountered or could offer advice on?
I'm currently experimenting with
http://www.mediawiki.org/wiki/Extension:Google_Analytics_Integration but it
only allows for 1 GA account number, and includes all pages in the wiki.
Thanks
Bill
My mediawiki 1.17 installation as gone nuts. It is not correctly
identifying things where the first letter context has changed. example:
Template: tlx call does not find Template:Tlx In the past this has made
no difference. Only recent changes are installation of the
Extension:WorkingWiki. The extension seems to be doing its job though there
are some issues with Latex (.sty) sheets. I just do not know if the
WorkingWiki, which did alter the editing interface, is the cause. I will be
contacting the author re: this.
Any tips?
I have Mediawiki installation went from 1.17 through 1.18 RC1 and to
1.18 - no problems.
But another Mediawiki has had some problems going direct from 1.17 to
1.18 - I took out Recaptcha and now see this:
*Fatal error*: Cannot redeclare wfprofilein() (previously declared in
/xxxxxxxx/public_html/includes/profiler/Profiler.php:14) in
*/xxxxx/public_html/includes/ProfilerStub.php* on line *25*
Not sure why...
Gordo
--
Gordon Joly
gordon.joly(a)pobox.com
http://www.joly.org.uk/
Don't Leave Space To The Professionals!
$wgDBtransactions gets set to true if using InnoDB tables. Is there
an advantage to using InnoDB tables?
The disadvantage is that with MySQL there is a file, ibdata1, that
seems to grow endlessly if InnoDB tables are used. See
http://bugs.mysql.com/bug.php?id=1341
We're wondering if we should just convert everything to MyISAM. Any
thoughts?
=====================================
Jim Hu
Associate Professor
Dept. of Biochemistry and Biophysics
2128 TAMU
Texas A&M Univ.
College Station, TX 77843-2128
979-862-4054
What: 1.18 Triage
When: Friday, Nov, 2130UTC
Time zone conversion: http://hexm.de/ap
Where: #wikimedia-dev on freenode
Use http://webchat.freenode.net/?channels=wikimedia-dve
if you don't have an IRC client
With the recent release of 1.18, I want to hold a triage to see if there
are enough issues to have a point release. This Friday, I'll be
covering bugs listed on the tracking bug
https://bugzilla.wikimedia.org/32711 to determine the severity and
number of issues.
If you know of issues not listed on that bug that have shown up in your
1.18 installation, especially regressions in MediaWiki behavior, please
add them.
Hope to see you at the triage!
Mark.
My dear wiki and semantic wiki-ers,
It's my pleasure and honor to announce the next International Semantic
MediaWiki Conference - SMWCon Spring 2012 on April 25-27 in Carlsbad, CA,
USA - Greater San Diego area.
This SMWCon will feature tutorials, keynotes, presentations, demos,
lightening talks, group discussions and/or developer hacking sessions.
For more information, please visit
http://semantic-mediawiki.org/wiki/SMWCon_Spring_2011.
We are calling for presentations and tutorials, please feel free to edit
the conference page if you'd like to contribute.
Thanks,
Jesse
On behalf of the SMWCon Spring 2012 organization committee
Hello,
I am developing an extension to import Word documents to MediaWiki. I've
noticed that Mediawiki is case sensitive. If, for example, import a page with
the title "The future of tv" and re-import the same page with the
corrected title "The future of TV" mediawiki creates two separate pages for
each import.
Is this normal?
How would you solve this problem?
regards,
Ramon
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
Hello,
This morning I downloaded MediaWiki 1.18.0 and attempted to upgrade my
existing wiki from version 1.17.0. Unfortunately,
maintenance/update.php fails when it attempts to create the table
"uploadstash" with a MySQL error:
1064: You have an error in your SQL syntax; check the manual that
corresponds to your MySQL server version for the right syntax to use
near 'TYPE=InnoDB' at line 17 (localhost)
A more detailed snippet from the output of update.php is included below,
beneath my sig. Fortunately, I always backup before performing updates
and I was able to download and update to 1.17.1 just fine. When I
attempted to update from 1.17.1 to 1.18.0 I received the same error. I
am currently running 1.17.1 without problems.
There are also a bunch of XCache related warnings, likely due to my
current XCache configuration. I currently have xcache.var_size = 96M,
so I have no idea why it would be "too small to enable var data
caching". XCache works just fine with the rest of my custom site code.
However, this is a separate issue than the CREATE TABLE problem which
is the real show-stopper for me, so I'm a lot less worried about it.
Some extra relevant info: I'm running Fedora 15, Apache 2.2.21, PHP
5.3.8, and MySQL 5.5.14. My wiki is pretty much vanilla with no
extensions. I have tweaked the MonoBook skin with a few customizations,
but most of my custom skin consists of symbolic links to the MonoBook
files and I make sure to check diffs of my changes against the originals
after each upgrade. My wiki is public but locked down such that only I
can post and edit articles.
Thanks for any help anyone can provide.
- --
Jeff Darlington
General Protection Fault
http://www.gpf-comics.com/
Relevant portion of update.php output:
======================================
Creating uploadstash table...PHP Warning: xcache_get(): xcache.var_size
is either 0 or too small to enable var data caching in
{path-to-wiki}/includes/objectcache/XCacheBagOStuff.php on line 17
PHP Warning: xcache_get(): xcache.var_size is either 0 or too small to
enable var data caching in
{path-to-wiki}/includes/objectcache/XCacheBagOStuff.php on line 17
PHP Warning: xcache_set(): xcache.var_size is either 0 or too small to
enable var data caching in
{path-to-wiki}/includes/objectcache/XCacheBagOStuff.php on line 35
PHP Warning: xcache_get(): xcache.var_size is either 0 or too small to
enable var data caching in
{path-to-wiki}/includes/objectcache/XCacheBagOStuff.php on line 17
PHP Warning: xcache_set(): xcache.var_size is either 0 or too small to
enable var data caching in
{path-to-wiki}/includes/objectcache/XCacheBagOStuff.php on line 35
PHP Warning: xcache_set(): xcache.var_size is either 0 or too small to
enable var data caching in
{path-to-wiki}/includes/objectcache/XCacheBagOStuff.php on line 35
PHP Warning: xcache_unset(): xcache.var_size is either 0 or too small
to enable var data caching in
{path-to-wiki}/includes/objectcache/XCacheBagOStuff.php on line 47
PHP Warning: xcache_unset(): xcache.var_size is either 0 or too small
to enable var data caching in
{path-to-wiki}/includes/objectcache/XCacheBagOStuff.php on line 47
A database query syntax error has occurred.
The last attempted database query was:
"CREATE TABLE `uploadstash` (
us_id int unsigned NOT NULL PRIMARY KEY auto_increment,
us_user int unsigned NOT NULL,
us_key varchar(255) NOT NULL,
us_orig_path varchar(255) NOT NULL,
us_path varchar(255) NOT NULL,
us_source_type varchar(50),
us_timestamp varbinary(14) not null,
us_status varchar(50) not null,
us_size int unsigned NOT NULL,
us_sha1 varchar(31) NOT NULL,
us_mime varchar(255),
us_media_type ENUM("UNKNOWN", "BITMAP", "DRAWING", "AUDIO", "VIDEO",
"MULTIMEDIA", "OFFICE", "TEXT", "EXECUTABLE", "ARCHIVE") default NULL,
us_image_width int unsigned,
us_image_height int unsigned,
us_image_bits smallint unsigned
) TYPE=InnoDB
"
from within function "DatabaseBase::sourceFile(
{path-to-wiki}/maintenance/archives/patch-uploadstash.sql )".
Database returned error "1064: You have an error in your SQL syntax;
check the manual that corresponds to your MySQL server version for the
right syntax to use near 'TYPE=InnoDB' at line 17 (localhost)"
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v2.0.17 (MingW32)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/
iEYEARECAAYFAk7U+04ACgkQVNMIBILmfwHznwCffxKVEDjzaJZEGfOVR9ZEwwWx
89cAoJ+8UdLcNiEFjbOsPEoXmMZseJqP
=eYNd
-----END PGP SIGNATURE-----
Hi,
http://wiki.cn2en.org
installed and set the system to not auto accept registration but there wasn't any link to create an account so I found this:
http://www.mediawiki.org/wiki/Extension:ConfirmAccount
I followed the instructions except for:
Change to the MediaWiki directory and run:
php maintenance/update.php
The link to create account appeared (top right of wiki.cn2en) and when clicked on you can login or 'request one'. Clicking on that took me to a page that I could't see unless logged in but I found a fix - by whitelisting that page in the LocalSettings.php (that's where most of the adding extensions, customising takes place).Now you can view the request page however after entering a username and password you get the error:
Database error
A database query syntax error has occurred. This may indicate a bug in the software. The last attempted database query was:
(SQL query hidden)
from within function "efCheckIfAccountNameIsPending". Database returned error "1146: Table 'cn2en_wiki.account_requests' doesn't exist (localhost)".
So I went back to the mediawiki website and found the extension discussion page, where someone had posted the same problem, but with a fix that I don't know how to do:
http://www.mediawiki.org/wiki/Extension_talk:ConfirmAccount#Database_error
I can see the update file http://wiki.cn2en.org/maintenance/update.php but i don't know how to 'run' it.
_ My hosts are looking into it.
That error is on every page now.
Best regards, Chris
Forwarding this question on behalf of someone from the Public Library of
Science -- please reply both to me and to her, as she's not on the list.
Thanks.
>>> On Tue, Nov 8, 2011 at 10:56 PM, Jennifer Lin <jlin(a)plos.org> wrote:
> >>>> Hi, Daniel. I received your contact info from Peter Binfield.
> >>>>
> >>>>
> >>>>
> >>>> At PLoS, we are extending our ALM app to include Wikipedia citations
> >>>> for PLoS articles, but recently came upon an issue, which we’ve
> >>>> logged with your developers.
> >>>>
> >>>> https://bugzilla.wikimedia.org/show_bug.cgi?id=32026
> >>>>
> >>>>
> >>>>
> >>>> The gist of it is that the Wiki API has a limitation of 100 returns.
> >>>> For PLoS articles that are referenced more than 100 times, we get
> >>>> back a different set of results each time. Some of the same sources
> >>>> are returned, some new ones. Although we have the total number of
> >>>> citations, we do not know if we are capturing all of them, even
if we
> >>>> run the query many times over. Seems likely that there could be a
> >>>> more efficient way of pulling the exhaustive set. The bug report
explains the technical details.
> >>>>
> >>>>
> >>>>
> >>>> Although there was some initial contact with a Wiki developer,
> >>>> activity seems to have stopped. Do you have any suggestions on how
> >>>> to proceed at this point in time? We are very eager to add
Wikipedia
> >>>> as an ALM source given its importance to the research community
at large.
> >>>>
> >>>>
> >>>>
> >>>> Thanks so much, Daniel.
> >>>>
> >>>> Cheers, -jlin
> >>>>
> >>>>
> >>>>
> >>>> Jennifer Lin
> >>>>
> >>>> Public Library of Science, Product Manager
> >>>>
> >>>> (415) 935-2095
> >>>>
> >>>> 1160 Battery Street
> >>>> Koshland Building East, Suite 100
> >>>> San Francisco, CA 94111, USA
--
Sumana Harihareswara
Volunteer Development Coordinator
Wikimedia Foundation