Brian,
thank you very much for that great info!
I noticed that my databases from the new installations of MediaWiki look
like this:
| Field | Type | Collation |
+--------------------------+------------------+-----------+
| user_name | varbinary(255) | NULL |
On the old databases, I'm gonna change all things like varchar(255) to
varbinary(255) via ALTER TABLE user MODIFY user_name varbinary(255);
I will make it so that the old databases look exactly like the new ones
and if I understand it correctly, no special characters will get screwed
up using the command above. If that's the case, I should have done it
right from the beginning :)
And I will change all remaining tables using MyISAM to InnoDB as you
suggested.
Thanks again and cheers,
Till
On 9/22/2016 9:51 PM, Brian Wolff wrote:
btw, an alternative solution might have been to change
the charset to
binary (not utf8_binary or latin1_binary, but just binary). utf8 in
mysql takes 3 bytes per letter (instead of 1 bytes as binary charset
does), which is probably why the limit was hit. Additionally, utf8
charset can't actually encode all utf8 letters (If that matters to
you) but binary can. Of course, switching to innodb is a good idea
too, for a very wide variety of reasons.
--
bawolff
On Thu, Sep 22, 2016 at 3:11 PM, Till Kraemer <info(a)till-kraemer.com> wrote:
> I'm sorry to hear that, Jasmine :( Maybe you can get help on Stack
> Overflow or Server Fault?
>
> With the help of this article at
>
https://confluence.atlassian.com/crowdkb/unable-to-perform-administrative-f…
> I entered:
>
> SELECT * FROM information_schema.TABLES WHERE table_schema =
> 'centralauth' AND table_collation != 'utf8_bin';
>
> ...and I noticed that most of the tables use engine MyISAM.
>
> I changed that using ALTER TABLE globaluser ENGINE=INNODB; and then I
> was able to do things like:
>
> ALTER TABLE globaluser CONVERT TO CHARACTER SET utf8 COLLATE utf8_bin;
>
> Everything works perfectly now.
>
> Have a nice evening!
>
> Thanks and cheers,
>
> Till
>
>
> On 9/22/2016 4:29 PM, Jasmine Smith wrote:
>> Sadly, there is virtually no help for CentralAuth. You'll likely get told to
use a shared users table.
>>
>> I had an issue months ago that no one helped with.
>>
>>> On 22 Sep 2016, at 03:22 pm, "Till Kraemer"
<info(a)till-kraemer.com> wrote:
>>>
>>> P.S.: I also created the user User on the datawiki and tried to rename
>>> him to টিল via Special:RenameUser but that resulted in the same error :/
>>>
>>>
>>>> On 9/21/2016 5:33 PM, Till Kraemer wrote:
>>>> Hi,
>>>>
>>>> I'm running a little pool wiki for files, a data wiki for, well
data,
>>>> and several language wikis (MediaWiki 1.27.1).
>>>>
>>>> Users টিল and Till have accounts on enwiki. I can create a local user
>>>> টিল on poolwiki via createLocalAccount.php with no problems. I also can
>>>> create a local user Till on datawiki without any trouble, but when I try
>>>> to create টিল on datawiki, I'm getting the following error:
>>>>
>>>> A database query error has occurred.
>>>> Query: SELECT
>>>>
gu_id,gu_name,lu_wiki,gu_salt,gu_password,gu_auth_token,gu_locked,gu_hidden,gu_registration,gu_email,gu_email_authenticated,gu_home_db,gu_cas_token
>>>> FROM `globaluser` LEFT OUTER JOIN `localuser` ON ((gu_name=lu_name) AND
>>>> lu_wiki = 'datawiki') WHERE gu_name = 'টিল' LIMIT 1
>>>> Function: CentralAuthUser::loadState
>>>> Error: 1267 Illegal mix of collations (latin1_bin,IMPLICIT) and
>>>> (utf8_general_ci,COERCIBLE) for operation '=' (127.0.0.1)
>>>>
>>>> Table globaluser on database centralauth looks like this:
>>>>
>>>> | Field | Type | Collation |
>>>> +------------------------------+------------------+-------------------+
>>>> | gu_id | int(11) | NULL |
>>>> | gu_name | varchar(255) | latin1_bin |
>>>> | gu_enabled | varchar(14) | latin1_swedish_ci |
>>>> | gu_enabled_method | enum('opt-in', |
latin1_swedish_ci |
>>>> | |'batch','auto', |
|
>>>> | |'admin') |
|
>>>> | gu_home_db | varchar(255) | latin1_bin |
>>>> | gu_email | varchar(255) | latin1_bin |
>>>> | gu_email_authenticated | char(14) | latin1_bin |
>>>> | gu_salt | varchar(16) | latin1_bin |
>>>> | gu_password | tinyblob | NULL |
>>>> | gu_locked | tinyint(1) | NULL |
>>>> | gu_hidden | varbinary(255) | NULL |
>>>> | gu_registration | varchar(14) | latin1_bin |
>>>> | gu_password_reset_key | tinyblob | NULL |
>>>> | gu_password_reset_expiration | varchar(14) | latin1_bin |
>>>> | gu_auth_token | varbinary(32) | NULL |
>>>> | gu_cas_token | int(10) unsigned | NULL |
>>>>
>>>> I tried to do things like ALTER TABLE `globaluser` MODIFY
>>>> `gu_password_reset_expiration` varchar(14) CHARACTER SET utf8 COLLATE
>>>> utf8_bin; but that doesn't work with gu_name:
>>>>
>>>> ERROR 1071 (42000): Specified key was too long; max key length is 1000
bytes
>>>>
>>>> I can do gu_name varchar(200), but I'm probably not supposed to do
that.
>>>>
>>>> I also can change gu_name varchar(255) to utf8_bin in a freshly created
>>>> centralauth database, but not in the one that is already populated with
>>>> my users.
>>>>
>>>> Exporting the old centralauth database, changing gu_name to utf8_bin in
>>>> an editor and importing it, doesn't work either: same error as
mentioned
>>>> above.
>>>>
>>>> How can I fix this?
>>>>
>>>> I'm running MariaDB 10.0.25 which seem to use XtraDB by default.
Would
>>>> it help to switch to InnoDB?
>>>>
>>>> And: my latest language wikis have $wgDBTableOptions =
"ENGINE=InnoDB,
>>>> DEFAULT CHARSET=binary"; so I changed the old wikis from
>>>> $wgDBTableOptions = "TYPE=InnoDB"; in LocalSettings.php to
the
>>>> settings of the new wikis. I don't know if that's a problem.
>>>>
>>>> Aside from the problems mentioned above, should I change every database
>>>> and table of the old wikis from latin1_bin to utf8_bin, so that they are
>>>> in line with the new MediaWiki installations?
>>>>
>>>> Any help is more than welcome!
>>>>
>>>> Thanks and cheers,
>>>>
>>>> Till