(1) [Historic facts. Skip if in hurry] When we made the transit from the Ripuarian Test Wikipdia [ .... ]
I think, we can a exception for this, when this tool has a planed timespace of running and you pledge to not save these passwords in any form.
The relevant part of the tool is only a few lines of php code, verification should be easy. It is history right now; if someone needs it for another transit of the kind, it's available for the asking to be downloaded, or installed on toolserver.
(2) I am planning a tool to 'bulk' insert redirects [ .... ]
mm, I don't think it's a good idea to run a bot in a user-context, because it is harder to block, when it's out of controll. The secound point is, that it fake the edit-count of a user.
So if you realy like to run this tool, please use another server.
I feel good discussing ideas a bit before implementing potentially weak solutions. :-) I think the edit count issue already rules my aproach out.
Control was imho not such a big issue, I've already been planning to drastically limit the bot to so many edits per hour, per user, per ip, etc. independent of each other. Since a simple regexp creating 'all possible combinations of ...' on a mid-size word already can easily lead to ten thousands of expansions, and almost all of those are never used in real life, carefully choosen limitations and rejects are needed.
Anyway, automatically and systematically creating redirects is an issue to be solved somehow. Just see, we are in a dialect continuum betwen German [[de]], Low German (Plattdüütsch) [[nds]], Netherlands [[nl]], Limburgs [[li]], Luxemburgish [[lb]], and Pälzsch [[pfl]], only the latter has not an own Wikipedia yet. For example, we've got most of the spelling variants of the month names from either, plus several in between, times 366 day articles -- it would be insane to leave all these redirects to be manually typed by volonteer editors. Not having them is creating complaints, and duplications, so that is not an option either.
Now I am thinking of having 'job queue' pages for a bot editing under his own name, and admins having to move requests from a "suggested" (unprotected or semiprotected) to an "accepted" (protected) page, while the bot moves them from "accepted" to "done", or just removes them from "accepted".
Users should be given the chance to enter request lists and/or regexps, thus they can be credited properly, but usually make one edit only for a set of bot-created redirects.
Two ways of organizing come to my mind:
A: have users put their suggestions on [[Target_Article/redirects]] which the bot detects when created or altered, which can be kept forever, or B: have users put suggestions on [[User:redirbot/suggesions]] e.g. in a format like: == year month == * [[Target Article]] [[alias1]] [[alias2]] ... [aliasN]] (alias* can be regexps) where entries are removed, when they become obsolete for whichever reason. In either case the associated talk pages can be used to solve questions.
I currently prefer approach B. (Leaving less drivel behind) Ideas and comments?
Thank you.
Purodha