Last night we had about 59 new accounts created on our Wiki. They then proceeded to do multiple edits to about 40 or so different pages. Each page had about 15-20 changes made.
The weird thing is that the last spider/bot to visit each page erased all the stuff that the others had created, leaving each page (from what I can tell so far) as it was originally.
The sites they were injecting into the pages were: http://WTHP5.disney.com
Now for my questions:
I had enabled "registered users can only edit", but this didn't help because they obviously automated this process. Is there something stronger I can do while still enabling the spirit of the Wiki?
Some of the pages won't seem to revert back to my last edit. Can I somehow completely delete any changes they made to the system and get their record of touching the pages out?
Is there a faster way to revert a bunch of pages at once? This is taking forever to read each page and verify things are OK.
I'm pretty disheartened by this. If it continues, I'll have to turn off external internet access to our wiki (this is for an academic library). We already see quite a few visits from other libraries, and we have some valuable information to share.
-Tony