I can view articles, and even use the edit window's preview function, but I can't save any changes.
On 16/01/2008, Elias Friedman elipongo@gmail.com wrote:
I can view articles, and even use the edit window's preview function, but I can't save any changes.
Obviously the devs are trying the simplest and most obvious fundraising technique ;-)
Ahh, stuff appears to be b0rken. I get database errors on diffs. I assume all are scrambling to action as we write.
- d.
On 16/01/2008, Elias Friedman elipongo@gmail.com wrote:
I can view articles, and even use the edit window's preview function, but I can't save any changes.
Large page deleted, backend servers snarled up a bit. I think it's at the "solved but choking it all down" stage now, but you may not want to quote that :-)
I just got a DB error on trying to log in: "User::invalidateCache". MySQL returned error "1114: The table 'user' is full (10.0.0.235)".
Not good...
On Jan 16, 2008 2:43 PM, Andrew Gray shimgray@gmail.com wrote:
On 16/01/2008, Elias Friedman elipongo@gmail.com wrote:
I can view articles, and even use the edit window's preview function, but I can't save any changes.
Large page deleted, backend servers snarled up a bit. I think it's at the "solved but choking it all down" stage now, but you may not want to quote that :-)
--
- Andrew Gray andrew.gray@dunelm.org.uk
WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: http://lists.wikimedia.org/mailman/listinfo/wikien-l
On Jan 16, 2008 5:43 PM, Andrew Gray shimgray@gmail.com wrote:
On 16/01/2008, Elias Friedman elipongo@gmail.com wrote:
I can view articles, and even use the edit window's preview function,
but I
can't save any changes.
Large page deleted, backend servers snarled up a bit. I think it's at the "solved but choking it all down" stage now, but you may not want to quote that :-)
--
- Andrew Gray
andrew.gray@dunelm.org.uk
Got onto IRC, I guess someone posted a virus to the sandbox and an admin tried to delete it. I guess that kind of annoys the servers when something with as huge a revision history as the sandbox gets deleted.
On 16/01/2008, Elias Friedman elipongo@gmail.com wrote:
Got onto IRC, I guess someone posted a virus to the sandbox and an admin tried to delete it. I guess that kind of annoys the servers when something with as huge a revision history as the sandbox gets deleted.
Oh for deletion of individual revisions ... it's easier to zap a rev by oversight than by just plain deletion.
- d.
On 16/01/2008, David Gerard dgerard@gmail.com wrote:
On 16/01/2008, Elias Friedman elipongo@gmail.com wrote:
Got onto IRC, I guess someone posted a virus to the sandbox and an admin tried to delete it. I guess that kind of annoys the servers when
something
with as huge a revision history as the sandbox gets deleted.
Oh for deletion of individual revisions ... it's easier to zap a rev by oversight than by just plain deletion.
- d.
WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: http://lists.wikimedia.org/mailman/listinfo/wikien-l
Yeah, no kidding :)
I'm thinking the sandbox should be completely wiped every night to prevent exactly this.
On Jan 16, 2008 3:05 PM, David Gerard dgerard@gmail.com wrote:
On 16/01/2008, Elias Friedman elipongo@gmail.com wrote:
Got onto IRC, I guess someone posted a virus to the sandbox and an admin tried to delete it. I guess that kind of annoys the servers when
something
with as huge a revision history as the sandbox gets deleted.
Oh for deletion of individual revisions ... it's easier to zap a rev by oversight than by just plain deletion.
- d.
WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: http://lists.wikimedia.org/mailman/listinfo/wikien-l
On 16/01/2008, Josh Gordon user.jpgordon@gmail.com wrote:
I'm thinking the sandbox should be completely wiped every night to prevent exactly this.
Delete and recreate? Sounds good. It's not like it's a valuable historical record. Or, move and recreate, then delete at leisure, as people don't like bots having admin powers.
- d.
On 16/01/2008, David Gerard dgerard@gmail.com wrote:
On 16/01/2008, Josh Gordon user.jpgordon@gmail.com wrote:
I'm thinking the sandbox should be completely wiped every night to prevent exactly this.
Delete and recreate? Sounds good. It's not like it's a valuable historical record. Or, move and recreate, then delete at leisure, as people don't like bots having admin powers.
The bot could just add a special edition speedy tag and wait for the next admin to come along.
Why does it not simply have a "Create your own sandbox" box (like at WP:RFCU) or redirect into an individual [[User:Myusername/Sandbox]]?
On Jan 16, 2008 6:24 PM, Thomas Dalton thomas.dalton@gmail.com wrote:
On 16/01/2008, David Gerard dgerard@gmail.com wrote:
On 16/01/2008, Josh Gordon user.jpgordon@gmail.com wrote:
I'm thinking the sandbox should be completely wiped every night to prevent exactly this.
Delete and recreate? Sounds good. It's not like it's a valuable historical record. Or, move and recreate, then delete at leisure, as people don't like bots having admin powers.
The bot could just add a special edition speedy tag and wait for the next admin to come along.
WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: http://lists.wikimedia.org/mailman/listinfo/wikien-l
On 16/01/2008, Nathan nawrich@gmail.com wrote:
Why does it not simply have a "Create your own sandbox" box (like at WP:RFCU) or redirect into an individual [[User:Myusername/Sandbox]]?
On Jan 16, 2008 6:24 PM, Thomas Dalton thomas.dalton@gmail.com wrote:
On 16/01/2008, David Gerard dgerard@gmail.com wrote:
On 16/01/2008, Josh Gordon user.jpgordon@gmail.com wrote:
I'm thinking the sandbox should be completely wiped every night to
prevent
exactly this.
Delete and recreate? Sounds good. It's not like it's a valuable historical record. Or, move and recreate, then delete at leisure, as people don't like bots having admin powers.
The bot could just add a special edition speedy tag and wait for the next admin to come along.
WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: http://lists.wikimedia.org/mailman/listinfo/wikien-l
WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: http://lists.wikimedia.org/mailman/listinfo/wikien-l
New editors would be better off using an offical one.
Alex wrote:
On 16/01/2008, Nathan nawrich@gmail.com wrote:
Why does it not simply have a "Create your own sandbox" box (like at WP:RFCU) or redirect into an individual [[User:Myusername/Sandbox]]?
New editors would be better off using an offical one.
I disagree. I found the official sandbox perfectly useless when I started editing Wikipedia, and that was several years ago. The first thing I did was find out how to set up my own sandbox, so I could try some test edits in peace, without other people's edits (and, in particular, blankings) interfering with my own. Such contention can only be worse (much worse) by now.
Agreed - an Etch-a-Sketch that anyone can shake isn't nearly so much fun as having one all to yourself, and only if people haunt the sandbox and point out problems to people is it of any use to have a shared one (that I can see).
On Jan 16, 2008 6:48 PM, Steve Summit scs@eskimo.com wrote:
Alex wrote:
On 16/01/2008, Nathan nawrich@gmail.com wrote:
Why does it not simply have a "Create your own sandbox" box (like at WP:RFCU) or redirect into an individual [[User:Myusername/Sandbox]]?
New editors would be better off using an offical one.
I disagree. I found the official sandbox perfectly useless when I started editing Wikipedia, and that was several years ago. The first thing I did was find out how to set up my own sandbox, so I could try some test edits in peace, without other people's edits (and, in particular, blankings) interfering with my own. Such contention can only be worse (much worse) by now.
WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: http://lists.wikimedia.org/mailman/listinfo/wikien-l
Discussion going on at:
http://en.wikipedia.org/wiki/Wikipedia:Village_pump_%28technical%29#Deletion...
I believe (but could be wrong) that the sandbox gets "cleaned" right away when speedy deletion tags are added, so unless something gets reprogrammed, that would not work.
On Jan 16, 2008 7:23 PM, Elias Friedman elipongo@gmail.com wrote:
Discussion going on at:
http://en.wikipedia.org/wiki/Wikipedia:Village_pump_%28technical%29#Deletion...
-- Elias Friedman A.S., EMT-P President Congregation Knesseth Israel http://www.ellingtonshul.org/
elipongo@gmail.com http://elipongo.blogspot.com/ _______________________________________________ WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: http://lists.wikimedia.org/mailman/listinfo/wikien-l
On Jan 17, 2008 10:53 AM, Nathan nawrich@gmail.com wrote:
Agreed - an Etch-a-Sketch that anyone can shake isn't nearly so much fun as having one all to yourself, and only if people haunt the sandbox and point out problems to people is it of any use to have a shared one (that I can see).
That's nice for people with accounts; the "official" sandbox is useful in a try-before-you-buy sense.
Steve Summit wrote:
Alex wrote:
On 16/01/2008, Nathan nawrich@gmail.com wrote:
Why does it not simply have a "Create your own sandbox" box (like at WP:RFCU) or redirect into an individual [[User:Myusername/Sandbox]]?
New editors would be better off using an offical one.
I disagree. I found the official sandbox perfectly useless when I started editing Wikipedia, and that was several years ago. The first thing I did was find out how to set up my own sandbox, so I could try some test edits in peace, without other people's edits (and, in particular, blankings) interfering with my own. Such contention can only be worse (much worse) by now.
Of course it's useless, but we need to keep it as a test to figure out if they're smart enough to build a sandbox of their own where they can say, "Hello World."
Ec
On Wed, 2008-01-16 at 17:51 -0800, Ray Saintonge wrote:
Steve Summit wrote:
Alex wrote:
On 16/01/2008, Nathan nawrich@gmail.com wrote:
Why does it not simply have a "Create your own sandbox" box (like at WP:RFCU) or redirect into an individual [[User:Myusername/Sandbox]]?
New editors would be better off using an offical one.
I disagree. I found the official sandbox perfectly useless when I started editing Wikipedia, and that was several years ago. The first thing I did was find out how to set up my own sandbox, so I could try some test edits in peace, without other people's edits (and, in particular, blankings) interfering with my own. Such contention can only be worse (much worse) by now.
Of course it's useless, but we need to keep it as a test to figure out if they're smart enough to build a sandbox of their own where they can say, "Hello World."
I think the idea of limiting the amount of reversions saved is fabulous and whoever (developers?) can do this should really get to working on this. Anybody suggested it elsewhere yet?
Ian [[User:Poeloq]]
On 16/01/2008, Steve Summit scs@eskimo.com wrote:
Alex wrote:
On 16/01/2008, Nathan nawrich@gmail.com wrote:
Why does it not simply have a "Create your own sandbox" box (like at WP:RFCU) or redirect into an individual [[User:Myusername/Sandbox]]?
New editors would be better off using an offical one.
I disagree. I found the official sandbox perfectly useless when I started editing Wikipedia, and that was several years ago. The first thing I did was find out how to set up my own sandbox, so I could try some test edits in peace, without other people's edits (and, in particular, blankings) interfering with my own. Such contention can only be worse (much worse) by now.
Lots of editors set up their own sandbox, but the current sandbox is still used by n00bs. I suppose a "set up your own" link would be a nice thing to put there, of course.
- d.
On 1/16/08, David Gerard dgerard@gmail.com wrote:
On 16/01/2008, Josh Gordon user.jpgordon@gmail.com wrote:
I'm thinking the sandbox should be completely wiped every night to prevent exactly this.
Delete and recreate? Sounds good. It's not like it's a valuable historical record. Or, move and recreate, then delete at leisure...
Somebody once told me the number of incoming links (which must change color) also factors into the amount of disruption when a page is deleted. Is this true or would the latter issue be (calmly) handled by the job queueueue?
—C.W.
I agree. What is the reason to keep old sandbox edit history?
On Jan 16, 2008 6:10 PM, Josh Gordon user.jpgordon@gmail.com wrote:
I'm thinking the sandbox should be completely wiped every night to prevent exactly this.
On Jan 16, 2008 3:05 PM, David Gerard dgerard@gmail.com wrote:
On 16/01/2008, Elias Friedman elipongo@gmail.com wrote:
Got onto IRC, I guess someone posted a virus to the sandbox and an
admin
tried to delete it. I guess that kind of annoys the servers when
something
with as huge a revision history as the sandbox gets deleted.
Oh for deletion of individual revisions ... it's easier to zap a rev by oversight than by just plain deletion.
- d.
WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: http://lists.wikimedia.org/mailman/listinfo/wikien-l
-- --jpgordon ∇∆∇∆ _______________________________________________ WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: http://lists.wikimedia.org/mailman/listinfo/wikien-l
On Wed, Jan 16, 2008 at 03:10:03PM -0800, Josh Gordon wrote:
I'm thinking the sandbox should be completely wiped every night to prevent exactly this.
Whose night. Wikipedia never sleeps.
On Jan 16, 2008 3:05 PM, David Gerard dgerard@gmail.com wrote:
On 16/01/2008, Elias Friedman elipongo@gmail.com wrote:
Got onto IRC, I guess someone posted a virus to the sandbox and an admin tried to delete it. I guess that kind of annoys the servers when
something
with as huge a revision history as the sandbox gets deleted.
Oh for deletion of individual revisions ... it's easier to zap a rev by oversight than by just plain deletion.
- d.
WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: http://lists.wikimedia.org/mailman/listinfo/wikien-l
-- --jpgordon ???????????? _______________________________________________ WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: http://lists.wikimedia.org/mailman/listinfo/wikien-l
On Thu, 2008-01-17 at 14:25 +1100, Brian Salter-Duke wrote:
On Wed, Jan 16, 2008 at 03:10:03PM -0800, Josh Gordon wrote:
I'm thinking the sandbox should be completely wiped every night to prevent exactly this.
Whose night. Wikipedia never sleeps.
Good point, but I think you will find that it doesn't really matter when it gets wiped. Do we even need revisions at all apart from say 2 or 3 for the purpose of feature demonstration. However if you are really asking for a time: Wiping at 0:00 UTC would make the most sense...
Ian [[User:Poeloq]]
On Jan 16, 2008 9:42 PM, Ian A Holton poeloq@gmail.com wrote:
On Thu, 2008-01-17 at 14:25 +1100, Brian Salter-Duke wrote:
On Wed, Jan 16, 2008 at 03:10:03PM -0800, Josh Gordon wrote:
I'm thinking the sandbox should be completely wiped every night to
prevent
exactly this.
Whose night. Wikipedia never sleeps.
Good point, but I think you will find that it doesn't really matter when it gets wiped. Do we even need revisions at all apart from say 2 or 3 for the purpose of feature demonstration. However if you are really asking for a time: Wiping at 0:00 UTC would make the most sense...
Ian [[User:Poeloq]]
Adding special cases to the code to restrict the number of revisions that a particular page will save...messy and possibly difficult.
Adding a bot to move a page, create a new version of the original page, and add a speedy tag to the moved page...relatively clean, not too hard to program, and keeps the main code pristine.
On Wed, 2008-01-16 at 22:19 -0600, Rich Holton wrote:
On Jan 16, 2008 9:42 PM, Ian A Holton poeloq@gmail.com wrote:
On Thu, 2008-01-17 at 14:25 +1100, Brian Salter-Duke wrote:
On Wed, Jan 16, 2008 at 03:10:03PM -0800, Josh Gordon wrote:
I'm thinking the sandbox should be completely wiped every night to
prevent
exactly this.
Whose night. Wikipedia never sleeps.
Good point, but I think you will find that it doesn't really matter when it gets wiped. Do we even need revisions at all apart from say 2 or 3 for the purpose of feature demonstration. However if you are really asking for a time: Wiping at 0:00 UTC would make the most sense...
Ian [[User:Poeloq]]
Adding special cases to the code to restrict the number of revisions that a particular page will save...messy and possibly difficult.
Adding a bot to move a page, create a new version of the original page, and add a speedy tag to the moved page...relatively clean, not too hard to program, and keeps the main code pristine.
True, a bot would be better. It wouldn't need admin rights, would it?
Ian [[User:Poeloq]]
To move, create, and tag for speedy, the bot should *not* need the sysop bit.
On Jan 16, 2008 11:22 PM, Ian A Holton poeloq@gmail.com wrote:
On Wed, 2008-01-16 at 22:19 -0600, Rich Holton wrote:
On Jan 16, 2008 9:42 PM, Ian A Holton poeloq@gmail.com wrote:
On Thu, 2008-01-17 at 14:25 +1100, Brian Salter-Duke wrote:
On Wed, Jan 16, 2008 at 03:10:03PM -0800, Josh Gordon wrote:
I'm thinking the sandbox should be completely wiped every night to
prevent
exactly this.
Whose night. Wikipedia never sleeps.
Good point, but I think you will find that it doesn't really matter
when
it gets wiped. Do we even need revisions at all apart from say 2 or 3 for the purpose of feature demonstration. However if you are really
asking
for a time: Wiping at 0:00 UTC would make the most sense...
Ian [[User:Poeloq]]
Adding special cases to the code to restrict the number of revisions
that a
particular page will save...messy and possibly difficult.
Adding a bot to move a page, create a new version of the original page,
and
add a speedy tag to the moved page...relatively clean, not too hard to program, and keeps the main code pristine.
True, a bot would be better. It wouldn't need admin rights, would it?
Ian [[User:Poeloq]]
WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: http://lists.wikimedia.org/mailman/listinfo/wikien-l
Rjd0060 wrote:
Ian A Holton wrote:
True, a bot would be better. It wouldn't need admin rights, would it?
To move, create, and tag for speedy, the bot should *not* need the sysop bit.
What's wrong with giving bots sysop access? Are you worried they might rise up and overthrow the human sysops?
-- Tim Starling
On Jan 17, 2008 7:39 PM, Tim Starling tstarling@wikimedia.org wrote:
Rjd0060 wrote:
Ian A Holton wrote:
True, a bot would be better. It wouldn't need admin rights, would it?
To move, create, and tag for speedy, the bot should *not* need the sysop bit.
What's wrong with giving bots sysop access? Are you worried they might rise up and overthrow the human sysops?
-- Tim Starling
Damage control on programming errors, mostly.
George Herbert wrote:
On Jan 17, 2008 7:39 PM, Tim Starling tstarling@wikimedia.org wrote:
Rjd0060 wrote:
Ian A Holton wrote:
True, a bot would be better. It wouldn't need admin rights, would it?
To move, create, and tag for speedy, the bot should *not* need the sysop bit.
What's wrong with giving bots sysop access? Are you worried they might rise up and overthrow the human sysops?
-- Tim Starling
Damage control on programming errors, mostly.
Sounds like nonsense to me. It's not like sysops can do anything irreversible. You're just wasting the time of human sysops by making them do jobs which bots could do.
-- Tim Starling
On Fri, 2008-01-18 at 14:48 +1100, Tim Starling wrote:
George Herbert wrote:
On Jan 17, 2008 7:39 PM, Tim Starling tstarling@wikimedia.org wrote:
Rjd0060 wrote:
Ian A Holton wrote:
True, a bot would be better. It wouldn't need admin rights, would it?
To move, create, and tag for speedy, the bot should *not* need the sysop bit.
What's wrong with giving bots sysop access? Are you worried they might rise up and overthrow the human sysops?
-- Tim Starling
Damage control on programming errors, mostly.
Sounds like nonsense to me. It's not like sysops can do anything irreversible. You're just wasting the time of human sysops by making them do jobs which bots could do.
-- Tim Starling
Tim, a bot that has admin rights and could potentially move wrong pages to wrong locations if it all goes bad is creating more work for humans, not less. It's just a rule I have: Limit the power u give the machines. But maybe that's me just watching to much Matrix ;-)
Ian [[User:Poeloq]]
On Jan 17, 2008 7:48 PM, Tim Starling tstarling@wikimedia.org wrote:
George Herbert wrote:
On Jan 17, 2008 7:39 PM, Tim Starling tstarling@wikimedia.org wrote:
Rjd0060 wrote:
Ian A Holton wrote:
True, a bot would be better. It wouldn't need admin rights, would it?
To move, create, and tag for speedy, the bot should *not* need the sysop bit.
What's wrong with giving bots sysop access? Are you worried they might rise up and overthrow the human sysops?
-- Tim Starling
Damage control on programming errors, mostly.
Sounds like nonsense to me. It's not like sysops can do anything irreversible. You're just wasting the time of human sysops by making them do jobs which bots could do.
Take this up with the en.wp bot auth group.
There are bots with admin bit set, run by a few people, but it's a small subset of the total (and, predictably, where some of the more really heinous problems came from).
While every action is basically reversible, not all actions are practically reversible. While it's unlikely that an admin bot would be doing stuff which could accidentally lead to some of the nightmare bot vandalism attack stuff I've gamed out playing red team, I prefer caution, as do those running en.wp these days.
George Herbert wrote:
Take this up with the en.wp bot auth group.
There are bots with admin bit set, run by a few people, but it's a small subset of the total (and, predictably, where some of the more really heinous problems came from).
While every action is basically reversible, not all actions are practically reversible. While it's unlikely that an admin bot would be doing stuff which could accidentally lead to some of the nightmare bot vandalism attack stuff I've gamed out playing red team, I prefer caution, as do those running en.wp these days.
A properly written bot would be safer than a human. Automation is a good way to avoid common mistakes. Like when a vandal moves a large article to a suspicious title and replaces the contents with nonsense, then tags it for speedy deletion. A human deletes the page and wonders why the servers crash. A bot could check the page move log.
Yes, I know the human could check the page move log too. But computers can process data much faster than humans can, and they don't get bored or lazy. So they can run lots of complex checks every single time and never risk an error due to a moment of inattention.
-- Tim Starling
On Jan 17, 2008 8:33 PM, Tim Starling tstarling@wikimedia.org wrote:
George Herbert wrote:
Take this up with the en.wp bot auth group.
There are bots with admin bit set, run by a few people, but it's a small subset of the total (and, predictably, where some of the more really heinous problems came from).
While every action is basically reversible, not all actions are practically reversible. While it's unlikely that an admin bot would be doing stuff which could accidentally lead to some of the nightmare bot vandalism attack stuff I've gamed out playing red team, I prefer caution, as do those running en.wp these days.
A properly written bot would be safer than a human. Automation is a good way to avoid common mistakes. Like when a vandal moves a large article to a suspicious title and replaces the contents with nonsense, then tags it for speedy deletion. A human deletes the page and wonders why the servers crash. A bot could check the page move log.
Yes, I know the human could check the page move log too. But computers can process data much faster than humans can, and they don't get bored or lazy. So they can run lots of complex checks every single time and never risk an error due to a moment of inattention.
I am a large fan of system administrator automation tools... cfengine, kickstart and jumpstart, (insert long list here). I write scripts and tools at (insertdayjob).
In my observation, the WP interface and how bots are being coded to do stuff lead to more errors than I like to see.
If the bots were as reliable as cfengine, or my disk management scripts and patch audit scripts have been, then there would be no problem.
I mean there is no reason they need it to do those things. I am all for giving a bot sysop access if they need it, and go through RFA. This bot, to preform these functions, does not need the bit, so why give it to it?
On Jan 17, 2008 10:39 PM, Tim Starling tstarling@wikimedia.org wrote:
Rjd0060 wrote:
Ian A Holton wrote:
True, a bot would be better. It wouldn't need admin rights, would it?
To move, create, and tag for speedy, the bot should *not* need the
sysop
bit.
What's wrong with giving bots sysop access? Are you worried they might rise up and overthrow the human sysops?
-- Tim Starling
WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: http://lists.wikimedia.org/mailman/listinfo/wikien-l
On 18/01/2008, Tim Starling tstarling@wikimedia.org wrote:
What's wrong with giving bots sysop access? Are you worried they might rise up and overthrow the human sysops?
More or less. There's lots of paranoia on en:wp about admin bots going batshit in sorcerer's apprentice mode. Though I don't think it's warranted, as *anything* an admin can do is easily reversible except history merges. (Making those *easily* reversible is one for the wishlist.)
- d.
On 18/01/2008, David Gerard dgerard@gmail.com wrote:
On 18/01/2008, Tim Starling tstarling@wikimedia.org wrote:
What's wrong with giving bots sysop access? Are you worried they might rise up and overthrow the human sysops?
More or less. There's lots of paranoia on en:wp about admin bots going batshit in sorcerer's apprentice mode. Though I don't think it's warranted, as *anything* an admin can do is easily reversible except history merges. (Making those *easily* reversible is one for the wishlist.)
But that's not true when bots are involved. A human can only screw up at roughly the same speed as another human can fix it, so it's not a big deal, but a bot can screw up a million times in a few minutes - that's not practically reversible without using another bot to undo it all, which takes a lot of preparation (the bot needs to be written, tested to make sure it's not going to screw things up even more, and approved - that's likely to take a day or so at least).
Personally, I wouldn't object to open source admin bots ("With enough eyes, all bugs are shallow." or whatever the quote it), but closed source ones are too likely to go wrong and are thus too risky (the chance of them going wrong is still quite small, but the potential damage is enormous, so the risk is still high). Also, an open source bot can probably be modified by any programmer to fix its own mistakes quite easily, doing that with a closed source bot requires the author. (So a closed source, supervised bot wouldn't be so bad, but I'd still rather not have them.)
There is also the fact that an admin bot account can be compromised, I think I would be easier for an admin to run the bot in their admin account, one less account to worry about, its not like you need a bot flag for one delete a day.
Chris
On Jan 18, 2008 3:28 AM, Thomas Dalton thomas.dalton@gmail.com wrote:
On 18/01/2008, David Gerard dgerard@gmail.com wrote:
On 18/01/2008, Tim Starling tstarling@wikimedia.org wrote:
What's wrong with giving bots sysop access? Are you worried they might rise up and overthrow the human sysops?
More or less. There's lots of paranoia on en:wp about admin bots going batshit in sorcerer's apprentice mode. Though I don't think it's warranted, as *anything* an admin can do is easily reversible except history merges. (Making those *easily* reversible is one for the wishlist.)
But that's not true when bots are involved. A human can only screw up at roughly the same speed as another human can fix it, so it's not a big deal, but a bot can screw up a million times in a few minutes - that's not practically reversible without using another bot to undo it all, which takes a lot of preparation (the bot needs to be written, tested to make sure it's not going to screw things up even more, and approved - that's likely to take a day or so at least).
Personally, I wouldn't object to open source admin bots ("With enough eyes, all bugs are shallow." or whatever the quote it), but closed source ones are too likely to go wrong and are thus too risky (the chance of them going wrong is still quite small, but the potential damage is enormous, so the risk is still high). Also, an open source bot can probably be modified by any programmer to fix its own mistakes quite easily, doing that with a closed source bot requires the author. (So a closed source, supervised bot wouldn't be so bad, but I'd still rather not have them.)
WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: http://lists.wikimedia.org/mailman/listinfo/wikien-l
Are closed source bots prevalent? Isn't part of the BRFA process evaluation of the underlying code? Any admin bot should probably be relatively slow, and make up for the slowness with long periods of uptime. Some of the paranoia is a bit farfetched - it shouldn't be incredibly difficult to get well designed bots that don't screw up, and notice when they do. It might be exceptional among bots, but it should still be possible. Bot RfA's have been doomed from the outset recently, because most of the !voters don't have the technical skills to evaluate whether or not its well designed (myself included).
On Jan 18, 2008 6:28 AM, Thomas Dalton thomas.dalton@gmail.com wrote:
On 18/01/2008, David Gerard dgerard@gmail.com wrote:
On 18/01/2008, Tim Starling tstarling@wikimedia.org wrote:
What's wrong with giving bots sysop access? Are you worried they might rise up and overthrow the human sysops?
More or less. There's lots of paranoia on en:wp about admin bots going batshit in sorcerer's apprentice mode. Though I don't think it's warranted, as *anything* an admin can do is easily reversible except history merges. (Making those *easily* reversible is one for the wishlist.)
But that's not true when bots are involved. A human can only screw up at roughly the same speed as another human can fix it, so it's not a big deal, but a bot can screw up a million times in a few minutes - that's not practically reversible without using another bot to undo it all, which takes a lot of preparation (the bot needs to be written, tested to make sure it's not going to screw things up even more, and approved - that's likely to take a day or so at least).
Personally, I wouldn't object to open source admin bots ("With enough eyes, all bugs are shallow." or whatever the quote it), but closed source ones are too likely to go wrong and are thus too risky (the chance of them going wrong is still quite small, but the potential damage is enormous, so the risk is still high). Also, an open source bot can probably be modified by any programmer to fix its own mistakes quite easily, doing that with a closed source bot requires the author. (So a closed source, supervised bot wouldn't be so bad, but I'd still rather not have them.)
WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: http://lists.wikimedia.org/mailman/listinfo/wikien-l
Sticking to the issue of a particular bot that would move [[Wikipedia:Sandbox]] to [[Wikipedia:Sandbox/Trash]], and tag is for speedy deletion, or if we really want to maintain it, move it to numbered or dated archives, and then create a new page, we don't need it to have a sysop bit. The one extra speedy a day is well within the administrators work capacity.
On Jan 18, 2008 3:25 PM, Nathan nawrich@gmail.com wrote:
Are closed source bots prevalent? Isn't part of the BRFA process evaluation of the underlying code? Any admin bot should probably be relatively slow, and make up for the slowness with long periods of uptime. Some of the paranoia is a bit farfetched - it shouldn't be incredibly difficult to get well designed bots that don't screw up, and notice when they do. It might be exceptional among bots, but it should still be possible. Bot RfA's have been doomed from the outset recently, because most of the !voters don't have the technical skills to evaluate whether or not its well designed (myself included).
On Jan 18, 2008 6:28 AM, Thomas Dalton thomas.dalton@gmail.com wrote:
On 18/01/2008, David Gerard dgerard@gmail.com wrote:
On 18/01/2008, Tim Starling tstarling@wikimedia.org wrote:
What's wrong with giving bots sysop access? Are you worried they might rise up and overthrow the human sysops?
More or less. There's lots of paranoia on en:wp about admin bots going batshit in sorcerer's apprentice mode. Though I don't think it's warranted, as *anything* an admin can do is easily reversible except history merges. (Making those *easily* reversible is one for the wishlist.)
But that's not true when bots are involved. A human can only screw up at roughly the same speed as another human can fix it, so it's not a big deal, but a bot can screw up a million times in a few minutes - that's not practically reversible without using another bot to undo it all, which takes a lot of preparation (the bot needs to be written, tested to make sure it's not going to screw things up even more, and approved - that's likely to take a day or so at least).
Personally, I wouldn't object to open source admin bots ("With enough eyes, all bugs are shallow." or whatever the quote it), but closed source ones are too likely to go wrong and are thus too risky (the chance of them going wrong is still quite small, but the potential damage is enormous, so the risk is still high). Also, an open source bot can probably be modified by any programmer to fix its own mistakes quite easily, doing that with a closed source bot requires the author. (So a closed source, supervised bot wouldn't be so bad, but I'd still rather not have them.)
WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: http://lists.wikimedia.org/mailman/listinfo/wikien-l
WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: http://lists.wikimedia.org/mailman/listinfo/wikien-l
On Jan 19, 2008 2:11 AM, Martijn Hoekstra martijnhoekstra@gmail.com wrote:
Sticking to the issue of a particular bot that would move [[Wikipedia:Sandbox]] to [[Wikipedia:Sandbox/Trash]], and tag is for speedy deletion, or if we really want to maintain it, move it to numbered or dated archives, and then create a new page, we don't need it to have a sysop bit. The one extra speedy a day is well within the administrators work capacity.
Note that the sandbox is move-protected, and has been so for a very long time. Not that that can't be questioned, but it's something to consider, since if it remains the status quo any bot would need sysop in order to be able to move it (and presumably move-protect the new one each day).
Good point.
Is it a good idea to split of a new discussion from this thread about the proposal to limit sandbox history in this way, or maybe set up some discussion at the village pump?
On Jan 18, 2008 4:19 PM, Stephen Bain stephen.bain@gmail.com wrote:
On Jan 19, 2008 2:11 AM, Martijn Hoekstra martijnhoekstra@gmail.com wrote:
Sticking to the issue of a particular bot that would move [[Wikipedia:Sandbox]] to [[Wikipedia:Sandbox/Trash]], and tag is for speedy deletion, or if we really want to maintain it, move it to numbered or dated archives, and then create a new page, we don't need it to have a sysop bit. The one extra speedy a day is well within the administrators work capacity.
Note that the sandbox is move-protected, and has been so for a very long time. Not that that can't be questioned, but it's something to consider, since if it remains the status quo any bot would need sysop in order to be able to move it (and presumably move-protect the new one each day).
-- Stephen Bain stephen.bain@gmail.com
WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: http://lists.wikimedia.org/mailman/listinfo/wikien-l
Yes, I was thinking this should be moved to the VP also.
On Jan 18, 2008 10:43 AM, Martijn Hoekstra martijnhoekstra@gmail.com wrote:
Good point.
Is it a good idea to split of a new discussion from this thread about the proposal to limit sandbox history in this way, or maybe set up some discussion at the village pump?
On Jan 18, 2008 4:19 PM, Stephen Bain stephen.bain@gmail.com wrote:
On Jan 19, 2008 2:11 AM, Martijn Hoekstra martijnhoekstra@gmail.com
wrote:
Sticking to the issue of a particular bot that would move [[Wikipedia:Sandbox]] to [[Wikipedia:Sandbox/Trash]], and tag is for speedy deletion, or if we really want to maintain it, move it to numbered or dated archives, and then create a new page, we don't need it to have a sysop bit. The one extra speedy a day is well within the administrators work capacity.
Note that the sandbox is move-protected, and has been so for a very long time. Not that that can't be questioned, but it's something to consider, since if it remains the status quo any bot would need sysop in order to be able to move it (and presumably move-protect the new one each day).
-- Stephen Bain stephen.bain@gmail.com
WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: http://lists.wikimedia.org/mailman/listinfo/wikien-l
WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: http://lists.wikimedia.org/mailman/listinfo/wikien-l
I'm usualy not any good at formulating those. Anyone willing to stepup to propose something?
On Jan 18, 2008 5:06 PM, Rjd0060 rjd0060.wiki@gmail.com wrote:
Yes, I was thinking this should be moved to the VP also.
On Jan 18, 2008 10:43 AM, Martijn Hoekstra martijnhoekstra@gmail.com
wrote:
Good point.
Is it a good idea to split of a new discussion from this thread about the proposal to limit sandbox history in this way, or maybe set up some discussion at the village pump?
On Jan 18, 2008 4:19 PM, Stephen Bain stephen.bain@gmail.com wrote:
On Jan 19, 2008 2:11 AM, Martijn Hoekstra martijnhoekstra@gmail.com
wrote:
Sticking to the issue of a particular bot that would move [[Wikipedia:Sandbox]] to [[Wikipedia:Sandbox/Trash]], and tag is for speedy deletion, or if we really want to maintain it, move it to numbered or dated archives, and then create a new page, we don't need it to have a sysop bit. The one extra speedy a day is well within the administrators work capacity.
Note that the sandbox is move-protected, and has been so for a very long time. Not that that can't be questioned, but it's something to consider, since if it remains the status quo any bot would need sysop in order to be able to move it (and presumably move-protect the new one each day).
-- Stephen Bain stephen.bain@gmail.com
WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: http://lists.wikimedia.org/mailman/listinfo/wikien-l
WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: http://lists.wikimedia.org/mailman/listinfo/wikien-l
-- Rjd0060 rjd0060.wiki@gmail.com
WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: http://lists.wikimedia.org/mailman/listinfo/wikien-l
Martijn Hoekstra wrote:
Is it a good idea to split of a new discussion from this thread about the proposal to limit sandbox history in this way, or maybe set up some discussion at the village pump?
Perhaps it would be a good idea to have a special class of users with SANDBOX rights for this. Admins could set up committees to determine who is eligible for these rights. ;-)
Ec
On the idea of deleting the sandbox on a regular basis, two thoughts:
1. Firstly, would it not stand to reason that the huge process involved with deleting the sandbox would still have to be called, even if the history wasn't viewable to non-administrators? I'm unsure of the MediaWiki database layout, but if there are separate SQL tables for the content of a revision, and its required permissions for viewing (public, sysop., etc...) then technically deleting the sandbox each night would be pointless: revisions deleted by administrators are simply hidden from public view, rather than actually dropped from the database. Then again, I may just be raving: I don't work with MW's internal operations that much. 2. Secondly, and if my thoughts above are just plain nonsense, then would it be useful to have an +sysop-ped bot account (á la the RedirectDeletion bot currently running) to perform this function? It would enable us to get along with the job of actually writing the encyclopedia, rather than tying the community up with yet another mundane-but-necessary job.
Anthony
User:AGK en.wikipedia.org
On 18/01/2008, Ray Saintonge saintonge@telus.net wrote:
Martijn Hoekstra wrote:
Is it a good idea to split of a new discussion from this thread about the proposal to limit sandbox history in this way, or maybe set up some discussion at the village pump?
Perhaps it would be a good idea to have a special class of users with SANDBOX rights for this. Admins could set up committees to determine who is eligible for these rights. ;-)
Ec
WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: http://lists.wikimedia.org/mailman/listinfo/wikien-l
In true too-impatient-to-read-the-entire-thread style, I've missed the fact that somebody has already brainstormed the idea of an admin. bot to delete the sandbox.
Ignore my last :)
AGK
On 18/01/2008, AGK agkwiki@googlemail.com wrote:
On the idea of deleting the sandbox on a regular basis, two thoughts:
- Firstly, would it not stand to reason that the huge process
involved with deleting the sandbox would still have to be called, even if the history wasn't viewable to non-administrators? I'm unsure of the MediaWiki database layout, but if there are separate SQL tables for the content of a revision, and its required permissions for viewing (public, sysop., etc...) then technically deleting the sandbox each night would be pointless: revisions deleted by administrators are simply hidden from public view, rather than actually dropped from the database. Then again, I may just be raving: I don't work with MW's internal operations that much. 2. Secondly, and if my thoughts above are just plain nonsense, then would it be useful to have an +sysop-ped bot account ( á la the RedirectDeletion bot currently running) to perform this function? It would enable us to get along with the job of actually writing the encyclopedia, rather than tying the community up with yet another mundane-but-necessary job.
Anthony
User:AGK en.wikipedia.org
On 18/01/2008, Ray Saintonge saintonge@telus.net wrote:
Martijn Hoekstra wrote:
Is it a good idea to split of a new discussion from this thread about the proposal to limit sandbox history in this way, or maybe set up some discussion at the village pump?
Perhaps it would be a good idea to have a special class of users with SANDBOX rights for this. Admins could set up committees to determine who is eligible for these rights. ;-)
Ec
WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: http://lists.wikimedia.org/mailman/listinfo/wikien-l
I'm sure some of the keen programmers around would like to see the bot code for any such sysop-based bot that might hit BRFA just to look for any open errors or programming holes in the code. But for the unfortunate bots, we always have access to the tools we need to remove it.
Another idea is have a Wikipage that has the bot controls in it, and have it full protected so that admins can start and stop the bot whenever a problem occurs. e.g. BotName looks at [[User:BotName/controls]] and sees that the param in the edit box is "botstatus=on;" and then continues its duties at the sandbox. If it sees "botstatus=off;" it kills the process altogether and waits a certain period before trying again.
I've seen it around, just cannot remember where I found it ;)
- E
-------------------------------------------------- From: "Nathan" nawrich@gmail.com Sent: Saturday, January 19, 2008 12:25 AM To: "English Wikipedia" wikien-l@lists.wikimedia.org Subject: Re: [WikiEN-l] Servers down?
Are closed source bots prevalent? Isn't part of the BRFA process evaluation of the underlying code? Any admin bot should probably be relatively slow, and make up for the slowness with long periods of uptime. Some of the paranoia is a bit farfetched - it shouldn't be incredibly difficult to get well designed bots that don't screw up, and notice when they do. It might be exceptional among bots, but it should still be possible. Bot RfA's have been doomed from the outset recently, because most of the !voters don't have the technical skills to evaluate whether or not its well designed (myself included).
On Jan 18, 2008 6:28 AM, Thomas Dalton thomas.dalton@gmail.com wrote:
On 18/01/2008, David Gerard dgerard@gmail.com wrote:
On 18/01/2008, Tim Starling tstarling@wikimedia.org wrote:
What's wrong with giving bots sysop access? Are you worried they might rise up and overthrow the human sysops?
More or less. There's lots of paranoia on en:wp about admin bots going batshit in sorcerer's apprentice mode. Though I don't think it's warranted, as *anything* an admin can do is easily reversible except history merges. (Making those *easily* reversible is one for the wishlist.)
But that's not true when bots are involved. A human can only screw up at roughly the same speed as another human can fix it, so it's not a big deal, but a bot can screw up a million times in a few minutes - that's not practically reversible without using another bot to undo it all, which takes a lot of preparation (the bot needs to be written, tested to make sure it's not going to screw things up even more, and approved - that's likely to take a day or so at least).
Personally, I wouldn't object to open source admin bots ("With enough eyes, all bugs are shallow." or whatever the quote it), but closed source ones are too likely to go wrong and are thus too risky (the chance of them going wrong is still quite small, but the potential damage is enormous, so the risk is still high). Also, an open source bot can probably be modified by any programmer to fix its own mistakes quite easily, doing that with a closed source bot requires the author. (So a closed source, supervised bot wouldn't be so bad, but I'd still rather not have them.)
WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: http://lists.wikimedia.org/mailman/listinfo/wikien-l
WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: http://lists.wikimedia.org/mailman/listinfo/wikien-l
Somebody once told me the number of incoming links (which must change color) also factors into the amount of disruption when a page is deleted. Is this true or would the latter issue be (calmly) handled by the job queueueue?
If the bot was to move the page first, then delete it this should not happen.
-Chris
On Jan 19, 2008 6:28 AM, James R. e.wikipedia@gmail.com wrote:
I'm sure some of the keen programmers around would like to see the bot code for any such sysop-based bot that might hit BRFA just to look for any open errors or programming holes in the code. But for the unfortunate bots, we always have access to the tools we need to remove it.
Another idea is have a Wikipage that has the bot controls in it, and have it full protected so that admins can start and stop the bot whenever a problem occurs. e.g. BotName looks at [[User:BotName/controls]] and sees that the param in the edit box is "botstatus=on;" and then continues its duties at the sandbox. If it sees "botstatus=off;" it kills the process altogether and waits a certain period before trying again.
I've seen it around, just cannot remember where I found it ;)
- E
From: "Nathan" nawrich@gmail.com Sent: Saturday, January 19, 2008 12:25 AM To: "English Wikipedia" wikien-l@lists.wikimedia.org Subject: Re: [WikiEN-l] Servers down?
Are closed source bots prevalent? Isn't part of the BRFA process evaluation of the underlying code? Any admin bot should probably be relatively slow, and make up for the slowness with long periods of uptime. Some of the paranoia is a bit farfetched - it shouldn't be incredibly difficult to get well designed bots that don't screw up, and notice when they do. It might be exceptional among bots, but it should still be possible. Bot RfA's have been doomed from the outset recently, because most of the !voters don't have the technical skills to evaluate whether or not its well designed (myself included).
On Jan 18, 2008 6:28 AM, Thomas Dalton thomas.dalton@gmail.com wrote:
On 18/01/2008, David Gerard dgerard@gmail.com wrote:
On 18/01/2008, Tim Starling tstarling@wikimedia.org wrote:
What's wrong with giving bots sysop access? Are you worried they might rise up and overthrow the human sysops?
More or less. There's lots of paranoia on en:wp about admin bots
going
batshit in sorcerer's apprentice mode. Though I don't think it's warranted, as *anything* an admin can do is easily reversible except history merges. (Making those *easily* reversible is one for the wishlist.)
But that's not true when bots are involved. A human can only screw up at roughly the same speed as another human can fix it, so it's not a big deal, but a bot can screw up a million times in a few minutes - that's not practically reversible without using another bot to undo it all, which takes a lot of preparation (the bot needs to be written, tested to make sure it's not going to screw things up even more, and approved - that's likely to take a day or so at least).
Personally, I wouldn't object to open source admin bots ("With enough eyes, all bugs are shallow." or whatever the quote it), but closed source ones are too likely to go wrong and are thus too risky (the chance of them going wrong is still quite small, but the potential damage is enormous, so the risk is still high). Also, an open source bot can probably be modified by any programmer to fix its own mistakes quite easily, doing that with a closed source bot requires the author. (So a closed source, supervised bot wouldn't be so bad, but I'd still rather not have them.)
WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: http://lists.wikimedia.org/mailman/listinfo/wikien-l
WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: http://lists.wikimedia.org/mailman/listinfo/wikien-l
WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: http://lists.wikimedia.org/mailman/listinfo/wikien-l
Again?
On Jan 18, 2008 10:10 PM, Christopher Grant chrisgrantmail@gmail.com wrote:
Somebody once told me the number of incoming links (which must change color) also factors into the amount of disruption when a page is deleted. Is this true or would the latter issue be (calmly) handled by the job queueueue?
If the bot was to move the page first, then delete it this should not happen.
-Chris
On Jan 19, 2008 6:28 AM, James R. e.wikipedia@gmail.com wrote:
I'm sure some of the keen programmers around would like to see the bot code for any such sysop-based bot that might hit BRFA just to look for any open errors or programming holes in the code. But for the unfortunate bots, we always have access to the tools we need to remove it.
Another idea is have a Wikipage that has the bot controls in it, and have it full protected so that admins can start and stop the bot whenever a problem occurs. e.g. BotName looks at [[User:BotName/controls]] and sees that the param in the edit box is "botstatus=on;" and then continues its duties at the sandbox. If it sees "botstatus=off;" it kills the process altogether and waits a certain period before trying again.
I've seen it around, just cannot remember where I found it ;)
- E
From: "Nathan" nawrich@gmail.com Sent: Saturday, January 19, 2008 12:25 AM To: "English Wikipedia" wikien-l@lists.wikimedia.org Subject: Re: [WikiEN-l] Servers down?
Are closed source bots prevalent? Isn't part of the BRFA process evaluation of the underlying code? Any admin bot should probably be relatively slow, and make up for the slowness with long periods of uptime. Some of the paranoia is a bit farfetched - it shouldn't be incredibly difficult to get well designed bots that don't screw up, and notice when they do. It might be exceptional among bots, but it should still be possible. Bot RfA's have been doomed from the outset recently, because most of the !voters don't have the technical skills to evaluate whether or not its well designed (myself included).
On Jan 18, 2008 6:28 AM, Thomas Dalton thomas.dalton@gmail.com wrote:
On 18/01/2008, David Gerard dgerard@gmail.com wrote:
On 18/01/2008, Tim Starling tstarling@wikimedia.org wrote:
What's wrong with giving bots sysop access? Are you worried they might rise up and overthrow the human sysops?
More or less. There's lots of paranoia on en:wp about admin bots
going
batshit in sorcerer's apprentice mode. Though I don't think it's warranted, as *anything* an admin can do is easily reversible except history merges. (Making those *easily* reversible is one for the wishlist.)
But that's not true when bots are involved. A human can only screw up at roughly the same speed as another human can fix it, so it's not a big deal, but a bot can screw up a million times in a few minutes - that's not practically reversible without using another bot to undo it all, which takes a lot of preparation (the bot needs to be written, tested to make sure it's not going to screw things up even more, and approved - that's likely to take a day or so at least).
Personally, I wouldn't object to open source admin bots ("With enough eyes, all bugs are shallow." or whatever the quote it), but closed source ones are too likely to go wrong and are thus too risky (the chance of them going wrong is still quite small, but the potential damage is enormous, so the risk is still high). Also, an open source bot can probably be modified by any programmer to fix its own mistakes quite easily, doing that with a closed source bot requires the author. (So a closed source, supervised bot wouldn't be so bad, but I'd still rather not have them.)
WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: http://lists.wikimedia.org/mailman/listinfo/wikien-l
WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: http://lists.wikimedia.org/mailman/listinfo/wikien-l
WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: http://lists.wikimedia.org/mailman/listinfo/wikien-l
WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: http://lists.wikimedia.org/mailman/listinfo/wikien-l
This does get tiring, especially when it's been >20 times this year. I suspect we're all doomed to the eternal *sigh* - this problem doesn't seem to be improving, especially in Europe.
Anthony
User:AGK en.wikipedia.org
On 20/01/2008, Nathan nawrich@gmail.com wrote:
Again?
On Jan 18, 2008 10:10 PM, Christopher Grant chrisgrantmail@gmail.com wrote:
Somebody once told me the number of incoming links (which must change color) also factors into the amount of disruption when a page is deleted. Is this true or would the latter issue be (calmly) handled by the job queueueue?
If the bot was to move the page first, then delete it this should not happen.
-Chris
On Jan 19, 2008 6:28 AM, James R. e.wikipedia@gmail.com wrote:
I'm sure some of the keen programmers around would like to see the bot code for any such sysop-based bot that might hit BRFA just to look for any
open
errors or programming holes in the code. But for the unfortunate bots,
we
always have access to the tools we need to remove it.
Another idea is have a Wikipage that has the bot controls in it, and
have
it full protected so that admins can start and stop the bot whenever a problem occurs. e.g. BotName looks at [[User:BotName/controls]] and sees that
the
param in the edit box is "botstatus=on;" and then continues its duties
at
the sandbox. If it sees "botstatus=off;" it kills the process
altogether
and waits a certain period before trying again.
I've seen it around, just cannot remember where I found it ;)
- E
From: "Nathan" nawrich@gmail.com Sent: Saturday, January 19, 2008 12:25 AM To: "English Wikipedia" wikien-l@lists.wikimedia.org Subject: Re: [WikiEN-l] Servers down?
Are closed source bots prevalent? Isn't part of the BRFA process evaluation of the underlying code? Any admin bot should probably be relatively slow, and make up for the slowness with long periods of uptime. Some of the paranoia is a bit farfetched - it shouldn't be incredibly difficult to get well designed bots that don't screw up, and notice when they do. It might be exceptional among bots, but it should still be possible. Bot RfA's have been doomed from the outset recently, because most of the !voters don't have the technical
skills
to evaluate whether or not its well designed (myself included).
On Jan 18, 2008 6:28 AM, Thomas Dalton thomas.dalton@gmail.com
wrote:
On 18/01/2008, David Gerard dgerard@gmail.com wrote:
On 18/01/2008, Tim Starling tstarling@wikimedia.org wrote:
> What's wrong with giving bots sysop access? Are you worried
they
> might > rise up and overthrow the human sysops?
More or less. There's lots of paranoia on en:wp about admin bots
going
batshit in sorcerer's apprentice mode. Though I don't think it's warranted, as *anything* an admin can do is easily reversible
except
history merges. (Making those *easily* reversible is one for the wishlist.)
But that's not true when bots are involved. A human can only screw
up
at roughly the same speed as another human can fix it, so it's not
a
big deal, but a bot can screw up a million times in a few minutes - that's not practically reversible without using another bot to undo
it
all, which takes a lot of preparation (the bot needs to be written, tested to make sure it's not going to screw things up even more,
and
approved - that's likely to take a day or so at least).
Personally, I wouldn't object to open source admin bots ("With
enough
eyes, all bugs are shallow." or whatever the quote it), but closed source ones are too likely to go wrong and are thus too risky (the chance of them going wrong is still quite small, but the potential damage is enormous, so the risk is still high). Also, an open
source
bot can probably be modified by any programmer to fix its own
mistakes
quite easily, doing that with a closed source bot requires the
author.
(So a closed source, supervised bot wouldn't be so bad, but I'd
still
rather not have them.)
WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: http://lists.wikimedia.org/mailman/listinfo/wikien-l
WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: http://lists.wikimedia.org/mailman/listinfo/wikien-l
WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: http://lists.wikimedia.org/mailman/listinfo/wikien-l
WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: http://lists.wikimedia.org/mailman/listinfo/wikien-l
WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: http://lists.wikimedia.org/mailman/listinfo/wikien-l
ClueBot and VoAbot do this - I assume you mean that sort of style?
Stwalkerster
On 18/01/2008, James R. e.wikipedia@gmail.com wrote:
I'm sure some of the keen programmers around would like to see the bot code for any such sysop-based bot that might hit BRFA just to look for any open errors or programming holes in the code. But for the unfortunate bots, we always have access to the tools we need to remove it.
Another idea is have a Wikipage that has the bot controls in it, and have it full protected so that admins can start and stop the bot whenever a problem occurs. e.g. BotName looks at [[User:BotName/controls]] and sees that the param in the edit box is "botstatus=on;" and then continues its duties at the sandbox. If it sees "botstatus=off;" it kills the process altogether and waits a certain period before trying again.
I've seen it around, just cannot remember where I found it ;)
- E
From: "Nathan" nawrich@gmail.com Sent: Saturday, January 19, 2008 12:25 AM To: "English Wikipedia" wikien-l@lists.wikimedia.org Subject: Re: [WikiEN-l] Servers down?
Are closed source bots prevalent? Isn't part of the BRFA process evaluation of the underlying code? Any admin bot should probably be relatively slow, and make up for the slowness with long periods of uptime. Some of the paranoia is a bit farfetched - it shouldn't be incredibly difficult to get well designed bots that don't screw up, and notice when they do. It might be exceptional among bots, but it should still be possible. Bot RfA's have been doomed from the outset recently, because most of the !voters don't have the technical skills to evaluate whether or not its well designed (myself included).
On Jan 18, 2008 6:28 AM, Thomas Dalton thomas.dalton@gmail.com wrote:
On 18/01/2008, David Gerard dgerard@gmail.com wrote:
On 18/01/2008, Tim Starling tstarling@wikimedia.org wrote:
What's wrong with giving bots sysop access? Are you worried they might rise up and overthrow the human sysops?
More or less. There's lots of paranoia on en:wp about admin bots
going
batshit in sorcerer's apprentice mode. Though I don't think it's warranted, as *anything* an admin can do is easily reversible except history merges. (Making those *easily* reversible is one for the wishlist.)
But that's not true when bots are involved. A human can only screw up at roughly the same speed as another human can fix it, so it's not a big deal, but a bot can screw up a million times in a few minutes - that's not practically reversible without using another bot to undo it all, which takes a lot of preparation (the bot needs to be written, tested to make sure it's not going to screw things up even more, and approved - that's likely to take a day or so at least).
Personally, I wouldn't object to open source admin bots ("With enough eyes, all bugs are shallow." or whatever the quote it), but closed source ones are too likely to go wrong and are thus too risky (the chance of them going wrong is still quite small, but the potential damage is enormous, so the risk is still high). Also, an open source bot can probably be modified by any programmer to fix its own mistakes quite easily, doing that with a closed source bot requires the author. (So a closed source, supervised bot wouldn't be so bad, but I'd still rather not have them.)
WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: http://lists.wikimedia.org/mailman/listinfo/wikien-l
WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: http://lists.wikimedia.org/mailman/listinfo/wikien-l
WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: http://lists.wikimedia.org/mailman/listinfo/wikien-l
On 17/01/2008, Brian Salter-Duke b_duke@bigpond.net.au wrote:
On Wed, Jan 16, 2008 at 03:10:03PM -0800, Josh Gordon wrote:
I'm thinking the sandbox should be completely wiped every night to prevent exactly this.
Whose night. Wikipedia never sleeps.
Look at a week's worth of edits to the page. Pick the least popular hour. Simple...
Elias Friedman wrote:
Got onto IRC, I guess someone posted a virus to the sandbox and an admin tried to delete it. I guess that kind of annoys the servers when something with as huge a revision history as the sandbox gets deleted.
While there is a reason to maintain the revision for most articles, is there any point to keeping a long revision history for the sandbox?
Ec
On 16/01/2008, Elias Friedman elipongo@gmail.com wrote:
I can view articles, and even use the edit window's preview function, but I can't save any changes.
-- Elias Friedman A.S., EMT-P President Congregation Knesseth Israel http://www.ellingtonshul.org/
elipongo@gmail.com http://elipongo.blogspot.com/ _______________________________________________ WikiEN-l mailing list WikiEN-l@lists.wikimedia.org To unsubscribe from this mailing list, visit: http://lists.wikimedia.org/mailman/listinfo/wikien-l
Sandbox was deleted.