I see-- and what treatment are you giving to towns which exist in violation of the various UN and US-Israel agreements over the years?
I've no idea whether they're in Adam's data or not, but if they >_exist_, they _exist_ and this fact should be noted.
Its only been the past couple months that Israel has bent to international pressure to dismantle a settlement -- how are you treating these? Are these registered as non-existent? Are you adding the former Arabic names of towns, recently and not-so recently incorporated into Israel?
Steve, do you *HAVE THIS DATA*?
If you *HAVE IT*, you can add it.
I have it! I have it! It is from http://www.allthatremains.com/ and I've added it to http://www.wikipedia.org/wiki/List_of_destroyed_villages_during_the_1948_Ara... But it was so much that it wasn't feasible to add it by hand so could I also be assisted by a bot? Compare http://www.wikipedia.org/wiki/Iqrit and http://www.allthatremains.com/Acre/Iqrit/index.html for an example. So if that is the case do I have to program the bot myself or can I lease one from someone?
BL
The RamBot was a bad idea, and the IsraelBot is too. Wikipedia articles should be generated individually, not en masse.
A semi-reasonable compromise would be to tag the auto-generated entries so that they didn't show up in the site statistics, etc. That is, the'd act as preformatted resources for editors who wanted to create an entry on a town, but wouldn't be considered real entries.
--tc
The Cunctator wrote:
The RamBot was a bad idea, and the IsraelBot is too. Wikipedia articles should be generated individually, not en masse.
A semi-reasonable compromise would be to tag the auto-generated entries so that they didn't show up in the site statistics, etc. That is, the'd act as preformatted resources for editors who wanted to create an entry on a town, but wouldn't be considered real entries.
--tc
Your argument makes no sense. There are certain things simply taken as facts that should be included in the Wikipedia. To wait for someone to enter all these things by hand would take forever. A bot can enter the data much more quickly and with a constantly consistent format for easy reading. Sure, someone should go back and look at all these bot entrys to add more information they might have from personal experience but to not let bots seed the 'pedia with factual stubs first would lessen Wikipedia's content.
As long as the information in a bot's stubs is factual, why not count it in the site statistics?
John wrote: The Cunctator wrote:
The RamBot was a bad idea, and the IsraelBot is too. Wikipedia articles should be generated individually, not en masse.
A semi-reasonable compromise would be to tag the auto-generated
entries
so that they didn't show up in the site statistics, etc. That is,
the'd
act as preformatted resources for editors who wanted to create an
entry
on a town, but wouldn't be considered real entries.
--tc
Your argument makes no sense. There are certain things simply taken as facts that should be included in the Wikipedia. To wait for someone to enter all these things by hand would take forever. A bot can enter the data much more quickly and with a constantly consistent format for
easy
reading. Sure, someone should go back and look at all these bot entrys to add more information they might have from personal experience but
to
not let bots seed the 'pedia with factual stubs first would lessen Wikipedia's content.
As long as the information in a bot's stubs is factual, why not count
it
in the site statistics?
Your argument, sir, makes as much sense as mine. The need for Wikipedia seeding is highly debatable; and the advisability of doing it even more debatable, as the more that bots are used, the less likely it is that someone will go back and look at all those bot entries.
There should be a balance between encyclopedia size and contributor participation--the best way to ensure that is to not auto-generate entries.
My suggestion was to let bot-seeding happen, but for those seeds to remain "hidden" until someone goes and looks for the information. Take, for example, [[Leominster, Massachusetts]].
My proposal would be as follows: 1. If you went to [[Johnny Appleseed]], for example, the link to Leominster would be in new-page or stub-link format (? or !), not a standard link, until someone actually made a change other than the Rambot entry. 2. If you click to that link, you see the Rambot content. 3. If you search for Leominster, the Rambot entry would show up in the listing (perhaps tagged). 4. The page wouldn't count in the statistics until someone's made at least one edit to the entry.
That way you get the benefit of the content-seeding without the distortion of the size or intent of Wikipedia, even if hundreds of thousands of such seedings are done.
There's also the side argument that all Rambot did was take information that should be in tables and graphs and put it into paragraph form to make it seem like Wikipedia style. It probably sould have done better just to supply the tables and graphs for eventual inclusion in real entries about the towns.
Having observed a strict indifference to 172 edits for a long time, trying to avoid an edit war. I believe it will be possible to work with him on some subject. I still think that 172 is very POViewed. It's not a problem in itself but 172 is still systematically reverting edits on his work. It seems to be still very hard. I'm an amateur while 172 is a professionnal, he's a native English speaker and my skills in English. OK but I still think that it's not acceptable to write that the Khmer Rouge were US-backed for instance, even if the US administration did not actively oppose them. I invite everyone to have a look a the Cold War article.
Ericd
--- The Cunctator cunctator@kband.com wrote:
Your argument, sir, makes as much sense as mine. The need for Wikipedia seeding is highly debatable; and the advisability of doing it even more debatable, as the more that bots are used, the less likely it is that someone will go back and look at all those bot entries.
There should be a balance between encyclopedia size and contributor participation--the best way to ensure that is to not auto-generate entries.
My suggestion was to let bot-seeding happen, but for those seeds to remain "hidden" until someone goes and looks for the information. Take, for example, [[Leominster, Massachusetts]].
My proposal would be as follows:
- If you went to [[Johnny Appleseed]], for example,
the link to Leominster would be in new-page or stub-link format (? or !), not a standard link, until someone actually made a change other than the Rambot entry. 2. If you click to that link, you see the Rambot content. 3. If you search for Leominster, the Rambot entry would show up in the listing (perhaps tagged). 4. The page wouldn't count in the statistics until someone's made at least one edit to the entry.
That way you get the benefit of the content-seeding without the distortion of the size or intent of Wikipedia, even if hundreds of thousands of such seedings are done.
There's also the side argument that all Rambot did was take information that should be in tables and graphs and put it into paragraph form to make it seem like Wikipedia style. It probably sould have done better just to supply the tables and graphs for eventual inclusion in real entries about the towns.
Sounds like a good idea, but I think that should be for all articles that have only been edited by one person. We shouldn't single out rambot.
The other thing ist that this (your idea or my variant, either one) would make the article count jump down by at least 25,000. This is very significant. LDan
__________________________________ Do you Yahoo!? Yahoo! SiteBuilder - Free, easy-to-use web site design software http://sitebuilder.yahoo.com
--- John Flockmeal@cox.net wrote:
There are certain things simply taken as facts that should be included in the Wikipedia. To wait for someone to enter all these things by hand would take forever. A bot can enter the data much more quickly and with a constantly consistent format for easy reading. Sure, someone should go back and look at all these bot entrys to add more information they might have from personal experience but to not let bots seed the 'pedia with factual stubs first would lessen Wikipedia's content.
I agree that bots can serve a good purpose.
However, the problem I see is that they can insert a lot of trivial information into the W. (For example, who needs the *exact* number of residents in any particular town?)
I think bots can be used to do busywork (for this is why compter programs (and computers) were created in the first place)) but that they must mark their passage and make it easy to re-bot the info. For example, they might segregate the botted info like <bot id="120"> some data in xhml or xml</bot>
===== Christopher Mahan chris_mahan@yahoo.com 818.943.1850 cell http://www.christophermahan.com/
__________________________________ Do you Yahoo!? Yahoo! SiteBuilder - Free, easy-to-use web site design software http://sitebuilder.yahoo.com
I agree that bots can serve a good purpose.
However, the problem I see is that they can insert a lot of trivial information into the W. (For example, who needs the *exact* number of residents in any particular town?)
Well, should they round it? If so, why should they? Is there anything inherently wrong with knowing precice statistics as opposed to approximate ones? Will readers brains be overloaded with information?
I think bots can be used to do busywork (for this is why compter programs (and computers) were created in the first place)) but that they must mark their passage and make it easy to re-bot the info. For example, they might segregate the botted info like
<bot id="120"> some data in xhml or xml</bot>
That would be nice, but it would be kinda useless after Rambot. LDan
__________________________________ Do you Yahoo!? Yahoo! SiteBuilder - Free, easy-to-use web site design software http://sitebuilder.yahoo.com
--- The Cunctator cunctator@kband.com wrote:
The RamBot was a bad idea, and the IsraelBot is too. Wikipedia articles should be generated individually, not en masse.
A semi-reasonable compromise would be to tag the auto-generated entries so that they didn't show up in the site statistics, etc. That is, the'd act as preformatted resources for editors who wanted to create an entry on a town, but wouldn't be considered real entries.
Well, I understand this view-- how can any stats be non-political. Does this mean that we accept them outright? I think that the RamBot was a great idea. A pain in the butt on recent changes, but....
To have Native Americans argue about US cities is one thing altogether-- those disputes were "settled" a long time ago.
~S~
__________________________________ Do you Yahoo!? Yahoo! SiteBuilder - Free, easy-to-use web site design software http://sitebuilder.yahoo.com