I don't see anything described in [[Manual:Bots]] that would actually
help you detect bot editing traffic.
&
Yeah, I don't see anything at [[Manual:Bots]] that mentions a user-agent convention for bots that edit Wikipedia.
&
I support this page was linked because from there one can click [[API:Client code]] >https://www.mediawiki.org/wiki/API:Etiquette#User-Agent_header > https://meta.wikimedia.org/wiki/User-Agent_policy which is the mentioned conventioned.
Sorry, I pasted a wrong link, the correct one is:
https://en.wikipedia.org/wiki/Wikipedia:Bot_policy
---
I wonder if Marcel means "crawlers".
Toby, do you mean when referring to spiders? Yes, I think they are equivalent terms. Do you think we should change the naming there?
---
Why not just "Bot", or "MediaWikiBot" which at least encompasses all sites that the client
can communicate with.
I personally agree with you, "MediaWikiBot" seems to have better semantics.
---
So this email is not meant to advertise the convention, right? Because the audience of this mailing list certainly doesn't include crawler operators.
No, it's not. It is to reach consensus on the convention and identify things that we can do to improve its application. Thanks for pointing that out, it was unclear in the initial email.