Couple thoughts:
1. ORES platform (
ores.wikimedia.org) was designed to host a wide range of
machine learning models, not just the ones built by Aaron Halfaker himself.
So, if there is a computer scientist out there who is interested in
training and maintaining a new bot-detection model, it can be hosted on and
surfaced through ORES. Then anyone with some bot- or web-development skills
can build tools on top of that model. Noting this because that's one of the
main points of having a "scoring platform": it separates the (necessarily
WMF-led) work of production platform development from the development of
purpose-built tools.
2. If anyone knows a computer scientist who is interested in developing and
piloting a model like this please send them our way. Members of the
Research team, or Aaron, *may* have capacity to support a formal
collaboration
3. This seems way too complex for a GSOC project to me, but I'd love to be
wrong about that. If there are students who are interested in working on
this, please send them our way (no promises, obvs).
4. Modifying the charter of an existing WMF product team seems somewhat out
of scope for this ask, task, and venue. :)
- J
On Mon, Feb 11, 2019 at 2:19 PM Pine W <wiki.pine(a)gmail.com> wrote:
Thanks for the replies.
I think that detailed discussion of the pros and cons of the Tech Wishlist
should be separate from this thread, but I agree that one way to get a
subject like unflagged bot detection addressed could be through the Tech
Wishlist assuming that WMF is willing to devote resources to that topic if
it ranked in the top X places.
It sounds like there are a few different ways that work in this area could
be resourced:
1. As mentioned above, making it be a tech wishlist item and having
Community Tech work on it;
2. Having the Anti-Harrassment Tools team work on it;
3. Having the Security team work on it;
4. Having the ORES team work on it;
5. Funding work through a WMF grants program;
6. Funding through a mentorship program like GSOC. I believe that GSOC
previously supported work on CAPTCHA improvements.
Of the above options I suggest first considering 2 and 4. Having AHAT staff
work on unflagged bot detection might be scope creep under the existing
AHAT charter but perhaps AHAT's charter could be modified into something
that would resemble the charter for an "Administrators' Tools Team". And
if
the ORES team has already done some work on unflagged bot detection then
perhaps ORES and AHAT staff could collaborate on this topic.
In the first half of the next WMF fiscal year, I think that planning for an
existing WMF team or combination of staff from existing teams to work on
unflagged bot detection would be good. If WMF does not resource this topic,
then if community people want unflagged bot detection be resourced, we can
consider other options such as 1 and 5.
Pine
(
https://meta.wikimedia.org/wiki/User:Pine )
_______________________________________________
Wikitech-l mailing list
Wikitech-l(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
--
Jonathan T. Morgan
Senior Design Researcher
Wikimedia Foundation
User:Jmorgan (WMF) <https://meta.wikimedia.org/wiki/User:Jmorgan_(WMF)>