Hello,
ORES has been out and served for the Wikipedia community for a while, for the purpose such as counter-vandalism. Having seen the wide usage and effectiveness of ORES in the community, we'd like to continue working on ORES development. We plan to improve and redesign ORES algorithms by incorporating feedbacks of all the stakeholders involved in the entire ORES ecosystem, such as ORES application developers, ORES application operators, etc. We want to understand their concerns and values, and come up with effective algorithmic designs that can balance trade-offs and mitigate potential conflicts of interests (such as edit quality control v.s. newcomer protection) to further improve ORES performance.
We will work with Aaron Halfaker and his team to make improvements on ORES quality control models, and identify its limitations. Here is the project proposal on Meta-Wiki https://meta.wikimedia.org/wiki/Research:Applying_Value-Sensitive_Algorithm_Design_to_ORES. If you are interested or have any thoughts, please feel free to reach out to me. Thanks!
Hi Bowen, after reading your project proposal I have a few questions and concerns.
You mention a perceived tension between protecting newcomers and protecting the quality of content. I am wondering whether that is a false dichotomy. In my experience, test edits and blatant vandalism usually look different from mistakes from good faith editors.
There is a feature that allows users to adjust ORES-supported edit scoring in our watchlists and Recdent Changes: https://www.mediawiki.org/wiki/Edit_Review_Improvements/New_filters_for_edit.... Have you tested this feature? How would your research be useful for that feature's future development?
I think that ORES is supposed to aid human judgment, not to substitute for human judgment. How certain are you that "ORES applications will play a role in drawing a line between acceptable freestyle edits and editing policies in standard."? There may well be some human patrollers who adjust their definitions for vandalism based on ORES recommendations, but I think that you would want to know to what extent ORES has that effect.
I would also like to mention that Wikipedia policies and guidelines, like offline human laws and customs, may change over time, may have varying interpretations, and may have varying degrees of adherence among the populace.
Thanks for your interest in studying ORES. I am glad that you are collaborating with Aaron.
On Thu, Dec 13, 2018, 7:08 AM Bowen Yu bowen-yu@umn.edu wrote:
Hello,
ORES has been out and served for the Wikipedia community for a while, for the purpose such as counter-vandalism. Having seen the wide usage and effectiveness of ORES in the community, we'd like to continue working on ORES development. We plan to improve and redesign ORES algorithms by incorporating feedbacks of all the stakeholders involved in the entire ORES ecosystem, such as ORES application developers, ORES application operators, etc. We want to understand their concerns and values, and come up with effective algorithmic designs that can balance trade-offs and mitigate potential conflicts of interests (such as edit quality control v.s. newcomer protection) to further improve ORES performance.
We will work with Aaron Halfaker and his team to make improvements on ORES quality control models, and identify its limitations. Here is the project proposal on Meta-Wiki < https://meta.wikimedia.org/wiki/Research:Applying_Value-Sensitive_Algorithm_...
.
If you are interested or have any thoughts, please feel free to reach out to me. Thanks! _______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
I agree that we very rarely misidentify vandalism.
Where there is a dichotomy between quality and openness is in our handling of new unsourced content. There are no easy solutions here, but I would acknowledge both that a significant proportion of new unsourced content is good faith, and also that those who revert much of it on sight are often doing the right thing.
One difficulty for the casual observer is how do you quickly tell the difference between someone who knows a subject that you don't and is rejecting an unsourced and implausible edit, as opposed to someone who is as ignorant of a subject as you and is rejecting an unsourced edit from someone who actually knows their stuff and was trying to improve wikipedia?
Jonathan
On Thu, 13 Dec 2018 at 16:34, Pine W wiki.pine@gmail.com wrote:
Hi Bowen, after reading your project proposal I have a few questions and concerns.
You mention a perceived tension between protecting newcomers and protecting the quality of content. I am wondering whether that is a false dichotomy. In my experience, test edits and blatant vandalism usually look different from mistakes from good faith editors.
There is a feature that allows users to adjust ORES-supported edit scoring in our watchlists and Recdent Changes:
https://www.mediawiki.org/wiki/Edit_Review_Improvements/New_filters_for_edit... . Have you tested this feature? How would your research be useful for that feature's future development?
I think that ORES is supposed to aid human judgment, not to substitute for human judgment. How certain are you that "ORES applications will play a role in drawing a line between acceptable freestyle edits and editing policies in standard."? There may well be some human patrollers who adjust their definitions for vandalism based on ORES recommendations, but I think that you would want to know to what extent ORES has that effect.
I would also like to mention that Wikipedia policies and guidelines, like offline human laws and customs, may change over time, may have varying interpretations, and may have varying degrees of adherence among the populace.
Thanks for your interest in studying ORES. I am glad that you are collaborating with Aaron.
On Thu, Dec 13, 2018, 7:08 AM Bowen Yu bowen-yu@umn.edu wrote:
Hello,
ORES has been out and served for the Wikipedia community for a while, for the purpose such as counter-vandalism. Having seen the wide usage and effectiveness of ORES in the community, we'd like to continue working on ORES development. We plan to improve and redesign ORES algorithms by incorporating feedbacks of all the stakeholders involved in the entire
ORES
ecosystem, such as ORES application developers, ORES application
operators,
etc. We want to understand their concerns and values, and come up with effective algorithmic designs that can balance trade-offs and mitigate potential conflicts of interests (such as edit quality control v.s. newcomer protection) to further improve ORES performance.
We will work with Aaron Halfaker and his team to make improvements on
ORES
quality control models, and identify its limitations. Here is the project proposal on Meta-Wiki <
https://meta.wikimedia.org/wiki/Research:Applying_Value-Sensitive_Algorithm_...
.
If you are interested or have any thoughts, please feel free to reach out to me. Thanks! _______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
--
Pine ( https://meta.wikimedia.org/wiki/User:Pine ) _______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Hi Bowen,
I've used ORES in my research on the factors driving article quality (where ORES scores are used as a proxy for article quality). If you are seeking input from the research community, I'm happy to participate in your survey
Ofer
On Thu, Dec 13, 2018 at 5:08 PM Bowen Yu bowen-yu@umn.edu wrote:
Hello,
ORES has been out and served for the Wikipedia community for a while, for the purpose such as counter-vandalism. Having seen the wide usage and effectiveness of ORES in the community, we'd like to continue working on ORES development. We plan to improve and redesign ORES algorithms by incorporating feedbacks of all the stakeholders involved in the entire ORES ecosystem, such as ORES application developers, ORES application operators, etc. We want to understand their concerns and values, and come up with effective algorithmic designs that can balance trade-offs and mitigate potential conflicts of interests (such as edit quality control v.s. newcomer protection) to further improve ORES performance.
We will work with Aaron Halfaker and his team to make improvements on ORES quality control models, and identify its limitations. Here is the project proposal on Meta-Wiki < https://meta.wikimedia.org/wiki/Research:Applying_Value-Sensitive_Algorithm_...
.
If you are interested or have any thoughts, please feel free to reach out to me. Thanks! _______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Thank you Ofer!
I'm sure Bowen would be interested in the needs/values you bring to ORES. FWIW, we on the Scoring Platform team consider Researchers to be legitimate users of ORES.
On Fri, Dec 14, 2018 at 12:18 AM Ofer Arazy ofer.arazy@gmail.com wrote:
Hi Bowen,
I've used ORES in my research on the factors driving article quality (where ORES scores are used as a proxy for article quality). If you are seeking input from the research community, I'm happy to participate in your survey
Ofer
On Thu, Dec 13, 2018 at 5:08 PM Bowen Yu bowen-yu@umn.edu wrote:
Hello,
ORES has been out and served for the Wikipedia community for a while, for the purpose such as counter-vandalism. Having seen the wide usage and effectiveness of ORES in the community, we'd like to continue working on ORES development. We plan to improve and redesign ORES algorithms by incorporating feedbacks of all the stakeholders involved in the entire
ORES
ecosystem, such as ORES application developers, ORES application
operators,
etc. We want to understand their concerns and values, and come up with effective algorithmic designs that can balance trade-offs and mitigate potential conflicts of interests (such as edit quality control v.s. newcomer protection) to further improve ORES performance.
We will work with Aaron Halfaker and his team to make improvements on
ORES
quality control models, and identify its limitations. Here is the project proposal on Meta-Wiki <
https://meta.wikimedia.org/wiki/Research:Applying_Value-Sensitive_Algorithm_...
.
If you are interested or have any thoughts, please feel free to reach out to me. Thanks! _______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Thank you all for your interests and comments!
I've made connections with individuals for more discussion. Please feel free to contact me if you have any thoughts.
On Fri, Dec 14, 2018 at 11:09 AM Aaron Halfaker aaron.halfaker@gmail.com wrote:
Thank you Ofer!
I'm sure Bowen would be interested in the needs/values you bring to ORES. FWIW, we on the Scoring Platform team consider Researchers to be legitimate users of ORES.
On Fri, Dec 14, 2018 at 12:18 AM Ofer Arazy ofer.arazy@gmail.com wrote:
Hi Bowen,
I've used ORES in my research on the factors driving article quality
(where
ORES scores are used as a proxy for article quality). If you are seeking input from the research community, I'm happy to participate in your survey
Ofer
On Thu, Dec 13, 2018 at 5:08 PM Bowen Yu bowen-yu@umn.edu wrote:
Hello,
ORES has been out and served for the Wikipedia community for a while,
for
the purpose such as counter-vandalism. Having seen the wide usage and effectiveness of ORES in the community, we'd like to continue working
on
ORES development. We plan to improve and redesign ORES algorithms by incorporating feedbacks of all the stakeholders involved in the entire
ORES
ecosystem, such as ORES application developers, ORES application
operators,
etc. We want to understand their concerns and values, and come up with effective algorithmic designs that can balance trade-offs and mitigate potential conflicts of interests (such as edit quality control v.s. newcomer protection) to further improve ORES performance.
We will work with Aaron Halfaker and his team to make improvements on
ORES
quality control models, and identify its limitations. Here is the
project
proposal on Meta-Wiki <
https://meta.wikimedia.org/wiki/Research:Applying_Value-Sensitive_Algorithm_...
.
If you are interested or have any thoughts, please feel free to reach
out
to me. Thanks! _______________________________________________ Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
Wiki-research-l mailing list Wiki-research-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wiki-research-l
wiki-research-l@lists.wikimedia.org