It goes without saying that a meaningful study should have a random selection process, although it happens all the time that researchers can't always get ideal populations so they study the populations they have. Unfortunately the study is behind a paywall, so you can't see how it was designed, although you would think the Gruniad would not report on something that was obviously flawed. 

Here's another study: "This study used a pretest/posttest design and included a control group to examine the impact of harassment training on intended responses to harassment. The sample consisted of 282 full-time professionals. At time 2, trainees expressed lower intentions to confront the perpetrator than did control-group participants." [1]

This one, "Sexual harassment at work, a decade (plus) of progress" has been cited widely. [2]

This one examines federal employment practices: " Widespread training within the agency has an effect over and above that attributable to the individual's receipt of training itself and training appears to be particularly successful in clarifying men's views about the “gray” area generated by unwanted sexual behavior originating with co-workers rather than supervisors." [3]

All behind paywalls.

And everyone who has ever held a job *knows* that training works.

The point here I think is about jumping on solutions without examining them first, and the difficulty of trying to crowdsource solutions from WP users. Who on this mailing list has the time (or the background) to sort though all this research? This is a whole field of study with years of trial and error behind it, we need a paid professional to sort though these issues.


[1] http://pwq.sagepub.com/content/31/1/62.abstract?ijkey=134e5be01979b1d65d136065e4d4445186bb6629&keytype2=tf_ipsecsha

[2] http://jom.sagepub.com/content/35/3/503.abstract

[3] http://onlinelibrary.wiley.com/doi/10.1046/j.0038-4941.2003.08404001.x/abstract;jsessionid=00F421CE101197C79EDA53336D15C74F.f03t02?userIsAuthenticated=false&deniedAccessCustomisedMessage=





On Tue, May 3, 2016 at 3:04 PM, WereSpielChequers <werespielchequers@gmail.com> wrote:
Significantly less likely than men who don't attend such training..........

So does that mean the targeting is correct and the people sent on such training are disproportionately those who most need it?

If you want a test of how effective that training is you could try an AB test. Study a large group of attendees, half before and half after such training. Or a large group of men a few months before and after such training to see if those who attend make more progress than those who don't. Comparing those who don't attend with those who do would only make sense if the attendees were randomly chosen.

WereSpielChequers


On 3 May 2016, at 15:53, Neotarf <neotarf@gmail.com> wrote:

"A study in the Journal of Applied Behavioral Science found men who participated in a university staff sexual harassment programme were “significantly less likely” to see coercive behaviour as sexual harassment."

http://www.telegraph.co.uk/women/work/sexual-harassment-training-makes-men-less-likely-to-report-inapp/?utm_source=dlvr.it&utm_mediu

_______________________________________________
Gendergap mailing list
Gendergap@lists.wikimedia.org
To manage your subscription preferences, including unsubscribing, please visit:
https://lists.wikimedia.org/mailman/listinfo/gendergap

_______________________________________________
Gendergap mailing list
Gendergap@lists.wikimedia.org
To manage your subscription preferences, including unsubscribing, please visit:
https://lists.wikimedia.org/mailman/listinfo/gendergap