On Thu, May 19, 2022 at 5:13 PM Steven Walling steven.walling@gmail.com wrote:
What else?
In my view, User Experience research has a lot to contribute to this conversation. Every announcement, every banner, every call to action can be user-tested, including in multiple languages. Put it in front of people and see how they respond. Do they get what you're trying to say? Are they turned away? Are they able to follow the call to action? Do they want to? How do different audiences (experienced Wiki[mp]edians, new contributors, people in the Global North or Global South, people with disabilities) respond?
That kind of testing is certainly possible for well-funded organizations; it's also possible to provide volunteers with the resources to do it.
All organizations struggle with creeping complexity over time. Hard evidence that this complexity is stifling can be the necessary counterweight that motivates action: user research findings, clickthrough and completion rates for calls to action (aggregate numbers are fine, no need to track individuals!), time series data to optimize feedback periods, etc.
With evidence in hand, develop standards. Wording choices carry strong connotations. Is "team" a better term than "committee"? Is "movement" a term that fosters in-group/out-group dynamics? Are feedback periods too long or not long enough? Do participation rates in elections go up or down?
In short, I believe an evidence-driven approach to reducing complexity could bear fruit quickly. I still think fondly of the A/B testing work Maryana P. and you organized for talk page templates. [1] I don't mean to diminish the extent to which Wikimedia is evidence-driven today! I'm sure lots of folks are measuring, testing & comparing different approaches for community engagement, and I'd love to hear about it. But perhaps a more org-wide evidence-driven campaign to simplify processes, improve communications & increase their effectiveness is needed as well.
Warmly, Erik
[1] https://meta.wikimedia.org/wiki/Research:Template_A/B_testing