Gregory Maxwell writes:
It is almost universal that when something is becoming worse the people involved think they are really fixing things. Agent provocateurs excluded, people don't set out to break their organizations.
Okay, but don't people who are fixing things also frequently believe they are fixing things?
I do think it is informative to contrast Erik's response to me with yours. The impression I get from Erik's "I don't feel that there's been significantly less or more sharing of information over the last few months than over the last couple of years." is "nothing has changed", while your response appears to be much further to the "Things are changing because they had to.. and for the better!".
Well, assuming that Erik and I would agree with your paraphrasing, I don't see anything logically contradictory between the two propositions you restate here.
And, for the avoidance of doubt, I do not think that it has failed. I am, however, concerned that *Wikimedia* is is closer to substantial failure than it has been previously. And as you may be aware I am, by far, not the most important person around the organization to express this view in the last month or so.
The thing about viral memes is that they infect indiscriminately -- even the wisest can catch them.
See for example the recent meme that Hillary Clinton's primary campaign had collapsed. Everybody, even the experts, "knew" it was true.
And then it turned out not to be.
But let me suggest a different 'meme' for you: "The Emperor's New Clothes".
Sure, it's a great meme, but the thing to remember about the story is that it's not the great crowd of people who point out the truth -- it's the one observer who challenges the received wisdom. If there were only one person who was critical about the Foundation while everyone else was cheerleading about it, I'd see the more obvious parallel. But what I see on the lists (and, like you, in private conversations and other media as well) is frequently a kind of crowd- sourcing of conspiracy theories.
If our fear of self-fulfilling negative outcomes or bad press causes us to perform self-censorship or denial we will not only fail but we will very likely cause harm to the outside world in the process.
This reminds me of recent discussions I've had elsewhere about parenting. I'm a parent myself. Obviously, any parent wants to protect his or her child, but, equally obviously, you can't protect a child from every conceivable thing. There's such a thing as overprotectiveness, but that's not in itself an argument against protectiveness.
Similarly, an argument against too much self-censorship (and I agree there can be too much) is not the same thing as an argument against prudence in what we choose to say.
As the projects and Wikimedias grow there will be more things wrong, and so it will take more eyes to see them, and more mouths to speak up about them. Not less.
Of course.
Transparency is something which is objectively measurable, so we should be able to escape the trap of inferences.
I am of the view that very few human enterprises and products lend themselves to objective measurability. Back before I went to law school, I was for a while a graduate student in experimental psychology. What convinced to me leave the program was my feeling that attempts to quantify and characterize human cognitive processes were grounded irreducibly in subjectivity. You remind me of that period of my life when you say that "Transparency is something which is objectively measurable." I'm unaware of studies that demonstrate irrefutably that this is the case, and I'm not sure there's even an objective definition of "transparency." You may of course know more about experimental studies on the subject than I do -- my experience is a quarter-century out of date.
Nevertheless, I find that, when dealing with human beings and their enterprises and when dealing with something as philosophically freighted as "transparency," it's more effective to insist on trying to act ethically rather than to insist on objectively measurable criteria.
There is a fine line between actual constraints and artificially created confidentiality suicide pacts.
There's less of a fine line between studying labor law and inferring labor law from, say, a nondisparagement agreement. For example, when you research the history of nondisparagement agreements, you find that they occur because, before they occurred, there was lots of disparagement going on, leading to costly litigation.
For this reason, labor-law experts routinely advise that non- disparagement be the rule, even in the absence of a formal agreement, although they also advise formal agreements. That's why you may find that some companies, as a matter of policy, never say anything bad about a former employee unless required to by law.
Certainly that is what I would advise, regardless of whether a "suicide pact" existed, and it is what I have advised when working at other organizations.
While I don't want the foundation to be subject to a transparency suicide pact either, if transparency isn't one of the most 'expensive' soft-costs of doing business for Wikimedia then something is probably wrong.
I don't know that we disagree here.
Dishonesty is a question of motivations, such questions are often hard to resolve.
That's precisely why one should be careful, it seems to me, about implying dishonesty in the Foundation, in the community, or elsewhere. More below.
Hanlon's razor is an often applied tool when someone is suspected being dishonest. My own view is that Hanlon's razor isn't all that useful: Stupidity, Malice. If the effect is the same we still need to fix it.
Well, here you discount the benefit of Hanlon's Razor. Say that I've done something stupid, and my reaction, after reflection, may well be that I agree with you. Say I'm dishonest, and I'll feel the impulse to resist anything you say after that. (You may be different -- you might regard being called "stupid" equally offensive with being called "dishonest.") I've generally found that, when trying to offer constructive criticism, implying things about people's motives (which you necessarily do when implying dishonesty) is less helpful than focusing on what is a better course.
I'll close by saying something I've noted before, which is that we have created a culture of *editors* here -- people who look at everything anyone says or writes with a critical eye -- and so we've
This sounds nice, but I don't see anything objective to support it.
It's a hypothesis, not data. For my take on the meaning of "hypothesis," see Karl Popper, The Logic of Scientific Discovery (1959). Because there is much about the world that cannot be known with certainty, it is important to favor what Popper calls "critical rationalism" rather than "justificationism." We are compelled to work with hypotheses all the time.
Here is a counter hypothesis
It doesn't seem to me to be "counter." That is to say, both hypotheses could logically be true.
We have created a culture of personal ownership here -- people who feel responsible for the product of their collective labors, or at least the parts they pick and choose -- and so they feel entitled to a fairly high degree of control, or at least visibility into the actions of those who are exerting influence over it, just as you might ask questions as a doctor works on your body or a plumber works on your home.
I agree with all this. I generally would not support any policy choices that, in my view, undercut the sense of personal investment in the projects. That said, things relating to professionalizing the organizational infrastructure have needed to be done, and still are being done. And this means that the Projects can't be operated like hobbies anymore.
This concern is heightened when the actors do not appear to have the same level of investment and when they do not have the credibility that doctors or plumbers have.
I want to suggest that geographical relocation is a pretty good sign of individual investment. Maybe not irrefutable, but still hard evidence.
While I am disappointed in the level of hostility in our forums, I can't personally agree with the position that we have become too critical.
Funnily enough, if you *agreed* with me, that might be taken as evidence against my hypothesis! ;)
While some areas may have become too critical at times, I think there are a lot of areas where more criticism is needed.
I don't think we disagree about this.
When the time comes that a Wikimedia representative can't make a clearly incorrect claim in the media without Wikimedians calling the person out about it (politely, of course) in our forums, come back to me.. and at that time I'll be willing to consider the idea that we're critical enough.
I'm not sure what you're referring to here, and it may not be helpful to ask you to be more specific, but I want to refer you again to Karl Popper. I don't think the problem in our memetic culture is that we're too critical -- indeed, criticism fuels the whole enterprise, and hooray about that. I do think that *reflexive* criticism, conspiracy- mongering, and hostility is destructive, and I think we all ought to be as self-aware as possible about whether we're saying things that promote destructive memes.
--Mike
Mike Godwin wrote:
Gregory Maxwell writes:
.....
Transparency is something which is objectively measurable, so we should be able to escape the trap of inferences.
I am of the view that very few human enterprises and products lend themselves to objective measurability. Back before I went to law school, I was for a while a graduate student in experimental psychology. What convinced to me leave the program was my feeling that attempts to quantify and characterize human cognitive processes were grounded irreducibly in subjectivity. You remind me of that period of my life when you say that "Transparency is something which is objectively measurable." I'm unaware of studies that demonstrate irrefutably that this is the case, and I'm not sure there's even an objective definition of "transparency." You may of course know more about experimental studies on the subject than I do -- my experience is a quarter-century out of date.
Nevertheless, I find that, when dealing with human beings and their enterprises and when dealing with something as philosophically freighted as "transparency," it's more effective to insist on trying to act ethically rather than to insist on objectively measurable criteria.
.....
--Mike
Hi,
On the topic of transparency, there was a so-called secret mailing list, as some characterized it, e.g.
http://www.theregister.co.uk/2007/12/04/wikipedia_secret_mailing/
Jimbo prefers the word 'private' to 'secret' - he's said so on several occasions. Do you see any ethical objection for the existence (and save in defined circumstances, membership) of all mailing lists known to the Foundation, associated with the encyclopedia and related projects, to be made public.
Do you see any argument for such disclosure?
Thanks in advance
On Jan 9, 2008 10:03 PM, luke brandt shojokid@gmail.com wrote:
Hi, On the topic of transparency, there was a so-called secret mailing list, as some characterized it, e.g.
http://www.theregister.co.uk/2007/12/04/wikipedia_secret_mailing/
Jimbo prefers the word 'private' to 'secret' - he's said so on several occasions. Do you see any ethical objection for the existence (and save in defined circumstances, membership) of all mailing lists known to the Foundation, associated with the encyclopedia and related projects, to be made public.
Do you see any argument for such disclosure?
I see some arguments for such disclosure: * While the details of some discussions need to be private the fact that there is a discussion usually has less privacy required. * Increased disclosure can help tame the more wild and inaccurate conspiracy theories. * Improved consistency with the claims of openness which are often made around here.
However, I don't believe such disclosure is achievable. The secret mailing list you refer to, wpcyberstalkers, was run on Jimmy's company's servers, not on Wikimedia's servers. It would be unrealistic to expect to impose a rule like this on the whole world. Wikimedia's handling of lists is generally fairly open. I suppose that's why the wpcyberstalkers list was at Wikia, and likewise for the sexually discriminatory wikichix lists.
Wikimedia-run lists are listed at http://lists.wikimedia.org/mailman/listinfo, though a quick glance indicates that the lists like arbcom-l, internal-l, staff-l, and board-l are not shown. Their existence is not secret. (and I'm sure there are a few others that I'm forgetting to mention)
Another counterpoint is that a long reused CC list is no different than a mailing list. It would be unreasonable and unrealistic to request everyone to disclose everyone that they are mailing.
Then what about other forms of communication? Would you expect me to post a log of every phone call I've had with another Wikimedian? :)
To me it seems like an impossible task to set rules along these lines which would not violate people's privacy while not creating constant argument about conformance.
In the public space these sorts of concerns are covered by things like sunshine laws (http://en.wikipedia.org/wiki/Freedom_of_Information_Act, I'm most familiar with Florida law http://brechner.org/Brechner%20Center%20FAQs-%20Florida%20in%20the%20Sunshin... ). I think the problem with adopting these kinds of rules for Wiki[pm]edia decision makers is that whom is actually a decision maker is hard to define, and any reasonable definition likely includes more than a thousand people.
Still, I think it would be productive to look at the various techniques for government transparency and look for things which could be productively applied here.
I've long found it amusing that agents speaking for Wiki[pm]edia have claimed a revolutionary level of transparency while in many areas the level of transparency is below the level required by some governments of themselves by law. Certantly some aspects of the project are very open but a revolutionarily open document management system should not be confused for transparency in governance.
Gregory Maxwell wrote:
Still, I think it would be productive to look at the various techniques for government transparency and look for things which could be productively applied here.
I've long found it amusing that agents speaking for Wiki[pm]edia have claimed a revolutionary level of transparency while in many areas the level of transparency is below the level required by some governments of themselves by law. Certantly some aspects of the project are very open but a revolutionarily open document management system should not be confused for transparency in governance.
Thanks for your reply, Gregory. I take your points, e.g. that probably only a few in the Foundation knew about the 'secret' mailing list, and it's difficult to make rules to cover all possibilities on lists. But I think you are right, in that it's still worth trying to achieve a greater level of transparency in governance for the reasons you gave.
wikimedia-l@lists.wikimedia.org