[Wikimedia-l] Wikimedia and the politics of encryption
FT2
ft2.wiki at gmail.com
Mon Sep 2 23:26:54 UTC 2013
There are many very sane comments in this thread. I agree with most of
them -
- Network encryption is important as one aspect;
- "Local" threats and "digging dirt" are an important realistic threat
(far more people are of interest to *THEIR OWN COMMUNITIES* vs nationally,
or open to so many types of harm - defamation, humiliation, "Lulz");
- Moving to https and forcing a serious look at technical implications
and needed workarounds is a strong argument;
- Asking those affected is a strong argument;
- We are a global presence, so our stance, its strength, its
"rightness", and the signal we send, are crucial.
With all respect to local editors, whose position I wish were better, there
is more at stake in Chinese and other affected Wikipedias, than China.
There are questions of internet/freedom/privacy-related beliefs, policies,
and directions -- what one might describe as the battleground for "privacy
of thought vs. state right to monitor thought". That is what it comes to,
whether now, or in 5 or 15 years.
I'm reminded of public reaction years back, to Google, when as a condition
of entry to China it agreed to filter its results. Part of the logic was
"better partial information and presence than none". Did it help Google's
efforts in China? It was seen by many outside as a betrayal. Google had to
leave eventually. Are there lessons we should consider from others who have
tried different approaches in these countries?
I see no reason to believe that state oversight and interception will be
benevolent institutions - and would disregard assurances that they are
designed as such. History teaches over and again that fallibility and
expansion of power is the more usual rule, and good intentions easily turn
to dark uses. To take a simple scenario and how we are affected, if passage
of time and public indifference endorses states being "usually" able to
watch what one studies and writes on, how long before immigration, access
to medical or welfare services, legal rights, marginalization, 'staged'
crimes, targeting, accusations of sedition or "anti-state activity", and so
on, become informed by (among other things) a standard government lookup by
state authorities and law enforcement, of one's Wikipedia (or other online)
accesses, and negative interpretations of what those may "mean"? Self
censorship is a grave possibility, and will encroach from the edges.
To give specific examples, take a Western visitor to Russia who once 8
years ago edited a Wikipedia article adding a note on homosexuality policy
in a school or a legal case in a county. There is no expectation that a
state body would not save all data they can and even in US law a URL is
probably metadata and has no right of privacy. When immigration routinely
obtains visitors' names 72 hours in advance (as some countries expect and
others may demand as a norm) won't they at some time turn around and ask as
part of that process, what is known of possible visitors, and annotate
their immigration records with "Edits pro-jewish topics" or "Seems to
support homosexuality"? Perhaps editors on contentious topics (drugs,
abortion, religion) will have these noted by immigration and less ethical
law enforcement bodies seeking visitors to target, if editing or reading
patterns become easily accessible. The same goes double for editors
attempting to uphold NPOV in countries where this is a risk, and the act of
simply toning down articles that contain inappropriately POV tone in
locally controversial articles may put one at risk.
Twitter and Facebook may show ones daily life, but Wikipedia editing and
page reads show what one sees as areas of interest to inform others, and
areas to be informed oneself. There are workarounds but we can't simply say
"people should know" or "if they are at risk they shouldn't edit". That's
not sustainable.
While this isn't explicitly "known" to happen yet in the US or UK, I
suggest that it's likely to be a logical step round the corner, worldwide,
where state bodies seek to know in advance more about individuals, and
individuals screen and self-censor in response. We need that not to become
a habit, or NPOV can be kissed goodbye.
The profound and poignant comment appeared in one media report a month ago,
that people like Merkel do act as strong advocates of privacy precisely
because - *unlike* US and UK citizens - they have actually lived under the
Stasi. They know what a file on every person, or state access to innermost
and private thoughts "for the common good" truly means for a country.
We probably do need to do what we can to afford a safe ecosphere, as our
whole endeavor depends on it and we have the position to make that point.
It may be difficult, but we probably have a good call for discussing the
possible need to support the ball rolling.
FT2
On Mon, Sep 2, 2013 at 11:23 PM, Fred Bauder <fredbaud at fairpoint.net> wrote:
> > On 31/08/13 15:17, Erik Moeller wrote:
> >> It could be argued
> >> that it’s time to draw a line in the sand - if you’re prohibiting
> >> the
> >> use of encryption, you’re effectively not part of the web. You’re
> >> subverting basic web technologies.
> >
> > China is not prohibiting encryption. They're prohibiting specific
> > instances of encryption which facilitate circumvention of censorship.
> >
> >> So, what to do? My main suggestion is to organize a broad request for
> >> comments and input on possible paths forward.
> >
> > OK, well there's one fairly obvious solution which hasn't been
> > proposed or discussed. It would allow the end-to-end encryption and
> > would allow us to stay as popular in China as we are now.
> >
> > We could open a data centre in China, send frontend requests from
> > clients in China to that data centre, and comply with local censorship
> > and surveillance as required to continue such operation.
> >
> > It would be kind of like the cooperation we give to the US government
> > at the moment, except specific to readers in China instead of imposed
> > on everyone in the world.
> >
> > It would allow WMF to monitor censorship and surveillance by being in
> > the request loop. It would give WMF greater influence over local
> > policy, because our staff would be in direct contact with their staff.
> > We would be able to deliver clear error messages in place of censored
> > content, instead of a connection reset.
> >
> > -- Tim Starling
>
> Their orders would be classified; disclosure of them would be a crime.
> Not a problem for us, but a big problem for staff on the ground in China.
>
> Fred
>
>
> _______________________________________________
> Wikimedia-l mailing list
> Wikimedia-l at lists.wikimedia.org
> Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/wikimedia-l,
> <mailto:wikimedia-l-request at lists.wikimedia.org?subject=unsubscribe>
>
More information about the Wikimedia-l
mailing list