I've just been looking at the image filter referendum. Could someone
from the Foundation please explain what you hope to gain by holding
it? The questions are extremely leading, so I doubt you will learn
anything useful from it (is anyone really going to say that they don't
think it's important to be culturally neutral?). Are you hoping to
determine people's priorities by seeing which ones they rate as 10 and
which as merely 8 or 9? If so, why? Can you not just implement them
all?
My understanding was that this referendum was intended to give the
community some say in what happened with this proposed feature. The
questions you are asking don't do that in the slightest. If you want
to be able to say the feature has community support, you need to
actually ask the community whether or not they support it.
Hi
The Image file referendum banners are currently running at 100% globally[1],
on all projects (correct me if I'm wrong). It seems rather excessive
considering the banner's size and the subject.
I have no idea, if this issue really needs such an exposure. The scheduled
service outage notice a few months ago, in comparison which led to complete
project outage for more than an hour, had a much smaller banner which didn't
even run at 100%.
Can someone please tell me if this issue requires this much urgency and
exposure? it is affecting local geo-tagged banners on a lot of projects.
Thanks
Theo
[1
http://meta.wikimedia.org/w/index.php?title=Special:CentralNotice&method=li…
On 8/16/2011 2:13 AM, foundation-l-request(a)lists.wikimedia.org wrote:
> One suggestion for archiving would be to have a complete set of projects
> filed with the copyright office and other key depositories quarterly.
>
> This could also address a potential long-term copyright problem. This
> has less to do with Wikipedia infringing on the copyrights of others
> than with the reverse. It already happens that others use Wikipedia
> material without credit in works on which they claim copyright. Re-use
> of that material on-wiki at a later date will inevitably result in a
> copyvio squabble, especially if the originally plundered version is no
> longer recognizable. This could be many years hence. What other means
> are available to protect the viral nature of freely licensed material?
>
> Forks could also be helpful in this regard. They would need to respect
> free licences, and, as a by-product, add evidence favouring the freeness
> of the material. A person creating a fork based on some topic area is
> unlikely to significantly alter all the articles imported, preferring to
> draw different conclusions from the same underlying facts. This is
> bound to leave an identifiable residue that will protect the licence.
Anything filed with the copyright office is a static slice in time.
Copyright is such a sticky issue - If you publish something, copyright
it , then go back and revise the original then you must copyright the
whole thing over again - because copyright is based on an "image" of
something. There is a limit to which you can use material that has a
copyright by others - it is called plagiarism and is well defined in
law. However - If you take material that is old enough to be out of
copyright and publish a new edition of that material - you can copyright
the new edition - but (as I understand it) only the image thereof - not
the actual material.
I may well be wrong, but a rather involved example might be in order.
Someone has an original photographic print of Adolf Hitler. Originally
the rights to that image had to be cleared by 1. the subject (however he
was a public figure so his rights were automatically "released" unless
otherwise stated) 2. the photographer 3. the rights-holder (originally
NSDAP) and 4. the possessor of the print. However point 1 was cleared
upon the subject's death (since he had no estate exercising control at
the time of death other than 3), Point 2 was cleared when the assets
were seized by the Allies. Point 3 reverted to the state of Bavaria
since the NSDAP was chartered under their laws and has been dissolved by
that agency. In this special case the state will not contest use by
others unless the purpose is to further the goals of the NSDAP ie.
Nazism and/or fascism. So the first three are covered and only point 4
applies to use in new work.
Now the fun part - I buy a copy of that "new work". It has a new
copyright. Exactly what is covered here? Only the "image" in the book.
So if I went to the National Archives, found the negative of that print
and made my own copy I could use my copy without restriction. However if
I used a scanner or camera to copy the image from the "new work" itself
then the new copyright would apply and I would need to obtain permission
from the publisher.
So if someone used material "as-is" from Wikipedia in a new work, they
could not "own" the material that came from Wikipedia, only the "image"
represented by their own publication. Since the material is likely just
text, I'm guessing that the "as-is" material could be freely copied by
others. This could get to be tricky as it is like the government
document that is stamped "Secret" because of one word. Obscure that word
and the document can be released under FOIA.
On 8/16/2011 5:00 AM, foundation-l-request(a)lists.wikimedia.org wrote:
> A couple of months ago three admins of Aceh Wikipedia decided that it
> is not acceptable that they participate in the project which holds
> Muhammad depictions. By the project, they mean Wikimedia in general,
> including Wikimedia Commons. It was just a matter of time when they
> would create their own wiki. And they created that moth or two after
> leaving Wikimedia. And what do you think which project has more
> chances for success: the one without editors or the other with three
> editors? So, while the reason for leaving couldn't be counted among
> reasonable ones, the product is the same as if they had a valid
> reason. And there are plenty of valid reasons, among them almost
> universal problem of highly bureaucratic structures on Wikimedia
> projects.
Politics and religion are the two areas where this problem usually
occurs. It is perfectly acceptable to present differing POVs if the
parties involved can find no common ground. They must be respected for
their differences as much for their similarities. That means that a
neutral platform such as Wikipedia must be able to host differing
opinions. This problem was popped up long ago when people of differing
opinions began altering pages and deleting the work of others. It was
addressed with implementation of the "edit lock" and frequent monitoring.
An Encyclopedia must be free to present all sides of this kind of issue
so third parties can come to understand the reasons behind the
differences. Refusal to do so moves the platform away from the mission
statement of neutrality.
Anyone who cannot support this commitment to neutrality is free to leave
and present their own POV - but they lose that neutral credibility in
the process of doing so.
2011/8/14 Krinkle <krinklemail(a)gmail.com>
> Hi all,
>
> 've read most of the previous mails so far. I'd like to clear some
> confusion
> (just in case). Please do correct me if I'm wrong and got caught
> by the confusion myself:
>
> The thread is about one of the following:
> * .. the ability to clone a MediaWiki install and upload it to your own
> domain
> to continue making edits, writing articles etc.
>
Installing MediaWiki for you is easy for geeks. The only solution for
newbies is using wikifarms.
> * .. getting better dumps of Wikimedia wikis in particular (ie. Wikipedia)
>
A ten years old on-going task.
> * .. being able to install MediaWiki easier or even online (like new wikis
> on
> Wikia.com)
>
MediaWiki developers issue.
> * .. making it easy for developers to fork the MediaWiki source code
> repository.
>
>
Trivial. Any developer can set up a repository with a source code snapshot.
Gerard in the first post was speaking about 1) forks, 2) digital preserving
Forking single articles is easy, you just copy/paste (with histories you
have to use import/export). Forking a set of articles is just a bit more
difficult. Forking the whole Wikipedia is _hard_, you need a good
infrastructure and skills.
Digital preserving is a big problem in computer science. It is not solved
yet, but if you make backups frequently and in several places, you have a
high security to save the data.
To fork you need first the data being preserved, and this links with the
dumps generation problem above.
I think people is getting nervous with Wikipedia (and me too), in the same
way people is getting worried with Google having control of all your online
life (Gmail, Google Reader, Google Calendar, Google+, etc). If Google closes
your account, your online life vanishes. If Google dies, your online life
too. Of course you can export all your e-mail, contacts, etc, but you lose
the @gmail.com address, all links in search engines to your data is broken,
etc. Google has a good policy about exporting data, most Internet services
don't.
The mankind is compiling all human knowledge in an encyclopedia, which is
hosted in faulty metal plates spinning thousand times per minute, managed by
faulty humans and located only in one or two locations in the world
(Florida, the land of hurricanes and San Francisco, the land of
earthquakes).
Making fun of Wikipedia is so 2007. Playing with Wikipedia is so 2001.
Losing knowledge is so 48 BC. This is the most important mission human race
has ever achieve.
Regards,
emijrp
--
> Krinkle
> _______________________________________________
> Wikitech-l mailing list
> Wikitech-l(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
[posted to foundation-l and wikitech-l, thread fork of a discussion elsewhere]
THESIS: Our inadvertent monopoly is *bad*. We need to make it easy to
fork the projects, so as to preserve them.
This is the single point of failure problem. The reasons for it having
happened are obvious, but it's still a problem. Blog posts (please
excuse me linking these yet again):
* http://davidgerard.co.uk/notes/2007/04/10/disaster-recovery-planning/
* http://davidgerard.co.uk/notes/2011/01/19/single-point-of-failure/
I dream of the encyclopedia being meaningfully backed up. This will
require technical attention specifically to making the projects -
particularly that huge encyclopedia in English - meaningfully
forkable.
Yes, we should be making ourselves forkable. That way people don't
*have* to trust us.
We're digital natives - we know the most effective way to keep
something safe is to make sure there's lots of copies around.
How easy is it to set up a copy of English Wikipedia - all text, all
pictures, all software, all extensions and customisations to the
software? What bits are hard? If a sizable chunk of the community
wanted to fork, how can we make it *easy* for them to do so?
And I ask all this knowing that we don't have the paid tech resources
to look into it - tech is a huge chunk of the WMF budget and we're
still flat-out just keeping the lights on. But I do think it needs
serious consideration for long-term preservation of all this work.
- d.
>
>
>
>
A successful fork needs more than just the content, software and sufficient
hardware, it also needs a community.
If we are serious about having a right to fork we need to make it easy for
editors to keep their account, and possibly even userrights in both forks,
otherwise whichever fork you have to create a new account for is at a huge
disadvantage. But for privacy/security reasons I don't think that WMF should
give the fork a copy of the databases that includes the userids and their
logins. Perhaps this could be finessed by having the WMF create a bridge to
allow wikimedians to activate their existing account at the forked wiki, and
the forked wiki would presumably not allow editors to otherwise create
accounts using names that had edits imported from Wikimedia.
BTW I'm not advocating a fork at this juncture. The only scenario I can see
in the short term that might lead to a fork is the clash between the
Foundation's policy on openness and the contrary decisions taken by certain
parts of the community, - for example EN wiki deciding to restrict new
article creation to Autoconfirmed users. Presumably the Foundation will get
the devs to code the change requested by EN wiki even if it does make us
less open. But it could quite legitimately say "That clashes with our core
values so we won't do that here, but if some of you want to create a more
deletionist wiki you do of course have the right to fork."
In that scenario I'd want the option of keeping my username on both forks,
though I doubt if I'd be active on the spinoff less open pedia. But I'd be
annoyed if they let someone else activate my account there.
WereSpielChequers
>
Good point - risk management isn't just about technical disaster -
geopolitical issues are actually a much greater long term risk
On 8/15/2011 2:04 AM, foundation-l-request(a)lists.wikimedia.org wrote:
> The primary value of a fork(s) is not financial or technical, but
> epistemological. We are the big kid in the playground, and that has a
> significant effect on the nature of the content. When we work so hard to
> build an aura of reliability readers begin to depend on us.
> Paradoxically, that's not always good. If we are so reliable, the reader
> is not motivated to look elsewhere for alternatives. Natural human
> laziness is bad enough by itself. We too easily fall into the trap of
> treating Group POV as Neutral POV. Forks, would develop their own
> versions of NPOV, and end up with very different results that are as
> easily reliable as ours, but still different. It becomes up to the
> reader to compare corresponding pages, and draw his own conclusions on
> the matter at hand.
>
> We should not be viewing forks as inherent evils to be resisted at all
> costs. We should be encouraging them, and helping them.