The abuse filter has some serious problems with logging of personal information, what to log and why. There are also the problems associated with the use of such a log, and who has access to it. In some jurisdictions it may be legal to log and use such information for arbitrary actions against the users but that is not generally the case. In Norway it is legal to log such actions for the administration of the system, but as soon as it is used for actions against the users it would need a license (konsesjon) to handle such information. Note that WMF may choose to neglect the Norwegian laws in this respect as it do not have to apply to Norwegian laws.
I believe it is fairly easy to avoid all of those those problems, but I can't find any information that says that such adaptions of the code are done, or that any other measure is taken to avoid said problems. Can anyone clarify on the matter as it seems that nearly everyone just hurrays the implementation and there is no effort to solve those issues.
John
Hello John,
done, or that any other measure is taken to avoid said problems. Can anyone clarify on the matter as it seems that nearly everyone just hurrays the implementation and there is no effort to solve those issues.
I discussed this with Andrew (he is not on foundation-l), and apparently, AbuseFilter does not seem to disclose any information that would not be available elsewhere. Is there any particular information released by it you'd consider leaking private data?
We love privacy, but we want to be consistent :)
The problem is that something that previously was public (vandal moving the page "George W. Bush" to "moron") will now be private (he get a message that hi isn't allowed to do that), this shifts the context from a public context to a private context. Then the extension do logging of actions done in this private context to another site. Users of this site will then have access to private information. It is not the information _disclosed_ which creates the problem, it is the information _collected_. It seems like the information is legal for "administrative purposes", but as soon as it is used for anything other it creates a lot of problems. For example, if anyone takes actions against an user based on this collected information it could be a violation of local laws. (Imagine collected data being integrated with CU) If such actions must be taken, then the central problems are identification of who has access to the logs and are they in fact accurate. That is something you don't want in a wiki with anonymous contributors! :D
The only solution I see is to avoid all logging of private actions if the actions themselves does not lead to a publication of something. Probably it will be legal to do some statistical analysis to administer the system, but that should limit the possibility of later identification of the involved users.
There are a lot of other problems, but I think most of them are minor to this.
John
Domas Mituzas skrev:
Hello John,
done, or that any other measure is taken to avoid said problems. Can anyone clarify on the matter as it seems that nearly everyone just hurrays the implementation and there is no effort to solve those issues.
I discussed this with Andrew (he is not on foundation-l), and apparently, AbuseFilter does not seem to disclose any information that would not be available elsewhere. Is there any particular information released by it you'd consider leaking private data?
We love privacy, but we want to be consistent :)
John,
There are a lot of other problems, but I think most of them are minor to this.
Well, this looks like lawyer thing then, not overall privacy policy discussion.
Privacy _is_ about law, but the extension creates the privacy problem and it must be solved. John
Domas Mituzas skrev:
John,
There are a lot of other problems, but I think most of them are minor to this.
Well, this looks like lawyer thing then, not overall privacy policy discussion.
Hoi, Privacy is regulated by laws. Privacy is about law but it is first and foremost about decency. By being consistent in what we do, by doing our best to provide reasonable privacy levels we provide a decent service to our users. There is no such thing as "the law" and the law as implemented can be insane.
There are laws that we do not abide by There are laws that we do not want to abide by. This means that the basic drive for privacy is because it is what we want not because we are told to do so. Thanks, GerardM
2009/3/25 John at Darkstar vacuum@jeb.no
Privacy _is_ about law, but the extension creates the privacy problem and it must be solved. John
Domas Mituzas skrev:
John,
There are a lot of other problems, but I think most of them are minor to this.
Well, this looks like lawyer thing then, not overall privacy policy discussion.
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
The peculiarity in some respects of Scandinavian law seems to come up on this list fairly frequently, but it's usually short on specifics or actual cases. John, do you have any specific references to what you've described as a problem?
Adhering to your interpretation on the possible limits on "private" information would effectively eliminate the abuse filter as a useful tool. I'm having a hard time seeing this as a widespread problem; there can't be many jurisdictions that define public and private in this way, or place such restrictions on what can be done with this data that blocking someone from a private website in another country could be a violation of the law.
To my mind, private data of the sort we need to worry about is not "private" in the sense that it is owned by the Foundation or not publicly viewable, but "private" in the sense that it contains potentially sensitive details of individual editors and readers. Nothing in the abuse filter would seem to change the public availability of this sort of data, and I can hardly see Wikimedia being penalized simply for preventing vandalism instead of reacting to it.
Nathan
On Wed, Mar 25, 2009 at 8:35 AM, John at Darkstar vacuum@jeb.no wrote:
The problem is that something that previously was public (vandal moving the page "George W. Bush" to "moron") will now be private (he get a message that hi isn't allowed to do that), this shifts the context from a public context to a private context. Then the extension do logging of actions done in this private context to another site. Users of this site will then have access to private information. It is not the information _disclosed_ which creates the problem, it is the information _collected_. It seems like the information is legal for "administrative purposes", but as soon as it is used for anything other it creates a lot of problems. For example, if anyone takes actions against an user based on this collected information it could be a violation of local laws. (Imagine collected data being integrated with CU) If such actions must be taken, then the central problems are identification of who has access to the logs and are they in fact accurate. That is something you don't want in a wiki with anonymous contributors! :D
The only solution I see is to avoid all logging of private actions if the actions themselves does not lead to a publication of something. Probably it will be legal to do some statistical analysis to administer the system, but that should limit the possibility of later identification of the involved users.
There are a lot of other problems, but I think most of them are minor to this.
John
It is not refusing to accept some kind of edit that creates the problem, it is the logging of the action because you then collect information about the users. Preventing the vandalism instead of reacting to it shifts the actions from a public context to a private context. By avoiding collecting such information and adhering to "administration of the system" most of the problem simply goes away. Its not about using or not using the extension, its about limiting the logging so that no one can gain access to any data to make later actions against the users (ie. the vandals).
WMF may choose to log the information anyhow, like it may choose to not respect copyright laws in some countries. I don't think that is very wise, but I can only say what I believe is right.
John
Nathan skrev:
The peculiarity in some respects of Scandinavian law seems to come up on this list fairly frequently, but it's usually short on specifics or actual cases. John, do you have any specific references to what you've described as a problem?
Adhering to your interpretation on the possible limits on "private" information would effectively eliminate the abuse filter as a useful tool. I'm having a hard time seeing this as a widespread problem; there can't be many jurisdictions that define public and private in this way, or place such restrictions on what can be done with this data that blocking someone from a private website in another country could be a violation of the law.
To my mind, private data of the sort we need to worry about is not "private" in the sense that it is owned by the Foundation or not publicly viewable, but "private" in the sense that it contains potentially sensitive details of individual editors and readers. Nothing in the abuse filter would seem to change the public availability of this sort of data, and I can hardly see Wikimedia being penalized simply for preventing vandalism instead of reacting to it.
Nathan
On Wed, Mar 25, 2009 at 8:35 AM, John at Darkstar vacuum@jeb.no wrote:
The problem is that something that previously was public (vandal moving the page "George W. Bush" to "moron") will now be private (he get a message that hi isn't allowed to do that), this shifts the context from a public context to a private context. Then the extension do logging of actions done in this private context to another site. Users of this site will then have access to private information. It is not the information _disclosed_ which creates the problem, it is the information _collected_. It seems like the information is legal for "administrative purposes", but as soon as it is used for anything other it creates a lot of problems. For example, if anyone takes actions against an user based on this collected information it could be a violation of local laws. (Imagine collected data being integrated with CU) If such actions must be taken, then the central problems are identification of who has access to the logs and are they in fact accurate. That is something you don't want in a wiki with anonymous contributors! :D
The only solution I see is to avoid all logging of private actions if the actions themselves does not lead to a publication of something. Probably it will be legal to do some statistical analysis to administer the system, but that should limit the possibility of later identification of the involved users.
There are a lot of other problems, but I think most of them are minor to this.
John
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
I see the actions as 100% public. Just because the edit that was attempted was not allowed does not mean it was not meant to be public. The "Logs" are just another avenue that an edit may take if it meets some conditions. the only difference between logging and previous behavior is the edit never made it to the "live" page. this is very similar to flagged revisions behavior of not showing an edit until its approved.
On Wed, Mar 25, 2009 at 11:35 AM, John at Darkstar vacuum@jeb.no wrote:
It is not refusing to accept some kind of edit that creates the problem, it is the logging of the action because you then collect information about the users. Preventing the vandalism instead of reacting to it shifts the actions from a public context to a private context. By avoiding collecting such information and adhering to "administration of the system" most of the problem simply goes away. Its not about using or not using the extension, its about limiting the logging so that no one can gain access to any data to make later actions against the users (ie. the vandals).
WMF may choose to log the information anyhow, like it may choose to not respect copyright laws in some countries. I don't think that is very wise, but I can only say what I believe is right.
John
Nathan skrev:
The peculiarity in some respects of Scandinavian law seems to come up on this list fairly frequently, but it's usually short on specifics or
actual
cases. John, do you have any specific references to what you've described
as
a problem?
Adhering to your interpretation on the possible limits on "private" information would effectively eliminate the abuse filter as a useful
tool.
I'm having a hard time seeing this as a widespread problem; there can't
be
many jurisdictions that define public and private in this way, or place
such
restrictions on what can be done with this data that blocking someone
from a
private website in another country could be a violation of the law.
To my mind, private data of the sort we need to worry about is not
"private"
in the sense that it is owned by the Foundation or not publicly viewable, but "private" in the sense that it contains potentially sensitive details
of
individual editors and readers. Nothing in the abuse filter would seem to change the public availability of this sort of data, and I can hardly see Wikimedia being penalized simply for preventing vandalism instead of reacting to it.
Nathan
On Wed, Mar 25, 2009 at 8:35 AM, John at Darkstar vacuum@jeb.no wrote:
The problem is that something that previously was public (vandal moving the page "George W. Bush" to "moron") will now be private (he get a message that hi isn't allowed to do that), this shifts the context from a public context to a private context. Then the extension do logging of actions done in this private context to another site. Users of this site will then have access to private information. It is not the information _disclosed_ which creates the problem, it is the information _collected_. It seems like the information is legal for "administrative purposes", but as soon as it is used for anything other it creates a lot of problems. For example, if anyone takes actions against an user based on this collected information it could be a violation of local laws. (Imagine collected data being integrated with CU) If such actions must be taken, then the central problems are identification of who has access to the logs and are they in fact accurate. That is something you don't want in a wiki with anonymous contributors! :D
The only solution I see is to avoid all logging of private actions if the actions themselves does not lead to a publication of something. Probably it will be legal to do some statistical analysis to administer the system, but that should limit the possibility of later identification of the involved users.
There are a lot of other problems, but I think most of them are minor to this.
John
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
I asked this in the last e-mail, but I'll make it the primary point of this one - do you have specific references that led to your current understanding of the problem? Has the distinction you describe in the collection of information been litigated somewhere else, or the subject of a law in any jurisdiction? As it stands, the logging is a crucial element of the filter. It's probably possible to obscure IP data from the log, but I don't see why that would be necessary at this point.
Nathan
On Wed, Mar 25, 2009 at 11:35 AM, John at Darkstar vacuum@jeb.no wrote:
It is not refusing to accept some kind of edit that creates the problem, it is the logging of the action because you then collect information about the users. Preventing the vandalism instead of reacting to it shifts the actions from a public context to a private context. By avoiding collecting such information and adhering to "administration of the system" most of the problem simply goes away. Its not about using or not using the extension, its about limiting the logging so that no one can gain access to any data to make later actions against the users (ie. the vandals).
WMF may choose to log the information anyhow, like it may choose to not respect copyright laws in some countries. I don't think that is very wise, but I can only say what I believe is right.
John
In Norway Personopplysningsloven §7 gives explicit exemptions to artistic, journalistic and literary work. Vandalism is no such thing but the project as such is a journalistic and literary work. When someone vandalize we claim it is just piggybacking on the normal use of the site and is "published". If there is no publishing the full law apply.
We had a previous correspondence with Datatilsynet where they claimed IP-addresses to be personal information. I don't think they have changed on that matter. WMF may choose to dismiss the law altogether, but I'm not sure Norwegian users can do the same thing.
I doubt seriously that logging of IP-addresses is a crucial element of the extension, its simply nice to have for later retrieval and actions. Given how logging is implemented in Mediawiki it is probably easier to keep the IP-addresses than removing them.
I don't think anything is going to change, so it is as a lost case.
John
http://www.lovdata.no/all/tl-20000414-031-001.html#7
Nathan skrev:
I asked this in the last e-mail, but I'll make it the primary point of this one - do you have specific references that led to your current understanding of the problem? Has the distinction you describe in the collection of information been litigated somewhere else, or the subject of a law in any jurisdiction? As it stands, the logging is a crucial element of the filter. It's probably possible to obscure IP data from the log, but I don't see why that would be necessary at this point.
Nathan
On Wed, Mar 25, 2009 at 11:35 AM, John at Darkstar vacuum@jeb.no wrote:
It is not refusing to accept some kind of edit that creates the problem, it is the logging of the action because you then collect information about the users. Preventing the vandalism instead of reacting to it shifts the actions from a public context to a private context. By avoiding collecting such information and adhering to "administration of the system" most of the problem simply goes away. Its not about using or not using the extension, its about limiting the logging so that no one can gain access to any data to make later actions against the users (ie. the vandals).
WMF may choose to log the information anyhow, like it may choose to not respect copyright laws in some countries. I don't think that is very wise, but I can only say what I believe is right.
John
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
Just so everyone is clear:
1) The abuse log is public. Anyone, including completely anonymous IPs, can read the log.
2) The information in the log is either a) already publicly available by other means, or b) would have been made public had the edit been completed. So abuse logging doesn't release any new information that wouldn't have been available had the edit been completed. (Some of the information it does release, such as User ID number and time of email address confirmation, is extremely obscure though. While "public" in the sense that it could be located by the public, some of the things in the log would be challenging to find otherwise.)
3) Some of the rules are private. The log generated is the same whether the rule being triggered is public or private, and both kinds result in a publicly accessible log.
4) There is an existing bug that deletion of articles does not currently delete the corresponding entries in the Abuse Log. That can potentially allow information about deleted content to leak through in some specific cases. It is on the agenda to patch that hole.
-Robert Rohde
On Wed, Mar 25, 2009 at 5:35 AM, John at Darkstar vacuum@jeb.no wrote:
The problem is that something that previously was public (vandal moving the page "George W. Bush" to "moron") will now be private (he get a message that hi isn't allowed to do that), this shifts the context from a public context to a private context. Then the extension do logging of actions done in this private context to another site. Users of this site will then have access to private information. It is not the information _disclosed_ which creates the problem, it is the information _collected_. It seems like the information is legal for "administrative purposes", but as soon as it is used for anything other it creates a lot of problems. For example, if anyone takes actions against an user based on this collected information it could be a violation of local laws. (Imagine collected data being integrated with CU) If such actions must be taken, then the central problems are identification of who has access to the logs and are they in fact accurate. That is something you don't want in a wiki with anonymous contributors! :D
The only solution I see is to avoid all logging of private actions if the actions themselves does not lead to a publication of something. Probably it will be legal to do some statistical analysis to administer the system, but that should limit the possibility of later identification of the involved users.
There are a lot of other problems, but I think most of them are minor to this.
John
Domas Mituzas skrev:
Hello John,
done, or that any other measure is taken to avoid said problems. Can anyone clarify on the matter as it seems that nearly everyone just hurrays the implementation and there is no effort to solve those issues.
I discussed this with Andrew (he is not on foundation-l), and apparently, AbuseFilter does not seem to disclose any information that would not be available elsewhere. Is there any particular information released by it you'd consider leaking private data?
We love privacy, but we want to be consistent :)
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
On Wed, Mar 25, 2009 at 5:55 PM, Robert Rohde rarohde@gmail.com wrote:
Just so everyone is clear:
- The abuse log is public. Anyone, including completely anonymous
IPs, can read the log.
- The information in the log is either a) already publicly available
by other means, or b) would have been made public had the edit been completed. So abuse logging doesn't release any new information that wouldn't have been available had the edit been completed. (Some of the information it does release, such as User ID number and time of email address confirmation, is extremely obscure though. While "public" in the sense that it could be located by the public, some of the things in the log would be challenging to find otherwise.)
Is it a wild assumption on the part of an editor, that after he has been warned for an "abuse" and not pursued it (by forcing a save if the "save" button is available) to assume that his action was lost, and thus possibly surprising to see it publicly logged?
In my opinion pressing the preview button and then not saving is a similar use case as being warned by the abuse filter and not saving -- you should not expect the lost edit in either case to be publicly available. I think at the least the abuse warning should make it clear that the action and <*x,y,z data of the user> * were publicly logged.
Best regards, Bence Damokos
Bence Damokos wrote:
On Wed, Mar 25, 2009 at 5:55 PM, Robert Rohde rarohde@gmail.com wrote:
Just so everyone is clear:
- The abuse log is public. Anyone, including completely anonymous
IPs, can read the log.
- The information in the log is either a) already publicly available
by other means, or b) would have been made public had the edit been completed. So abuse logging doesn't release any new information that wouldn't have been available had the edit been completed. (Some of the information it does release, such as User ID number and time of email address confirmation, is extremely obscure though. While "public" in the sense that it could be located by the public, some of the things in the log would be challenging to find otherwise.)
Is it a wild assumption on the part of an editor, that after he has been warned for an "abuse" and not pursued it (by forcing a save if the "save" button is available) to assume that his action was lost, and thus possibly surprising to see it publicly logged?
In my opinion pressing the preview button and then not saving is a similar use case as being warned by the abuse filter and not saving -- you should not expect the lost edit in either case to be publicly available. I think at the least the abuse warning should make it clear that the action and <*x,y,z data of the user> * were publicly logged.
Except his assumption when clicking save, before ever seeing the abuse filter warning, was that his edit would be publicly viewable immediately. Unless the user was purposely intending to do something that he knew would be disallowed by the abuse filter, he was fully intending for whatever he wrote to be made public.
On Wed, Mar 25, 2009 at 2:02 PM, Alex mrzmanwiki@gmail.com wrote:
Bence Damokos wrote:
On Wed, Mar 25, 2009 at 5:55 PM, Robert Rohde rarohde@gmail.com wrote:
Just so everyone is clear:
- The abuse log is public. Anyone, including completely anonymous
IPs, can read the log.
- The information in the log is either a) already publicly available
by other means, or b) would have been made public had the edit been completed. So abuse logging doesn't release any new information that wouldn't have been available had the edit been completed. (Some of the information it does release, such as User ID number and time of email address confirmation, is extremely obscure though. While "public" in the sense that it could be located by the public, some of the things in the log would be challenging to find otherwise.)
Is it a wild assumption on the part of an editor, that after he has been warned for an "abuse" and not pursued it (by forcing a save if the "save" button is available) to assume that his action was lost, and thus possibly surprising to see it publicly logged?
In my opinion pressing the preview button and then not saving is a similar use case as being warned by the abuse filter and not saving -- you should not expect the lost edit in either case to be publicly available. I think at the least the abuse warning should make it clear that the action and <*x,y,z data of the user> * were publicly logged.
Except his assumption when clicking save, before ever seeing the abuse filter warning, was that his edit would be publicly viewable immediately. Unless the user was purposely intending to do something that he knew would be disallowed by the abuse filter, he was fully intending for whatever he wrote to be made public.
-- Alex (wikipedia:en:User:Mr.Z-man)
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
Exactly. Which is why I fail to see an argument about privacy concerns. When you press any submit button in Mediawiki for an action that is logged (be it a page move, edit, deletion, user rights change), you do knowing full well that your actions are going to be public. If your attempted action is blocked by the filters, we now log that.
Now, I could see the argument for privacy if we started logging things that are traditionally private (login/logout, password changes, preference changes, etc), but that's not the case here.
-Chad
2009/3/25 John at Darkstar vacuum@jeb.no:
In Norway it is legal to log such actions for the administration of the system, but as soon as it is used for actions against the users it would need a license (konsesjon) to handle such information.
What kind of action against users are you thinking of? All we're likely to do is block them, which would be administering the system. Are you suggesting that contacting their ISP to report abuse would be problematic? (That's the only other action I can think of.)
Basically all actions against users given information in such a log. Contacting the ISP is a valid question, I believe contacting the ISP about completed actions are legal in most jurisdictions, contacting them about uncompleted actions is not. In the US it is legal to act on uncompleted actions after provocations (aka the perpetrators intentions), that is not legal in every other countries (eg quite few countries).
As I see it, all problems comes from public or partly public logging actions that are now in a private context.
Thomas Dalton skrev:
2009/3/25 John at Darkstar vacuum@jeb.no:
In Norway it is legal to log such actions for the administration of the system, but as soon as it is used for actions against the users it would need a license (konsesjon) to handle such information.
What kind of action against users are you thinking of? All we're likely to do is block them, which would be administering the system. Are you suggesting that contacting their ISP to report abuse would be problematic? (That's the only other action I can think of.)
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
2009/3/27 John at Darkstar vacuum@jeb.no:
Contacting the ISP is a valid question, I believe contacting the ISP about completed actions are legal in most jurisdictions, contacting them about uncompleted actions is not.
I fail to see how do you distinguish between a “completed” action and an “uncompleted” one.
When you click Save, you *complete* your action. The only difference Abuse Filter makes is whether this action shows at http://xx.wikipedia.org/wiki/Foobar?action=history, or at http://xx.wikipedia.org/wiki/Special:AbuseLog. How could the law distinguish between those??
-- [[cs:User:Mormegil | Petr Kadlec]]
And what is "every other countries"? I'm not a lawyer, but even if you are, have you done a legal study of all the countries on earth, because there are a lot.
skype: node.ue
2009/3/27 John at Darkstar vacuum@jeb.no:
Basically all actions against users given information in such a log. Contacting the ISP is a valid question, I believe contacting the ISP about completed actions are legal in most jurisdictions, contacting them about uncompleted actions is not. In the US it is legal to act on uncompleted actions after provocations (aka the perpetrators intentions), that is not legal in every other countries (eg quite few countries).
As I see it, all problems comes from public or partly public logging actions that are now in a private context.
Thomas Dalton skrev:
2009/3/25 John at Darkstar vacuum@jeb.no:
In Norway it is legal to log such actions for the administration of the system, but as soon as it is used for actions against the users it would need a license (konsesjon) to handle such information.
What kind of action against users are you thinking of? All we're likely to do is block them, which would be administering the system. Are you suggesting that contacting their ISP to report abuse would be problematic? (That's the only other action I can think of.)
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
2009/3/27 Mark Williamson node.ue@gmail.com:
And what is "every other countries"? I'm not a lawyer, but even if you are, have you done a legal study of all the countries on earth, because there are a lot.
He said "every" not "any". "that is not legal in every other countries" (assuming that last word was intended to be singular) means there is at least one country where it is not legal. "that is not legal in any other country" would mean there were no countries where is was legal. People using "every" and "any" incorrectly is a pet hate of mine, but he got it right!
This issue has been discussed at rather great lenght at Wikipedia in Norwegian (bokmål) and the mailinglist admin-wikipedia-no. I haven't yet seen anyone who agrees with Johns interpretation that logging of attempts to save (publish) blocked by an abusefilter is against Norwegian law.
Finn Rindahl
2009/3/27 Thomas Dalton thomas.dalton@gmail.com
2009/3/27 Mark Williamson node.ue@gmail.com:
And what is "every other countries"? I'm not a lawyer, but even if you are, have you done a legal study of all the countries on earth, because there are a lot.
He said "every" not "any". "that is not legal in every other countries" (assuming that last word was intended to be singular) means there is at least one country where it is not legal. "that is not legal in any other country" would mean there were no countries where is was legal. People using "every" and "any" incorrectly is a pet hate of mine, but he got it right!
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
Thats Finns interpretation of this. Finn and some other users claims that there are no such things as privacy concerns with the Abuse filter, and claims they have a general consensus on the use of it. They even claim that the local authority "Datatilsynet" would not have any opinion on the matter and that they would refuse to respond on queries about it, but in fact they did reply. They where very specific on the issue and said that the use is outside the reach of the law, as the site is in the states, but if it were implemented on a site in Norway the solution has to apply to the law "Personopplysningsloven" or it must be strictly used for "administration of the system". Thats why I said we may _choose_ to use it anyhow.
Note that Norwegian users would be bound by local law both in Norway and partly also abroad, in addition to other local law. How this would be in this case I don't know, but an admin taking actions against a user because he have information from the AbuseFilter would at least be questionable.
Note also that some of the users at no.wp has claimed that we should not relate to Norwegian law, and that Wikipedia should somehow be regarded as "international territory" or something similar. I'm not quite sure how they argue for this idea, I simply can't follow the logic on that.
John
Finn Rindahl skrev:
This issue has been discussed at rather great lenght at Wikipedia in Norwegian (bokmål) and the mailinglist admin-wikipedia-no. I haven't yet seen anyone who agrees with Johns interpretation that logging of attempts to save (publish) blocked by an abusefilter is against Norwegian law.
Finn Rindahl
2009/3/27 Thomas Dalton thomas.dalton@gmail.com
2009/3/27 Mark Williamson node.ue@gmail.com:
And what is "every other countries"? I'm not a lawyer, but even if you are, have you done a legal study of all the countries on earth, because there are a lot.
He said "every" not "any". "that is not legal in every other countries" (assuming that last word was intended to be singular) means there is at least one country where it is not legal. "that is not legal in any other country" would mean there were no countries where is was legal. People using "every" and "any" incorrectly is a pet hate of mine, but he got it right!
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
Thats correct. ~~~~
Thomas Dalton skrev:
2009/3/27 Mark Williamson node.ue@gmail.com:
And what is "every other countries"? I'm not a lawyer, but even if you are, have you done a legal study of all the countries on earth, because there are a lot.
He said "every" not "any". "that is not legal in every other countries" (assuming that last word was intended to be singular) means there is at least one country where it is not legal. "that is not legal in any other country" would mean there were no countries where is was legal. People using "every" and "any" incorrectly is a pet hate of mine, but he got it right!
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
On Fri, Mar 27, 2009 at 8:29 AM, John at Darkstar vacuum@jeb.no wrote:
As I see it, all problems comes from public or partly public logging actions that are now in a private context.
When you press submit, you've already completed your action, and it's public. Just because the AbuseFilter doesn't let you add your text to the page history doesn't make it any less public.
Your whole argument stems from this faulty premise of edits/moves/etc not being public if they are blocked. This is wrong.
-Chad
Exactly. It's not as if we're invading people's minds to see what they're going to do, they've already done it.
MArk
2009/3/27 Chad innocentkiller@gmail.com:
On Fri, Mar 27, 2009 at 8:29 AM, John at Darkstar vacuum@jeb.no wrote:
As I see it, all problems comes from public or partly public logging actions that are now in a private context.
When you press submit, you've already completed your action, and it's public. Just because the AbuseFilter doesn't let you add your text to the page history doesn't make it any less public.
Your whole argument stems from this faulty premise of edits/moves/etc not being public if they are blocked. This is wrong.
-Chad
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
You publish something when you push the submit buttan AND it is later publicly available. It is not published because it reads "submit" on the button or anything else. It is the action AND the result that publish the content.
When the content are in fact published is somewhat amusing in itself, it is no universal accepted definitions of when this is done.
John
Chad skrev:
On Fri, Mar 27, 2009 at 8:29 AM, John at Darkstar vacuum@jeb.no wrote:
As I see it, all problems comes from public or partly public logging actions that are now in a private context.
When you press submit, you've already completed your action, and it's public. Just because the AbuseFilter doesn't let you add your text to the page history doesn't make it any less public.
Your whole argument stems from this faulty premise of edits/moves/etc not being public if they are blocked. This is wrong.
-Chad
foundation-l mailing list foundation-l@lists.wikimedia.org Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l
wikimedia-l@lists.wikimedia.org