Marc A. Pelletier wrote:
... A minor random increase of size in document wouldn't even slow down [fingerprinting.]
That's absolutely false. The last time I measured the sizes of all 9,625 vital articles, there was only one at the median length of 30,356 bytes but four articles up to 50 bytes larger. Scale that up to 4,300,000 articles, and are you suggesting anyone is seriously going to try fingerprinting secondary characteristics for buckets of 560 articles? It would not only slow them down, it would make their false positive rate useless.
This is why we need cryptography experts instead of laypeople making probabilistic inferences on Boolean predicates.
Marc, I note that you have recommending not keeping the Perl CPAN modules up to date on Wikimedia Labs: http://www.mediawiki.org/w/index.php?title=Wikimedia_Labs/Tool_Labs/Needed_T... saying that out of date packages are the "best tested" when in fact almost all CPAN packages have their own unit tests. That sort of reasoning is certain to allow known security vulnerabilities to persist when they could easily be avoided.
Anthony wrote:
How much padding is already inherent in HTTPS?
None, which is why Ryan's Google Maps fingerprinting example works.
... Seems to me that any amount of padding is going to give little bang for the buck....
Again, can we please procure expert opinions instead of relying on the existing pool of volunteer and staff opinions, especially when there is so much FUD prevalent discouraging the kinds of encryption which would most likely strengthen privacy?
On 08/02/2013 05:06 PM, James Salsman wrote:
Marc, I note that you have recommending not keeping the Perl CPAN modules up to date on Wikimedia Labs: http://www.mediawiki.org/w/index.php?title=Wikimedia_Labs/Tool_Labs/Needed_T... saying that out of date packages are the "best tested" when in fact almost all CPAN packages have their own unit tests. That sort of reasoning is certain to allow known security vulnerabilities to persist when they could easily be avoided.
Besides being from a few months ago, and unrelated to this conversation, I think that's a mis-characterization of what he said.
He said in general he would lean towards "keeping the distribution's versions since those are the better tested ones", but noted it should be looked at on a "package-by-package basis", and that "there may well be good reasons to bump up to a more recent version" (a security vulnerability that the distro isn't fixing rapidly enough would be such a reason).
It seems from the context "better tested" meant something like "people are using this in practice in real environments", not only automated testing.
Matt Flaschen
On 08/02/2013 05:50 PM, Matthew Flaschen wrote:
It seems from the context "better tested" meant something like "people are using this in practice in real environments", not only automated testing.
And, indeed, given the constraints and objectives of the Tool Labs (i.e.: no secrecy, all open source and data, high reliability), the more important concern is "tested to be robust"; I'd deviate from distribution packaging in the case where a security issue could lead to escalation, but concerns about data leaks are not an issue.
And whilst I am not a cryptography expert (depending, I suppose, how to define "expert") I happen to be very well versed in security protocol design and zero-information analysis (but lack the math acument for cryptography proper so I have to trust the Blums and Shamirs of this world at their word).
For what concerns us here in traffic analysis, TLS is almost entirely worthless *on its own*. It is a necessary step, and has a great number of /other/ benefits that justify its deployment without having anything to do with the NSA's snooping. I was not making an argument against it.
What I /am/ saying, OTOH, is that random padding without (at least) pipelining and placards *is* worthless to protect against traffic analysis since any reliable method to do it would be necessarily robust against deviation in size. Given that it has a cost to implement and maintain, and consumes resources, it would be counterproductive to do that. It would give false reassurance of higher security without actually bringing any security benefit. I.e.: theatre.
-- Marc
Anthony wrote:
How much padding is already inherent in HTTPS?
None, which is why Ryan's Google Maps fingerprinting example works.
Citation needed.
... Seems to me that any amount of padding is going to give little bang for the buck....
Again, can we please procure expert opinions instead of relying on the existing pool of volunteer and staff opinions, especially when there is so much FUD prevalent discouraging the kinds of encryption which would most likely strengthen privacy?
Feel free. But don't talk about what is most likely if you're not interested in being told that you're wrong.
On Fri, Aug 2, 2013 at 10:07 PM, Anthony wikimail@inbox.org wrote:
Anthony wrote:
How much padding is already inherent in HTTPS?
None, which is why Ryan's Google Maps fingerprinting example works.
Citation needed.
Also please address https://en.wikipedia.org/wiki/Block_cipher_modes_of_operation#Padding
It seems that the ciphers which run in CBC mode, at least, are padded. Wikipedia currently seems to be set to use RC4 128. I'm not sure what, if any, padding is used by that cipher. But presumably Wikipedia will switch to a better cipher if Wikimedia cares about security.
On Fri, Aug 2, 2013 at 7:23 PM, Anthony wikimail@inbox.org wrote:
On Fri, Aug 2, 2013 at 10:07 PM, Anthony wikimail@inbox.org wrote:
Anthony wrote:
How much padding is already inherent in HTTPS?
None, which is why Ryan's Google Maps fingerprinting example works.
Citation needed.
Also please address https://en.wikipedia.org/wiki/Block_cipher_modes_of_operation#Padding
It seems that the ciphers which run in CBC mode, at least, are padded. Wikipedia currently seems to be set to use RC4 128. I'm not sure what, if any, padding is used by that cipher. But presumably Wikipedia will switch to a better cipher if Wikimedia cares about security.
We're currently have RC4 and AES ciphers in our list, but have RC4 listed first and have a server preference list to combat BEAST. TLS 1.1/1.2 are enabled and I'll be adding the GCM ciphers to the beginning of the list either during Wikimania or as soon as I get back.
- Ryan
On Sat, Aug 3, 2013 at 4:19 AM, Ryan Lane rlane@wikimedia.org wrote:
On Fri, Aug 2, 2013 at 7:23 PM, Anthony wikimail@inbox.org wrote:
It seems that the ciphers which run in CBC mode, at least, are padded. Wikipedia currently seems to be set to use RC4 128. I'm not sure what,
if
any, padding is used by that cipher. But presumably Wikipedia will
switch
to a better cipher if Wikimedia cares about security.
We're currently have RC4 and AES ciphers in our list, but have RC4 listed first and have a server preference list to combat BEAST. TLS 1.1/1.2 are enabled and I'll be adding the GCM ciphers to the beginning of the list either during Wikimania or as soon as I get back.
Rereading that it looks like I might have implied that Wikimedia didn't care about security. That was absolutely not my intended implication. Sorry about that.
On Sat, Aug 3, 2013 at 4:19 AM, Ryan Lane rlane@wikimedia.org wrote:
We're currently have RC4 and AES ciphers in our list, but have RC4 listed first and have a server preference list to combat BEAST. TLS 1.1/1.2 are enabled and I'll be adding the GCM ciphers to the beginning of the list either during Wikimania or as soon as I get back
If possible, could a quick announcement be made (either here or on wikitech or on bug 52496), when we start supporting GCM? Much appreciated.
*-- * *Tyler Romeo* Stevens Institute of Technology, Class of 2016 Major in Computer Science www.whizkidztech.com | tylerromeo@gmail.com
wikimedia-l@lists.wikimedia.org