On 08/02/2013 01:32 PM, James Salsman wrote:
Padding each transmission with a random number of
bytes, up to say 50
or 100, might provide a greater defense against fingerprinting while
saving massive amounts of bandwidth.
It would slightly change the algorithm used to make the fingerprint, not
make it any significantly higher, and you'd want to have some fuzz in
the match process anyways since you wouldn't necessarily want to have to
fiddle with your database at every edit.
The combination of "at least this size" with "at least that many
secondary documents of at least those sizes in that order" is probably
sufficient to narrow the match to a very tiny minority of articles.
You'd also need to randomize delays, shuffle load order, load blinds,
etc. A minor random increase of size in document wouldn't even slow
down the process.
-- Marc