On Dec 28, 2014 11:35 PM, "Oliver Keyes" <okeyes@wikimedia.org> wrote:
> More importantly, the HTTPS protocol involves either sanitising or completely stripping referers, rendering those chains impossible to reconstruct.
Could you elaborate? (we're talking about hops from one page to another within the same domain name?)
More generally: what is the status of hadoop? could we potentially have 3rd-party users get access even if they can't do an NDA by writing their own mapreduce jobs to support their research? Depending on the job maybe it would need legal (LCA) review before releasing results or maybe some could be reviewed by others (approved by LCA).
We could give researchers (all labs users?) access to a truly sanitized dataset with the right format for use when designing jobs. Or maybe not sanitized but filtered to requests for just a few users that volunteered to release their data for X days.
-Jeremy