While doing CR forĀ https://gerrit.wikimedia.org/r/#/c/232896/3/modules/ext.wikimediaEvents.search.js I came to have serious doubts about this approach.

In brief, it attempts to track user satisfaction with search results by measuring how long do people stay on pages. It does that by appending fromsearch=1 to links for 0.5% of users. However, this results in page views being uncached and thus increasing HTML load time by a factor of 4-5 and, consequentially, kicking even short pages' first paint outside of comfort zone of 1 second - and that's measured from the office, with ping of 2-3 ms to ulsfo. My concern here is that as a result we're trying to measure the very metric we're screwing with, resulting in experiment being inaccurate.

Can we come up with a way of measurement that's less intrusive or alter the requirements of the experiment?

--
Best regards,
Max Semenik ([[User:MaxSem]])