See also: Chen, M. C., Anderson, J. R., & Sohn, M. H. (2001, March). What can a mouse cursor tell us more?: correlation of eye/mouse movements on web browsing. In CHI'01 extended abstracts on Human factors in computing systems (pp. 281-282). ACM.  (See attached)

Kim, N. W., Bylinskii, Z., Borkin, M. A., Gajos, K. Z., Oliva, A., Durand, F., & Pfister, H. (2017). BubbleView: an interface for crowdsourcing image importance maps and tracking visual attention. ACM Transactions on Computer-Human Interaction (TOCHI)24(5), 36. https://arxiv.org/pdf/1702.05150 

I don't know how readily available mouse-based attention tracking solutions are.  But from the literature, it seems like there are good options for understanding attention through purely software means. 

-Aaron

On Fri, Nov 9, 2018 at 11:24 AM Tilman Bayer <tbayer@wikimedia.org> wrote:
Nice! Added a link at https://meta.wikimedia.org/wiki/Research:Which_parts_of_an_article_do_readers_read#Eyetracking

On Fri, Nov 9, 2018 at 3:38 AM, Gilles Dubuc <gilles@wikimedia.org> wrote:
I've just discovered and received today an amazing piece of hardware that a lot of you might find useful. It's called the Tobii Eye Tracker 4C, which can be used with the Tobii Pro Sprint hosted service.


This could allow us to do lab user testing where we record gaze cheaply and very easily.

_______________________________________________
Research-Internal mailing list
Research-Internal@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/research-internal




--
Tilman Bayer
Senior Analyst
Wikimedia Foundation
IRC (Freenode): HaeB
_______________________________________________
Research-Internal mailing list
Research-Internal@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/research-internal