See also: Chen, M. C., Anderson, J. R., & Sohn, M. H. (2001, March). What
can a mouse cursor tell us more?: correlation of eye/mouse movements on web
browsing. In CHI'01 extended abstracts on Human factors in computing
systems (pp. 281-282). ACM. (I can provide the PDF on request -- too big
to attach)
Kim, N. W., Bylinskii, Z., Borkin, M. A., Gajos, K. Z., Oliva, A., Durand,
F., & Pfister, H. (2017). BubbleView: an interface for crowdsourcing image
importance maps and tracking visual attention. ACM Transactions on
Computer-Human Interaction (TOCHI), 24(5), 36.
https://arxiv.org/pdf/1702.05150
I don't know how readily available mouse-based attention tracking solutions
are. But from the literature, it seems like there are good options for
understanding attention through purely software means.
-Aaron
On Fri, Nov 9, 2018 at 11:24 AM Tilman Bayer
<tbayer(a)wikimedia.org> wrote:
Nice! Added a link at
https://meta.wikimedia.org/wiki/Research:Which_parts_of_an_article_do_reade…
On Fri, Nov 9, 2018 at 3:38 AM, Gilles Dubuc <gilles(a)wikimedia.org>
wrote:
I've just discovered and received today an
amazing piece of hardware
that a lot of you might find useful. It's called the Tobii Eye Tracker
4C <https://tobiigaming.com/product/tobii-eye-tracker-4c/>, which can
be used with the Tobii Pro Sprint <https://www.tobiipro.com/sprint/>
hosted service.
I've recorded a video demo of it here:
https://www.mediawiki.org/wiki/File:Tobii_Eye_Tracker_4C_demo.webm
This could allow us to do lab user testing where we record gaze cheaply
and very easily.
_______________________________________________
Research-Internal mailing list
Research-Internal(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/research-internal
--
Tilman Bayer
Senior Analyst
Wikimedia Foundation
IRC (Freenode): HaeB
_______________________________________________
Research-Internal mailing list
Research-Internal(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/research-internal