I've just discovered and received today an amazing piece of hardware that a lot of you might find useful. It's called the Tobii Eye Tracker 4C https://tobiigaming.com/product/tobii-eye-tracker-4c/, which can be used with the Tobii Pro Sprint https://www.tobiipro.com/sprint/ hosted service.
I've recorded a video demo of it here: https://www.mediawiki.org/wiki/File:Tobii_Eye_Tracker_4C_demo.webm
This could allow us to do lab user testing where we record gaze cheaply and very easily.
Nice! Added a link at https://meta.wikimedia.org/wiki/Research:Which_parts_of_an_article_do_reader...
On Fri, Nov 9, 2018 at 3:38 AM, Gilles Dubuc gilles@wikimedia.org wrote:
I've just discovered and received today an amazing piece of hardware that a lot of you might find useful. It's called the Tobii Eye Tracker 4C https://tobiigaming.com/product/tobii-eye-tracker-4c/, which can be used with the Tobii Pro Sprint https://www.tobiipro.com/sprint/ hosted service.
I've recorded a video demo of it here: https://www.mediawiki. org/wiki/File:Tobii_Eye_Tracker_4C_demo.webm
This could allow us to do lab user testing where we record gaze cheaply and very easily.
Research-Internal mailing list Research-Internal@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/research-internal
See also: Chen, M. C., Anderson, J. R., & Sohn, M. H. (2001, March). What can a mouse cursor tell us more?: correlation of eye/mouse movements on web browsing. In *CHI'01 extended abstracts on Human factors in computing systems* (pp. 281-282). ACM. (See attached)
Kim, N. W., Bylinskii, Z., Borkin, M. A., Gajos, K. Z., Oliva, A., Durand, F., & Pfister, H. (2017). BubbleView: an interface for crowdsourcing image importance maps and tracking visual attention. *ACM Transactions on Computer-Human Interaction (TOCHI)*, *24*(5), 36. https://arxiv.org/pdf/1702.05150
I don't know how readily available mouse-based attention tracking solutions are. But from the literature, it seems like there are good options for understanding attention through purely software means.
-Aaron
On Fri, Nov 9, 2018 at 11:24 AM Tilman Bayer tbayer@wikimedia.org wrote:
Nice! Added a link at https://meta.wikimedia.org/wiki/Research:Which_parts_of_an_article_do_reader...
On Fri, Nov 9, 2018 at 3:38 AM, Gilles Dubuc gilles@wikimedia.org wrote:
I've just discovered and received today an amazing piece of hardware that a lot of you might find useful. It's called the Tobii Eye Tracker 4C https://tobiigaming.com/product/tobii-eye-tracker-4c/, which can be used with the Tobii Pro Sprint https://www.tobiipro.com/sprint/ hosted service.
I've recorded a video demo of it here: https://www.mediawiki.org/wiki/File:Tobii_Eye_Tracker_4C_demo.webm
This could allow us to do lab user testing where we record gaze cheaply and very easily.
Research-Internal mailing list Research-Internal@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/research-internal
-- Tilman Bayer Senior Analyst Wikimedia Foundation IRC (Freenode): HaeB _______________________________________________ Research-Internal mailing list Research-Internal@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/research-internal
See also: Chen, M. C., Anderson, J. R., & Sohn, M. H. (2001, March). What can a mouse cursor tell us more?: correlation of eye/mouse movements on web browsing. In CHI'01 extended abstracts on Human factors in computing systems (pp. 281-282). ACM. (I can provide the PDF on request -- too big to attach)
Kim, N. W., Bylinskii, Z., Borkin, M. A., Gajos, K. Z., Oliva, A., Durand, F., & Pfister, H. (2017). BubbleView: an interface for crowdsourcing image importance maps and tracking visual attention. ACM Transactions on Computer-Human Interaction (TOCHI), 24(5), 36. https://arxiv.org/pdf/1702.05150
I don't know how readily available mouse-based attention tracking solutions are. But from the literature, it seems like there are good options for understanding attention through purely software means.
-Aaron
On Fri, Nov 9, 2018 at 11:24 AM Tilman Bayer tbayer@wikimedia.org wrote:
Nice! Added a link at https://meta.wikimedia.org/wiki/Research:Which_parts_of_an_article_do_reader...
On Fri, Nov 9, 2018 at 3:38 AM, Gilles Dubuc gilles@wikimedia.org wrote:
I've just discovered and received today an amazing piece of hardware that a lot of you might find useful. It's called the Tobii Eye Tracker 4C https://tobiigaming.com/product/tobii-eye-tracker-4c/, which can be used with the Tobii Pro Sprint https://www.tobiipro.com/sprint/ hosted service.
I've recorded a video demo of it here: https://www.mediawiki.org/wiki/File:Tobii_Eye_Tracker_4C_demo.webm
This could allow us to do lab user testing where we record gaze cheaply and very easily.
Research-Internal mailing list Research-Internal@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/research-internal
-- Tilman Bayer Senior Analyst Wikimedia Foundation IRC (Freenode): HaeB _______________________________________________ Research-Internal mailing list Research-Internal@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/research-internal
Indeed, on desktop mouse cursor can be a decent proxy for gaze. I can't remember if it was the particular paper you've quoted where I read about it, but I seem to recall that the correlation varied quite a bit from one person to another. Essentially some people point the mouse to where they're looking, but others don't.
This paper by Microsoft about cursor position is also relevant: https://phabricator.wikimedia.org/T165272#3955001 but it shows the limitations of predicting things based on cursor position when the person doesn't know what they're looking for when they land on the page. It works really well for the use case of people looking for "Facebook" on Bing to go to Facebook. Which probably has some muscle memory to it, i.e. people who do that already know where they're going to move their mouse to.
This is the paper that got me interested in gaze tracking for performance: https://phabricator.wikimedia.org/T165272#3933730 It has limitations in terms of methodology, with people watching videos and playing a guessing game instead of having a more natural browsing experience, but the results are impressive. Enough to make me want to investigate what people look at on a Wikipedia page as it's loading, to inform our decisions about how we prioritise page elements.
The upside of doing a study with a device like the Tobii 4C is that we can also verify at the same time the correlation between gaze and cursor position.
On Fri, Nov 9, 2018 at 5:54 PM Aaron Halfaker ahalfaker@wikimedia.org wrote:
See also: Chen, M. C., Anderson, J. R., & Sohn, M. H. (2001, March). What can a mouse cursor tell us more?: correlation of eye/mouse movements on web browsing. In CHI'01 extended abstracts on Human factors in computing systems (pp. 281-282). ACM. (I can provide the PDF on request -- too big to attach)
Kim, N. W., Bylinskii, Z., Borkin, M. A., Gajos, K. Z., Oliva, A., Durand, F., & Pfister, H. (2017). BubbleView: an interface for crowdsourcing image importance maps and tracking visual attention. ACM Transactions on Computer-Human Interaction (TOCHI), 24(5), 36. https://arxiv.org/pdf/1702.05150
I don't know how readily available mouse-based attention tracking solutions are. But from the literature, it seems like there are good options for understanding attention through purely software means.
-Aaron
On Fri, Nov 9, 2018 at 11:24 AM Tilman Bayer tbayer@wikimedia.org wrote:
Nice! Added a link at https://meta.wikimedia.org/wiki/Research:Which_parts_of_an_article_do_reader...
On Fri, Nov 9, 2018 at 3:38 AM, Gilles Dubuc gilles@wikimedia.org wrote:
I've just discovered and received today an amazing piece of hardware that a lot of you might find useful. It's called the Tobii Eye Tracker 4C https://tobiigaming.com/product/tobii-eye-tracker-4c/, which can be used with the Tobii Pro Sprint https://www.tobiipro.com/sprint/ hosted service.
I've recorded a video demo of it here: https://www.mediawiki.org/wiki/File:Tobii_Eye_Tracker_4C_demo.webm
This could allow us to do lab user testing where we record gaze cheaply and very easily.
Research-Internal mailing list Research-Internal@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/research-internal
-- Tilman Bayer Senior Analyst Wikimedia Foundation IRC (Freenode): HaeB _______________________________________________ Research-Internal mailing list Research-Internal@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/research-internal
Engineering mailing list Engineering@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/engineering
I wonder how many Wikipedia readers still use a mouse.
On Fri, Nov 9, 2018 at 8:54 AM Aaron Halfaker ahalfaker@wikimedia.org wrote:
See also: Chen, M. C., Anderson, J. R., & Sohn, M. H. (2001, March). What can a mouse cursor tell us more?: correlation of eye/mouse movements on web browsing. In CHI'01 extended abstracts on Human factors in computing systems (pp. 281-282). ACM. (I can provide the PDF on request -- too big to attach)
Kim, N. W., Bylinskii, Z., Borkin, M. A., Gajos, K. Z., Oliva, A., Durand, F., & Pfister, H. (2017). BubbleView: an interface for crowdsourcing image importance maps and tracking visual attention. ACM Transactions on Computer-Human Interaction (TOCHI), 24(5), 36. https://arxiv.org/pdf/1702.05150
I don't know how readily available mouse-based attention tracking solutions are. But from the literature, it seems like there are good options for understanding attention through purely software means.
-Aaron
On Fri, Nov 9, 2018 at 11:24 AM Tilman Bayer tbayer@wikimedia.org wrote:
Nice! Added a link at https://meta.wikimedia.org/wiki/Research:Which_parts_of_an_article_do_reader...
On Fri, Nov 9, 2018 at 3:38 AM, Gilles Dubuc gilles@wikimedia.org wrote:
I've just discovered and received today an amazing piece of hardware that a lot of you might find useful. It's called the Tobii Eye Tracker 4C https://tobiigaming.com/product/tobii-eye-tracker-4c/, which can be used with the Tobii Pro Sprint https://www.tobiipro.com/sprint/ hosted service.
I've recorded a video demo of it here: https://www.mediawiki.org/wiki/File:Tobii_Eye_Tracker_4C_demo.webm
This could allow us to do lab user testing where we record gaze cheaply and very easily.
Research-Internal mailing list Research-Internal@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/research-internal
-- Tilman Bayer Senior Analyst Wikimedia Foundation IRC (Freenode): HaeB _______________________________________________ Research-Internal mailing list Research-Internal@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/research-internal
--
Sherry Snyder (WhatamIdoing) Community Liaison, Wikimedia Foundation