Abstract

For information visualization researchers, eye tracking has been a useful tool to investigate research participants’ underlying cognitive processes by tracking their eye movements while they interact with visual techniques. We used an eye tracker to better understand why participants with a variant of a tabular visualization called ‘SimulSort’ outperformed ones with a conventional table and typical one-column sorting feature (i.e., Typical Sorting). The collected eye-tracking data certainly shed light on the detailed cognitive processes of the participants; SimulSort helped with decision-making tasks by promoting efficient browsing behavior and compensatory decision-making strategies. However, more interestingly, we also found unexpected eye-tracking patterns with Simul- Sort. We investigated the cause of the unexpected patterns through a crowdsourcing-based study (i.e., Experiment 2), which elicited an important limitation of the eye tracking method: incapability of capturing peripheral vision. This particular result would be a caveat for other visualization researchers who plan to use an eye tracker in their studies. In addition, the method to use a testing stimulus (i.e., influential column) in Experiment 2 to verify the existence of such limitations would be useful for researchers who would like to verify their eye tracking results.

Comments

Kim, S.-H., Dong, Z., Xian, H., Upatising, B., & Yi, J. S. (2012) Does an Eye Tracker Tell the Truth about Visualizations?: Findings while Investigating Visualizations for Decision Making. IEEE Transactions on Visualization and Computer Graphics, 18(12):2421-2430.

Keywords

Visualized decision making, eye tracking, crowdsourcing, quantitative empirical study, limitations, peripheral vision

Date of this Version

8-1-2012

DOI

10.1109/TVCG.2012.215

Included in

Engineering Commons

Share

COinS