Rethinking Pedagogical Use of Eye Trackers for Visual Problems with Eye Gaze Interpretation Tasks
Main Article Content
Abstract
Eye tracking technology enables the visualisation of a problem solver's eye movement while working on a problem. The eye movement of experts has been used to draw attention to expert problem solving processes in a bid to teach procedural skills to learners. Such affordances appear as eye movement modelling examples (EMME) in the literature. This work intends to further this line of work by suggesting how eye gaze data can not only guide attention but also scaffold learning through constructive engagement with the problem solving process of another human. Inferring the models’ problem solving process, be it that of an expert or novice, from their eye gaze display would require a learner to make interpretations that are rooted in the knowledge elements relevant to such problem solving. Such tasks, if designed properly, are expected to probe or foster a deeper understanding of a topic as their solutions would require not only following the expert gaze to learn a particular skill, but also interpreting the solution process as evident from the gaze pattern of an expert or even of a novice. This position paper presents a case for such tasks, which we call eye gaze interpretation (EGI) tasks. We start with the theoretical background of these tasks, followed by a conceptual example and representation to elucidate the concept of EGI tasks. Thereafter, we discuss design considerations and pedagogical affordances, using a domain-specific (chemistry) spectral graph problem. Finally, we explore the possibilities and constraints of EGI tasks in various fields that require visual representations for problem solving.
Article Details
FLR adopts the Attribution-NonCommercial-NoDerivs Creative Common License (BY-NC-ND). That is, Copyright for articles published in this journal is retained by the authors with, however, first publication rights granted to the journal. By virtue of their appearance in this open access journal, articles are free to use, with proper attribution, in educational and other non-commercial settings.
References
Alexander, R. G., Waite, S., Macknik, S. L., & Martinez-Conde, S. (2020). What do radiologists look for? Advances and limitations of perceptual learning in radiologic search. Journal of Vision, 20(10), 17-17.
Author. [EGI tasks]. (2022, May 13). P1- Nitropropane [Video]. YouTube. https://www.youtube.com/watch?v=WfCFErB0v2k
Author. [EGI tasks]. (2022, May 13). P2 Nitropropane [Video]. YouTube. https://www.youtube.com/watch?v=U5zWdUveR-c
Chase, C. C., Malkiewich, L., & S Kumar, A. (2019). Learning to notice science concepts in engineering activities and transfer situations. Science Education, 103(2), 440-471.
Chiu, M.-H. (Ed.). (2016). Science Education Research and Practices in Taiwan. Springer Singapore. https://doi.org/10.1007/978-981-287-472-6
Dosher, B., & Lu, Z. L. (2017). Visual perceptual learning and models. Annual review of vision science, 3, 343.
Emhardt, S., Jarodzka, H., Brand-Gruwel, S., Drumm, C., & van Gog, T. (2020). Introducing Eye Movement Modeling Examples for Programming Education and the Role of Teacher’s Didactic Guidance. ACM Symposium on Eye Tracking Research and Applications, 1–4. https://doi.org/10.1145/3379156.3391978
Emhardt, S. N., Wermeskerken, M., Scheiter, K., & Gog, T. (2020). Inferring task performance and confidence from displays of eye movements. Applied Cognitive Psychology, 34(6), 1430–1443. https://doi.org/10.1002/acp.3721
Foulsham, T., & Lock, M. (2015). How the Eyes Tell Lies: Social Gaze During a Preference Task. Cognitive Science, 39(7), 1704–1726. https://doi.org/10.1111/cogs.12211
Gibson, E. J. (1969). Principles of perceptual learning and development.
Guegan, S., Steichen, O., & Soria, A. (2021, March). Literature review of perceptual learning modules in medical education: What can we conclude regarding dermatology?. In Annales de Dermatologie et de Vénéréologie (Vol. 148, No. 1, pp. 16-22). Elsevier Masson.
Haider, H., & Frensch, P. A. (n.d.). Information Reduction During Skill Acquisition: The Influence of Task Instruction. 23.
Helle, L. (2017). Prospects and Pitfalls in Combining Eye-Tracking Data and Verbal Reports. Frontline Learning Research, 5(3), 1-12.
Jarodzka, H., Scheiter, K., Gerjets, P., & van Gog, T. (2010). In the eyes of the beholder: How experts and novices interpret dynamic stimuli. Learning and Instruction, 20(2), 146–154. https://doi.org/10.1016/j.learninstruc.2009.02.019
Jarodzka, H., van Gog, T., Dorr, M., Scheiter, K., & Gerjets, P. (2013). Learning to see: Guiding students’ attention via a Model’s eye movements fosters learning. Learning and Instruction, 25, 62–70. https://doi.org/10.1016/j.learninstruc.2012.11.004
Just, M. A., & Carpenter, P. A. (n.d.). A Theory of Reading: From Eye Fixations to Comprehension. 26.
Kellman, P. J., & Massey, C. M. (2013). Perceptual learning, cognition, and expertise. In Psychology of learning and motivation (Vol. 58, pp. 117-165). Academic Press.
Kok, E. M., & Jarodzka, H. (2017). Before your very eyes: The value and limitations of eye tracking in medical education. Medical Education, 51(1), 114–122. https://doi.org/10.1111/medu.13066
Litchfield, D., & Ball, L. J. (2011). Rapid communication: Using another’s gaze as an explicit aid to insight problem solving. Quarterly Journal of Experimental Psychology, 64(4), 649–656. https://doi.org/10.1080/17470218.2011.558628
Lobato, J. (2012). The actor-oriented transfer perspective and its contributions to educational research and practice. Educational Psychologist, 47(3), 232-247.
Mason, L., Pluchino, P., & Tornatora, M. C. (2015). Eye-movement modeling of integrative reading of an illustrated text: Effects on processing and learning. Contemporary Educational Psychology, 41, 172–187. https://doi.org/10.1016/j.cedpsych.2015.01.004
Mason, L., Scheiter, K., & Tornatora, M. C. (2017). Using eye movements to model the sequence of text-picture processing for multimedia comprehension: Using eye movements to model. Journal of Computer Assisted Learning, 33(5), 443–460. https://doi.org/10.1111/jcal.12191
Muzyka, J. L. (n.d.). Spectral Zoo: Combined Spectroscopy Practice Problems for Organic Chemistry. 12.
Rayner, K. (Ed.). (1992). Eye Movements and Visual Cognition: Scene Perception and Reading. Springer New York. https://doi.org/10.1007/978-1-4612-2852-3
Rayner, K. (2009). The 35th Sir Frederick Bartlett Lecture: Eye movements and attention in reading, scene perception, and visual search. Quarterly Journal of Experimental Psychology, 62(8), 1457–1506. https://doi.org/10.1080/17470210902816461
Reiser, B. J. (2004). Scaffolding complex learning: The mechanisms of structuring and problematizing student work. In The Journal of the Learning sciences (pp. 273-304). Psychology Press.
Shannon, C. E. (1948). A mathematical theory of communication. The Bell system technical journal, 27(3), 379-423.
Thomas, L. E., & Lleras, A. (2007). Moving eyes and moving thought: On the spatial compatibility between eye movements and cognition. Psychonomic Bulletin & Review, 14(4), 663–668. https://doi.org/10.3758/BF03196818
Underwood, G., & Everatt, J. (1992). The Role of Eye Movements in Reading: Some Limitations of the Eye-Mind Assumption. In Advances in Psychology (Vol. 88, pp. 111–169). Elsevier. https://doi.org/10.1016/S0166-4115(08)61744-6
Van Es, E. A., & Sherin, M. G. (2002). Learning to notice: Scaffolding new teachers’ interpretations of classroom interactions. Journal of technology and teacher education, 10(4), 571-596.
van Gog, T., Jarodzka, H., Scheiter, K., Gerjets, P., & Paas, F. (2009). Attention guidance during example study via the model’s eye movements. Computers in Human Behavior, 25(3), 785–791. https://doi.org/10.1016/j.chb.2009.02.007
van Marlen, T., van Wermeskerken, M., Jarodzka, H., & van Gog, T. (2016). Showing a model’s eye movements in examples does not improve learning of problem-solving tasks. Computers in Human Behavior, 65, 448–459. https://doi.org/10.1016/j.chb.2016.08.041
van Wermeskerken, M., Litchfield, D., & van Gog, T. (2018). What Am I Looking at? Interpreting Dynamic and Static Gaze Displays. Cognitive Science, 42(1), 220–252. https://doi.org/10.1111/cogs.12484
Xie, H., Zhao, T., Deng, S., Peng, J., Wang, F., & Zhou, Z. (2021). Using eye movement modelling examples to guide visual attention and foster cognitive performance: A meta analysis. Journal of Computer Assisted Learning, 37(4), 1194–1206. https://doi.org/10.1111/jcal.12568
Watanabe, T., & Sasaki, Y. (2015). Perceptual learning: toward a comprehensive theory. Annual review of psychology, 66, 197.
Zelinsky, G. J., Peng, Y., & Samaras, D. (2013). Eye can read your mind: Decoding gaze fixations to reveal categorical search targets. Journal of Vision, 13(14), 10–10. https://doi.org/10.1167/13.14.10