{"id":52,"date":"2022-07-28T09:20:52","date_gmt":"2022-07-28T09:20:52","guid":{"rendered":"https:\/\/metric.qcri.org\/blog\/?p=52"},"modified":"2022-07-28T09:20:53","modified_gmt":"2022-07-28T09:20:53","slug":"confusion-prediction-from-eye-tracking-data-experiments-with-machine-learning","status":"publish","type":"post","link":"https:\/\/metric.qcri.org\/blog\/2022\/07\/28\/confusion-prediction-from-eye-tracking-data-experiments-with-machine-learning\/","title":{"rendered":"Confusion Prediction from Eye-Tracking Data: Experiments with Machine Learning"},"content":{"rendered":"\n<p>Predicting user confusion can help improve information presentation on websites, mobile apps, and virtual reality interfaces.<\/p>\n\n\n\n<p>One promising information source for such prediction is eye-tracking data about gaze movements on the screen. Coupled with think-aloud records, we explore if user\u2019s confusion is correlated with primarily fixation-level features.<\/p>\n\n\n\n<p>We find that random forest achieves an accuracy of more than 70% when prediction user confusion using only fixation features. In addition, adding user-level features (age and gender) improves the accuracy to more than 90%.<\/p>\n\n\n\n<p>We also find that balancing the classes before training improves performance.<\/p>\n\n\n\n<p>We test two balancing algorithms, Synthetic Minority Over Sampling Technique (SMOTE) and Adaptive Synthetic Sampling (ADASYN) finding that SMOTE provides a higher performance increase.<\/p>\n\n\n\n<p>Overall, this research contains implications for researchers interested in inferring users\u2019 cognitive states from eye-tracking data.<\/p>\n\n\n\n<p>Salminen, J., Nagpal, M., Kwak, H., An, J., Jung, S.G., and\u00a0Jansen, B. J.\u00a0(2019)\u00a0<a rel=\"noreferrer noopener\" href=\"http:\/\/www.bernardjjansen.com\/uploads\/2\/4\/1\/8\/24188166\/a5-salminen.pdf\" target=\"_blank\">Confusion Prediction from Eye-Tracking Data: Experiments with Machine Learning<\/a>. 9th International Conference on Information Systems and Technologies (ICIST 2019), Cairo, Egypt. 24\u201326 March.\u00a0Article No. 5.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Predicting user confusion can help improve information presentation on websites, mobile apps, and virtual reality interfaces. One promising information source for such prediction is eye-tracking data about gaze movements on the screen. Coupled with think-aloud records, we explore if user\u2019s confusion is correlated with primarily fixation-level features. We find that random forest achieves an accuracy &#8230; <a title=\"Confusion Prediction from Eye-Tracking Data: Experiments with Machine Learning\" class=\"read-more\" href=\"https:\/\/metric.qcri.org\/blog\/2022\/07\/28\/confusion-prediction-from-eye-tracking-data-experiments-with-machine-learning\/\" aria-label=\"More on Confusion Prediction from Eye-Tracking Data: Experiments with Machine Learning\">Read more<\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[5],"tags":[],"class_list":["post-52","post","type-post","status-publish","format-standard","hentry","category-user-study"],"jetpack_featured_media_url":"","jetpack-related-posts":[{"id":62,"url":"https:\/\/metric.qcri.org\/blog\/2022\/07\/28\/fixation-and-confusion-investigating-eye-tracking-participants-exposure-to-information-in-personas\/","url_meta":{"origin":52,"position":0},"title":"Fixation and Confusion \u2013 Investigating Eye-tracking Participants\u2019 Exposure to Information in Personas","date":"July 28, 2022","format":false,"excerpt":"To more effectively convey relevant information to end users of persona profiles, we conducted a user study consisting of 29 participants engaging with three persona layout treatments. We were interested in confusion engendered by the treatments on the participants, and conducted a within-subjects study in the actual work environment, using\u2026","rel":"","context":"In &quot;user study&quot;","img":{"alt_text":"","src":"","width":0,"height":0},"classes":[]},{"id":287,"url":"https:\/\/metric.qcri.org\/blog\/2024\/07\/05\/eye-tracking-beyond-conventional-uses\/","url_meta":{"origin":52,"position":1},"title":"Eye Tracking Beyond Conventional Uses","date":"July 5, 2024","format":false,"excerpt":"Eye tracking is a versatile technology with applications across various fields for its ability to provide detailed insights into visual attention and behavior. With more and more organizations gaining interest in this technology\u2019s abilities, there has been a rise in numerous offline and online eye-tracking tools such as METRIC. User\u2026","rel":"","context":"In &quot;analytics&quot;","img":{"alt_text":"eye-tracking technology for early screening and diagnosis of ASD","src":"https:\/\/i0.wp.com\/metric.qcri.org\/blog\/wp-content\/uploads\/2024\/07\/openart-image_notBvuei_1720099256913_raw.png?resize=350%2C200&ssl=1","width":350,"height":200},"classes":[]},{"id":123,"url":"https:\/\/metric.qcri.org\/blog\/2023\/07\/23\/using-eye-tracking-in-user-studies-unveiling-visual-attention-patterns\/","url_meta":{"origin":52,"position":2},"title":"Using Eye-Tracking in User Studies: Unveiling Visual Attention Patterns","date":"July 23, 2023","format":false,"excerpt":"Eye tracking is critical in user studies because it provides valuable insights into users\u2019 visual attention and behavior. It allows researchers to see where the users focus their attention while interacting with their digital interfaces, products, or content. The following are reasons why eye tracking is important in user studies:\u2026","rel":"","context":"Similar post","img":{"alt_text":"","src":"https:\/\/i0.wp.com\/metric.qcri.org\/blog\/wp-content\/uploads\/2023\/07\/image.png?resize=350%2C200&ssl=1","width":350,"height":200},"classes":[]},{"id":28,"url":"https:\/\/metric.qcri.org\/blog\/2022\/07\/28\/confusion-and-information-triggered-by-photos-in-persona-profiles\/","url_meta":{"origin":52,"position":3},"title":"Confusion and Information Triggered by Photos in Persona Profiles","date":"July 28, 2022","format":false,"excerpt":"We investigate whether additional photos beyond a single headshot makes a persona profile more informative without confusing the end user. We conduct an eye-tracking experiment and qualitative interviews with digital content creators after varying the persona in photos via a single headshot, a headshot and photo of the persona in\u2026","rel":"","context":"In &quot;user study&quot;","img":{"alt_text":"","src":"","width":0,"height":0},"classes":[]},{"id":36,"url":"https:\/\/metric.qcri.org\/blog\/2022\/07\/28\/comparing-persona-analytics-and-social-media-analytics-for-a-user-centric-task-using-eye-tracking-and-think-aloud\/","url_meta":{"origin":52,"position":4},"title":"Comparing Persona Analytics and Social Media Analytics for a User-Centric Task Using Eye-Tracking and Think-Aloud","date":"July 28, 2022","format":false,"excerpt":"We compare a data-driven persona system and an analytics system for efficiency and effectiveness for a user identification task. Findings from the 34-participant experiment show that the data-driven persona system affords faster task completion, is easier for users to engage with, and provides better user identification accuracy. Eye-tracking data indicates\u2026","rel":"","context":"In &quot;user study&quot;","img":{"alt_text":"","src":"","width":0,"height":0},"classes":[]},{"id":188,"url":"https:\/\/metric.qcri.org\/blog\/2024\/06\/11\/spyware-deconstructing-3-myths-about-eye-tracking\/","url_meta":{"origin":52,"position":5},"title":"Spyware? Deconstructing 3 Myths About Eye Tracking","date":"June 11, 2024","format":false,"excerpt":"With the rise of the use of eye tracking in fields such as medical and psychological research, marketing research, and human-computer interaction studies, skepticism and conspiracy theories about the study method are unavoidable. So let's deconstruct some of these myths and theories. Heatmap from Survey2Persona Landing Page [Source: Metric] 1.\u2026","rel":"","context":"In &quot;eye tracking&quot;","img":{"alt_text":"Spyware? Deconstructing 3 Myths About Eyetracking","src":"https:\/\/i0.wp.com\/metric.qcri.org\/blog\/wp-content\/uploads\/2024\/06\/Screen-Shot-2024-06-11-at-10.51.01-AM.jpg?resize=350%2C200&ssl=1","width":350,"height":200},"classes":[]}],"_links":{"self":[{"href":"https:\/\/metric.qcri.org\/blog\/wp-json\/wp\/v2\/posts\/52","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/metric.qcri.org\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/metric.qcri.org\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/metric.qcri.org\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/metric.qcri.org\/blog\/wp-json\/wp\/v2\/comments?post=52"}],"version-history":[{"count":1,"href":"https:\/\/metric.qcri.org\/blog\/wp-json\/wp\/v2\/posts\/52\/revisions"}],"predecessor-version":[{"id":53,"href":"https:\/\/metric.qcri.org\/blog\/wp-json\/wp\/v2\/posts\/52\/revisions\/53"}],"wp:attachment":[{"href":"https:\/\/metric.qcri.org\/blog\/wp-json\/wp\/v2\/media?parent=52"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/metric.qcri.org\/blog\/wp-json\/wp\/v2\/categories?post=52"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/metric.qcri.org\/blog\/wp-json\/wp\/v2\/tags?post=52"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}