Source: Duke University
Published: March 7, 2022
Article: Source: National Science Foundation
Published: March 30, 2022
The minutiae of eye movement and pupillary response reveal a surprising amount of information about our inner thoughts, for example, whether we are bored or excited, whether our attention is focused, and whether we are an expert or a novice at a task. Our eyes even reveal whether we are fluent in a language or whether we are reading a comic book or advanced literature. Computer engineers exploring this poetic window to our soul designed a "virtual eyes" software that simulates human eye movement to study how humans see the world and process visual information, with aims for smaller-scale uses in metaverse applications. Called EyeSyn, the generative model used publicly available data, such as videos of speakers addressing the media during press conferences or visuals of art, and compared it to data from the eye movements of actual viewers. By training on publicly available data, EyeSyn reduces the privacy concerns that usually come with collecting this sort of user data. Even when trained on small datasets, which notably alleviates the problem of sparse gaze data from human subjects, the results showed that the program could simulate distinct patterns of gaze signals and eye responses to stimuli with 90% accuracy. "The synthetic data alone isn't perfect, but it's a good starting point," the lead researcher notes, "Smaller companies can use it rather than spending the time and money on trying to build their own real-world datasets (with human subjects). And because the personalization of the algorithms can be done on local systems, people don't have to worry about their priacy eye movement data becoming part of a large database."
![]() |
Comparison of four tasks between actual gaze (left) and simulated gaze (right) using EyeSyn |
My rating of this study: ⭐⭐
Lan G
No comments:
Post a Comment