Simulated human eye movement aims to shape Metaverse platforms

Getting data on how people’s eyes move while watching TV or reading books is a tedious process, and app developers can now skip it thanks to Duke Engineering’s “virtual eyes.” Credit: Maria Gorlatova, Duke University

Computer engineers at Duke University have developed virtual eyes that simulate the way humans look at the world with enough precision that companies can train virtual reality and augmented reality software. The program, called EyeSyn for short, will help developers build applications for rapidly expanding metaverses while protecting user data.

The results have been accepted and will be presented at the International Conference on Information Processing in Sensor Networks (IPSN), 4-6 May 2022, the first annual Network Sensing and Control Forum.

“If you want to detect if someone is reading a comic book or advanced literature just by looking them in the eye, you can do that,” said Maria Gorlatova, an assistant professor at Nortel Networks in electrical and computer engineering at Duke University.

“But training this kind of algorithm requires data from hundreds of people who wear headphones for hours at a time,” Gorlatova added. “We wanted to develop software that not only reduces privacy concerns about collecting this type of data, but also allows small businesses that don’t have these levels of resources to get into the metaverse.”

The poetic insight describing the eyes as windows to the soul has been repeated since at least biblical times for good reason: the small movements of how our eyes move and our pupils dilate provide an astonishing amount of information. The human eye can detect if we are bored or excited, where the focus is, if we are experts or novices at a particular task, or even if we are proficient in a particular language.

See also  Clear facts and virtual reality Rainer Maria Rilke Library in Paris Wednesday 13 April 2022

“When you prioritize your vision, it also says a lot about you as a person,” Gorlatova said. “It can inadvertently reveal gender and racial biases, interests we don’t want others to know, and information we may not even know about ourselves.”

Eye movement data is invaluable to companies that build platforms and software in the metaverse. For example, reading a user’s eyes allows developers to customize content for interaction responses or reduce the resolution of their peripheral vision to provide computing power.

With such a wide range of complexity, creating virtual eyes that mimic how the average human reacts to a variety of stimuli seems like a daunting task. To climb the mountain, Guratova and her team, including former postdoctoral fellow Guohao Lan, who is now an assistant professor at Delft University of Technology in the Netherlands and currently holds a Ph. Student Tim Scargill delved into the cognitive science literature that explores how humans see the world and process visual information.







Instead of collecting actual eye movement data, as shown here, Duke engineers have developed a set of “virtual eyes” that simulate the data well enough to train new metaverse applications. Credit: Maria Gorlatova, Duke University

For example, when a person watches someone speak, their eyes alternate between the person’s eyes, nose, and mouth for varying lengths of time. When developing EyeSyn, the researchers created a model that extracts the location of these features on a speaker and program their virtual eyes to statistically simulate the time they spend focusing on each area.

“If you give EyeSyn many different inputs and run it enough times, it will generate an artificial eye movement dataset large enough to train a classifier (machine learning) for a new program,” Gorlatova said.

To test the accuracy of their artificial eyes, the researchers turned to publicly available data. They first “watched” videos of Dr. Anthony Fauci speaking to the media at press conferences and compared them to eye movement data from actual viewers. They also compared a virtual data set of their artificial eyes looking at art with real data sets collected from people browsing a virtual art museum. The results showed that EyeSyn was able to closely match distinct patterns of real gaze cues and simulate the different ways in which different people’s eyes respond.

See also  In Cantal, baking ovens are discovered in the past thanks to virtual reality

According to Gorlatova, this level of performance is good enough for companies to use as a benchmark for training new metaverse platforms and software. By using a basic skill level, commercial software can achieve better results by customizing their algorithms after interacting with specific users.

“Synthetic data alone isn’t perfect, but it’s a good starting point,” Gorlatova said. “Small businesses can use it instead of spending time and money trying to build their datasets in the real world (with human subjects). And because the algorithm customization can be done on local systems, they don’t have to worry about their eye movement data becoming part of a database big “.


What is Metaverse? Explains the CTO he’s helping to build


More information:

Stady: maria.gorlatova.com/wp-content… 022/03 / EyeSyn_CR.pdf

conspiracy: ipsn.acm.org/2022/

Presented by Duke University


quote: Simulated human eye movement aims to shape metaverse platforms (March 7, 2022) Retrieved March 7, 2022 from https://techxplore.com/news/2022-03-simulated-human-eye-movement-objects.html

This document is subject to copyright. Except for fair use for private study or research purposes, no part may be reproduced without written permission. The content is provided for information only.

Frank Mccarthy

<p class="sign">"Certified gamer. Problem solver. Internet enthusiast. Twitter scholar. Infuriatingly humble alcohol geek. Tv guru."</p>

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top