Simulated human eye movement aims to train metaverse platforms

U.S. National Science Foundation grantee computer engineers based at Duke University have developed virtual eyes that simulate how humans look at the world. The virtual eyes are accurate enough for companies to train virtual reality and augmented reality applications.

“The aims of the project are to provide improved mobile augmented reality by using the Internet of Things to source additional information, and to make mobile augmented reality more reliable and accessible for real-world applications,” said Prabhakaran Balakrishnan, a program director in NSF’s Division of Information and Intelligent Systems.

The program, EyeSyn, will help developers create applications for the rapidly expanding metaverse while protecting user data. The study results will be presented at the upcoming International Conference on Information Processing in Sensor Networks.

Read more…