XR developer focus — Tobii Ocumen, VR games, and interaction design

  • by Maggie Ma
  • 6 min

Tobii XR developer focus - Ocumen

Welcome to this edition of XR developer focus!  Time flies since our last update but we are never short of excitement in seeing how eye tracking has been applied to new use areas and inspired innovations in new industries. Make sure you subscribe to our mailing list, to be the first to receive relevant news, insights, and more on developing with Tobii eye tracking.

The highlights:

  • Tobii Ocumen success stories and AWE talks
  • VR games featuring eye tracking
  • Design for gaze-based interactions

Tobii Ocumen empowers innovations

Since its launch two years ago, Tobii Ocumen has empowered many innovations that disrupt conventional approaches and improve efficiencies in various industries. As an advanced tool for observing user behavior in VR, it provides access to new data and helps gain granular insights into human behaviors.  Developers use the framework it provides to record, organize, and analyze biometrics as well as rapidly build and develop scientific-grade VR products in domains such as health assessment, therapeutics, and training.

Gaize is a software company working to solve the problem of drug impaired drivers and workers. Ken Fitchler, CEO of Gaize, shared the stage with us at AWE USA 2023 to explore the possibilities of the convergence of VR and eye tracking to understand human conditions and develop groundbreaking solutions.

AWE talks 2023

AWE USA 2023 - Collaboration, Training and Education: Unleashing the Power of VR

More success stories in healthcare, training and more industries using Tobii Ocumen were featured in another AWE talk by Tobii’s Amanda Bentley and Robert Malmström.

AWE talks 2023

AWE USA 2023 - Healthcare & Wellness: Disrupting Healthcare, Training, and More

If Tobii Ocumen seems helpful to your use case, check out our product page and our developer site to find more information.  You are also encouraged to take advantage of our free trial to find out how Tobii Ocumen can empower your solution.

VR games take advantage of eye tracking

This year we saw an increasing trend of utilizing eye tracking in VR games to optimize the computing power of VR headsets and create more natural interactions.

Tobii eye tracking has enabled dynamic foveated rendering for many XR devices. Similar to how our eyes and brain work in the physical world, the computing power of XR devices can only focus on where our eyes reveal our attention. The savings can then be used for other critical tasks, such as maintaining high framerates, which is essential to run demanding content like games. Such savings will be exponential for future XR devices as display advances to 8K, 16K and beyond.

"Foveated rendering on PS VR2 can bring a massive reduction in GPU usage while producing the same perceptual quality – and, combined with eye tracking technology, the performance gains are even better...We’ve also seen gains up to 3.6x faster when foveated rendering is combined with eye tracking."

-- Unity 

By leveraging the power of our eyes and intuitive understanding of visual cues, XR devices can enable a level of natural and immersive interaction with virtual environments. Even better, it achieves this without requiring users to learn new behavior patterns.

Here are some of the reviews from the media:

"You can select dialog choices with characters by simply looking at an option and hitting a face button instead of swiping around with an analog wheel. It’s so seamless that I almost forgot I was doing it sometimes — and I never want to go back."

-- The Verge

"And in a game like Horizon: Call of the Mountain, you can use that gaze tracking to navigate menus. And very quickly, I found this much preferred over your standard kind of head tracked look pointer...I found it so much easier and so much more intuitive and natural to just use my eyes to move the cursor..."

- Adam Savage's Tested.

"... the fleeting glance was so crucial to making me feel like I was really there, really being given the stink eye by a character, that I couldn't help but lose myself in the moment... It's the kind of immersion I haven't gotten from VR in a while."

- WIRED

To help developers leverage eye tracking better, Tobii delivered a talk at GDC 2023 on pushing the limits of immersion in next-gen games using state-of-the-art eye and head tracking.  It gives you a deep dive into eye and head tracking technology and its use cases in games for PC and XR devices. Tips and tricks for avoiding design traps are also provided to help developers better adopt this technology to enhance immersion in their games.

Design for gaze-based interactions

Such a trend of utilizing eye tracking in creating optimized user interaction and experience hasn’t only been seen in gaming but also in broader use areas with the recent launch of Apple’s Vision Pro.  Once again, eye tracking is validated as an essential technology in XR.

If you are interested in catching up this trend and designing gaze-based interactions, you should check out our DevZone, where you can find information about the fundamental building blocks used to design for eye tracking in VR, as well as tips and tricks on what you should consider when designing gaze-based use cases.

It’s also worth mentioning that Unity has added eye tracking support to XRI2.3.  Tobii has been assisting with the concepts and research and is proud to contribute to the next level of support to developers.

Resources

Lastly, I want to leave you with a few selected materials that may be interesting to you.

Foveated rendering is a rendering technique that is revolutionizing the way we experience virtual reality. By only rendering the highest-quality images in the area where the user is looking, it can significantly reduce the amount of processing power required to run a VR headset, resulting in many performance benefits like smoother performance, higher-resolution displays or even longer battery life.
Read our blog article on Foveated rendering to learn more.

If you are working on an eye tracking study, interpret the data, and draw conclusions about your study results, you will find my colleague Ieva Miseviciute’s recent learn article Types of eye movements very useful.

Dr. Walter Greenleaf is the Medical AR/VR expert from Stanford Virtual Human Interaction Lab.  He recently delivered an inspirational speech on VR, ML, Biosensing converging to transform healthcare: Virtual Reality, Machine Learning, Biosensing - Converging to Transform Healthcare. It not only discusses the detailed trend of these important technologies, but also points to a future in which the confluence of such technologies results in affordable, scalable and more effective healthcare.

Happy developing and until the next update!

Written by

  • Tobii employee

    Maggie Ma

    Head of Marketing XR

    As the head of marketing for the XR segment at Tobii, I get to tell amazing stories about our eye tracking sensor technology and how it is put to good use in VR and AR. I get inspired by the innovations that enhance understanding of ourselves, break the physical and financial barriers, help address incurable diseases, and fuel curiosity to explore new frontiers. It feels great to connect the magic of technology with the need of the users.

Related content

Subscribe to our blog

Subscribe to our stories about how people are using eye tracking and attention computing.