XR developer focus — avatars, games, and pilot training

  • by Johan Bouvin
  • 4 min

Tobii XR developer focus avatars

Introducing our new blog series for XR devs

With leaps in hardware and software happening so rapidly, I believe that many devs are keen to understand how eye tracking can help enhance virtual interactions, provide innovative new metaverse experiences, and empower users with ever deeper levels of immersion. And so, I’ve started this new blog series for XR software and game developers.

In this first post, I aim to share some of the projects that our XR team has been collaborating on, some of the stuff that makes me proud because it’s like some sort of proof point when you get to see the technology you’ve been developing make a difference to others. Read on to find out about our eye tracking for avatars, learn about the first XR game to showcase eye tracking as a core functionality, as well as some fascinating event presentations ranging from next-gen game development to pilot training in the enterprise world.

I’ll be talking more about developing for AR, VR, and MR devices over the coming months, so be sure to subscribe to our XR newsletter on our XR developers page, so you’ll be the first to hear about developments.

Avatars in the metaverse and beyond

From World of Warcraft characters to Second Life alter-egos, the digital avatar has long been a way for online communicators, and gamers especially, to manifest themselves online in any way they desire. Avatars are great because they bring freedom of expression to the online world. You can be seen how you wish — be that a replication of you in the physical world, or designing all sorts of creative new looks, from the whimsical to those that express how you feel on the inside.

As we enter the realms of extended and virtual reality, the importance of avatars will reach even greater heights as we come face to face with the virtual avatars of others in next-gen AAA games for VR headsets or in new tools for communication and collaboration among colleagues.

Eye tracking adds an extra layer of reality to avatars and virtual characters because it enables XR to emulate eye contact and eye movements that play a significant role in how people communicate. By enabling them in XR we extend these essential aspects of humanity online. Allowing users to adopt more true-to-life expressions and appear more authentically present adds to the immersion possible in the metaverse.

Tobii, LIV, and Ready Player Me

In March this year, Tobii announced an exciting partnership with LIV, the leader in XR game streaming, and Ready Player Me, a platform that enables users to create 3D avatars for use in hundreds of different XR apps and games. "Designing avatars is not just about creating beautiful faces, bodies, and peripherals, but also about reflecting and communicating real-time attention and emotions of a user with others" said Timmu Tõke, Co-Founder and CEO of Ready Player Me at the time –

.

Designing avatars is not just about creating beautiful faces, bodies, and peripherals but also about reflecting and communicating real-time attention and emotions of a user with others
Timmu Tõke, Co-Founder and CEO of Ready Player Me (at the time)

And I couldn’t agree more.

The first experiment as part of this collaboration features a game called Racket: Nx, demonstrating how real-time eye movement in a streamer’s avatar provides viewers with insights into their gameplay. Check out the video below

Experimenting with MetaHuman

MetaHuman Creator is a tool for creating hyper-realistic characters in Unreal Engine. To explore how well eye tracking adds to that realism, Ajinkya Waghulde — one of our senior engineering managers — posted about integrating our eye tracking into MetaHuman avatars, with an impressive video example: Tobii MetaHuman avatar demo.

Take a look at how Ajinkya did it in his A gaze expression experiment with MetaHuman blog post, and why not take a deep dive into our Social Use Cases area on our XR devzone?

MetaHuman characters with eye movement enabled by Tobii eye tracking 

Eye tracking at recent events

Starblazer — one of the first VR games to prominently feature eye tracking

Last week I attended AWE USA in sunny California with our team, meeting partners, devs, and other XR industry leaders. One of the highlights for me was the presentation I did with our good friends at Starcade Arcade, who talked about Starblazer a game that will leverage our eye tracking arriving in this summer’s upcoming release Starblazer: Factions.

Starcade Aracade

Alexander Clark, Starcade Arcade’s founder, talks through the key features that eye tracking brings to Starblazer’s upcoming refresh — including attentive and reactive UI and intuitive interaction with objects. He also shares the lessons they learned during development, which may give you a head start when developing your games and apps.

And don’t miss Alexander’s answers to the great questions posed by the AWE audience during the Q&A at the end of our presentation: Building next-gen XR games with Tobii eye tracking.

Unity’s GDC presentation on building games for PS VR2

This video isn't hot off the press but the entire presentation is worth the watch. At GDC 2022 in March, Unity gave a fascinating presentation about building next-gen games for PlayStation VR2.

You won’t be surprised to know that what caught my attention most was when Jono Forbes — senior engineering manager in the XR Team at Unity — said,

I’ve saved what is, to me, the most exciting new design space for last, which is eye tracking of course.“

Always great to see we’re not the only ones excited about eye tracking, and Jono does a great job talking about how Unity sees eye tracking as a key part of the coming generation of VR headsets. I’ve embedded the video below to start at the eye tracking section, 31m22s into the clip, but feel free to drag back to the start to catch the full presentation.

Enterprise, eye tracking, and pilot training

Much of this post has focussed on the gaming side of XR, but eye tracking has plenty to offer elsewhere, like the enterprise sector.

Another highlight from AWE 2022 was the presentation by Rick Parker — CTO of Visionary Training Resources (VTR). Parker tells about how VTR is helping fill a worldwide shortage of pilots by disrupting the early stages of pilot training and how our eye tracking has enabled performance tracking and demonstrated ROI — helping VTR make their business case to major airlines. 

Building disruptive pilot training with XR and eye tracking: VTR case of empowering major airlines.

aircraft cockpit simulation

Our XR devzone

As you may know, we have built a comprehensive devzone dedicated to augmented-, mixed-, and virtual-reality eye tracking integrations that features documentation, guides, demos, and more. Check it out: Tobii XR devzone.

We will shortly be launching a survey to gather feedback and suggestions from devs using or planning to use the XR devzone. If you want to be included, please register for our newsletter on the XR developers page,

Written by

  • Johan Bouvin

    Johan Bouvin

    Director of product — XR software and ecosystem at Tobii

    Hi, I work with tons of developers from all over, hoping to gain an understanding of how their eye tracking needs are evolving. My aim is to ensure that we continue to develop our SDKs, developer tools, and XR platform to align with the rapidly expanding needs of the industry. I collaborate with amazing people in different lines of work. I find the more varied the individuals, the more interesting the results.

Related content

Subscribe to our blog

Subscribe to our stories about how people are using eye tracking and attention computing.