A new technology might, for example, enable users to complete their tasks faster than before. It might bridge the gap between jobs, improving workflow. It might allow a user to carry out a previously manual task with a machine, which not only brings the advantages of speed and efficient workflows but often improves outcomes — such as unbiased diagnostic tests.
The segment I am responsible for at Tobii Tech delivers a range of solutions that help our customers to add a little innovation to their products. We measure our performance on the ability of our customers to commercialize. We want to make sure that any time invested in adding eye tracking to a new or existing product or solution leads to success or at least a fast fail. Helping our customers to do this is the core purpose of our Professional Services team at Tobii Tech.
No matter what industry or sector you operate in, it’s possible to add eye tracking to your products and systems. The prerequisites are modest: a user, some value in what they are looking at, a machine, and an application. In simple terms, you can do three things with eye tracking. You can build an application that leverages the data it generates, such as eye movement patterns, blink rate, and eye openness. You can use it to facilitate the interaction between a user and an application. Or, you can be highly innovative and combine the two.
When our customer RightEye came to us, they had an idea about replacing the manual follow-my-finger test used to assess brain activity, with a device that leverages eye tracking data. The value: replace a manual examination, limited by the clinician’s ability to perceive detailed eye movements, with an easy-to-read report generated from the accurate data eye tracking can capture.
Our customer ControlRad had an idea to use eye tracking in the interface of its C-arm machines. The value: reduce exposure to X-rays for patients and staff in the operating theatre. By limiting the region photographed to the surgeon’s area of interest — captured by eye tracking — their solution maintains image quality with fewer resources.
In both cases, the value is clear. But how did they get from this point to commercial products? Typically, we follow our customers on a journey through technical feasibility, prototyping, development and integration, and validation.
The best way to determine whether eye tracking is technically feasible depends on the complexity of the product. It’s a lot simpler to figure out for a standalone device with a dedicated application than say a system with multiple integrations and subsystems that support a range of applications.
If you are considering adding eye tracking into a laptop, a monitor, or a tablet-like device — what we call screen-based or remote eye tracking — you can do quite a lot of feasibility testing on your own.
For example, you can use a standalone peripheral eye tracker, such as the Tobii Eye Tracker 5L, with our integration software in your test environment. The eye tracker mounts easily on any screen and connects to the rest of your system through a USB port. Our SDKs and software help developers get going, providing tools that take the grunt work out of building applications from scratch. Even if you don’t want to code at this stage, our demos can help you test eye tracking in your target device — enabling you to determine if it works as intended. That said, our Professional Services team is on hand to help overcome any difficulties you might have.
For more complex scenarios, we’ve found that a getting-started workshop is the shortest route to progress.
First, we listen. We want to know everything about your product or idea, its intended outcome, and how you envisage eye tracking will work. From our side, we talk about what’s feasible. We want to set your expectations right from the outset. We’ll discuss what the technology can do, potential solutions, and some of the stuff that perhaps isn’t immediately obvious. While we designed our eye tracking for universal use, to function for the broadest range of users, in minimal ambient light, as well as well-lit situations, some conditions apply. The distance between the eye tracker and the user, for example, has a maximum, as does the angle between the tracker and the user’s eyes.
We discuss solutions and needs during the workshop to ensure we are aligned on the technical road ahead. In less than a day, we’ll walk you through how eye tracking works, teach you how to integrate it, and get it running in your application.
Once you’ve determined that eye tracking is technically feasible, you may want to embed it into your prototype. And while our peripheral offers an ideal way to test the feasibility of the technology, our platform — eye tracking components and SDKs — allows you to embed the technology into your device or system, in a discreet way.
Several factors dictate how much support customers tend to need at this stage: product complexity, the physical placement of eye tracking components, and the level of application development competency. We can advise and provide application development support, everything from code snippets to building a complete application.
Imagine you are building an automated kiosk, such as a ticketing machine or a self-service airport check-in. From an engineering point of view, adding eye tracking to this kind of user interface is relatively straightforward — if you say compare with a device that controls surgical robots.
One of the common challenges customers face is how to mount eye-tracking components. For accuracy, the eye tracker needs to be fixed in relation to the screen when in operation. Space might be a concern, or other physical or environmental conditions may limit the possibilities for an optimum mounting angle.
In the kiosk case, the mounting angle limits the height range of applicable users. If possible, enabling the eye tracker to rotate or move can solve this.
Calibration is another common challenge. To provide accurate data, eye tracking needs to be calibrated for each user. While our software integration package offers a ready-to-use application, many customers want calibration to be a seamless part of the user experience. Depending on available competence, this kind of development activity may require our support.
When it comes to interacting with machines, users are familiar with keyboards, selecting objects on a touchscreen, and pointing with a mouse. Eye tracking as an input modal, however, hasn’t yet reached the same level of maturity. The difficulty is understanding touchless interaction, how to use your eyes to interact with a machine — to choose a seat, for example, or check-in baggage.
Over the past two decades, Tobii has developed best practices in the design of feedback mechanisms to confirm user selections and choices. To help overcome interactivity issues such as: at what point does staring at the screen become yes, I really want to make this choice? Or how to leverage attention data to determine if the user is looking elsewhere?
But, product development is so much more than: does it work! The time between a first successful Hello World and full application development can be weeks, months, or even longer. While all products go through validation, many of our customers operate in the healthcare sector, governed by strict regulatory and technical requirements. In such cases, the validation phase can prolong the process further. These kinds of projects can last for several years before a commercial product is available. We are always available to support formal validation processes, such as running tests and providing test reports related to our eye tracking technology.
In short, getting from idea to commercialization is a journey that we undertake together with our customers. How much support you need from us will depend on your product’s complexity and the level of developer competency available to you. We measure our success by yours, so we are always available to support. For more information, I’d encourage you to visit our Customer integration service or get in touch with us.
Subscribe to our stories about how people are using eye tracking and attention computing.