Dynamic distortion compensation

Research and reports

Dynamic distortion compensation — enabled by application-agnostic pupil tracking

Resource Details

  • Written by

    Tobii

  • Read time

    6 min

Summary

All lenses introduce some degree of distortion. A range of solutions, including new types of lenses, pre-warping algorithms, and static distortion compensation, have been developed for VR headsets to address the issue. But distortion still causes some users to experience nausea, which impacts adoption.

Current solutions work well when the user’s pupils align with the sweet spot for each lens. But, they cannot compensate for the lack of image sharpness, ghosting, and flare that occur as pupil position changes — due to headset slippage or because the user is moving their eyes around.

In this white paper, Tobii proposes an application-agnostic concept that leverages actual pupil position in real-time, swapping out static compensation with a dynamic version. To support the idea, we have created a new signal that leverages low-latency tracking of pupil position in 3D to deliver the necessary input to redress distortion — in every image in real time.

For the user, dynamic distortion compensation is likely to minimize nausea, but perhaps more significantly, it sharpens the appearance of images, improving the user experience with existing headsets and unlocking the benefits of high-resolution displays.

Download the white paper

Resource Details

  • Written by

    Tobii

  • Read time

    6 min

Download the white paper

Please fill out the form to download the white paper.