For some years, the automotive industry has been promoting the use of
driver monitoring systems (DMS) in all vehicles. Most new cars employ some form of monitoring, and regulators will probably enforce some kind of driver distraction tracking in the near future. It’s no surprise that the conversation has shifted from is your vehicle equipped with a DMS to how good is it?
DMS primarily aims to raise safety by alerting the driver in cases of distraction or drowsiness while providing critical information to the advanced driver-assistance system (ADAS). The DMS also supports applications in collaborative driving and in-car assistance. As regulators and NCAP continue to pursue the adoption of DMS for all vehicles, driver monitoring graduates from a nice-to-have feature to a ubiquitous automotive commodity.
Against this backdrop, automotive-OEM expectations of DMS performance are rising, especially regarding user experience. And the EuroNCAP test protocol sets out lofty requirements for DMS functionality, including a maximum latency of 0.16 seconds to perform the lizard-gaze transition.
But for many DMS providers, several roadblocks challenge the robustness of the solution:
Because the basic signals — head pose, gaze, and eye openness — empower all DMS features, a best-in-class DMS should prioritize the robustness and latency of these signals with diversity and occlusion factored in. Doing so is the best possible way to meet NCAP requirements and deliver a good user experience.
That’s why Tobii has developed a new AI-based software for driver monitoring with superior robustness, and no latency of gaze and head pose tracking. It also works regardless of accessories like hats and glasses, and across all ethnicities. Check the video below and contact us for more information about our advanced DMS, including how you can try it out yourself!
The DMS, how it works, and how it will help create a future without accident-related fatalities — Tobii's Peter Tiberg explains.Learn more
Subscribe to our stories about how people are using eye tracking and attention computing.