Features

Gaze: the driver-monitoring eye-tracking software which could be in your next car

We find out about latest eye-tracking tech that offers key journey info based on where you look as you drive

We all know that some cars can monitor the driver’s eyes and flash up a warning if they think they’re becoming drowsy or not paying attention. But what if the vehicle used the tech to get a better idea of what we’re looking at through the windscreen?

That’s the idea at the heart of the next generation of in-car systems from Nuance, a specialist in speech recognition whose tech is currently used by more than 60 firms. Called Gaze, the set-up has moved from static concept stage to real-world tests over the past six months, and Nuance’s automotive boss, Arnd Weil, says he’s in talks with car makers about bringing it to market, perhaps as soon as by 2020.

11 car safety systems to be mandatory by 2021

Our own experience of the set-up takes place on roads near Milan, Italy, where a rented Ford Galaxy has been modified to show off the concept. There’s a chunky tablet PC mounted on the dash and, more importantly, a computer gaming-spec eye tracking module taped to the top of the instrument panel.

As principal product manager Daniel Kindermann drives along the shore of Lake Maggiore, it’s clear the car is following his eyes, constantly monitoring where he’s looking. We pass a hotel and Kindermann asks: “What’s the rating on that business?” Within seconds, the system replies with not only the name, but also details on the style of hotel and its customer rating.

The same goes for restaurants; glancing sideways while driving along, Kindermann asks the system for details and gets ratings from review sites, plus a brief description. The next step, we’re told, would be the possibility to then call the business, for example, or to use an online service to reserve a room or table.

There are various systems at work here, of course. Our car has multiple microphones, allowing it to pick out individual occupants and work out who’s talking to it. Boxes of computer hardware fill the boot, much of it made by Nvidia, the graphics chip firm behind many infotainment and instrument displays.

Driverless cars: everything you need to know

Gaze mixes these systems with the eye tracking, uses the processing power to ‘back-track’ through time to the point where the user has glanced at the point of interest, then cross-references this with hi-res mapping and GPS data to work out what they were looking at when they asked the question.

The system will have the capacity to ‘learn’ what its users are viewing. At one point, Kindermann looks out towards a seafront location and asks the car to tell him about a restaurant, but it assumes he’s looking at an island in the distance. It doesn’t take long for him to correct the error, and a cloud server would flag up the area and help the system to second-guess the user more accurately in future.

There are other potential uses. Gaze could notice what you’re looking at inside the car. If a light flashes up on the dashboard, simply asking “Why has that light come on?” could prompt the system to give you safety info. It could also be used by insurance firms to monitor and grade drivers’ attentiveness, and increase or decrease their premiums as a result. 

Nuance is also working on gesture control, building on the tech found in BMW’s latest 7 and 5 Series models. Nils Lenke, Nuance’s director of corporate research, said: “We call it multimodal assistance. 

“You’re trying to accomplish one task but you could be using three ways of doing it: looking briefly at somewhere as you’re passing it, gesturing your hand towards it and at the same time, asking ‘Tell me more about this’. The system will combine these as inputs to give you a more accurate response.”

Weil admits that while the Gaze tech is approaching production standards, it may be a year or two before a brand can design the tracking sensors into a dash. 

In the meantime, Nuance’s R&D team is working on sentiment recognition. A car that not only knows what you’re looking at, but is capable of guessing why? Don’t bet against it. Not least because as Level 4 and 5 autonomy arrives in the next 10 to 15 years, we should have more time to look around us as we’re travelling. Helping us to make the most of it will be big business.

Enjoyed this article? Then why not read our previous feature on judging speed without a speedometer?

Most Popular

New 2021 BMW M3 launched with huge grille and 503bhp
BMW 3 Series M3 Coupe

New 2021 BMW M3 launched with huge grille and 503bhp

The new sixth-generation BMW M3 Competition saloon gets a 503bhp straight-six engine and four-wheel-drive
22 Sep 2020
New 2020 Volkswagen ID.4 electric SUV revealed with up to 323 miles of range
Volkswagen ID.4

New 2020 Volkswagen ID.4 electric SUV revealed with up to 323 miles of range

The new Tiguan-sized Volkswagen ID.4 electric SUV becomes VW’s second ID. model, rivalling the Tesla Model Y
23 Sep 2020
New Range Rover Velar PHEV launched with 33-mile electric range
Land Rover Range Rover Velar

New Range Rover Velar PHEV launched with 33-mile electric range

Land Rover has given the Range Rover Velar an update, with a new plug-in hybrid powertrain and an improved infotainment system
23 Sep 2020