Nissan Pro Pilot self-driving tech: driven by the autonomous car that’s ‘better than a human’
Nissan vows to make AI-based cars part of our everyday lives – from 2027. Its engineer reckons it’s a better driver then he is
This blind right turn across three lanes of Tokyo traffic would be nerve-wracking for a human driver: our view forward is obscured by a bus swinging across our path. Will Nissan’s AI-trained autonomous Ariya sit tight, or make a bolt for it?
Head of automated driving tech, Tetsuya Iijima sits in the electric SUV’s driving seat, not that his hands have gone anywhere near the steering for the first five minutes of this trip. Indeed he’s looking straight at me over his left shoulder, explaining his passion project, while I distractedly gaze past him at the huge, white, battering ram we’re engaging in a stand-off with.
The bus swings right, clears our path and reveals that oncoming traffic is scarce. Cue an invisible poltergeist turning the steering wheel and releasing the brake, with the Ariya gliding across the junction, onto a side street and into the next scenario.
This third-generation Pro Pilot system is a collaboration with British tech start-up Wayve: it produces the AI-driving model trained on millions of hours of dashcam footage (hopefully excluding those crazy Russian dashcam compilations), while Nissan focuses on systems integration and software to make the EV steer, brake and accelerate.
How does the Ariya see on Ginza’s streets? There are 11 cameras dotted around the car, with one behind the windscreen mirror providing the image for the digital driver’s display, annotated by a blue arrow showing our path through the urban jungle.
The black carbuncle lashed on the roof houses four of them, but critically the supporting LiDAR too. This reflected light sensor acts like binoculars, scanning much further down the road in search of hazards: 150m for small ones, twice that for big ones. “This is our insurance,” says Iijima-san, because cameras can only see 50m ahead on unlit US interstates, for example.
But “the camera is the core sensor for AI [autonomous driving] – and the Wayve brain,” he explains. This is a prototype of the system bound for customer cars, starting with the Japanese Elgrand MPV in 2027: Nissan is deadly serious about being at the cutting-edge of end-to-end autonomous driving. But implementation requires ultrafast-processing ‘Software Defined Vehicles’, a final decision on the ‘seeing’ technology, and how to package it more discreetly.
“We haven’t decided the final configuration,” says the engineering boss. “But never only camera. LiDAR must [be added] if the environment includes night time and high speed – and radar too.” The Pro Pilot Ariya does have radars at the corners, but these are to meet regulatory requirements and enable park distance control rather then being system-critical.
Having crawled down the Prince Hotel access road to a red stop light, the Ariya is now firmly in its stride, powering confidently up to the 50kph limit. But it slows a touch upon seeing a traffic crossing, or pedestrians. Iijima-san was once flummoxed by the car seemingly refusing to move: its 360˚ vision had spotted an overtaking cyclist in a blind spot, and was waiting for it to pass. The SUV’s ‘eyes’ and ‘brain’ vigilantly reassess the environment every 100-miliseconds: Iijima-san reckons that’s processing as fast as a human.
“An enormous quantity of training has made the AI very intelligent,” continues the engineer, facing me and leaving the Ariya to closely tail the MPV in front, getting through just before the lights go red. “The AI understands the road structure and the 3D world, it understands the relationship between driving behaviour and the [road] environment.
“Humans understand the physics of the world: for Wayve that took 4-5 years of training.”
The Ariya undertook around 30 practice runs through Ginza to prepare, with Auto Express’s demo run past the century mark. Any teething troubles? The most challenging was getting it to understand indigenous Japanese traffic signage, given London has provided the bulk of Wayve’s foundation model training.
Nissan also had to programme a boundary zone in which the car operates, for regulatory reasons, not technical ones. It navigates using a regular, turn-by-turn mapping service, in this case Mapbox, which Iijima-san says is good at learning uncharted areas. It also provides speed limit guidance, along with the on-board camera scanning for signs.
Almost 100 per cent of driving capability sits on-board, enabling the Ariya to compute – and proceed – without relying on infrastructure. Services – such as the car notifying your restaurant you’re going to be late for a booking – and over-the-air updates will rely on cloud computing.
Under shadowy bridges, giving way to passers-by, braking overly heavily on occasion – a clumsily human touch – the Ariya progresses naturally without incident. Is this unlocking Level 4 autonomy – hands-off, eyes-off, for long periods in complicated urban areas – I ask Tetsuya Iijima?
“Level is not a technical word,” he counters. “This is technically equal to or better than the human brain. Level 2, 3 or 4 is a regulatory term.” But 20 minutes into the demonstration, I can clearly see the Ariya can assess its environment and drive – without intervention – to human norms. So I ask again.
“It’s Level 4 capable, but Level 2!” he responds. And that is a fundamental truth: until regulators in different regions have found a way to verify autonomous cars are safe, self-driving technology can’t proliferate beyond today’s Level 2-plus ‘eyes on, hands off’ capability.
Audi knows: it plumbed Level 3 ‘eyes off, hands off’ highway driving into the 2017 A8 limousine and accepted liability for any accidents – but gave up trying to get the system homologated. “Society is not ready to accept Level 4 autonomy, and nor are regulators,” states Iijima-san.
As we close in on our destination, I spot a silver Mazda 2 drifting towards us on a collision course, touching the solid central white line. The Ariya doesn’t swerve to this fleeting threat; it doesn’t react at all. Its response was either very confident, or oblivious. Considering its smooth, seamless and spotless performance so far, I give Pro Pilot 3.0 the benefit of the doubt that it was poised to react.
Iijima-san won’t be drawn on how much the prototype costs. And affordability will be one of the many hurdles facing Nissan’s vision of making AI autonomous driving part of everyday life. Is it a hugely expensive option, tested initially by a small sample of fleet customers or the very rich? Are there limits – light levels, weather conditions, geofence areas – to when and where it can be used? And who is liable for the inevitable accidents? Nissan’s robotaxi pilot, launching in Tokyo late this year, will seek to provide more learnings and some answers.
Over 15km (9 miles), the Pro Pilot Ariya drives like a competent, considerate human. It makes navigating Tokyo look simple: something I spectacularly failed at during my test drives. And, of course, Iijima-san didn’t touch the steering, throttle or brakes once during the demonstration. His faith is total, he pays the car minimal attention.
So, I ask, is Pro Pilot 3.0 a better driver than he is? “This is learned from [millions] of safe driving episodes for precise manoeuvre training,” he responds. “My singular experience is compressed inside my brain. Even if I’m old, its knowledge is bigger, deeper, and it always concentrates on a 360˚driving view.
“So yes – it is.”
Did you know you can buy a used car with Auto Express? Choose from tens of thousands of cars with trusted dealers around the UK. Click here to buy used with Auto Express now...












