Text size
Media duration: 54minutesmins
Drive
9 min readText size
Published
Text size
The promise of self-driving vehicles has a long history, but the reality has been a bit slow off the brakes.
Tesla’s Elon Musk has been promising self-driving cars since at least 2013, and it’s been seven years since Waymo’s first ‘driverless’ service was launched in the US — with a back-up driver on board just in case.
But the gap between a robot chauffeur and reality has been quietly filled with a range of semi-autonomous technologies that have made Australian vehicles safer — even if we still need to grasp the wheel.
From lane detection cameras to automated braking and parking, Advanced Driver Assistance Systems (ADAS) are increasingly allowing drivers to rely more on technology to tackle complex or risky environments.
Now, though, manufacturers are ready for the next step and we can soon expect to share the road with vehicles where drivers take a back seat and let the car drive itself.
It’s a transition that will be complex for regulators.
There are six levels of vehicle autonomy, numbered zero to five, and so far we are only just scratching the surface of what is possible.
Australia uses the Society of Automotive Engineers’ international standards to describe the level of automation available at each level, and uses the system to define where the responsibility for safety lies between the ‘driver’ and autonomous vehicle manufacturer.
Level 0 is the most basic, with no automation at all, but still allowing for ADAS such as automatic emergency braking, blind spot warning and lane departure warning.
Level 1 introduces some additional support for drivers, with the car able to take on activities such as either steering or braking in certain circumstances, keeping the car within a lane or employing adaptive cruise control to drop your speed if you near another vehicle.
Level 2 is the most sophisticated automation currently available on our roads and is sometimes split between Level 2 and Level 2+.
At the lower level, the car can undertake steering and braking/accelerating at the same time, such as taking over parallel parking or reversing into a car bay, with the driver keeping their eyes on the road and their hands on the wheel.
Level 2+ is described as hands-off, eyes-on driving, although authorities agree that the human is still considered to be the one doing the driving and is ultimately responsible if something goes wrong.
But it is when the vehicle crosses the threshold from Level 2 to Level 3 that things get interesting.
Level 3 is not yet permitted on Australian roads, except in trials specifically approved by authorities. That’s because the shift from semi-autonomous to autonomous vehicles is still being assessed for safety in Australian conditions, and alignment with our road safety regulations.
Tesla’s recent roll-out of automation for some of its vehicles in Australia, called Full Self Driving (Supervised), is described by the manufacturer as Level 2, although as drivers barely need to touch the wheel, there have been questions over whether its capabilities extend into Level 3 territory.
At Level 3 and above, even if a human is in the driver’s seat, you are not considered to be driving when automated features are engaged, although the vehicle can require you to take over. Instead, it is the manufacturer who has liability in a crash, at least in theory.
Level 3 moves from hands-off, eyes-on driving to hands-off, eyes-off for limited conditions.
As an example, Level 3 traffic jam assistance combines adaptive cruise control and lane centring, so that a driver stuck in heavy traffic can reduce their attention, take their hands off the wheel, and let the vehicle navigate the stop-start conditions.
But with greater automation comes greater risk, and manufacturers working on Level 3 features tend to build in safeguards, such as eye cameras to ensure the driver is still watching the road environment, even if the car is doing the work.
If the driver’s attention wanders, the vehicle can force the issue by requiring the driver to put their hands back on the wheel or refocus on the road if they want autonomous functions to continue.
It is only at Level 4 and Level 5 that autonomous vehicles become truly self-driving, either in some conditions (such as Waymo and its city-based driverless taxis) or across all possible terrain. For Australia, both options are some distance away.
It’s not just cars changing with new technology, drivers have to adapt too — and some of the uncertainty around autonomous vehicle risk lies in how the driver and other road users respond.
Director of Monash University Accident Research Centre, Stuart Newstead, says many people have experienced low-level or semi-autonomous technology with radar cruise control and lane trace, and these are relatively well adopted.
Although they help the driver manage on the road, they don’t offer so much assistance that the driver starts to let their awareness slide.
“The evaluation evidence on these technologies show they're good because they're still requiring the driver to be in control. They're not saying don't pay attention or have a rest for a while,” he says.
“They are basically a safety net for people in their regular driving task.
“That said, some people don't trust the technology, or they find it annoying, particularly lane keep assist, and they seem to get upset by the intervention.
“The strength of intervention can vary quite a lot between different vehicles and some people find that intimidating.”
As the degree of automation increases, the response of drivers matters more. For all but the highest levels of autonomous vehicles, the driver should be available to take over control at a moment’s notice.
But studies have shown that the more distracted the driver is when the vehicle is in charge, the longer it takes them to re-engage.
It can take time for the driver to reassess the road environment, check their mirrors, resume the wheel, and understand the hazard that has required them to get involved.
In simulations conducted for the Department of Transport in the UK, in which drivers engaged in tasks ranging from eating popcorn to watching a film, many struggled to regain situational awareness or control of the vehicle when required to do so.
The median ‘time to take over’ from reading a magazine or viewing a mobile phone was about 5 seconds and in one case, 24 seconds — far from ideal.
But if something does go wrong, Newstead believes many people will blame the tech.
“The strange thing about people handing control away to a device or even another person is that they are far less accepting of failure in those circumstances,” he says.
“It puts the onus on the system for public acceptability to make those systems work at a really high level.
“If you are injured by an autonomous vehicle failure that has nothing to do with your input, you'd be far less accepting of that than if you have a crash yourself.”
Interaction with other road users is also largely untested and Stuart Newstead says a road with half the vehicles running autonomously and half under human control could lead to chaos.
“Will the people who don't have the technology understand that you can game the system, and change lanes in front of autonomous cars knowing they will throw on the brakes and create a gap?” he asks.
“All those questions remain completely unanswered.”
Swinburne University Professor of Future Mobility Hussein Dia says too little is known about the real-world performance of cars at Level 3 or above and whether they will respond as humans would in different situations.
“Imagine a driver going down a suburban road and they see a soccer ball coming across the road,” he says.
“A mature driver might not only slow down, but think also that there could be a child following that soccer ball. But perhaps the driverless car might recognise the ball and say, it’s passed me so I'll speed through.
“It doesn't have the same level of thinking that a human does about whether there is something else happening it needs to anticipate.”
Professor Dia says Australia’s cautious and conservative approach has been the right one, while national guidelines and regulations for autonomous vehicles are developed.
“What we're seeing in the US, especially with Tesla, is that we're having a lot of injuries, in some cases, even fatalities,” he says.
“One injury and one fatality is too many.”
While safety remains a key reason for the go-slow on autonomous vehicles, that may not always be the case.
Waymo markets itself as ‘the world’s most experienced driver’, given its technology is responsible for travelling more than 96 million miles, or 150 million kilometres.
Across that distance, it says its vehicles are engaged in far fewer crashes that cause injury, and have less impact on vulnerable road users.
In crashes involving pedestrians, for example, Waymo says it has 92 per cent fewer crashes compared to human-driver benchmarks, translating to 35 fewer injured pedestrians over the distance driven.
It’s an example of the benefits autonomous vehicles could offer human drivers and other road users, but Stuart Newstead says it must be balanced by good governance, transparency and regulation.
“If it's done well, autonomous driving could have major benefits, because we know that even the semi-autonomous tech we're seeing now is having significant benefits in supporting people's driving capabilities,” he says.
“People make mistakes, and the tech helps support people when they do make legitimate errors, not to suffer consequences.
“Greater autonomy in a well-developed space has great potential for road safety — but it also has a potential to be an unmitigated disaster if you do don't do it well.”