Autonomous cars are coming. Actually, they’re already here – but you’re going to see a lot more of them on UK roads in the near future. The government wants fully self-driving cars on public roads by the end of 2021, and for this country to be a leader in developing such technology.
In other words, get ready to share the roads with an increasing number of autonomous cars being tested very soon. Does that make you nervous? Do you believe that self-driving cars are safe? Can we trust them? Do you want them sharing the same roads as you?
I ask such questions because encouraging greater engagement with the public over autonomous vehicle trials is a key part of updates made to the UK government’s Code of Practice for firms testing such machines on public roads. The government wants to make sure that anyone conducting autonomous car testing recognises the need to “educate the public regarding the potential benefits” of self-driving cars.
The idea is that talking to people more about autonomous vehicles can reduce some of the concerns about them, particularly in terms of safety. It’s probably a smart move, because there is evidence that self-driving cars aren’t being universally welcomed.
Google-owned Waymo has been testing cars in the US for several years, including extensive running in Chandler, Arizona. Late last year, the Arizona Republic newspaper obtained police documents detailing 21 incidents documented by Chandler police of Waymo autonomous vehicles, and their on-board test drivers, being harassed or threatened.
One machine had its tyres slashed while stopped in traffic. Another was threatened by a man brandishing a gun. The man involved in that incident was later charged with aggravated assault and disorderly conduct, and the police report of his arrest noted that he “stated that he despises and hates those [Waymo] cars.”
One Jeep Wrangler was reported trying to run Waymo vehicles off the road on several occasions, with its owner, Erik O’Polka, telling the New York Times: “There are other places they can test. They said they need real-world examples, but I don’t want to be their real-world mistake.”
Undoubtedly, these are isolated incidents: Waymo says its vehicles log more than 25,000 miles of autonomous running a day in Arizona. But they hint at an underlying mistrust of self-driving cars, perhaps linked to some of the high-profile incidents such machines have been involved with.
That includes the first recorded case of a pedestrian, Elaine Herzberg, dying after being struck by an autonomous vehicle, an Uber-run Volvo XC90 in Tempe, Arizona in March 2018.
Join the debate
Add your comment
Autonomous Pilots....
We have been trusting an auto Pilot on all Planes that fly passengers for a while now, most crash due to Pilot error, from what I’ve read the Plane can take off and Fly and land virtually on its own, so, why do we have so many concerns about Cars doing the same?, when they first had auto pilot there must’ve been the same concerns about safety and so on, some of us make it sound like we’ll have pile ups on the Roads, People being injured or killed by the dozen!, that’s why autonomy is being fazed in gradually, and, we need autonomy because of the increase in the number of Vehicles year on year on our Roads.
@ Peter Cavellini
Sorry Peter, that argument wont fly (if you excuse the pun) Planes have spacings of measured in miles horizontally or thousands of feet vertically. They dont have to look out for oncoming traffic/ pedestrians, stationary vehicles/fallen trees/potholes/ice/snow/floods, and they definitely dont run with wingtips one foot apart in contraflows.
This is not comparable.
Even then, Auto pilot does occaisionally fail needing pilot intervention.
Flip It - if autonomous cars had come first...
..and now people were asking that instead of an automated pilot you were going to let 80 year olds with slower reactions than a sensor or 17 year olds with a few hours of instruction drive past your family with a combined speed of 120mph on a tiny country road - what would you say?
Who will be held responsible in case anything happens...
I'm not ready. Not because I fear the technology-side won't be ready, but because I don't see a real effort to solve the question of "who's held responsible in case anything goes wrong". The current development does not adress this question. As customer You won't be able to defend Yourself properly, just because the technology used (software/hardware) is (and probably will) is controlled by the manufacturers. Right now for example, no one can tell what software and thus what algorithm is on the road controlling a specific vehicle. So if Your car hits me as a pedestrian: who am I going to blame? As long as both 'drivers' and the rest of the traffic participants can't tell why an autonomous behaves as it does, there is a serious risk in case of a dispute for the general public. Not because of bad technology but because of lack of information.