All previous incidents involving a Google AV have been put down to human error – until now. Self-driving in California, a Google Lexus AV avoided some sandbags in the road at a red light. When the light turned green and the Google car attempted to get back into the flow of traffic, it was aware of the bus approaching in the left side mirror, but believed it would give way. It then made contact with the bus at less than 2mph, with the bus travelling at 15mph, according to Google’s driver report.
In other words, the Google car did not have right-of-way. A human driver would know this and wouldn’t assume someone would slow down for them to pull out, making themselves completely at fault and liable for any claims. Footage of the incident (below) shows little more than a full car length between the bus and the GMC Yukon in front of it. Google said it has now edited its programs to "more deeply understand that buses and other large vehicles are less likely to yield to us than other types of vehicles". The space available plus the travelling speed equals a dangerous situation, no matter what the vehicle size.
The Google car should have waited for there to be enough space for it to join the road safely, just as a human would. Granted, this incident was the first in which the AV was solely responsible out of the 1.5 million miles covered on US roads, but I still don’t think we’re ready for AVs. Humans have the ability to apply logic and react to changing situations. At the moment, autonomous vehicles don’t.
In this case, Google accepted liability for the incident. If put into production, manufacturers could end up worse-off than if they simply made good cars with tech that helps to prevent accidents. However, it is still unclear as to who would be to blame in the event of an accident with an AV.
Google reveals 13 near misses in two months for its self-driving cars
I’m not dismissing autonomous technology completely. Cruise control, for example, is brilliant and is fast becoming standard equipment. My issue is with fully autonomous technology. We’ve all had a complaint about technology at some point. Are we really ready to trust it completely with, not only our own lives but, the lives of our children?
I wouldn’t buy a fully autonomous car until I felt confident it would make the same judgments as me. Full autonomy is not expected to appear before 2025, so there is still time for improvements. But, for now, this accident shows that the fully autonomous car isn’t quite ready to grace our streets full-time. Skynet is a long way off, yet… apparently.
Join the debate
Add your comment
The computer is a poor substitute for the human brain.
Author
I drive for pleasure...