The safety of autonomous and semi-autonomous cars has once again been called into question after a Tesla Model S crashed into a police car in California, prompting concerns over whether drivers are receiving sufficient training about engaging the self-driving feature.
The aftermath of the crash was shown in a series of pictures tweeted by the Laguna Beach Police Department, which said that fortunately, no officer was in the police vehicle at the time. The woman ‘driving’ the Tesla sustained minor injuries, however.https://www.engadget.com/2018/05/29/tesla-model-s-in-autopilot-collides-with-police-suv/
This morning a Tesla sedan driving outbound Laguna Canyon Road in “autopilot” collides with a parked @LagunaBeachPD unit. Officer was not in the unit at the time of the crash and minor injuries were sustained to the Tesla driver. #lagunabeach #police #tesla pic.twitter.com/7sAs8VgVQ3
— Laguna Beach PD PIO (@LBPD_PIO_45) May 29, 2018
The website Engadget points out that Tesla tells drivers to keep their hands on the steering wheel when the semi-autonomous ‘Autopilot’ feature is engaged, with the company reiterating that the technology “doesn't make the car impervious to all accidents."
Should a driver let go of the steering wheel for around a minute, the Autopilot feature will be lost for the remainder of the journey (you don’t have to be a genius mathematician to work out that even at 30 miles an hour, that is enough time to travel half a mile.
But Engadget questions whether the safety warnings given to drivers using Autopilot mode are sufficient, and suggests that many Tesla drivers do not entirely appreciate their legal responsibilities when engaging it.
At least two crashes while Tesla’s Autopilot is operational have resulted in drivers of the vehicle being killed – one, in 2016, when the car crashed into a truck after a motorist who was watching a Harry Potter movie apparently ignored repeated warnings to keep his hands on the steering wheel.
An official inquiry established that he had failed to respond to seven audible warnings, and six visual ones on the car’s dashboard.
The safety of self-driving cars around vulnerable road users including cyclists is something we have regularly featured here on road.cc.
Last year, 80-year-old cyclist Fred Heppell from Durham was killed in a collision involving a Tesla car, although it is unclear whether the Autopilot feature was in operation at the time.
In May last year, a robotics expert at California’s Stanford University said that the technology would result in cyclists being killed.
Post-doctoral researcher Heather Knight said in a review posted on Medium that she “found the Autopilot’s agnostic behaviour around bicyclists to be frightening.”
She continued: “I’d estimate that Autopilot classified ~30 per cent of other cars, and 1 per cent of bicyclists.
“Not being able to classify objects doesn’t mean the Tesla doesn’t see that something is there, but given the lives at stake, we recommend that people NEVER USE TESLA AUTOPILOT AROUND BICYCLISTS!”
Dr Knight added: “Do not treat this system as a prime time autonomous car. If you forget that … bikers will die.”
Simon joined road.cc as news editor in 2009 and is now the site’s community editor, acting as a link between the team producing the content and our readers. A law and languages graduate, published translator and former retail analyst, he has reported on issues as diverse as cycling-related court cases, anti-doping investigations, the latest developments in the bike industry and the sport’s biggest races. Now back in London full-time after 15 years living in Oxford and Cambridge, he loves cycling along the Thames but misses having his former riding buddy, Elodie the miniature schnauzer, in the basket in front of him.