Support road.cc

Like this site? Help us to make it better.

Tesla car on Autopilot crashes into police car in California

Safety of semi-autonomous vehicle under the spotlight once again following incident in California

The safety of autonomous and semi-autonomous cars has once again been called into question after a Tesla Model S crashed into a police car in California, prompting concerns over whether drivers are receiving sufficient training about engaging the self-driving feature.

The aftermath of the crash was shown in a series of pictures tweeted by the Laguna Beach Police Department, which said that fortunately, no officer was in the police vehicle at the time. The woman ‘driving’ the Tesla sustained minor injuries, however.https://www.engadget.com/2018/05/29/tesla-model-s-in-autopilot-collides-with-police-suv/

The website Engadget points out that Tesla tells drivers to keep their hands on the steering wheel when the semi-autonomous ‘Autopilot’ feature is engaged, with the company reiterating that the technology “doesn't make the car impervious to all accidents."

Should a driver let go of the steering wheel for around a minute, the Autopilot feature will be lost for the remainder of the journey (you don’t have to be a genius mathematician to work out that even at 30 miles an hour, that is enough time to travel half a mile.

But Engadget questions whether the safety warnings given to drivers using Autopilot mode are sufficient, and suggests that many Tesla drivers do not entirely appreciate their legal responsibilities when engaging it.

At least two crashes while Tesla’s Autopilot is operational have resulted in drivers of the vehicle being killed – one, in 2016, when the car crashed into a truck after a motorist who was watching a Harry Potter movie apparently ignored repeated warnings to keep his hands on the steering wheel.

An official inquiry established that he had failed to respond to seven audible warnings, and six visual ones on the car’s dashboard.

The safety of self-driving cars around vulnerable road users including cyclists is something we have regularly featured here on road.cc.

Last year, 80-year-old cyclist Fred Heppell from Durham was killed in a collision involving a Tesla car, although it is unclear whether the Autopilot feature was in operation at the time.

> Durham cyclist may be world's first to die in collision with a Tesla – unclear if it was in Autopilot mode

In May last year, a robotics expert at California’s Stanford University said that the technology would result in cyclists being killed.

> Never use Tesla Autopilot feature around cyclists, warns robotics expert

Post-doctoral researcher Heather Knight said in a review posted on Medium that she “found the Autopilot’s agnostic behaviour around bicyclists to be frightening.”

She continued: “I’d estimate that Autopilot classified ~30 per cent of other cars, and 1 per cent of bicyclists.

“Not being able to classify objects doesn’t mean the Tesla doesn’t see that something is there, but given the lives at stake, we recommend that people NEVER USE TESLA AUTOPILOT AROUND BICYCLISTS!”

Dr Knight added: “Do not treat this system as a prime time autonomous car. If you forget that … bikers will die.”

Simon joined road.cc as news editor in 2009 and is now the site’s community editor, acting as a link between the team producing the content and our readers. A law and languages graduate, published translator and former retail analyst, he has reported on issues as diverse as cycling-related court cases, anti-doping investigations, the latest developments in the bike industry and the sport’s biggest races. Now back in London full-time after 15 years living in Oxford and Cambridge, he loves cycling along the Thames but misses having his former riding buddy, Elodie the miniature schnauzer, in the basket in front of him.

Add new comment

21 comments

Avatar
Yorkshire wallet | 5 years ago
1 like

Elon Musk is the Barnum of the car world. 

Company will get wrapped up if it carries on as it is. Can't even make 5000 cars a month. 

Avatar
cyclisto | 5 years ago
1 like

The problem is that even if autonomous driving is 10 times safer than the average driver, that a single accident causes tons of negative publicity and possible lawsuits. To put it simple, because we enjoy whining and take advantage of a situation more people will die

Avatar
PRSboy | 5 years ago
1 like

This came up on the BBC today...

https://www.bbc.co.uk/news/av/business-44460980/this-car-is-on-autopilot...

It seems obvious to me or anyone that its plainly not ready to be called Autopilot or self driving.

 

 

Avatar
CXR94Di2 | 5 years ago
2 likes

Autonomous vehicles don't have to be perfect just better than humans at driving.

I would suggest that the few incidents involving Tesla cars is significantly less than the magnitude of 'normal' driver errors.

Lane keeping and safety distance on autocruise are useful additions to our two newest cars. It allows me to cover the brakes when approaching hazards, the car alters speed with that of the car in front automatically.

Avatar
Yorkshire wallet replied to CXR94Di2 | 5 years ago
1 like

CXR94Di2 wrote:

Autonomous vehicles don't have to be perfect just better than humans at driving.

If I'm going to die whilst in a car I'd at least like it to be my fault, as odd as that sounds. I've had accidents in the past but they were because I was driving in a manner likely to cause an accident. 20 odd years down the line and I've never had another accident. It would just be perverse to me die because of software glitch or something.

I suppose I'm discounting other arseholes driving like arseholes but I can generally read the road and other peoples car 'body language' pretty well so stay out of most trouble.

Avatar
PRSboy | 5 years ago
2 likes

Quotes from Tesla's own website... you could be forgiven for thinking that autopilot is fully autonomous, excepting the weasel words:

"All Tesla vehicles produced in our factory, including Model 3, have the hardware needed for full self-driving capability at a safety level substantially greater than that of a human driver."

"Together, this system provides a view of the world that a driver alone cannot access, seeing in every direction simultaneously, and on wavelengths that go far beyond the human senses."

"Your Tesla will match speed to traffic conditions, keep within a lane, automatically change lanes without requiring driver input, transition from one freeway to another, exit the freeway when your destination is near, self-park when near a parking spot and be summoned to and from your garage."

"Build upon Enhanced Autopilot and order Full Self-Driving Capability on your Tesla. This doubles the number of active cameras from four to eight, enabling full self-driving in almost all circumstances, at what we believe will be a probability of safety at least twice as good as the average human driver. The system is designed to be able to conduct short and long distance trips with no action required by the person in the driver’s seat

All you will need to do is get in and tell your car where to go. If you don’t say anything, the car will look at your calendar and take you there as the assumed destination or just home if nothing is on the calendar. Your Tesla will figure out the optimal route, navigate urban streets (even without lane markings), manage complex intersections with traffic lights, stop signs and roundabouts, and handle densely packed freeways with cars moving at high speed. When you arrive at your destination, simply step out at the entrance and your car will enter park seek mode, automatically search for a spot and park itself. A tap on your phone summons it back to you."

But...

"Please note that Self-Driving functionality is dependent upon extensive software validation and regulatory approval, which may vary widely by jurisdiction."

And of course...

"Every driver is responsible for remaining alert and active when using Autopilot, and must be prepared to take action at any time."   

Avatar
don simon fbpe | 5 years ago
0 likes

Twatty Volvo drivers and their twatty cars!

Avatar
FluffyKittenofT... | 5 years ago
0 likes

When these things start killing  third parties, where will the legal responsibility be placed?

  People are talking about 'suing' and fines, etc, but someone will need to go to prison, and the question I'm not clear on is whether it will be the non-driving 'driver' (on the basis that they weren't paying attention as per the manufacturer's instructions - depsite the fact that it's human nature to get bored and stop paying attention in such a situation) or the CEO or chief engineer of the company that made the flawed vehicle?

 

I'm concerned it might be 'none of the above' as they all blame each other and it gets written off as an 'accident' and 'just one of those things'.  And that it will turn a personal act by an identifiable humanwho can be held morally responsibile, into another one of those impossible-to-prosecute cases of corporate manslaughter.

Avatar
hawkinspeter replied to FluffyKittenofTindalos | 5 years ago
2 likes

FluffyKittenofTindalos wrote:

When these things start killing  third parties, where will the legal responsibility be placed?

  People are talking about 'suing' and fines, etc, but someone will need to go to prison, and the question I'm not clear on is whether it will be the non-driving 'driver' (on the basis that they weren't paying attention as per the manufacturer's instructions - depsite the fact that it's human nature to get bored and stop paying attention in such a situation) or the CEO or chief engineer of the company that made the flawed vehicle?

 

I'm concerned it might be 'none of the above' as they all blame each other and it gets written off as an 'accident' and 'just one of those things'.  And that it will turn a personal act by an identifiable humanwho can be held morally responsibile, into another one of those impossible-to-prosecute cases of corporate manslaughter.

I think it's important to note the distinction between Tesla's autopilot and other autonomous vehicles (e.g. Waymo, Google etc).

With Tesla, it's not autonomous and I believe the driver has to keep their hands on the wheel most of the time or else it'll leave auto-pilot mode. IANAL, but I'd guess that the driver has legal responsibility for any incidents. The driver may decide that they can offload some responsibility by suing the manufacturer for mis-representing what the vehicles can do, but that'd be a separate court case between the driver/operator and the manufacturer.

With an autonomous car (e.g. Google's one without controls), I'd guess that the responsibility would sit squarely with the manufacturer. I can't see a manufacturer wanting to displace that responsibility onto the "driver" as that'd absolutely destroy their ability to sell more autonomous vehicles. I'd expect the manufacturers to end up doing deals with the insurance companies to better manage the risk, but some manufacturers might decide to handle it themselves.

Avatar
rkemb replied to FluffyKittenofTindalos | 5 years ago
1 like

FluffyKittenofTindalos wrote:

When these things start killing  third parties, where will the legal responsibility be placed?

  People are talking about 'suing' and fines, etc, but someone will need to go to prison, and the question I'm not clear on is whether it will be the non-driving 'driver' (on the basis that they weren't paying attention as per the manufacturer's instructions - depsite the fact that it's human nature to get bored and stop paying attention in such a situation) or the CEO or chief engineer of the company that made the flawed vehicle?

 

I'm concerned it might be 'none of the above' as they all blame each other and it gets written off as an 'accident' and 'just one of those things'.  And that it will turn a personal act by an identifiable humanwho can be held morally responsibile, into another one of those impossible-to-prosecute cases of corporate manslaughter.

Google and Volvo have already stated that they are willing to accept responsibility -- see http://fortune.com/2015/10/07/volvo-liability-self-driving-cars/ , for example.

Avatar
shufflingb | 5 years ago
2 likes

Anyone who has used Siri, Alexa or similar will know precisely how "good" state of the art recognition technology is.  Safely controlling a car with limited mobile processing power on public roads is many orders of magnitude harder than that.

First, you have to "see", then you have to able to classify what has been "seen" before finally triggering the appropriate response from the car.  State of the art for each of these is, well we've already mentioned Siri and Alexa, mix in software bugs and poorly designed systems, and you've got what we refer to in engineering as a clusterfuck.  Eventually, it will be made to work, and it will make the roads safer, but it will be a long time before it does.

Really, the horrific thing about all of this is we could have most of the safety benefits now with far simpler systems. Except, well, we have politicians who lack leadership and an automotive industry that has a very limited interest in making sure its vehicles don't exceed speed limits and kill and maim others.

So in the meantime, we continue with the death and mayhem for the foreseeable future. That's the thing that is the saddest about all of this.

Avatar
FatBoyW | 5 years ago
2 likes

I am shocked at how little auto the autopilot does! Seems it’s going to be used to be less attentive when it is not fit for purpose and it makes it harder to be safe as most of the time it will do the right thing, much harder to keep  your concentration when not doing anything. Also what a weird safety response, don’t keep your hands on the wheel? Ok then the car puts you from a bad situation into a dangerous one! Surely putting the hazard lights on and bring the car to a stop would be better? Maybe pull it to the side and don’t let it start up again for 20 minutes or something. 

Regardless of my mindless musings, the basic premise that each individual in society must have a ton of metal to  move around in with the only way to be really safe around them is for everyone to be in their own metal safety cage and allow these things to travel easily at 60 - 80 mph in urban conditions due to their power and handling with it seems no enforcement of speed limits is just so bonkers, such a shame we as a society cannot accept its time to give up this bonkers transport system.

Avatar
rkemb replied to FatBoyW | 5 years ago
2 likes

FatBoyW wrote:

I am shocked at how little auto the autopilot does! Seems it’s going to be used to be less attentive when it is not fit for purpose and it makes it harder to be safe as most of the time it will do the right thing, much harder to keep  your concentration when not doing anything. Also what a weird safety response, don’t keep your hands on the wheel?

What's weird is that there is no mandated standard for monitoring driver disengagement. With GM Supercruise, it watches the driver's eyes and any extended period of not watching the road sounds an alarm, flashes lights, and disengages the system. Tesla, on the other hand, seem perfectly fine with the driver paying very little attention.

Avatar
FatBoyW | 5 years ago
3 likes

I am shocked at how little auto the autopilot does! Seems it’s going to be used to be less attentive when it is not fit for purpose and it makes it harder to be safe as most of the time it will do the right thing, much harder to keep  your concentration when not doing anything. Also what a weird safety response, don’t keep your hands on the wheel? Ok then the car puts you from a bad situation into a dangerous one! Surely putting the hazard lights on and bring the car to a stop would be better? Maybe pull it to the side and don’t let it start up again for 20 minutes or something. 

Regardless of my mindless musings, the basic premise that each individual in society must have a ton of metal to  move around in with the only way to be really safe around them is for everyone to be in their own metal safety cage and allow these things to travel easily at 60 - 80 mph in urban conditions due to their power and handling with it seems no enforcement of speed limits is just so bonkers, such a shame we as a society cannot accept its time to give up this bonkers transport system.

Avatar
fukawitribe | 5 years ago
1 like

Crikey.... given that means the there's less than 50 drivers in the UK who don't have low IQ or are predisposed to making poor judgements - and you're one of them by your own judgement - then we're basically fucked, aren't we ?

Avatar
BehindTheBikesheds replied to fukawitribe | 5 years ago
0 likes

fukawitribe wrote:

Crikey.... given that means the there's less than 50 drivers in the UK who don't have low IQ or are predisposed to making poor judgements - and you're one of them by your own judgement - then we're basically fucked, aren't we ?

pretty much, the numbers don't lie do they!

Avatar
ConcordeCX replied to fukawitribe | 5 years ago
1 like

fukawitribe wrote:

Crikey.... given that means the there's less than 50 drivers in the UK who don't have low IQ or are predisposed to making poor judgements - and you're one of them by your own judgement - then we're basically fucked, aren't we ?

yes, but look on the bright side - their IQ must be abso-fucking-lutely enormous in order to bring the average back to 100, so they’re the people who should be writing the AI software (at least until the software can write itself).

Avatar
fukawitribe replied to ConcordeCX | 5 years ago
1 like

ConcordeCX wrote:

fukawitribe wrote:

Crikey.... given that means the there's less than 50 drivers in the UK who don't have low IQ or are predisposed to making poor judgements - and you're one of them by your own judgement - then we're basically fucked, aren't we ?

yes, but look on the bright side - their IQ must be abso-fucking-lutely enormous in order to bring the average back to 100, so they’re the people who should be writing the AI software (at least until the software can write itself).

Ah excellent point - however it gets weirder... my maths was out by a factor of a hundred so it's actually slightly less than half a driver or, rounding up, one person. As we already know it's BTBS, that must make his IQ beyond astronomical !... which I guess is what he's been telling us all along one way or another..... who knew ?

Avatar
BehindTheBikesheds | 5 years ago
4 likes

'But Engadget questions whether the safety warnings given to drivers using Autopilot mode are sufficient, and suggests that many Tesla drivers do not entirely appreciate their legal responsibilities when engaging it.'

This sums up 99.999999% of all drivers on the planet.

It doesn't matter how much tech you put in place, even fully auto, either the human in charge or the human programming the AI will still fuck it up because governments and plod still don't give a toot about responsibilities and safety, because they themselves are entitled motons.

Allowing people with low IQ/people predisposed to making poor judgements (99.99999% again) in charge of killing machines is no better than letting some peope loose with guns, jackboots, building materials and poisoness gasses ... still culpable and indeed complicit. #Godwin'slaw   

Avatar
The_Vermonter | 5 years ago
3 likes

Considering this was in California, a state where police have a habit of running over cyclists and getting away with it, I say that's AI-induced karma.

Avatar
Grahamd | 5 years ago
6 likes

 

In a perverse way I think this is good news. The police take a dim view on vehicles that collide with theirs, so hopefully there is a proper investigation and whatever flaws are in the car and / or manuals  / training are addressed promptly making the roads safer for everyone.

 

Latest Comments