Support road.cc

Like this site? Help us to make it better.

Never use Tesla Autopilot feature around cyclists, warns robotics expert

She says people would be killed

According to a Stanford University robotics researcher, Tesla’s semi-autonomous driving technology displays a frightening inability to recognise cyclists. After testing the feature along with a colleague, Heather Knight recommended that it never be activated when cyclists are around as she believes people would be killed.

Knight and Dylan Moore are part of a research group at Stanford University’s Department of Mechanical Engineering. Fortune reports that Knight, who posted the review to Medium, has a PhD from the Robotics Institute at Carnegie Mellon University and is currently doing post-doctoral research in social robotics.

When activated, the Tesla Autopilot feature will speed up or slow down the car based on what’s in front of it, keeping it in lane and following the turns of the road. Tesla makes it clear to drivers that the system is not fully autonomous and that they should keep their hands on the steering wheel and pay attention at all times.

Truck driver says man killed in Tesla Autopilot crash was watching Harry Potter

Knight is concerned that some will ignore the system’s limitations and put cyclists’ lives at risk. She writes that she “found the Autopilot’s agnostic behaviour around bicyclists to be frightening.”

Tesla's Situation Awareness Display helps the human driver have a mental model of what the car sees. Knight gave this feature an A+ rating on the grounds that "it helps the driver understand shortcomings of the car, i.e., its perception sucks."

She adds: “I’d estimate that Autopilot classified ~30% of other cars, and 1% of bicyclists. Not being able to classify objects doesn’t mean the tesla doesn’t see that something is there, but given the lives at stake, we recommend that people NEVER USE TESLA AUTOPILOT AROUND BICYCLISTS!”

She concludes her review by warning, “do not treat this system as a prime time autonomous car. If you forget that… bikers will die.”

Alex has written for more cricket publications than the rest of the road.cc team combined. Despite the apparent evidence of this picture, he doesn't especially like cake.

Add new comment

18 comments

Avatar
burtthebike | 6 years ago
0 likes

"Never use Tesla Autopilot feature around cyclists, warns robotics expert."

Yet again, a reversal of responsibility, with drivers allowed to use technology which endangers other road users.  Why?

How come dangerous technology is allowed on roads?  Surely this should not be allowed until it is safe for all other road users?

Avatar
ribena | 6 years ago
0 likes

Google quickly abandoned the "half way house" driver assist technology for all these reasons.

https://medium.com/waymo/why-were-aiming-for-fully-self-driving-vehicles...

"People didn’t pay attention like they should have. We saw some silly behavior, including someone who turned around and searched the back seat for his laptop to charge his phone — while traveling 65mph down the freeway! We saw human nature at work: people trust technology very quickly once they see it works. As a result, it’s difficult for them to dip in and out of the task of driving when they are encouraged to switch off and relax."

I work in the industry. There is a lack of testing generally for these technologies. Companies are just rushing them to market as part of the release cycle, and don't have the resources for the kind of up front development that Google (and Apple?) have.

But there is also no testing whatsoever of how these technologies affect driver behaviour.

Before full autonomus cars reach the market, we could well make the roads more dangerous.

 

Avatar
RobD | 6 years ago
0 likes

If the default position for any semi-autonomous car is that anything it can't identify/classify causes it to slow to a crawl until it has safely passed the object or recognised it then I would feel less worried, as it stands I don't know how much this system would modify it's driving, probably more so than most drivers who seem to wait until they have no option but to slow down behind you (or attempt a last minute pass).

Avatar
velo-nh | 6 years ago
0 likes

Hands on the wheel is a requirement in all fifty states.  Since the feature is pointless while your hands are on the wheel, it should not be legal for sale in the US.

 

Avatar
Mungecrundle | 6 years ago
2 likes

Little consolation (apart from the stonking insurance payout) if you are one of those run over by a Tesla, but are they not statistically far less likely to be involved in a road traffic collision even at the current stage of development than human drivers?

Having said that I hope the Tesla AI is smarter than the auto headlight dimming, auto windscreen wiping, and other dumb as fuck driver 'conveniences' in my Citroen.

Avatar
handlebarcam | 6 years ago
1 like

Quote:

Never use Tesla Autopilot feature around cyclists, warns robotics expert

Countdown to people deliberately going out in their Teslas, switching on the autopilot when they see cyclists, and then posting the results on YouTube, in 3... 2... 1...

Avatar
alg | 6 years ago
8 likes

so no change there for us - the 'semi-autonomous driver' is just like almost every other driver on the road

Avatar
1961BikiE | 6 years ago
4 likes

Yes this basically suggests that the only place it could be used is roads where cycles are prohibited, in UK that means motorways.

So if it doesn't recognise cyclists can it detect pedestrians, motorbikes, scooters?

Avatar
rogermerriman replied to 1961BikiE | 6 years ago
1 like

1961BikiE wrote:

Yes this basically suggests that the only place it could be used is roads where cycles are prohibited, in UK that means motorways. So if it doesn't recognise cyclists can it detect pedestrians, motorbikes, scooters?

In spite of the name (autopilot) its more of a cruise control plus in that it will keep you in lane, change lane if you ask and keep distance from cars ahead.

 

It doesn't have radar etc, hence one of them missing a crossing lorry or rather not.

Avatar
P3t3 replied to rogermerriman | 6 years ago
0 likes
rogermerriman wrote:

In spite of the name (autopilot) its more of a cruise control plus in that it will keep you in lane, change lane if you ask and keep distance from cars ahead

But that's what auto-pilot does.

It's not an incorrect name. The concept of Autopilots on aircraft where the name comes from ranges from keeping the plane flying level (but not straight) all the way to being able to fly a defined route to a specific point. The principal reason for it is that it reduces pilot mental fatigue but the pilot still needs to be alert and able to take control at any instant.

I'm not saying it's a good thing cars have these features though, there are lots of problems associated with over reliance on autopilot even in well trained pilots. In ordinary people behind the wheel it's bound to be much more problematic. Legeslastion will not keep up/catch up with this kind of rapid leap in technology.

Avatar
ex_terra | 6 years ago
4 likes

And the size of the problem  / level of risk is perfectly illustated by Stephen Fry - a self admitted Tesla owner.

At this weekend's Hay festival Fry spoke about technology and robotics, amongst other things saying that  he owned an automated car that allowed him to read a book or watch television as the car drove itself along California's freeways."*

I'm not familiar with California Freeways to know if you can cycle on them but Fry seems blissfully unaware at how unreliable his car's technology is - claiming to read a book or watch TV while the car drives itself may well result in a serious accident or death as a handful of other trusting Tesla owners have already found to their own cost.

*source: http://www.bbc.co.uk/news/entertainment-arts-40076701

Avatar
RMurphy195 replied to ex_terra | 6 years ago
0 likes

ex_terra wrote:

And the size of the problem  / level of risk is perfectly illustated by Stephen Fry - a self admitted Tesla owner.

At this weekend's Hay festival Fry spoke about technology and robotics, amongst other things saying that  he owned an automated car that allowed him to read a book or watch television as the car drove itself along California's freeways."*

I'm not familiar with California Freeways to know if you can cycle on them but Fry seems blissfully unaware at how unreliable his car's technology is - claiming to read a book or watch TV while the car drives itself may well result in a serious accident or death as a handful of other trusting Tesla owners have already found to their own cost.

*source: http://www.bbc.co.uk/news/entertainment-arts-40076701

There's Risk and hen there's Impact. Personally I don't want to be the Impact from someone else's decision to take a Risk.

Avatar
urbane replied to ex_terra | 6 years ago
0 likes
ex_terra wrote:

And the size of the problem  / level of risk is perfectly illustated by Stephen Fry - a self admitted Tesla owner.

At this weekend's Hay festival Fry spoke about technology and robotics, amongst other things saying that  he owned an automated car that allowed him to read a book or watch television as the car drove itself along California's freeways."*

I'm not familiar with California Freeways to know if you can cycle on them but Fry seems blissfully unaware at how unreliable his car's technology is - claiming to read a book or watch TV while the car drives itself may well result in a serious accident or death as a handful of other trusting Tesla owners have already found to their own cost.

*source: http://www.bbc.co.uk/news/entertainment-arts-40076701

Stephen Fry has repeatedly demonstrated that he is a r-type degenerate who like other r-types can't see the limitations of technology; it's like a magic toy to him.

Avatar
brooksby | 6 years ago
11 likes

"Tesla makes it clear to drivers that the system is not fully autonomous and that they should keep their hands on the steering wheel and pay attention at all times." - which is, lets face it, not the average person's common sense understanding of the description "Autopilot".

Avatar
Yorkshire wallet | 6 years ago
21 likes

This halfway house technology really shouldn't be on the roads at all. It either does it all, or it does nothing if you want the driver to keep paying attention. You can't expect it to do everthing 99% of the time and have the driver really paying attention 100% of the time.

Avatar
danthomascyclist | 6 years ago
3 likes

This is an engineering problem that will be fixed with time. I already trust these vehicles far more than I trust humans

 

Also I'd like to reiterate; the Tesla doesn't have problem seeing cyclists, it just has problems identifying them. The vehicle knows that something is there, and so takes evasive action as required.

Avatar
MikeOnABike replied to danthomascyclist | 6 years ago
12 likes

danthomascyclist wrote:

This is an engineering problem that will be fixed with time. I already trust these vehicles far more than I trust humans

 

Also I'd like to reiterate; the Tesla doesn't have problem seeing cyclists, it just has problems identifying them. The vehicle knows that something is there, and so takes evasive action as required.

 

You either work for Tesla, own a Tesla or you are Elon Musk.

Avatar
jh27 replied to danthomascyclist | 6 years ago
1 like
danthomascyclist wrote:

This is an engineering problem that will be fixed with time. I already trust these vehicles far more than I trust humans

 

u

Also I'd like to reiterate; the Tesla doesn't have problem seeing cyclists, it just has problems identifying them. The vehicle knows that something is there, and so takes evasive action as required.

So it 'sees' cyclists... as an inanimate obstacle.
?

What next? Is it going to start leaving comments on the Daily Mail about road tax and red light jumping?

Latest Comments