Support road.cc

Like this site? Help us to make it better.

"These are completely safe autonomous vehicles": Cyclist spots driverless car using cycle lane

Cruise insists its car would not have gone into the bike lane if there was a cyclist in it

"Perfect! These are completely safe autonomous vehicles."

That was the reaction of one Austin cyclist to a Cruise self-driving car, without a person even in the vehicle, making a left turn into a cycle lane before continuing to travel in the infrastructure along the next stretch of road.

At the lights, once stopped, the rider pulls up alongside, revealing an empty vehicle — no driver, no passengers, just one autonomous self-driving car.

Currently operating in evenings and overnight in San Francisco, Austin and Phoenix, Cruise's driverless taxi service operates in much the same way as Uber (just without the human moving passengers from A to B), with people requesting a ride on an app, and the company's website insists "safety is our priority. Period."

Once Fox 7 got hold of this video, Cruise released a statement insisting again that safety is their number one priority and the company will be "reviewing our lane-mapping in that area".

Cruise also insisted the car would not have entered the bike lane if there had been a cyclist using it, but the rider involved — Robert Foster — says it seems "reckless" to allow cars making "egregious mistakes".

"They're driving like a lot of maybe less experienced drivers in Austin drive or when they take a left turn, they just do it extremely wide, not realising that's both illegal and very unsafe," he explained.

"That just seems so reckless for them to be allowing cars that can make not small mistakes, but egregious mistakes, missing a lane by 16 feet. You know, that just seems egregious out on the streets.

Cruise driverless car in Austin bike lane (screenshot Twitter/@WalkerATX)

"This is a 4,000-pound vehicle that they're testing on the city streets. There's still enough error that I'd be very disappointed if someone I was teaching to drive was driving that way."

In reply Cruise commented: "Safety is Cruise's top priority, not just for our passengers but for everyone we share the road with. Our technology is always improving and we’ll be reviewing our lane-mapping in that area."

Cruise driverless car in Austin bike lane (screenshot Twitter/@WalkerATX)

But Foster has not been impressed by his experiences riding around the driverless vehicles and says he has seen another driving down the middle of the road, and that they are adding to an already dangerous existence for cyclists and pedestrians.

A little under a year ago we reported two instances of YouTubers capturing footage of their Tesla vehicles in Full Self-Driving Beta (FSD) struggling to avoid danger.

The first came just weeks after Elon Musk had claimed FSD had not been responsible for a single collision since its release in October 2020 and saw the vehicle crash into a cycle lane bollard. Earlier in the nine minute video the vehicle ran a red light.

Tesla FSD Beta crashes into cycle lane (screenshot via YouTube/AI Addict)

> Tesla using Full Self-Driving Beta crashes into cycle lane bollard...weeks after Elon Musk's zero collisions claim

Then, days later a second YouTuber uploaded a video of their Tesla in FSD almost ramming a cyclist in San Francisco.

Dan is the road.cc news editor and joined in 2020 having previously written about nearly every other sport under the sun for the Express, and the weird and wonderful world of non-league football for The Non-League Paper. Dan has been at road.cc for four years and mainly writes news and tech articles as well as the occasional feature. He has hopefully kept you entertained on the live blog too.

Never fast enough to take things on the bike too seriously, when he's not working you'll find him exploring the south of England by two wheels at a leisurely weekend pace, or enjoying his favourite Scottish roads when visiting family. Sometimes he'll even load up the bags and ride up the whole way, he's a bit strange like that.

Add new comment

66 comments

Avatar
chrisonabike replied to polainm | 1 year ago
0 likes

They will, if the AV parks where they want to park on the pavement / in the cycle lane...

Avatar
pasley69 replied to makadu | 1 year ago
0 likes

Reading this reminded me of a trip a week ago when i was following a tradesman (a painter) up a local highway. A large tin of white paint had fallen over in his trailer and was leaking onto the road - Wheee, for several km two narrow lanes created from one normal lane. The autonomous car's AI system would have had fun with that. And then I got to thinking: Is this going to be the next method for vandalism? Teenagers painting white lines on the roads? e.g. bring an existing lane to a point. Or would this amount to terrorism if done on a high-speed freeway - imagine the potential chaos - paint in a few diagonal white lines and paint a few existing lines black.

Avatar
Hirsute | 1 year ago
1 like

Someone drew a parallel with aeroplanes and automation which has implications for autonomous cars - see wired article

https://www.wired.com/story/opinion-the-plane-paradox-more-automation-sh...

Avatar
OnYerBike replied to Hirsute | 1 year ago
8 likes

It's an interesting article. I would however make two counterpoints when applying the logic to cars:

Firstly, pilots receive much more extensive and ongoing training and scrutiny than drivers. Given the behaviour of many drivers, there's a much lower bar for safety to be improved overall, even if automation introduces some new risks.

Secondly, and probably more importantly, cars have an obvious failsafe mode that planes don't: you can stop a car. If there's a problem that the computer can't deal with, all it needs to do is apply the brakes. If something goes wrong with a plane, it still needs to be able to keep flying and land safely. 

Avatar
Backladder replied to OnYerBike | 1 year ago
1 like
OnYerBike wrote:

Secondly, and probably more importantly, cars have an obvious failsafe mode that planes don't: you can stop a car. If there's a problem that the computer can't deal with, all it needs to do is apply the brakes. If something goes wrong with a plane, it still needs to be able to keep flying and land safely. 

Let us know how that works out for you when you do it in lane 3 on a busy motorway!

Avatar
OnYerBike replied to Backladder | 1 year ago
2 likes

Brings me back to the first point: the only danger comes from other drivers who are driving carelessly. Automated cars aren't pefect but they are pretty good at not following too closely and not crashing into the back of the car in front when it slows down/stops.

Avatar
KDee replied to OnYerBike | 1 year ago
2 likes

I like the big sticker on the back of the car..."May stop quickly". Remind me, as I am sometimes very thick, but isn't that a consideration whenever following another mode of transport (even when walking)?

Avatar
janusz0 replied to OnYerBike | 1 year ago
1 like

We'd like it to not crash into the back of other vehicles that might be on the road too.

Avatar
Backladder replied to OnYerBike | 1 year ago
1 like

And apparantly driving into a cycle lane regardless of who else might have thought they would remain on their part of the road. Good enough to avoid other cars does not equal good enough to be on the road.

Avatar
pasley69 replied to Backladder | 1 year ago
0 likes

Will this be the new standard for driving licenses? - Hey you drove around for 2 hours and didn't hit anything - great here's your licence.

Avatar
IanMSpencer replied to OnYerBike | 1 year ago
1 like

Ashley Neal's description of his Tesla's assist: it's like a bad driver with good reactions.

They may not hit things if they can help it, but their aggressive and erratic manoeuvring can cause problems. You can often spot a driver planning to do something odd, would a Tesla give you the same cues?

Avatar
mattw replied to Backladder | 1 year ago
4 likes

The all need Rhonda Pickering as the passenger.

Avatar
pasley69 replied to Backladder | 1 year ago
0 likes

Failsafe mode on the flat maybe, I'd like to see AI handling a necessary stop on a 1:20 grade that has oil spilled on the road. Or where its necessary to keep on going no matter what, maybe on a road that is going under water because of rising floodwaters, or because of the approaching tornado, or the bushfire, or maybe your angry husband is chasing you with a gun. All rare and unusual incidents of course; but any study of local newspapers will show that rare and unusual is not so uncommon.

Avatar
Hirsute replied to OnYerBike | 1 year ago
0 likes

Although planes do need to keep flying, they start from a high altitude giving a lot of time to get back control. Whereas in a car, you may only have a few seconds to act which may be less than the time required to orientate yourself to the situation.

I'm not sure you can say the brakes can always be invoked if the system is controlling the car. I'm not that confident in bug free systems.

Avatar
Jetmans Dad replied to Hirsute | 1 year ago
2 likes
hirsute wrote:

... bug free systems.

No such thing. The goal is always to minimise bugs and try to ensure they do not compromise correct operation of the software. 

Avatar
OnYerBike replied to Hirsute | 1 year ago
0 likes
hirsute wrote:

Although planes do need to keep flying, they start from a high altitude giving a lot of time to get back control.

Unless the plane has only just taken off, as per both of the 737 Max crashes.

hirsute wrote:

I'm not sure you can say the brakes can always be invoked if the system is controlling the car. I'm not that confident in bug free systems.

That's a fair point, I never said it would be a perfect system. But the ability to "just stop", whether that be the car's reaction or a human intervention, is still there in a way that it isn't with an aircraft. 

Avatar
Backladder replied to OnYerBike | 1 year ago
0 likes
OnYerBike wrote:

Unless the plane has only just taken off, as per both of the 737 Max crashes.

but that wasn't so much and aircraft failure as a failure of regulation and training, so very much like the tesla problems

Avatar
Rezis replied to OnYerBike | 1 year ago
0 likes
Avatar
OnYerBike replied to Rezis | 1 year ago
0 likes

I had actually seen that pop up on twitter - I maintain (as did many other commentors) that the crash was the result of the following drivers' careless/dangerous driving. It's not the UK so I don't know the letter of the law over there, but I'm pretty sure that the basic principle of leaving a sufficient distance to stop safely applies. 

Avatar
IanMSpencer replied to OnYerBike | 1 year ago
1 like

They certainly contributed, but it does seem that the Tesla fail safe response was not very fail-safe. Surely simply removing drive would be a better response, akin to disengaging cruise control than actively stopping in a live motorway lane. If it was intelligent enough to be self-driving (albeit with supervision that it objected to detecting not being present), it should be bright enough to stop on the hard shoulder or exit the freeway and stop.

When would stopping on a motorway live lane ever be the response of first resort to an issue?

Bear in mind that the 2 second rule does not relate to a safe stopping distance.

Avatar
OnYerBike replied to IanMSpencer | 1 year ago
0 likes

Clearly stopping suddenly in a live lane is not ideal, and of course it shouldn't be a first resort. But as a last resort, it is (IMHO) more sensible than just cutting the power. And my original point was simply that it is a last resort option that is not available to an aircraft!

Avatar
IanMSpencer replied to OnYerBike | 1 year ago
1 like

My understanding is that it was likely to be a response to a driver ignoring attention reminders. Whilst bringing the car to a halt may be an appropriate action in many urban environments, it is probably not the right choice on a freeway, where a moderate speed reduction and changing lane to the inside but progressing is safer than stopping. Given that the driver has already abrogated their responsibility, passing control to an inattentive driver on a high speed road does not seem well thought through. But that's the point isn't it? They are struggling to get self-driving cars to cope with normal scenarios, so abnormal scenarios are a big problem.

Avatar
ShutTheFrontDawes | 1 year ago
4 likes

The thing that really surprises me about this is the lack of a driver in the vehicle. The video demonstrates that the autonomous vehicle is not capable of driving to an acceptable standard, which raises two questions for me:
1) How has the risk associated with this vehicle being driven on public roads without a driver been assessed and considered acceptable?
2) How is the vehicle supposed to learn (e.g. that you shouldn't drive in a cycle lane) if there is no-one there to teach it through driver intervention? I can't believe that someone is reviewing footage and performing manual software updates. That would prevent the massive benefits of machine learning.

Avatar
chrisonabike replied to ShutTheFrontDawes | 1 year ago
6 likes

Sounds like drivers in Lancashire (thanks wtjs!).  Apparently no-one is providing them feedback either when they don't tax / insure their vehicles or drive them without due care and attention.  So maybe the autonomous car is being "trained" for operation there?

Avatar
wtjs replied to chrisonabike | 1 year ago
8 likes

Sounds like drivers in Lancashire

I think I would still be safer around an AV 'autonomised' by a Raspberry Pi Zero than near Lancashire Audi and BMW drivers

Avatar
OldRidgeback | 1 year ago
5 likes

Autonomous vehicle technology still has a long way to go. The algorithm developers have really struggled to get the vehicles to recognise pedestrians, cyclists and motorcyclists and also how to predict their behaviour.

Avatar
brooksby replied to OldRidgeback | 1 year ago
7 likes

Exactly.  We were supposed to have flying skateboards years ago, and we don't even have them yet, so 'properly' self-driving cars must be a long long way away...  3

Avatar
Jetmans Dad replied to OldRidgeback | 1 year ago
2 likes
OldRidgeback wrote:

... and also how to predict their behaviour.

Even though the point is that because cars deal with road conditions very differently to more vulnerable users (e.g. bicycles) cyclists can move in unpredictable ways.

Avatar
kil0ran replied to OldRidgeback | 1 year ago
4 likes

The only solution is dedicated, segregated infrastructure.

For AVs.

Avatar
brooksby replied to kil0ran | 1 year ago
10 likes
kil0ran wrote:

The only solution is dedicated, segregated infrastructure.

For AVs.

Scalextric  3

Pages

Latest Comments