A video posted to YouTube this week shows the shocking moment a Tesla car operating in Full Self Driving (FSD) Beta mode in San Francisco suddenly veered towards a cyclist, who was oblivious to the danger the vehicle placed him in.
The footage was posted to the video-sharing website by vlogger and Tesla enthusiast Omar Qazi, who immediately before the near-collision had said, “You can actually make thousands of people drive safer — just with a software update,” and who had to grab the steering wheel to get the car back on course to avoid hitting the bike rider.
Following the close call, the driver, who goes by the name HyperChange on social media, asks “are we gonna have to cut that?” and also insists that “it wouldn’t have hit him” – although we’re not sure any cyclist would voluntarily take their chances at sharing roadspace with an autonomous vehicle capable of suddenly changing direction in this way.
As journalist Jason Torchinsky pointed out in his report on the incident on the motoring website Jalopnik,
Omar goes on to suggest that this sort of thing is “shocking to people because it’s new,” though I may take the bold position that it’s shocking to people because it fucking turned right toward a cyclist who was clearly visible for no good discernible reason. I think maybe that’s a bigger shock than, you know, “newness.”
Qazi claims that the FSD system “functioned exactly as designed” since “it detected that there’s a potentially dangerous situation” – although as Torchinsky highlights, it was a situation entirely of the car’s making.
“Well, the whole time you’re driving on human pilot, you’re making your car avoid hitting a biker.” Qazi continues.
“You’re constantly making your car avoid hitting a biker ... but then you’re surprised that you’re doing it for one second while on FSD Beta” – leading Torchinsky to suggest that “if you’re characterising normal driving as ‘constantly making your car avoid hitting’ anything, let alone a person on a bike, then I’d have to say your fundamental view of driving is deeply, dangerously wrong.”
“This time, nobody got hurt, and it was all pretty funny,” the journalist added. “This time.”
It’s the second incident involving a Tesla in FSD Beta mode that we have reported on this week, despite the company’s CEO, Elon Musk, claiming last month that the technology had not been responsible for a single crash since its launch in October 2020.
> Tesla using Full Self-Driving Beta crashes into cycle lane bollard ... weeks after Elon Musk's zero collisions claim
But a video shot by San Jose-based YouTuber AI Addict showed his Tesla car crashing into a segregated cycle lane bollard as it made a right turn while the self-driving mode was engaged.
In a voiceover on the video, he said: “Changing lanes ... Oh ... S***. We hit it. We actually hit that. Wow. We were so close on the corner ... I can't believe the car didn’t stop.”
“Alright, YouTube, it's confirmed I have hit that pylon. It's a first for me to actually hit an object in FSD,” he added.
According to Tesla, FSD is “capable of delivering intelligent performance and control to enable a new level of safety and autonomy.”
The technology supposedly enables the vehicle to drive itself to a destination that has been input on the car’s navigation system, although the motorist has to be prepared to assume control should something go wrong.
In December, we reported how Musk had been accused of encouraging driver distraction and putting road users in danger when it emerged that Tesla owners are now able to play video games through the car’s infotainment system while the vehicle is in operation.
> Tesla owners can now play video games… while their car is moving
And commenting on an earlier version of the motor manufacturer’s software, a researcher at Stanford University in California said in 2017 that Tesla’s autonomous vehicle technology had no place being used around cyclists.
> Never use Tesla Autopilot feature around cyclists, warns robotics expert
Post-doctoral robotics researcher Heather Knight wrote that she “found the Autopilot’s agnostic behaviour around bicyclists to be frightening.”
In a review posted to Medium, she said: “I’d estimate that Autopilot classified ~30 per cent of other cars, and 1 per cent of bicyclists.
“Not being able to classify objects doesn’t mean the Tesla doesn’t see that something is there, but given the lives at stake, we recommend that people NEVER USE TESLA AUTOPILOT AROUND BICYCLISTS!”
She concluded her review by saying: “Do not treat this system as a prime time autonomous car. If you forget that … bikers will die.”
Add new comment
37 comments
From the little I watched, thanks marmotte, it was seeking to ignore bus only lanes, and when it did swerve at the cyclists it appeared to want to be driving in a cycle lane.
It's obvious now you mention it. They'd loaded Tesla's self driving bike software by mistake!
So it's not just poor drivers thinking they're great drivers - all those cars parked on pavements and in cycle lanes genuinely believed they should be there!
I watched a few minutes before the incident: skimming past wing mirrors, excessive steering and random swerves which gave the driver the heebie-jeebies, driving in the wrong lane, failing to move off on a green light. On the other hand, it seemed to be driving like several other cars around, so obviously they haven't a high bar to aim for.
It is at
https://youtu.be/N_iQSRMzmRQ?t=1514
For those who don't want to watch an hour of inane drivel.
So my thinking is that either we'll always need a human behind a wheel to take legal responsibility when things go wrong, or legal responsibility will be on the vehicle manufacturer.
If the latter is true then we can expect these vehicles to become very risk averse, to the point that you could treat any road as a zebra crossing.
Either this ushers in a new age of public surveillance as governments demand video footage recorded by fsd cars to catch traffic offenders and/or ushers in a new age of active travel as people realise it's much easier to get around when all the cars stay out of your way.
Or an acceptance that"accidents" will happen...
Self driving, electric . . . . . It still does nothing to solve congestion and the clogging up of the world's cities by cars.
Not entirely true if it drives up the use and down the cost of taxi's since they are now available 24/7.
Plus self driving buses on demand buses then become a thing.
You're naive if you think the world is going to go cold turkey on the car overnight. It's probably the work of another century to eliminate them.
"It's probably the work of another century to eliminate them."
It'll go a lot faster when the real crises we are fabricating, amongst other things through our car use, hit.
Only then we'll not just be rid of cars but also the rest of civilization.
It took Amsterdam and other Dutch cities considerably less time to drastically reduce vehicle traffic. Elimination may be too much but a great deal can be (and is being) done to make urban areas more people-friendly and less car-centric.
Self-driving cars, like tunnels, flying cars etc are not the answer. Taxis and Uber are not a solution to anything either. And if the cost of a taxi is low it means the person doing the work is poorly treated and paid.
Drawing policy conclusions from individual cases like this isn't useful at all. The question isn't whether these cars are 100% safe but whether, in the aggregate, they are safer overall than the common alternative.
If autonomous cars killed five cyclists a year, that would be bad, but it would still be a hell of a lot better than what human drivers do. It may seem unpleasant, perhaps even immoral, to think of it that way, but it's also the only mode of thinking that leads to a genuine improvement in human welfare.
If you look at the Tesla's screen where it shows your own and surrounding vehicles, it doesn't detect the cyclist. Regardless it was trying to go into a non-car lane. It would certainly have hit the cyclist, look how far the steering wheel turned right.
The weird thing is that it *does* detect the cyclist. You can see him rendered on the screen the whole time, which makes it more and more likely that there's a problem with the FSD code.
I noticed that the forward collision warning sounded just as he took control. The emergency braking etc runs independently from the full self driving software (for obvious reasons) and works like it does with a human driving - it'll step in at the very last minute to try to mitigate an accident.
I'm hoping this guy is kicked out of the testing program for letting his friend "have a go" and publishing a video of them driving without holding the wheel.
I question the sanity in putting anything self-driving that is in a beta stage anywhere other than a closed test track. Never mind a city!
Betas are usually feature complete and considered close to release, but I think they need to consider this an alpha.
Betas are usually feature complete and considered close to release, but I think they need to consider this an alpha.[/quote]
Alpha or Beta, until they are sure it can identify and avoid people keep it out of a city.
If the history of the motor vehicle has taught us anything it's that we are remarkably tolerant of mayhem inflicted at large when the temptations of money and "the new" are available. Children and animals having to "learn to get out of the way of the motorist", people and buildings being shoved out of the way, leaded petrol...
So I'd agree but it's more of a "hope". However on the "benefits" side you can read rich_cb on how they're mostly doing very well and the potential this could unlock.
Even with my cycling hat on, I think I'd prefer to have the computers in charge of the streering, after seeing Omar Qazi playing with the wheel at the start.
This car urgently needs the obligatory spike on the steering wheel.
You can see on the screen that it loses the cyclist. The car is visible moving past and stopping. The cyclist rides past and then disappears, and doesn't reappear/alarm goes off until the driver takes over and the cyclist is directly in front. Car might as well have been on the phone.....
At 25:35...
Thanks for this. That's a half hour of twattery avoided.
Thanks - the most useful comment on here.
Should have gone with LIDAR.
It seems to me that encountering an issue after just 25 minutes or so, means that it's not a rare occurrence and shows that it's not nearly ready for use unsupervised.
I honestly don't think fully self driving cars will be with us within the next ten years, perhaps in motorway environments, where the number of variables is reduced.the car could be self driving, but not in an urban environment.
Even then the fact the car can't see the cyclist suggests it also would not see a person changing a wheel on the hard shoulder.
They're already with us.
Completely self driving taxis are in operation in areas of Phoenix and San Francisco.
Tesla are miles behind Waymo and GM.
Waymo self driving vehicles can only operate in areas where the roads have been mapped within an inch of their lives. They have been doing this for years. To make their technology available elsewhere they just have to do the same for the rest of the globe. Then update the mapping everytime there is a road change. It could take forever. They have a handful of test cars providing data to do that. Oh and each car costs a fortune with its lidar everywhere.
Teslas operate to a much lower level of autonomy but they can operate everywhere and they have a fleet of millions of cars supplying data to improve their technology. And they already have cars operating with all the hardware they will need.
I think that when you predict who is closest to bringing an autonomous car to market it is Tesla that is liight years ahead.
I guess being prepared to drive at speed through anything that gets in the way would help with that.
In the race to provide fully autonomous cars the fact that Tesla have produced cars with "a much lower level of autonomy" puts them "light years ahead".
Ok.
Mapping cities takes time but it's a far from insurmountable challenge and many features are consistent from city to city. The second city will take less time to map, the third less again etc etc.
fully automonus? they can go anywhere? put the destination int eh satnav and go? no driver monitoring what's happening? lastly are they available for sale?
So i don't think they are "already with us". They are still trying to prove them.
https://www.kqed.org/news/11897647/youre-not-imagining-it-there-are-more....
Pages