A journalist from France has caught on camera the moment a Nissan driverless car passes a cyclist without leaving enough space.
The video, shot in London as Nissan showcased its driverless progress, shows how the car’s console registers the cyclist, but then fails to move over to give him space.
Tetsuya Iijima, global head of autonomous drive development at Nissan, is behind the wheel, but fails to over-ride the car and move out either, the video, spotted by BikeBiz, shows.
One of the French journalists in the car can be heard saying: ”I was a little scared for him" in French.
Last year we reported how Adrian Lord, of the transport consultancy Phil Jones Associates, fears that once technology that prevents pedestrians and cyclists from being hit by vehicles makes it to our roads, it opens the door for vulnerable road users to take advantage of the impossibility of being injured.
He said: "Once people realise that an autonomous vehicle will stop [automatically], will pedestrians and cyclists deliberately take advantage and step out or cycle in front of them?
“If that’s the case, how long would such a vehicle take to drive down Oxford Street or any other busy urban high street?”
Meanwhile professor of transport engineering at the University of the West of England, John Parkin, told the Financial Times (link is external) that much of the infrastructure that's being implemented to keep bikes and cars apart in inner-city environments, will be made redundant by autonomous technology reaching maturity.
"When fewer cars are driven by humans, in cities at least," the professor said. "There would be less need to segregate cyclists from traffic. This would allow roads to be designed as more open, shared spaces."
Add new comment
47 comments
There's another video I've seen where the car doesn't recognise a street sweeper in the lane ahead and the driver has to take control to prevent a crash.
There's also this gem with a Tesla supposedly on autopilot that nails a construction barrier.
https://www.youtube.com/watch?v=fQxIhMBKblY
Just as with this article's video, what the hell was the "driver" doing?
I think that's Javascript. In which case, we're all dead.
[Engage geek mode]
Its definitely not C -- there's an object.attribute reference in the test condition, could possibly a C++ code fragment but Java or JavaScript more likely
[Engage cynic mode]
Treating humans on bikes as an object is about right, but there's a bug - the program should read as follows if replicating human driver logic:
} else {
passingDistanceMeters = 0.5;
[Disengage cynic mode]
(Can't disengage geek mode unfortunaley!)
Please tell us, Velo, just how many crashes have Airbus had due to automation? And for the same period how many more crashes due to human error?
I've just gone through this list of A320 incidents and found only one incident where the autopilot was mentioned, and several where the cause was attributed to human actions:
https://en.wikipedia.org/wiki/Accidents_and_incidents_involving_the_Airbus_A320_family
Automation
vs. Human
Maybe this last one is unfair... or maybe not. Outside of science fiction (at least as yet), computers aren't prone to emotional instability and even then I'm willing to take my chances with Marvin the paranoid android from H2G2 instead of some of the psychos out there behind a wheel currently - HAL9000 on the other hand ...
Don't come at me with facts, this is the internet! 3.0 (post fact version)
I thought the argument that's been made is that excessive automation 'de-skills' the pilots so when they _do_ need to take over, they can't cope and screw things up? So it's not that the autopilot makes the error, it's that it leaves human pilots more prone to do so than they used to be.
It seems to be one of those plausible-but-debatable thesis that some expert argues for, and which journalists who aren't-as-clever-as-they-think-they-are, then keep excitedly telling us all about as if we hadn't heard it already.
Might be true, might not, dunno, but seems relevant to self-driving-cars, given that the people using them will probably be drunk or asleep or watching movies when they are suddenly called on to intervene.
Point being I don't think one can put much trust in self-driving cars that require the human being able to 'take over' when things get tricky. They are going to have to be able to cope on their own.
Air France 296, June 1988. Controversial, but the plane delayed the pilot's command to throttle up before hitting the trees.
Air France 447, May 2009. Mixed blame with the pilots not knowing how to react to the automated systems disengaging.
Air Asia 8501 (Qz8501), Dec 2014. Blamed on over-reliance on automation leading to an inability to control the aircraft without it.
Indian Airlines 605, Feb 1990. Controversial, some parties claim the crash was caused by throttle behavior that downed Air France 296.
I could go on, but I think you get the idea. Yes, the automation didn't intentionally down the plane, but the automation mixed with human pilot interaction has led to disasters. "Self driving" cars will still have humans and I presume the cars will still have manual control for some time, probably past our lifetimes. The video in this article shows the degredation of skills, the idiot "driver" didn't take control when the vehicle came too close to the cyclist. That type of over-dependence is what made me think of the airlines.
Please stop highlighting plane crashes.
I assume driverless cars will stop to avoid collision with a person, or anything bigger than the car/or truck being driven - e.g. elephant, cow, donkey but what about smaller things? goat, large dog, medium size dog, small dog, cat, bird, mouse, frog - where is the cutoff point? How about a young toddler, say 18 months - about as big as a mid-sized dog - or if she trips over maybe a small dog or large cat. How about kangaroos - a driver has a chance to see them coming from the side before they cross - maybe there will be predictive course analysis built in - I wouldn't bet on it since they have trouble doing this even for planes and ships.
Another point - who gets to face court if the automated car does something wrong - The computer programmer? the car manufacturer? or the driver who has to stay alert all the time with his hands on the controls and might as well be driving anyway.
Sounds great - are all cars on the road going to be separated by 90 second gaps? with assistant drivers? and traffic controllers in constant contact with drivers?
If the cyclist in question had swerved and been hit, who would be held responsible and taken to court? the driver or the computer programming team?
And if engineers are screening the results - I hope the team includes a few pedestrians, cyclists, mothers with young children, disabled pensioners. I rather suspect most highly paid engineers and computer programmers may not have the best interests of other road users in mind.
who faces court if your tumble dryer bursts into flames and burns down a block of flats?
Driverless cars won't have steering wheels etc., that's the point of them. Everyone in the car will be a passenger.
as for things coming from the side, maybe the designers have thought of that, and will put sensors there. Jeez. The driver has two eyes at best, two ears and one brain which are collectively dealing with a whole lot more than just driving. Driverless cars don't need these limitations, and only have to think about driving, not hitting stuff, and getting out of the way of kangaroos. The reason planes and ships have problems with this is the dearth of Australian megafauna at sea and in the sky; Oxford Street is obviously different, there are unpredictable Antipodeans bouncing about all over the bloody place.
They will also be recording everything all the time for use in evidence. A law will be made to make it illegal deliberately to obstruct the highway, if such a law doesn't already exist, and the car will have you bang to rights.
An opportunity for a test case here perhaps - if the cyclist decides to report the close pass, who will be prosecuted - Nissan, or the person in the driving seat who did not take control?
Surely that's a trick question... the answer to the question "who will be prosecuted?" is almost bound to be "no-one"!
Absolutely. Who wants a driverless car that you still have to be ready to drive at a moment's notice? I want it to drive me home from the pub and then go and pick someone else up, like a taxi without the bad tempered driver.
The big advantage that cars have over aeroplanes though is that when they stop, they just sit there rather than plummeting out of the sky. So car autopilots can just slow and stop (hopefully safely*) if they can't work out what to do, no driver intervention required. In a plane, the pilot HAS to take over or everyone dies.
* There are some situations where there's going to be a crash no matter whether the driver's human or cyborg.
Nervous flyer?
That's why most manufacturers are now looking at going straight to SAE Level 4 - full autonomy, with no driver intervention.
Good article on what the levels are and current state of development (and its where I've lifted the quote above from):
http://www.techrepublic.com/article/autonomous-driving-levels-0-to-5-understanding-the-differences/
Pages