Support road.cc

Like this site? Help us to make it better.

Video: Nissan driverless car in cyclist close pass

'I was a little scared for him', says nervous passenger...

A journalist from France has caught on camera the moment a Nissan driverless car passes a cyclist without leaving enough space.

 

The video, shot in London as Nissan showcased its driverless progress, shows how the car’s console registers the cyclist, but then fails to move over to give him space.

Tetsuya Iijima, global head of autonomous drive development at Nissan, is behind the wheel, but fails to over-ride the car and move out either, the video, spotted by BikeBiz, shows.

One of the French journalists in the car can be heard saying: ”I was a little scared for him"  in French.

Last year we reported how Adrian Lord, of the transport consultancy Phil Jones Associates, fears that once technology that prevents pedestrians and cyclists from being hit by vehicles makes it to our roads, it opens the door for vulnerable road users to take advantage of the impossibility of being injured.

He said: "Once people realise that an autonomous vehicle will stop [automatically], will pedestrians and cyclists deliberately take advantage and step out or cycle in front of them?

“If that’s the case, how long would such a vehicle take to drive down Oxford Street or any other busy urban high street?”

Meanwhile professor of transport engineering at the University of the West of England, John Parkin, told the Financial Times (link is external) that much of the infrastructure that's being implemented to keep bikes and cars apart in inner-city environments, will be made redundant by autonomous technology reaching maturity.

"When fewer cars are driven by humans, in cities at least," the professor said. "There would be less need to segregate cyclists from traffic. This would allow roads to be designed as more open, shared spaces."

Add new comment

47 comments

Avatar
Redvee | 7 years ago
0 likes

There's another video I've seen where the car doesn't recognise a street sweeper in the lane ahead and the driver has to take control to prevent a crash.

Avatar
velo-nh | 7 years ago
0 likes

There's also this gem with a Tesla supposedly on autopilot that nails a construction barrier.

https://www.youtube.com/watch?v=fQxIhMBKblY

Just as with this article's video, what the hell was the "driver" doing?

 

Avatar
Matt_S replied to arowland | 7 years ago
1 like
arowland wrote:
handlebarcam wrote:

Looks like the Nissan software engineers have faithfully replicated the programming found in many human drivers' heads:

if (cyclist.inBikeLane == true) {

    passingDistanceMeters = 0;

} else {

    passingDistanceMeters = 1;

}

If only drivers did think in C!

 

I think that's Javascript. In which case, we're all dead. 

-- NORMAL --
Avatar
CygnusX1 replied to Matt_S | 7 years ago
2 likes
Matt_S wrote:
arowland wrote:
handlebarcam wrote:

Looks like the Nissan software engineers have faithfully replicated the programming found in many human drivers' heads:

if (cyclist.inBikeLane == true) {

    passingDistanceMeters = 0;

} else {

    passingDistanceMeters = 1;

}

If only drivers did think in C!

 

I think that's Javascript. In which case, we're all dead. 

-- NORMAL --

[Engage geek mode]

Its definitely not C -- there's an object.attribute reference in the test condition, could possibly a C++ code fragment but Java or JavaScript more likely

[Engage cynic mode]

Treating humans on bikes as an object is about right, but there's a bug - the program should read as follows if replicating human driver logic:

} else {

    passingDistanceMeters = 0.5;

[Disengage cynic mode]

(Can't disengage geek mode unfortunaley!)

Avatar
CygnusX1 replied to velo-nh | 7 years ago
1 like
velo-nh wrote:
Mungecrundle wrote:

Having experienced no fewer than 3 dangerous passes on the club ride this morning, I'd far sooner take my chances with software being in control.

How many crashes has Airbus had due to automation?  And that's without the complexities of being on a road in heavy traffic.

As a software engineer, I don't see self-driving cars being a thing anytime soon.  At least not safe ones, or ones that don't require a lot of human intervention.  Worse, this is just going to lead to drivers that will be incapable on the occasion where the self-driving doesn't work.  

Please tell us, Velo, just how many crashes have Airbus had due to automation? And for the same period how many more crashes due to human error?

I've just gone through this list of A320 incidents and found only one incident where the autopilot was mentioned, and several where the cause was attributed to human actions:

https://en.wikipedia.org/wiki/Accidents_and_incidents_involving_the_Airbus_A320_family

Automation

  • 5 November 2014, Lufthansa Flight 1829. The aircraft, while on autopilot, lowered the nose into a descent reaching 4000 fpm. The uncommanded pitch-down was caused by two angle of attack sensors that were jammed in their positions. The crew disconnected the related Air Data Units and were able to recover the aircraft.

vs. Human

  • 28 July 2010, Airblue Flight 202.  During a non-standard self-created approach below the minimum descent altitude the aircraft crashed into the ground after the captain ignored 21 cockpit warnings to pull-up. 146 passengers and six crew were on board the aircraft. There were no survivors.
  • 28 December 2014, Indonesia AirAsia Flight 8501,. The cause was initially a malfunction in two of the plane’s rudder travel limiter units. The crew ignored the recommended procedure to deal with the problem and disengaged the autopilot which contributed to the subsequent loss of control. All 162 on board killed.
  • 24 March 2015, Germanwings Flight 9525. The crash was deliberately caused by the co-pilot Andreas Lubitz, who had previously been treated for suicidal tendencies and been declared "unfit to work" by a doctor. 150 killed.

Maybe this last one is unfair... or maybe not.  Outside of science fiction (at least as yet), computers aren't prone to emotional instability and even then I'm willing to take my chances with Marvin the paranoid android from H2G2 instead of some of the psychos out there behind a wheel currently  - HAL9000 on the other hand ...

Avatar
barbarus replied to CygnusX1 | 7 years ago
1 like
CygnusX1 wrote:

I've just gone through this list of A320 incidents and found only one incident where the autopilot was mentioned

Don't come at me with facts, this is the internet! 3.0 (post fact version)

Avatar
FluffyKittenofT... replied to barbarus | 7 years ago
3 likes
barbarus wrote:
CygnusX1 wrote:

I've just gone through this list of A320 incidents and found only one incident where the autopilot was mentioned

Don't come at me with facts, this is the internet! 3.0 (post fact version)

I thought the argument that's been made is that excessive automation 'de-skills' the pilots so when they _do_ need to take over, they can't cope and screw things up? So it's not that the autopilot makes the error, it's that it leaves human pilots more prone to do so than they used to be.

It seems to be one of those plausible-but-debatable thesis that some expert argues for, and which journalists who aren't-as-clever-as-they-think-they-are, then keep excitedly telling us all about as if we hadn't heard it already.

Might be true, might not, dunno, but seems relevant to self-driving-cars, given that the people using them will probably be drunk or asleep or watching movies when they are suddenly called on to intervene.

Point being I don't think one can put much trust in self-driving cars that require the human being able to 'take over' when things get tricky. They are going to have to be able to cope on their own.

Avatar
velo-nh replied to CygnusX1 | 7 years ago
3 likes
CygnusX1 wrote:

Please tell us, Velo, just how many crashes have Airbus had due to automation? And for the same period how many more crashes due to human error?

Air France 296, June 1988.  Controversial, but the plane delayed the pilot's command to throttle up before hitting the trees.

Air France 447, May 2009.  Mixed blame with the pilots not knowing how to react to the automated systems disengaging.

Air Asia 8501 (Qz8501), Dec 2014.  Blamed on over-reliance on automation leading to an inability to control the aircraft without it.

Indian Airlines 605, Feb 1990.  Controversial, some parties claim the crash was caused by throttle behavior that downed Air France 296.

 

I could go on, but I think you get the idea.  Yes, the automation didn't intentionally down the plane, but the automation mixed with human pilot interaction has led to disasters.  "Self driving" cars will still have humans and I presume the cars will still have manual control for some time, probably past our lifetimes.  The video in this article shows the degredation of skills, the idiot "driver" didn't take control when the vehicle came too close to the cyclist.  That type of over-dependence is what made me think of the airlines.  

Avatar
J90 replied to velo-nh | 7 years ago
0 likes

Please stop highlighting plane crashes.

Avatar
pasley69 | 7 years ago
1 like

I assume driverless cars will stop to avoid collision with a person, or anything bigger than the car/or truck being driven - e.g. elephant, cow, donkey but what about smaller things? goat, large dog, medium size dog, small dog, cat, bird, mouse, frog - where is the cutoff point? How about a young toddler, say 18 months - about as big as a mid-sized dog - or if she trips over maybe a small dog or large cat. How about kangaroos - a driver has a chance to see them coming from the side before they cross - maybe there will be predictive course analysis built in - I wouldn't bet on it since they have trouble doing this even for planes and ships.
Another point - who gets to face court if the automated car does something wrong - The computer programmer? the car manufacturer? or the driver who has to stay alert all the time with his hands on the controls and might as well be driving anyway.

Avatar
pasley69 replied to velo-nh | 7 years ago
0 likes
velo-nh wrote:
Mungecrundle wrote:

Having experienced no fewer than 3 dangerous passes on the club ride this morning, I'd far sooner take my chances with software being in control.

How many crashes has Airbus had due to automation?  And that's without the complexities of being on a road in heavy traffic.

As a software engineer, I don't see self-driving cars being a thing anytime soon.  At least not safe ones, or ones that don't require a lot of human intervention.  Worse, this is just going to lead to drivers that will be incapable on the occasion where the self-driving doesn't work.  

Sounds great - are all cars on the road going to be separated by 90 second gaps? with assistant drivers? and traffic controllers in constant contact with drivers?

If the cyclist in question had swerved and been hit, who would be held responsible and taken to court? the driver or the computer programming team?

And if engineers are screening the results - I hope the team includes a few pedestrians, cyclists, mothers with young children, disabled pensioners. I rather suspect most highly paid engineers and computer programmers may not have the best interests of other road users in mind.

Avatar
ConcordeCX replied to pasley69 | 7 years ago
0 likes
Pasley69 wrote:

I assume driverless cars will stop to avoid collision with a person, or anything bigger than the car/or truck being driven - e.g. elephant, cow, donkey but what about smaller things? goat, large dog, medium size dog, small dog, cat, bird, mouse, frog - where is the cutoff point? How about a young toddler, say 18 months - about as big as a mid-sized dog - or if she trips over maybe a small dog or large cat. How about kangaroos - a driver has a chance to see them coming from the side before they cross - maybe there will be predictive course analysis built in - I wouldn't bet on it since they have trouble doing this even for planes and ships. Another point - who gets to face court if the automated car does something wrong - The computer programmer? the car manufacturer? or the driver who has to stay alert all the time with his hands on the controls and might as well be driving anyway.

who faces court if your tumble dryer bursts into flames and burns down a block of flats?

Driverless cars won't have steering wheels etc., that's the point of them. Everyone in the car will be a passenger.

as for things coming from the side, maybe the designers have thought of that, and will put sensors there. Jeez. The driver has two eyes at best, two ears and one brain which are collectively dealing with a whole lot more than just driving. Driverless cars don't need these limitations, and only have to think about driving, not hitting stuff, and getting out of the way of kangaroos. The reason planes and ships have problems with this is the dearth of Australian megafauna at sea and in the sky; Oxford Street is obviously different, there are unpredictable Antipodeans bouncing about  all over the bloody place.

They will also be recording everything all the time for use in evidence. A law will be made to make it illegal deliberately to obstruct the highway, if such a law doesn't already exist, and the car will have you bang to rights.

Avatar
RMurphy195 | 7 years ago
0 likes

An opportunity for a test case here perhaps - if the cyclist decides to report the close pass, who will be prosecuted - Nissan, or the person in the driving seat who did not take control?

Avatar
Dnnnnnn replied to RMurphy195 | 7 years ago
0 likes
RMurphy195 wrote:

An opportunity for a test case here perhaps - if the cyclist decides to report the close pass, who will be prosecuted - Nissan, or the person in the driving seat who did not take control?

Surely that's a trick question... the answer to the question "who will be prosecuted?" is almost bound to be "no-one"!

Avatar
ollieclark replied to FluffyKittenofTindalos | 7 years ago
0 likes
FluffyKittenofTindalos wrote:

I thought the argument that's been made is that excessive automation 'de-skills' the pilots so when they _do_ need to take over, they can't cope and screw things up? So it's not that the autopilot makes the error, it's that it leaves human pilots more prone to do so than they used to be.

It seems to be one of those plausible-but-debatable thesis that some expert argues for, and which journalists who aren't-as-clever-as-they-think-they-are, then keep excitedly telling us all about as if we hadn't heard it already.

Might be true, might not, dunno, but seems relevant to self-driving-cars, given that the people using them will probably be drunk or asleep or watching movies when they are suddenly called on to intervene.

Point being I don't think one can put much trust in self-driving cars that require the human being able to 'take over' when things get tricky. They are going to have to be able to cope on their own.

Absolutely. Who wants a driverless car that you still have to be ready to drive at a moment's notice? I want it to drive me home from the pub and then go and pick someone else up, like a taxi without the bad tempered driver.

The big advantage that cars have over aeroplanes though is that when they stop, they just sit there rather than plummeting out of the sky. So car autopilots can just slow and stop (hopefully safely*) if they can't work out what to do, no driver intervention required. In a plane, the pilot HAS to take over or everyone dies.

* There are some situations where there's going to be a crash no matter whether the driver's human or cyborg.

Avatar
CygnusX1 replied to J90 | 7 years ago
0 likes
J90 wrote:

Please stop highlighting plane crashes.

Nervous flyer?

Avatar
CygnusX1 replied to velo-nh | 7 years ago
0 likes
velo-nh wrote:

The video in this article shows the degredation of skills, the idiot "driver" didn't take control when the vehicle came too close to the cyclist.  That type of over-dependence is what made me think of the airlines.  

That's why most manufacturers are now looking at going straight to SAE Level 4 - full autonomy, with no driver intervention.

Quote:

Jim McBride, autonomous vehicles expert at Ford, [is] focused on getting Ford straight to Level 4, since Level 3, which involves transferring control from car to human, can often pose difficulties. "We're not going to ask the driver to instantaneously intervene—that's not a fair proposition," McBride said.

Good article on what the levels are and current state of development (and its where I've lifted the quote above from):

http://www.techrepublic.com/article/autonomous-driving-levels-0-to-5-understanding-the-differences/

 

Pages

Latest Comments