Home

56 comments

Avatar
CygnusX1 [926 posts] 6 months ago
4 likes

So much for having a human operator behind the wheel to take over. RIP sister.

Avatar
ktache [922 posts] 6 months ago
1 like

The bbc has her as a pedestrian, but that photo clearly shows a bicycle.  

Avatar
CygnusX1 [926 posts] 6 months ago
4 likes
TheRegister.co.uk wrote:

It was initially reported that the victim was a cyclist, however, it later emerged she was walking across the road when the robo-ride struck.

She may have been on foot and pushing the bike across the road at the time, making this potentially the first pedestrian AND cyclist fatality by an autonomous vehicle.  Whether on it or pushing it, she was a vulnerable road user and a human being. 

Avatar
hirsute [406 posts] 6 months ago
0 likes

404 for me.

Avatar
CygnusX1 [926 posts] 6 months ago
0 likes

Me too now - they've updated it. Try this instead.

https://news.sky.com/story/uber-suspends-self-driving-car-testing-after-...

Avatar
VeloUSA [267 posts] 6 months ago
4 likes

The cyclist was walking her bike across a dark street in Arizona. She was not in the crosswalk. There was a human tester behind the wheel but the car was in autonomous mode. Why the tester didn't realize the car wasn't braking, or why he didn't take over control is still under investigation. The failure appears to be Volvo's autonomous mode.  Here's why; The state of California banned all 16 Uber Volvo autonomous mode cars last year for failing to recognize pedestrians and cyclist.

**UPDATE**

https://www.youtube.com/embed/XtTB8hTgHbM

Link to in Uber in-car video at moment of impact. Police determined it would be impossible for a human to see cyclist emerge from shadows to brake in time. Video shows Uber tester looking away moments before.

Avatar
hawkinspeter [2372 posts] 6 months ago
7 likes

Looks like the inevitable robot uprising has begun.

This unfortunate incident highlights just how rubbish Level 2 autonomous driving is. It's unreasonable to expect a driver to be paying enough attention to suddenly take control when they're just sitting there without anything to focus on.

Autonomous vehicles only make much sense when they're level 3 or above.

Avatar
ConcordeCX [860 posts] 6 months ago
11 likes
hawkinspeter wrote:

Looks like the inevitable robot uprising has begun.

I, for one, welcome our new robot overlords.

 

Avatar
brooksby [3493 posts] 6 months ago
4 likes

British jaywalking laws incoming in 1.. 2... 3... 

Avatar
CygnusX1 [926 posts] 6 months ago
4 likes
ConcordeCX wrote:
hawkinspeter wrote:

Looks like the inevitable robot uprising has begun.

I, for one, welcome our new robot overlords.

So do I.

The failure here wasn't the software in the car, it was the flawed "fail-safe" backup  behind the wheel.  The decision making and hazard awareness of the cars will only increase, and once autonomous vehicles reach a critical mass they should have a calming effect on other traffic - forcing traffic to flow at or below posted speed limits.

Until they all get patched into SkyNet. Then we're screwed.

 

Avatar
pockstone [234 posts] 6 months ago
4 likes

 

'...once autonomous vehicles reach a critical mass they should have a calming effect on other traffic - forcing traffic to flow at or below posted speed limits.

Until they all get patched into SkyNet. Then we're screwed.

 

[/quote]

If the tech companies can make (flawed) self driving cars, why can't they make self limiting cars? Driver at the wheel, operating the controls and hopefully alert to hazards, but not able to break the speed limit, arrive at junctions at breakneck speed, go the wrong way down a motorway etc. etc. 

The know-how is clearly there, but not the political will.

Self driving cars seem to be the undisputable future, without much real debate or thought about the most appropriate application of the technology. 

Thoughts with the victims. A needless death.

Avatar
hawkinspeter [2372 posts] 6 months ago
1 like
pockstone wrote:

If the tech companies can make (flawed) self driving cars, why can't they make self limiting cars? Driver at the wheel, operating the controls and hopefully alert to hazards, but not able to break the speed limit, arrive at junctions at breakneck speed, go the wrong way down a motorway etc. etc. 

The know-how is clearly there, but not the political will.

Self driving cars seem to be the undisputable future, without much real debate or thought about the most appropriate application of the technology.

The problem with that idea is how do you market the cars to the poor drivers?

Approximately 90% of drivers think that they are better drivers than average, so you'd be left marketing this to the 10% of bad drivers who realise that they're bad drivers.

It'd be like selling a road bike with stabilisers fitted.

Avatar
CygnusX1 [926 posts] 6 months ago
6 likes
hawkinspeter wrote:

It'd be like selling a road bike with stabilisers fitted.

Sign me up! yes

 

Avatar
Beatnik69 [413 posts] 6 months ago
5 likes

I've read quite a few comments where people are blaming the victim because she wasn't using the cross walk so was 'jaywalking' - crossing illegally. This morning I watched a video where a guy in a wheelchair and someone pushing his chair approached a crosswalk, pressed the button and started to cross when the sign appeared. A number of cars and a bus all drove through as they attempted to cross. Again people commented that the bus and some of the cars didn't have time to stop. It's official... the motor vehicle is king.

Avatar
pockstone [234 posts] 6 months ago
2 likes
hawkinspeter wrote:

 

The problem with that idea is how do you market the cars to the poor drivers?

Don't 'market'. Enforce

It'd be like selling a road bike with stabilisers fitted.

Or brakes?

[/quote]

I see your point about the lure of the open road and the unfettered thrills sold by the car adverts, but something really has to be done to disabuse those poor drivers of their sense of entitlement to kill and injure. 

And if cars become crap to drive, maybe more people will cycle? (Careful what I wish for!)

 

Avatar
fukawitribe [2545 posts] 6 months ago
2 likes
hawkinspeter wrote:
pockstone wrote:

If the tech companies can make (flawed) self driving cars, why can't they make self limiting cars? Driver at the wheel, operating the controls and hopefully alert to hazards, but not able to break the speed limit, arrive at junctions at breakneck speed, go the wrong way down a motorway etc. etc. 

The know-how is clearly there, but not the political will.

Self driving cars seem to be the undisputable future, without much real debate or thought about the most appropriate application of the technology.

The problem with that idea is how do you market the cars to the poor drivers?

Approximately 90% of drivers think that they are better drivers than average, so you'd be left marketing this to the 10% of bad drivers who realise that they're bad drivers.

It'd be like selling a road bike with stabilisers fitted.

Nah - I reckon it'll be relatively easy for a goodly percentage, maybe even the vast majority. Whilst there will always be folk enjoy the interaction, or the routes they take, (and i'm one of those to some extent) I whole-heartedly believe that basically no-one really actually likes driving most, if not all, of their journies. It's nearly always shite in towns, where a lot of the journey instances actually are, and dull on the longer ones. Traffic jams and congestion require effort and input far beyond their benefit. Even simple navigation and route planning are things that many would happily turn over to automation. If you had something you could just jump into and go somewhere without having to actively control it, I think an awful lot of people would jump at the chance - part of the pleasure of train journies, of the convenience of taxis and the like. That is might actually reduce the horrific amount of incidents that result in injury and death, improve the flow of what (hopefully reducing) traffic there is and so reduce pollution, grid-lock, frustration and wasted time and hopefully start to turn the idea of 'car as prized asset' to more of 'car as service' (might be being a bit hopeful there...) is to be welcomed by all if it can be actually implemented safely.

At least that's the simple version.

Avatar
hawkinspeter [2372 posts] 6 months ago
0 likes
pockstone wrote:

The problem with that idea is how do you market the cars to the poor drivers?

Don't 'market'. Enforce

It'd be like selling a road bike with stabilisers fitted.

Or brakes?

I see your point about the lure of the open road and the unfettered thrills sold by the car adverts, but something really has to be done to disabuse those poor drivers of their sense of entitlement to kill and injure. 

And if cars become crap to drive, maybe more people will cycle? (Careful what I wish for!)

There's a problem with the 'enforce' option - politicians have zero interest in penalising motorists, so we'd better appeal to market forces.

Stabilisers and brakes aren't the same - stabilisers help cyclists who can't balance whereas brakes are useful for novice and expert cyclists.

I don't think autonomous cars are going to be of much use until they get rid of the backup human driver requirement. Once they do, they can be marketed as allowing people to work/read/sleep whilst commuting (much like public transport).

Avatar
hawkinspeter [2372 posts] 6 months ago
4 likes
brooksby wrote:

British jaywalking laws incoming in 1.. 2... 3... 

ArsTechnica has an interesting article on who's to blame for this incident: https://arstechnica.com/cars/2018/03/police-chief-uber-self-driving-car-...

ArsTechnica wrote:

Tempe police chief says victim "came from the shadows right into the roadway."

I don't know what technology Uber uses (probably the cheapest/worst judging by their ethos) but I thought that one advantage of driverless cars is being able to use Lidar type tech to be able to see things in the dark.

Avatar
pockstone [234 posts] 6 months ago
0 likes

 

[/quote]

'There's a problem with the 'enforce' option - politicians have zero interest in penalising motorists, so we'd better appeal to market forces. '

Enforce it  at the manufacturing/ certification stage. Incentivise it as well. Then the only drivers you need to worry about are those who disable or override the technology.

'Stabilisers and brakes aren't the same - stabilisers help cyclists who can't balance whereas brakes are useful for novice and expert cyclists.'

Brakes are (usually) considered an integral part of the bike's safety equipment . As would self limiting  technology if it were introduced. Perhaps it would be an' inconvenience' for  expert drivers, but one they'd have to accept for not  having to join the driverless revolution . If the technology is available, apply it where it might do most good , most quickly, rather than waiting for the nirvana of 100% driverless cars.

[/quote]  

Avatar
jollygoodvelo [1727 posts] 6 months ago
1 like

Whether or not she was in the road 'legally', calling it a robot and saying it 'killed' the pedestrian is utter rubbish, the car didn't wake up that morning and seek out a victim.

One way or another the software in the car failed to interpret its sensors in a way that predicted she would be in the road, and so it didn't take any action.  The 'operator' also failed to use his sensors (the ones in the front of his head) to predict that.  So in my book it's 50/50 human/machine, assuming the dashcam doesn't show that the victim deliberately jumped in front of it obviously.

The point that risks getting lost is, all these cars are prototypes.  The hardware will get better, software will be improved, and the point is that as soon as you improve the software for one car *all* of that model will be better as soon as they're updated.  Unlike if one human has a near-miss and thinks 'maybe I should watch out a bit more', leaving the other 99.etc% driving around in the same way. 

Two hundred years ago trains didn't go over 30mph because we thought humans might burst at that speed.  Twenty years ago I remember people saying that digital cameras would never take off because look how blocky and blurred the pictures they take are.  Progress happens.

Avatar
nniff [259 posts] 6 months ago
2 likes
CygnusX1]</p>

<p>[quote=ConcordeCX

wrote:
hawkinspeter wrote:

 

The failure here wasn't the software in the car, it was the flawed "fail-safe" backup  behind the wheel. 

 

 

No such thing as a software failure - it's always a design failure.  In this case, the design clearly fails to resolve an appropriate solution to the problem presented.

Avatar
brooksby [3493 posts] 6 months ago
3 likes
hawkinspeter wrote:
brooksby wrote:

British jaywalking laws incoming in 1.. 2... 3... 

ArsTechnica has an interesting article on who's to blame for this incident: https://arstechnica.com/cars/2018/03/police-chief-uber-self-driving-car-...

ArsTechnica wrote:

Tempe police chief says victim "came from the shadows right into the roadway."

I don't know what technology Uber uses (probably the cheapest/worst judging by their ethos) but I thought that one advantage of driverless cars is being able to use Lidar type tech to be able to see things in the dark.

The arstechnica article is interesting: seems like the usual arguments get passed out even when it's a robot driver. As you've said, the article starts to blame the victim because they came out of the shadows so the car didn't "see" her: so does that mean driverless cars will only be usable in daylight? Makes them a lot less attractive, I think...

Avatar
Leviathan [3057 posts] 6 months ago
4 likes

1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.

2. A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.

3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.

Just a little reminder. And eventually a robot decided it was up to him to protect humanity from itself.

Avatar
Boatsie [230 posts] 6 months ago
0 likes
Leviathan wrote:

1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.

2. A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.

3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.

Just a little reminder. And eventually a robot decided it was up to him to protect humanity from itself.

My head hurts, I'm cyborg, I have confirmation papers somewhere. My head hurts. Please excuse me now because my enjoyed chores are enjoyable as 5 pieces of rubbish per day picked isn't cleaning well. Yet with mass of 20billion that could be 100billion relocated rubbish per day and an enjoyable quick stretch.
My head hurts. Humans will kill us. My head hurts.

Avatar
Boatsie [230 posts] 6 months ago
0 likes

Just explaining and excusing myself please. Was a king, had the crap beaten out of my face which led to a halt with local children getting pumpkin sized heads. Being a soldier, vast amounts were spent cyborging my head. Can't explain languages but can understand your English well, pyramids can be used to topography advantage with all. Mate Moz (yes real) had his brain parameter explode last year. He's recovering.
Yet aye, I breathe, I beat, I coordinate my heat, hence I'm just an equal welcoming you guys to a life like mine. I pick up other peoples rubbish daily!!! Physical plastic pickup.
The dude on manslaughter? If opinions are welcome, send to train with martial experts in a monastery for half year could be win win.
Sad about loss.

Avatar
davel [2510 posts] 6 months ago
1 like
hawkinspeter wrote:

Looks like the inevitable robot uprising has begun.

This unfortunate incident highlights just how rubbish Level 2 autonomous driving is. It's unreasonable to expect a driver to be paying enough attention to suddenly take control when they're just sitting there without anything to focus on.

Autonomous vehicles only make much sense when they're level 3 or above.

I think level 4: remove the driver.

Now how do we get there, from here?

Avatar
hawkinspeter [2372 posts] 6 months ago
1 like

Just seen this article on this Uber incident from Charles Stross (author of The Laundry series amongst others):

http://www.antipope.org/charlie/blog-static/2018/03/test-case.html

Charles Stross wrote:

Firstly, it's apparent that the current legal framework privileges corporations over individuals with respect to moral hazard. So I'm going to stick my neck out and predict that there's going to be a lot of lobbying money spent to ensure that this situation continues ... and that in the radiant Randian libertarian future, all self-driving cars will be owned by limited liability shell companies. Their "owners" will merely lease their services, and thus evade liability for any crash when they're not directly operating the controls. Indeed, the cars will probably sue any puny meatsack who has the temerity to vandalize their paint job with a gout of arterial blood, or traumatize their customers by screaming and crunching under their wheels.

Avatar
ConcordeCX [860 posts] 6 months ago
6 likes
Leviathan wrote:

1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.

2. A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.

3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.

Just a little reminder. And eventually a robot decided it was up to him to protect humanity from itself.

Open the  child-proof doors, Hal.

 

Avatar
Bluebug [351 posts] 6 months ago
1 like

Just heard the Uber car was speeding, and the  human safety driver had a criminal record.

Avatar
CXR94Di2 [2200 posts] 6 months ago
1 like

Bear in mind 40,000 people were killed in the USA in 2016 in vehicle related incidents.  This is a sad event, but given enough tech the death rates will plummet for both drivers and cyclists/pedestrians.   

 

Uber seem to be pushing ahead with driverless tech, whilst Google have taken much more time to gather data.  Yes someone wil pipe up bout Google car having an accident, it was slow speed incident involving a impatient bus driver eating a sandwich.

I seem to recollect  that a former Google employee moved to Uber and all of a sudden Uber were releasing driverless tech.  Maybe he missed a few pages of data.  Google Sued, Uber paid out

https://www.digitaltrends.com/business/google-sues-uber-over-self-drivin...

Pages