In an unavoidable collision involving a robotic driverless car, who should die? That’s the ethical question being pondered by automobile companies as they develop the new generation of cars.
Stanford University researchers are helping the industry to devise a new ethical code for life-and-death scenarios.
According to Autonews, Dieter Zetsche, the CEO of Daimler AG, asked at a conference: “if an accident is really unavoidable, when the only choice is a collision with a small car or a large truck, driving into a ditch or into a wall, or to risk sideswiping the mother with a stroller or the 80-year-old grandparent. These open questions are industry issues, and we have to solve them in a joint effort.”
Google’s own self driving car gives cyclists extra space if it spots them in the lane, which theoretically puts the inhabitants of the car at greater risk of collision, but does it anyway. This is an ethical choice.
“Whenever you get on the road, you’re making a trade-off between mobility and safety,” Noah Goodall, a researcher at the University of Virginia.
“Driving always involves risk for various parties. And anytime you distribute risk among those parties, there’s an ethical decision there.”
Google is constantly making decisions based around information and safety risks, asking the following questions in a constant loop.
1. How much information would be gained by making this maneuver?
2. What’s the probability that something bad will happen?
3. How bad would that something be? In other words, what’s the “risk magnitude”?
In an example published in Google’s patent, says Autonews, “getting hit by the truck that’s blocking the self-driving car’s view has a risk magnitude of 5,000. Getting into a head-on crash with another car would be four times worse -- the risk magnitude is 20,000. And hitting a pedestrian is deemed 25 times worse, with a risk magnitude of 100,000.
“Google was merely using these numbers for the purpose of demonstrating how its algorithm works. However, it’s easy to imagine a hierarchy in which pedestrians, cyclists, animals, cars and inanimate objects are explicitly protected differently.”
We recently reported how Google has released a new video showing how its self-driving car is being taught to cope with common road situations such as encounters with cyclists. We’d far rather share the road with a machine that’s this courteous and patient than a lot of human drivers.
We’ve all been there. You need to turn across the traffic, but you’re not quite sure where, so you’re a bit hesitant, perhaps signalling too early and then changing your mind before finally finding the right spot.
Do this in a car and other drivers just tut a little. Do it on a bike and some bozo will be on the horn instantly and shouting at you when he gets past because you’ve delayed him by three-tenths of a nanosecond.
But not if the car’s being controlled by Google’s self-driving system. As you can see in this video, the computer that steers Google’s car can recognise a cyclist and knows to hold back when it sees a hand signal, and even to wait if the rider behaves hesitantly.
The front wheel typically plays no part in acceleration, so disc brakes are going to put the spokes under more stress than normal riding. This is...
Haven't you said that before...?
Oh no you didn't!
I haven't watched that video with the sound on, but does it have the Benny Hill music playing over the top of it...?
I wonder if its worth looking up whether Driveway is a one man band or not. If not, might be worth having another go at contacting them. ...
Wellgo M142 are very light (236g / pair) and compact. I have a set on my commuter that are 10 years old and still spin beautifully. There's even a...
Me too, even carbon rim brakes seem simplicity itself - and stop well enough for me. Going is my problem, not stopping
Yes; next question?
Not sorta thing I'd consider because i like to ride outside...but you could just stick a shit front wheel on?
Looks bang on