According to a Stanford University robotics researcher, Tesla’s semi-autonomous driving technology displays a frightening inability to recognise cyclists. After testing the feature along with a colleague, Heather Knight recommended that it never be activated when cyclists are around as she believes people would be killed.
Knight and Dylan Moore are part of a research group at Stanford University’s Department of Mechanical Engineering. Fortune reports that Knight, who posted the review to Medium, has a PhD from the Robotics Institute at Carnegie Mellon University and is currently doing post-doctoral research in social robotics.
When activated, the Tesla Autopilot feature will speed up or slow down the car based on what’s in front of it, keeping it in lane and following the turns of the road. Tesla makes it clear to drivers that the system is not fully autonomous and that they should keep their hands on the steering wheel and pay attention at all times.
Knight is concerned that some will ignore the system’s limitations and put cyclists’ lives at risk. She writes that she “found the Autopilot’s agnostic behaviour around bicyclists to be frightening.”
Tesla's Situation Awareness Display helps the human driver have a mental model of what the car sees. Knight gave this feature an A+ rating on the grounds that "it helps the driver understand shortcomings of the car, i.e., its perception sucks."
She adds: “I’d estimate that Autopilot classified ~30% of other cars, and 1% of bicyclists. Not being able to classify objects doesn’t mean the tesla doesn’t see that something is there, but given the lives at stake, we recommend that people NEVER USE TESLA AUTOPILOT AROUND BICYCLISTS!”
She concludes her review by warning, “do not treat this system as a prime time autonomous car. If you forget that… bikers will die.”