As we move closer and closer to the day when robot cars are driving themselves around the Bay Area, safety regulators in Europe are raising grave concerns about the personal injury producing potential of at least one of those vehicles. Private Wealth magazine is reporting that officials in the German Transportation Ministry are calling the autopilot function on the most popular model from Tesla Motors Inc a "considerable traffic hazard."
The comments come from an internal report obtained by German magazine Der Spiegel. The criticism comes after a series of tests conducted on the autopilot function on the Tesla Model S. Among their findings: the autopilot failed to warn the test drivers when the computer found itself in a situation it could not handle; the sensors did not extend far enough backward during an overtaking maneuver; and the emergency braking was inadequate.
This negative report comes on the heels of the death of a Model S driver in a collision with a truck this past May.
Tesla says the autopilot feature is a "drivers assistance system" that requires oversight by the user "at all times."
Automakers and technology companies like Google claim that self-driving vehicles will revolutionize traffic safety, by virtually eliminating human error and personal injuries from our roadways. However, it's still not clear that automated robotic cars will not replace death and serious injury due to negligent and careless drivers with deadly crashes caused by on-board computers, sonar sensors or other technological failures which do not live up to their makers' promises.
These are concerns for another day. For now, we are still at risk of serious injury due to bad drivers, and must take legal action when we are in fact hurt in a car crash.