Not so smart

Tesla Model S that was being driven by Joshua Brown, who was killed when the Tesla sedan crashed while in self-driving mode. Photo / Supplied

TESLA Motors will not face a recall or fine as a result of a fatal crash involving its Autopilot system, but US safety regulators are warning auto manufacturers and drivers not to treat semi-autonomous cars as if they were fully self-driving.

The National Highway Traffic Safety Administration says it found that the system had no safety defects at the time of the May 7 crash in Florida, and that it was primarily designed to prevent rear-end collisions rather than other crash scenarios.

Bryan Thomas, the agency’s chief spokesman, said automated driving systems still required a driver’s full attention. He warned that automakers need to keep tabs on how drivers use the technology and should design vehicles “with the inattentive driver in mind.”

The probe began in June, nearly two months after a driver using autopilot in a 2015 Tesla Model S died when it failed to spot a tractor-trailer crossing the car’s path on a highway in Williston, Florida, near Gainesville.

Tesla’s autopilot uses cameras, radar and computers to detect objects and automatically brake if the car is about to hit something. It also can steer the car to keep it centered in its lane. The company has said that before autopilot can be used, drivers must acknowledge that it’s an “assist feature” that requires both hands on the wheel at all times and that drivers must be ready to take control.

Just about every company has or is working on similar systems as they move rapidly toward self-driving cars.

The investigation “helps clarify that cars are still supposed to be driven by attentive people, and if people behind the wheel aren’t attentive, it’s not the technology’s fault,” said Karl Brauer, executive publisher of Kelley Blue Book. That would help avoid the stigma that the technology causes accidents, he said.

NHTSA released guidelines last year that attempt to ensure safety without slowing development of semi-autonomous and self-driving cars. The agency says self-driving features could dramatically reduce deaths by eliminating human error, which plays a role in 94 percent of fatalities.

Thomas said NHTSA wants to encourage innovation “to get the best answer to how we use these automated systems to the best effect and saving the most lives.”

In its probe, NHTSA evaluated how the system functions and looked into dozens of other crashes involving Teslas, including one on the Pennsylvania Turnpike that injured two people.

The Florida crash killed former Navy Seal Joshua Brown, 40, of Canton, Ohio.

Tesla said at the time that the cameras on Brown’s Model S sedan failed to distinguish the white side of a turning tractor-trailer from a brightly lit sky and neither the car nor Brown applied brakes.

Thomas said Brown set the car’s cruise control at 74 mph — 9 mph over the limit — less than two minutes before the crash. NHTSA’s crash reconstruction showed the tractor-trailer should have been visible to Brown at least 7 seconds before impact, enough time to react.

Detecting vehicles that cross in its path were beyond the capabilities of the autopilot system, Thomas said.

Tesla said it appreciated NHTSA’s thoroughness in reaching its conclusion.

When Tesla released its autopilot in 2015, some safety advocates questioned whether the Palo Alto, California, company and NHTSA allowed the public access to the system before testing was finished.

Consumer Reports magazine called on Tesla to drop the “autopilot” name because it could give drivers too much trust in their car’s ability to drive itself.

In September, Tesla updated Autopilot software to rely more on radar sensors and less on cameras. The update also disabled the automatic steering if drivers don’t keep both hands on the wheel. — AP

TESLA Motors will not face a recall or fine as a result of a fatal crash involving its Autopilot system, but US safety regulators are warning auto manufacturers and drivers not to treat semi-autonomous cars as if they were fully self-driving.

The National Highway Traffic Safety Administration says it found that the system had no safety defects at the time of the May 7 crash in Florida, and that it was primarily designed to prevent rear-end collisions rather than other crash scenarios.

Bryan Thomas, the agency’s chief spokesman, said automated driving systems still required a driver’s full attention. He warned that automakers need to keep tabs on how drivers use the technology and should design vehicles “with the inattentive driver in mind.”

The probe began in June, nearly two months after a driver using autopilot in a 2015 Tesla Model S died when it failed to spot a tractor-trailer crossing the car’s path on a highway in Williston, Florida, near Gainesville.

Tesla’s autopilot uses cameras, radar and computers to detect objects and automatically brake if the car is about to hit something. It also can steer the car to keep it centered in its lane. The company has said that before autopilot can be used, drivers must acknowledge that it’s an “assist feature” that requires both hands on the wheel at all times and that drivers must be ready to take control.

Just about every company has or is working on similar systems as they move rapidly toward self-driving cars.

The investigation “helps clarify that cars are still supposed to be driven by attentive people, and if people behind the wheel aren’t attentive, it’s not the technology’s fault,” said Karl Brauer, executive publisher of Kelley Blue Book. That would help avoid the stigma that the technology causes accidents, he said.

NHTSA released guidelines last year that attempt to ensure safety without slowing development of semi-autonomous and self-driving cars. The agency says self-driving features could dramatically reduce deaths by eliminating human error, which plays a role in 94 percent of fatalities.

Thomas said NHTSA wants to encourage innovation “to get the best answer to how we use these automated systems to the best effect and saving the most lives.”

In its probe, NHTSA evaluated how the system functions and looked into dozens of other crashes involving Teslas, including one on the Pennsylvania Turnpike that injured two people.

The Florida crash killed former Navy Seal Joshua Brown, 40, of Canton, Ohio.

Tesla said at the time that the cameras on Brown’s Model S sedan failed to distinguish the white side of a turning tractor-trailer from a brightly lit sky and neither the car nor Brown applied brakes.

Thomas said Brown set the car’s cruise control at 74 mph — 9 mph over the limit — less than two minutes before the crash. NHTSA’s crash reconstruction showed the tractor-trailer should have been visible to Brown at least 7 seconds before impact, enough time to react.

Detecting vehicles that cross in its path were beyond the capabilities of the autopilot system, Thomas said.

Tesla said it appreciated NHTSA’s thoroughness in reaching its conclusion.

When Tesla released its autopilot in 2015, some safety advocates questioned whether the Palo Alto, California, company and NHTSA allowed the public access to the system before testing was finished.

Consumer Reports magazine called on Tesla to drop the “autopilot” name because it could give drivers too much trust in their car’s ability to drive itself.

In September, Tesla updated Autopilot software to rely more on radar sensors and less on cameras. The update also disabled the automatic steering if drivers don’t keep both hands on the wheel. — AP

Your email address will not be published. Comments will display after being approved by a staff member. Comments may be edited for clarity.

Poll

  • Voting please wait...
    Your vote has been cast. Reloading page...
    How important do you think the annual police cannabis hunt is to fighting crime in the district?