“Self-driving” Tesla crash driver won’t sue – but Chubb unit might

“Self-driving” Tesla crash driver won’t sue – but Chubb unit might

“Self-driving” Tesla crash driver won’t sue – but Chubb unit might

by Dana Hull

A Texas man said the Autopilot mode on his Tesla Model S sent him off the road and into a guardrail, bloodying his nose and shaking his confidence in the technology. He doesn’t plan to sue the electric-car maker, but his insurance company might.

Mark Molthan, the driver, readily admits that he was not paying full attention. Trusting that Autopilot could handle the route as it had done before, he reached into the glove box to get a cloth and was cleaning the dashboard seconds before the collision, he said. The car failed to navigate a bend on Highway 175 in rural Kaufman, Texas, and struck a cable guardrail multiple times, according to the police report of the Aug. 7 crash.

“I used Autopilot all the time on that stretch of the highway,” Molthan, 44, said in a phone interview. “But now I feel like this is extremely dangerous. It gives you a false sense of security. I’m not ready to be a test pilot. It missed the curve and drove straight into the guardrail. The car didn’t stop -- it actually continued to accelerate after the first impact into the guardrail.”

Molthan’s experience -- while not as serious as a fatal crash that federal regulators are investigating -- still highlights the challenges ahead in determining who is to blame when semi-autonomous vehicles are involved in accidents. Insurance claims involving Tesla’s Autopilot are largely uncharted territory, in part because driver behavior is still a contributing factor.

Cozen O’Connor, the law firm that represents Molthan’s auto-insurance carrier, a unit of Chubb Ltd., said it sent Tesla Motors Inc. a notice letter requesting joint inspection of the vehicle, which has been deemed a total loss. Tesla said it’s looking into the Texas crash. Tesla stresses that Autopilot is only an assist feature -- that drivers need to keep their hands on the wheel and be prepared to take over at any time.

Fresh Focus
Tesla’s driver-assistance features, which the company calls Autopilot, have been in the spotlight in the wake of a fatal crash in Florida on May 7. Probes by the National Highway Traffic Safety Administration and the National Transportation Safety Board of the Florida crash are ongoing. After non-fatal accidents in Montana and Pennsylvania, Consumer Reports called on Tesla to require drivers to keep their hands on the steering wheel and to change the feature’s name to avoid confusion.

Scrutiny around Autopilot is heightened in part because the federal government is drafting guidelines, expected to be released this summer, for automakers racing to bring fully self-driving cars to market. Ford Motor Co., while announcing plans to produce a fully autonomous vehicle for use by ride-hailing services this week, said it would avoid adding incremental technologies because they leave the driver too detached -- in “no-man’s land” -- to take over in a dangerous situation.

While Ford and Alphabet Inc.’s Google espouse an all-or-nothing approach, Tesla has introduced driver-assist technology in “beta” form for continuous improvement and frequent over-the-air software updates. Tesla’s website stresses that active sensors, GPS and high-resolution digital maps help the vehicle to stay within lanes, and that “real time feedback from the Tesla fleet ensures the system is continually learning and improving upon itself.”

Automakers including General Motors Co., Honda Motor Co. and Daimler AG have also pushed to add features that take over some of the work but require the driver to remain responsible.

Safety First
About 35,200 people were killed in U.S. auto accidents in 2015, according to NHTSA. The overwhelming majority of vehicle accidents -- 94 percent -- are due to human error. Safety regulators want to improve human behavior while promoting technology that will protect people in crashes and help prevent them from occurring.

Fans of Tesla’s Autopilot bemoan that there’s no database of lives saved or accidents avoided by the technology.

“I’m disgusted that the only time Autopilot is in the news is when there are crashes,” said Diana Becker, 55, of Los Angeles, in a phone interview. “Nobody hears about the accidents that don’t happen.”

Becker recently completed a 27-day road trip throughout the West with her two children. She credits the Autopilot in her Model X with saving her family from colliding with a driver who crossed suddenly in front of them.

“I drove 400 miles a day on our road trip, and Autopilot was my second pair of eyes,” said Becker. “I depend on it.”

A Missouri man who suffered a pulmonary embolism last month relied on Tesla’s Autopilot to help him drive at least 20 miles to the nearest hospital, Slate reported.

Molthan, the Texas driver, also owns a Model X sport utility vehicle. He said he’s a big fan of Palo Alto, California-based Tesla and Chief Executive Officer Elon Musk, but his next car won’t be another Model S.

Copyright 2016 Bloomberg

Related stories:
Uber’s first self-driving fleet to launch this month
What the deadly Tesla self-driving car crash means for insurers