The National Highway Traffic Safety Administration is considering the role played by autopilot technology in a Florida collision between a Tesla Model S and a big rig. Tesla said autopilot sensors failed to detect the white truck, turning in front of a ...
By rolling out self-driving technologyÂ to consumers more aggressively than its competitors, Tesla Motors secured a spot in the forefront of a coming industry.But thatÂ strategy could expose the company to a risk it has sought to avoid: liability in crashes.
Tesla in 2015 activatedÂ its autopilot mode,Â which automates steering, braking and lane switching. TeslaÂ asserts the technologyÂ doesnâ€™t shift blame for accidents from the driver to the company.Â But Google, Zoox and other firms seeking to develop autonomous drivingÂ software say itâ€™s dangerous toÂ expectÂ people in the driverâ€™s seat to exercise any responsibility. Drivers get lulled into acting like passengersÂ after a few minutes of the car doing most of the work, the companies say,Â so relying onÂ themÂ to suddenly brake when their carsÂ fail to spot a hazardÂ isnâ€™t a safe bet.
SuchÂ a concern could undermine Tesla, whose autopilot feature is central to a fatal-accident investigation launched last week by federal regulators.Â The National Highway Traffic Safety Administration is considering the role played by autopilot technology in a Florida collision between a Tesla Model S and a big rig. Tesla said autopilot sensors failed to detect the white truck, turning in front of a Model S, against a bright May sky, killing 40-year-old Joshua Brown.
Were the victimâ€™s family to sue Tesla over an accident caused -- or not avoided -- by autopilot, one of several arguments they might make is that Tesla acted negligently by not doing what a reasonable manufacturer would do, said Stephen Nichols, an attorney in the Los Angeles office of law firm Polsinelli. The fact that others have developed similar technology, but have chosen not toÂ release it or haveÂ branded it in ways that donâ€™t suggestÂ automation,Â could leaveÂ Tesla vulnerable."You could say, 'Tesla, you're not doing what these other companies are doing, so you're being unreasonable,'" Nichols said.
Cases about defective product design typically hinge on whether a company sufficiently vetted its waresÂ â€“Â in this situation, programming code that interacts with a number of components throughout the car.If the accident happened because the software was inadequate (because it couldnâ€™t spot the white vehicle on a lightÂ backdrop) and proper testing would have found the flaw, Tesla could be on the hook, said Jon Tisdale, a general partner in Gilbert, Kelly, Crowley & Jennettâ€™s Los Angeles office.The competitive landscape bolsters his contention.â€œThereâ€™s going to be the argument made that they are rushing to market to corner it before other manufacturers release the product, and that Tesla cut the testing short â€“ â€˜they didn't do it right,â€™â€ said Tisdale, whoÂ mostly defendsÂ product liability cases.Teslaâ€™s billionaire founder Elon Musk has said that autopilot mode is a voluntary feature, that drivers are warned of the risks and that testing it with the public makes it safer than if the company were to do it solely internally. And heâ€™s made clear since its release that drivers donâ€™t abdicate responsibility.â€œThe onus is on the pilot to make sure the autopilot is doing the right thing,â€ he said in a televised interview in 2013. â€œWe're not yet at the stage where you can go to sleep and wake up at your destination. We would have called it autonomous â€¦ if that were the case.â€Consumer advocates say Tesla and other companies that insist on consumer culpability when a machine is in charge donâ€™t understand whatâ€™s happening on the roads.â€œOn the one hand, theyâ€™re saying trust us, we can drive better than you would, but on the other hand, they are saying if something goes wrong, don't ask us to stand behind our product,â€ said Rosemary Shahan, president of the Consumers for Auto Reliability and Safety lobbying group. â€œBut if itâ€™s controlled by an algorithm, why should you be liable?â€Google, with its goal of producing a car that doesnâ€™t have a way for a human to take control, is one of the few companies with a different stance at the outset. The company says it would be responsible for accidents caused by its software (though how traffic tickets get handled is still unsettled).Zoox, a Silicon Valley start-up that recently raised $200 million at a $1-billion valuation from investors, declined to comment about how it views the liability question. But the company also isnâ€™t planning to release technology that would require human intervention.Shahan said holding companies accountable through lawsuits and regulation might stifle innovation, but itâ€™s a worthwhile tradeoff to get them to take more precautions.â€œItâ€™s hard enough to not nod off when you are in control, let alone when youâ€™re in autopilot,â€ she said. â€œWe shouldnâ€™t trade one set of human error for another.â€Brownâ€™s familyÂ has said through attorneys that they hope lessons from his crash â€œwill trigger further innovation which enhances the safety of everyone on the roadways.â€A decision on whether to file a lawsuit isnâ€™t likely until the federal inquiry is completed, and the familyâ€™s focus remains on mourning, the attorneys firstname.lastname@example.orgTwitter: @peard33