With its Autopilot, a system of driver assistance with varying levels of assistance, electric car maker Tesla has so far made more negative headlines than positive ones. The list of accidents that are or could potentially be related to the system is growing. Some of those accidents have been fatal for passengers.
The Washington Post was the first to undertake a detailed reconstruction of the 2022 incident. It could have been the first fatal accident while using the highest level of assistance, the so-called Full Self-Driving, i.e. full autonomous driving, which is still only a beta version.
The accident occurred in Evergreen, Colorado. According to the report, Hans von Ohain, a Tesla employee and the driver of the vehicle, and Erik Rossiter were returning home after playing golf when the car veered off the road, hit a tree and burst into flames.
The Washington Post cites a recording from an emergency call center that says von Ohain was using the Tesla’s “self-driving feature” at the time, which caused the car to “simply drive off the road.” Rossiter, who was in the passenger seat and survived the crash, confirmed that von Ohain did use Full Self-Driving. However, that is unprovable at this point, as the car’s data records were completely destroyed by the fire. Tesla declined to answer questions about the incident, according to the Washington Post.
Von Ohain died in the crash, unable to get out of his burning car. The newspaper reported that the cause of death was “smoke inhalation and thermal injuries.” However, an autopsy revealed that the victim had a blood alcohol level of 2.6 per mille, more than three times the legal limit.
Courts take a close look at Tesla’s Autopilot
Tesla notes on its website that Autopilot is designed for “a careful driver who keeps their hands on the wheel and can take control at any time.” The report suggests that both human and technical error may have been at play in this case.
On the one hand, as the Washington Post writes, it is obvious that von Ohain would not be able to drive a car with such a blood alcohol level. On the other hand, the analysis of the accident showed that the car was not free from technical defects. We are talking about traces of tire rolling on the road, which suggest that after the vehicle hit the tree, the engine was still powering the wheels. According to the chief inspector investigating the causes of the accident, taking into account the dynamics of the impact and the way the car left the road without any signs of a sudden maneuver, it most definitely indicates the fault of the active driving assistance function.
The question of whether drivers or automakers are to blame for such accidents is now a subject of increasing interest in courts. A California court recently considered a 2019 accident in which a Tesla Model 3 driving in Autopilot mode veered off the road, killing the driver and seriously injuring two passengers. The judge ruled at the time that the company bore no responsibility for the incident. According to the Washington Post, at least nine more such cases are set to go to trial this year.
“Elon Musk claims that the car can drive itself and that it is even better than a human,” the newspaper quotes the widow of the accident victim as saying. “We have been sold a false sense of security.”
Source: Der Spiegel