Tesla Autopilot Lawsuits

InvestigationsNews
July 7, 2023

When you purchase a Tesla automobile from a dealer, it includes a standard Autopilot system. You have the option to replace that with a Full Self Driving (FSD) feature. Despite what the names suggest, neither feature removes the need for a driver. They are both intended as driver-assist features.

While these systems may help you drive in some circumstances, that assistance appears to come with a cost. According to a recent report by the NHTSA, driver-assist features in Tesla vehicles have been at least partially responsible for at least 700 accidents since 2019. 

Comparatively, the next most dangerous driver-assist option (Subaru vehicles) resulted in only 23 accidents during that period.

Is Tesla Autopilot Safe?

The evidence suggests that these features are not safe, especially compared to the driver-assist features of other manufacturers. Why are Tesla Autopilot features involved in so many more accidents than driver-assist features in other vehicles? One possible cause is marketing.

Tesla is actively pushing new owners to add the FSD feature, and that marketing push downplays that it is a driver-assist feature. Even the name suggests that drivers can zone out and let the car drive itself. This marketing has already resulted in Tesla being fined, sued, and criminally investigated over misleading self-driving claims.

Another possibility is that the autopilot software just isn’t well designed. There are at least a few signs that this might be the case. In February 2022, 60,000 Tesla vehicles with FSD features were recalled due to what was described as a software glitch. This defect resulted in Teslas rolling through stop signs.

If other portions of the software are similarly defective, the FSD feature may not be safe enough for use, even while a driver is monitoring road conditions. And it definitely wouldn’t be safe enough for drivers who are sleeping or otherwise not paying attention to the road.

And yet another possible reason for these accidents is that Tesla deactivated radar sensors in all vehicles in 2021. These sensors were supposed to prevent collisions, so it isn’t clear why they were turned off. The company claims it was done to allow better technology to be developed, but that has yet to be implemented.

Regardless of the intention, turning off those sensors decreases the ability of self-driving features to identify hazards. This puts drivers, passengers, and others at risk. This is the type of behavior that may lead to a Tesla Autopilot lawsuit.

How Dangerous Are Tesla Self-Driving Features?

Any car accident is dangerous for passengers in the vehicle. However, accidents caused by Tesla Autopilot features are more dangerous than other accidents. 

Roughly one of every 200 accidents in the U.S. results in a fatality. The rate of Tesla Autopilot deaths is close to one in 40. This puts Tesla’s self-driving features on par with motorcycles in terms of fatalities.

While some Tesla vehicles have been recalled, investigations into the danger of these features are not yet complete. For example, the NHTSA is still investigating problems with the software that led to Teslas striking emergency vehicles on the edge of the road. When these investigations are complete, more Teslas may be recalled.

What to Do if You Own a Tesla

Even though Tesla Autopilot is standard in every vehicle, you don’t have to use it. If you are concerned that the feature will cause an accident, turn it off. You aren’t required to use driver-assist features.

However, just because you try to turn a feature off, that doesn’t mean it will cease functioning. One of the downsides with modern vehicles is that you don’t have complete control over the software. If your Tesla Autopilot takes control of your vehicle when you don’t want it to, you should contact an experienced product liability attorney at McEldrew Purtell immediately.

What to Do After a Tesla Autopilot Crash

Even if you don’t own a Tesla, you could be a victim of these Tesla features. If you get into an accident with a Tesla, you should consult with a car accident lawyer as soon as possible. There is a chance that the other driver was using some type of self-driving feature. However, you will need the assistance of a car accident lawyer to gather that information.

If they were using a self-driving feature, that might affect liability in the accident. It may also mean you have the option to file a claim against Tesla as well as the driver of the vehicle. Your product liability attorney will investigate your case and determine your options.

Autopilot Is a Misleading Term

Both “autopilot” and “full self-driving” suggest that the vehicle can drive itself without any supervision from a driver. However, Tesla and the government agencies that regulate these vehicles publicly claim the opposite. These are just driver-assist features and can only be safely used with a licensed driver who is paying attention to the road.

If you own a Tesla, use these features safely to avoid being liable for an accident. And if you see someone driving a Tesla who appears to be ignoring the road and relying on these features, give them plenty of leeway and alert the authorities when it is safe to do so. The last thing you want is to get into a Tesla Autopilot crash because of a driver-assist feature.

If you or someone you know was injured or killed as a result of using Tesla’s Autopilot system, contact us to discuss your legal options. You may be entitled to financial compensation.