The National Highway Traffic Safety Administration (NHTSA) has opened an investigation into Tesla’s Autopilot system after a series of avoidable car accidents. Both the accident and the investigation raise some pressing questions about the use of partial driver assistance features in vehicles.
NHTSA Investigates Tesla’s Autopilot System
After years of inaction, on August 13, the NHTSA announced that it had opened an investigation into crashes that were caused by Tesla’s Autopilot system. The scope of the investigation is small, though – it will only focus on 11 crashes that have happened since 2018 where the Autopilot system slammed the car into first responder vehicles that were parked at the scene of a crash or other incident.
Those accidents have produced 17 injuries and 1 fatality.
The announcement is in stark contrast to the NHTSA’s hands-off approach in years past. Tesla’s Autopilot system has been connected to numerous crashes since its release in 2014.
Criticism on Driver Attentiveness is Misplaced
Much of the criticism that Tesla has faced over its Autopilot system has been its lenient ways of ensuring that the driver is still paying attention and is ready to override the system.
Autopilot is a driver-assistance feature that can steer, accelerate, and brake without the driver’s input. It uses cameras and other sensors to detect road hazards. However, it is not a completely self-driving vehicle. The owner’s manual tells drivers to keep their hands on the wheel while Autopilot is activated, so that the driver can step in if it is doing something wrong. That instruction, though, is not enforced and drivers have found numerous ways to get around it. Infamously, one Tesla driver was arrested for putting the Autopilot on and then riding in the backseat of the vehicle.
The focus on Tesla’s poor driver monitoring, though, is based on the idea that a human driver would reliably be able to step in, override the Autopilot system, and prevent a car crash. This seems unlikely, at best.
Tesla drivers who have their Autopilot feature engaged are probably going to assume that it has things under control. This feeling of security in the system will get stronger over time as they see Autopilot successfully manage traffic. Even drivers who have their hands on the wheel and are watching the road will likely trust the Autopilot feature. When it approaches a hazard that it will not avoid – like a parked emergency vehicle – even attentive drivers will not notice that they are about to crash until it is too late for them to do anything about it.
Car Accident Lawyers at the Smith Law Office Serve Victims in St. Joseph
Driver-assistance features like the Autopilot are a big step towards self-driving vehicles. They are also imperfect and expecting human drivers to step in and correct them in real time is naïve.
If you or a loved one has been hurt when a Tesla Autopilot system failed to avoid a car accident, call the personal injury attorneys at the Smith Law Office at (816) 875-9373 or contact them online.