Following a series of crashes involving Tesla cars colliding with fire trucks and emergency vehicles, the National Highway Traffic Safety Administration (NHTSA) has ordered the automaker to hand over data about its Autopilot driver-assistance system.
In a letter released earlier this year, NHTSA told Tesla to produce documentation regarding:
- How its Autopilot system works.
- How Tesla ensures drivers pay attention to the road.
- Whether there are limits to when Autopilot can be turned on.
- Records of vehicle sales, complaints involving Autopilot, and litigation involving Autopilot crashes.
The request comes as regulators continue to investigate a number of crashes involving Tesla vehicles that failed to detect stopped emergency vehicles with flashing lights and other barriers, including at least eight crashes that resulted in 10 deaths. Tesla has until October 22, 2021 to produce the information and faces fines up to $115 million if it fails to comply.
While the investigation is ongoing, it serves as an important reminder to Tesla owners that autopilot systems do not excuse them from their obligations to pay attention to the road and safely operate their vehicles.
Autopilot & Personal Injury Lawsuits
Though Tesla and other automakers are quickly pushing vehicles with sophisticated self-driving software onto public roads, autonomous vehicle technology is still in its infancy.
According to NHTSA, vehicles with fully automated driving systems (ADS) capable of handling the whole task of driving will one day become a reality. Currently, however, most vehicles with purported “self-driving” technology use advanced systems to assist drivers only with certain tasks. Examples include technology that:
- Helps drivers avoid lane drifting and unsafe lane changes.
- Warns drivers of vehicles or objects when backing up.
- Assists drivers in braking automatically when vehicles ahead stop or slow.
NHTSA notes that no vehicle currently available for sale is truly “self-driving.” Every vehicle in the U.S., the agency states, still “requires the full attention of the driver at all times for safe operation.”
For victims who are injured in collisions involving vehicles with “self-driving” or driver-assist technology, this means that there are still grounds to pursue claims against drivers who failed to safely operate their vehicles, especially if it can be shown that drivers were distracted, intoxicated, or otherwise negligent.
In some cases, accident victims and families of those who died in fatal crashes, may have grounds to pursue personal injury or wrongful death actions against automakers who produce vehicles with faulty technology. In cases involving Tesla autopilot crashes, for example, these claims typically allege a lack of effective safeguards to ensure drivers keep their eyes on the road and hands on the wheel while using the system, as well as failures to implement limits on the use of self-driving technology in certain areas, such as local roads. As with any car accident case, determining who can be held liable for damages is a task that demands meticulous investigation.