Autonomous vehicles may be slated as the future of our roads but concerns about new Tesla software is raising questions about how much control humans will have over self-driving technology and whether they’ll be able to use it to endanger others.
As reported by Streetsblog NYC, the software in question is Tesla’s “Full Self Driving Mode” beta. Released in October 2021, it included a new feature that allows drivers to select one of three custom driving profiles – “chill,” “average,” or assertive” – which determine how aggressive the vehicle is when deploying autonomous features for things like stops, yellow lights, lane changes, and following distance.
Although it was initially released without much attention, an article published by The Verge in January 2022 revealed that the software’s “assertive” mode resulted in vehicles making several risky actions. This included tailgating, performing unsafe passing maneuvers, and rolling through certain stops – all of which are illegal in most U.S. states.
The article was a major revelation for safety advocates who argue that Tesla shouldn’t be able to program its vehicles to break laws, even if there are minor variances from jurisdiction to jurisdiction.
And while Tesla fans argue that “assertive” and “average” modes may program vehicles to make rolling stops only at “optional stops” like when pulling out of a parking lot, the reality is that stops are often not optional in those locations. Moreover, rolling stops can be extremely dangerous. According to the National Safety Council, vehicles accidents in parking lots and parking garages cause an estimated 500 deaths and 60,000 injuries every year, mostly involving pedestrians.
Ultimately, the debate over Tesla’s Full Self Driving Mode may be just another iteration of the age-old debate over rolling stops and other similar driving behaviors. Though some may say rolling stops are okay at some locations or when “no one is around,” many others – including the experts – agree that its always a bad idea.
That’s certainly the case for standard vehicles whose drivers can easily overlook people, vehicles, or hazards. But it’s also the case for self-driving cars, as there’s no guarantee a vehicle’s technology will be able to spot and react to sudden hazards in time to prevent a crash, or that the technology will function properly all the time. That’s a pertinent concern at a time when autonomous technology is still working through its bugs.
The Perecman Firm, P.L.L.C. has been fighting for victims of motor vehicle accidents and negligence since 1983. We’re always tracking developments in autonomous vehicle technology and the law so as to provide our clients with the most up-to-date representation possible.