In a recent interview with the American Association for Justice’s (AAJ) Trial Magazine, an Arias Sanguinetti founding partner shared her extensive knowledge of autonomous vehicles, which are transforming the legal and automotive landscapes.
A Los Angeles personal injury lawyer and an industry leader with years of experience in autonomous technology, Elise Sanguinetti offers a deep dive into the challenges, risks, and evolving legal questions associated with these innovations.
Understanding the Evolution of Autonomous Vehicles
As autonomous vehicle technology rapidly advances, it’s essential to understand the various levels of autonomy that are already on the road.
Elise explains that the Society of Automotive Engineers classifies vehicles into levels ranging from 0 (no autonomy) to 5 (fully autonomous).
For many people, the driver-assistance systems found in modern cars are familiar, such as accident avoidance features, automatic braking, and lane-keeping assistance. These systems, which fall under levels 1 and 2, are considered basic forms of assistance designed to support the driver without completely taking over.
However, higher-level systems — like Tesla’s Autopilot, Ford’s BlueCruise, and Cadillac’s Super Cruise — are pushing the boundaries of what’s possible. These systems go beyond mere assistance, allowing the vehicle to take partial control, but the driver is still expected to remain alert and ready to intervene if necessary.
Levels 3, 4, and 5 include cars that increasingly operate autonomously. While the shift toward fully autonomous vehicles may eventually eliminate the need for a human driver, it also brings with it unique challenges in both the technology itself and how it is regulated.
The Risks of Driver-Assistance Systems
One critical issue Elise highlights is the risk posed by overreliance on driver-assistance systems. While these technologies are designed to improve safety and convenience, they can lead to dangerous situations if drivers become disengaged or overly reliant on them.
For example, phantom braking — where a vehicle suddenly brakes due to a false reading from its sensors — can lead to rear-end collisions. Similarly, lane-assist features can sometimes cause the vehicle to overcorrect, leading drivers to react in ways that increase the risk of an accident.
In her practice, Elise has seen firsthand how these systems can fail, resulting in accidents that could have been avoided with better driver engagement or more reliable technology. The combination of human oversight and automated control can be dangerous, especially when the driver fails to stay alert or intervene at a critical moment.
How Autonomous Vehicle Manufacturers Can Improve Safety
Elise also delves into the responsibility of manufacturers to design systems that keep drivers engaged and ready to take control when necessary. While some vehicles are equipped with interior cameras to monitor driver attention, she points out that these systems are not always effective.
Manufacturers must ensure that their cars include clear and actionable reminders to keep drivers alert, which Elise believes is currently lacking in many vehicles on the market.
Regulators are starting to take action in response to these concerns. For example, in December 2023, the National Highway Traffic Safety Administration (NHTSA) required Tesla to issue a recall due to inadequate driver monitoring.
Despite these measures, Elise stresses that more needs to be done to ensure that autonomous vehicle technologies live up to their promises of safety.
Click to contact our personal injury lawyers today
Proving Liability in Autonomous Vehicle Accidents
Proving liability in accidents involving fully autonomous vehicles is highly complex. The vast amount of data generated by these vehicles — ranging from video footage to detailed sensor readings — can provide insights into what went wrong.
However, accessing this information can be difficult, especially when manufacturers are reluctant to share it.
Elise is currently representing a client who was involved in a crash with a fully autonomous vehicle. She highlights that the challenge often lies not in proving fault but in comprehending the vast amounts of data generated during the accident.