Transportation law is rapidly evolving with the advent of automation in the trucking sector. With 20 states allowing some form of automation, each with its unique set of rules and no overarching federal regulations, navigating this new terrain can be challenging for trucking companies. High-profile accidents involving automated vehicles (AVs) have garnered media attention, further complicating the issue. The Federal Motor Carrier Safety Administration (FMCSA) is moving toward enacting new regulations for the highest levels of automated driving systems (ADS). However, the FMCSA is not conducting further rulemaking for lower-level ADS involving a safety driver in a commercial motor vehicle (CMV). The present Federal Motor Carrier Safety Regulations (FMCSR) will continue to apply in those situations.
A review and assessment of the laws adopted by New York, Pennsylvania, Arizona, and Texas provide a spectrum of state-level oversight, moving toward full adoption of ADS rapidly in the continued absence of federal regulation of CMVs using ADS. We assess the risks of CMV ADS and the requirements of these state regulatory schemes. We also propose best practices for CMV carriers and risk managers seeking to reduce risk and prepare for the next era of litigation involving full and driver-assisted ADS as CMV carriers seek to reap the benefits of ADS, including probable reductions in equipment downtime and driver shortages.
Adoption of CMV ADS and Early Indications of Risk
ADS operates the vehicle with sensors, cameras, LiDAR (light detection and ranging), Global Positioning System (GPS) navigation, and mapping software to control driving actions. SAE International (formerly the Society of Automotive Engineers) has established a range of complexity for ADS from Level 1 to Level 5 (SAE J3016); this scheme has been adopted by the FMCSA. Level 1, the lowest level, includes cruise control, lane-departure warnings, and automatic emergency braking. Level 2 involves vehicle control of steering, acceleration, and braking while the driver remains engaged and ready to take over the wheel. Level 3 is close to full automation but in limited circumstances, such as on the highway, and requires a driver for local travel or to take over when necessary due to road conditions. Level 4 involves full automation but requires the vehicle to remain in some geographic regions or roadways, avoiding adverse weather conditions. At Level 5, the vehicle can operate autonomously in any circumstance and route.
Levels 1 through 4 ADS are currently being deployed in select areas by CMV carriers. Levels 1 and 2 ADS are more prevalent now, with CMV carriers using adaptive cruise control and lane-departure warnings to aid drivers with their routes. Companies are testing Levels 3 and 4 ADS in specific geographic areas or on particular routes. Level 3 is the highest level of ADS before there is no driver involvement; it involves extensive control of the CMV by the ADS. In contrast, Levels 4 and 5 ADS do not require a driver, which is a particular focus of current rulemaking.
Levels 4 and 5 ADS are being researched and developed in selected geographic areas. In 2019, the autonomous trucking company TuSimple began testing a Level 4 ADS on a 1,200-mile route between Phoenix, Arizona, and Dallas, Texas. The company used a combination of cameras, radar, and LiDAR sensors to enable its trucks to navigate highways and make deliveries without human intervention. The CMVs still contained safety drivers and engineers to ensure the ADS operated safely. Notably, the FMCSA got involved when a viral video showed a TuSimple vehicle in Arizona suddenly jutted to the left on a highway, striking a concrete median (no injuries resulted). The safety driver tried to counter-steer the truck following a computer-generated command several minutes earlier. The FMCSA closed the investigation without penalties, and testing of the Level 4 ADS continues, with improvements based on the event, according to TuSimple. Still, as reported by the Wall Street Journal, TuSimple’s safety drivers “have flagged concerns about failures in a mechanism that didn’t always enable them to shut off the self-driving system by turning the steering wheel” (Kate O’Keeffe & Heather Somerville, Self-Driving Truck Accident Draws Attention to Safety at TuSimple, Wall St. J. (Aug. 1, 2022)).
Waymo, a subsidiary of Google, has been testing ADS Level 4 since 2017. On May 5, 2022, a Waymo Via truck with a safety driver was pushed off the roadway by a non-ADS CMV in a hit-and-run accident outside Dallas, Texas. Waymo’s ADS and safety driver were determined not to be at fault. However, such incidents are red flags for the fledgling CMV ADS industry and raise questions regarding risk management, best practices, and the future of CMV ADS litigation.
Risks inherent in CMV ADS include sensor limitations (poor weather degrading LiDAR or radar performance), sensor fusion challenges (errors caused by the integration of different ADS), inadequate mapping (incorrect and outdated maps resulting in improper navigation of the road), software bugs (ADS failure caused by faults in the software code), hardware failure (malfunctions in sensors, actuators, or onboard computers), cybersecurity threats (hacking and malware to control or disrupt the vehicle), and edge cases (unusual or unpredictable situations, such as construction or unique traffic patterns). As the industry grows, these risks must be addressed.
The 2018 Uber crash in Tempe, Arizona, the first-ever pedestrian fatality involving a self-driving car, promptly led to Uber’s ADS program being revoked for testing in Arizona. The accident inflamed a sentiment that testing ADS on active city streets could be too dangerous to allow. Arizona’s state government had sought to attract technology companies, garnering investment and employment opportunities, by keeping regulation at a minimum. Soon, a video from the Uber vehicle was released, showing the driver looking down and not at the road. In total, the driver had traveled 3.67 miles while not looking at the road. At 9:59 pm, six seconds pre-impact, the ADS in the Uber identified the pedestrian while the vehicle traveled 43 mph. However, due to limited lighting, it classified the pedestrian as an unknown object, not a person, then as a bicycle. However, the ADS could not determine which direction the pedestrian was headed. At 1.3 seconds pre-impact, the ADS decided it needed to apply the brakes, but Uber had removed the ability for the vehicle to brake on its own.
The Uber accident is exemplary of an ADS-involved collision. A settlement resolved the pending litigation for this accident. Therefore, we do not know the basis of the alleged liability in this event. However, the facts alone show a potent mix of risks. An accident such as this would most likely involve allegations of traditional negligence on the driver for not paying attention, negligence through vicarious liability for that driver, negligence for disabling the brakes on the ADS, and product liability (sensor limitations due to low lighting).
The released police body camera footage showed further issues with ADS adoption. Specifically, the responding law enforcement officers appeared ill-equipped to investigate this type of accident, which was a new phenomenon. They hesitantly read Miranda rights to the driver while attempting to determine the cause of the accident. Was it the person or the machine? This event led Uber to shut down Arizona operations, laying off 300 employees (Matt McFarland, Uber Shuts Down Self-Driving Operations in Arizona, CNN (May 23, 2018)).