Autonomous vehicles have immense promise for the future of automobile litigation and the automobile industry. However, technology has not advanced to the point of full autonomy yet. Right now, there are too many variables for technology and human distractions. This article will explore some of these problems and a case study of the extreme problems.
June 23, 2023 Feature
Autonomous Vehicles: Is the Future Here?
By Shannon Davis
What Is an Autonomous Vehicle?
When most of us think about autonomous vehicles, we hoped for the flying cars from The Jetsons or Back to the Future. However, autonomous or “self-driving” vehicles are “those in which operation of the vehicle occurs without direct driver input to control the steering, acceleration, and braking and are designed so that the driver is not expected to constantly monitor the roadway while operating in self-driving mode.” See National Association of Insurance Commissioners. However, this is not the whole story. There are several levels to this definition as well, according to the United States Department of Transportation
- Level 0: Momentary Driver Assistance, i.e.: lane departure warning,
- Level 1: Driver Assistance, i.e.: lane keeping assistance,
- Level 2: Additional Assistance, i.e.: Highway pilot,
- Level 3: Conditional Automation, i.e.: All aspects of driving are handled, but the driver must be available,
- Level 4: High Automation, i.e.: All aspects of driving are handled, and the driver is now a passenger,
- Level 5: Full Automation, i.e.: all aspects of driving are handled under all conditions. The driver is not needed.
Most new cars today have Level 0 or Level 1 features, but it’s when we venture out of momentary and immediate assistance that autonomous vehicles’ problems arise.
Oh, the Humanity!
To start, humans have humanity, which means we make errors and our attention fades. This is true whether we are driving a car, an autonomous car, or watching videos on our phones. With Level 0-3, humans must be available to immediately intervene when there is a road hazard or something similar. Autopilot requires humans to recognize a hazard, grab the wheel, and then safely navigate around said hazard, all within 1.5 seconds (the average driver’s reaction time, per www.driveincontrol.org), and that is if the driver is fully paying attention. If for instance, the driver happens to be watching their favorite viral video or blog series, then that reaction time will increase. Humans are distracted easily, and in fact, 3,142 were killed due to distracted driving in 2020. Drivers must be focused on the road and its obstacles. However, autopilot allows the driver to become distracted while the car is in motion, and with reaction times increased, then it could lead to more crashes.
Fear is like an emotional roadmap for the technology we might embrace, and 70 percent of drivers are afraid to see self-driving cars, CBS reports. Humans are notorious for rejecting technology they fear, which could lead to more than one outcome- technology advances quickly or fear prolongs it. Either way, technology has not outpaced our distractions while driving. Finally, the trolly problem does exist when we are talking about autonomous cars and driving with autopilot, and only humanity can truly answer it when real lives are in danger.
Technology Is Advancing, but We Aren’t There Yet!
Humans aside, the technological side of autonomous vehicles is not advanced enough for humans to be distracted while they are driving. Levels 3-5 are decades away. Some of the current technological issues faced by autonomous vehicles are parking challenges and failure to detect hazards.
Parking is one of the technological advances that will take time to figure out for autonomous vehicles. For example, an autonomous vehicle will know the destination that you are trying to get to. However, it cannot account for a person wanting to be dropped at a certain part of the building or if there are obstructions to a specific entrance. Also, the vehicles may negate the need for parking by circling, which in turn causes congestion (intended and unintended), as suggested by Adam Millard- Ball.
Hazards and the ability to recognize hazards have been ongoing issues with autonomous vehicles, whether life hazards, road hazards, or emergency issues. What if, for instance, a passenger has a medical emergency? One such issue that may arise is how police can stop an autopilot car, so emergency services can take care of the passenger. “Autopilot” cars have been known to slam into objects that were not recognized by the system causing deaths and injuries. Furthermore, severe weather, like rain or snow, can cause the vehicle to not recognize depths or hazards. See Tomorrow.IO. As people know, hazards and objects pop up on the road instantly and regularly. Until the technology can account for that, autonomous vehicles won’t reach level 3.
A Case Study, If You Please
Although technology hasn’t overcome some of its biggest issues (to be honest, neither has humanity), people are still driving autonomous vehicles on autopilot. On February 18, 2023, a car suspected to be in autopilot mode crashed into a firetruck, killing the driver and leaving the passenger in serious condition. The fire truck sustained $1.4 million in damages. The trucks were there to shield the work crew, but either the technology or the driver did not recognize the danger in time. Another suspected autopilot crash occurred on February 17, 2023, in China. The car hit several other cars and a bicycle before coming to a stop. One person was killed in this accident. Finally, on Thanksgiving 2022, a suspected autopilot-activated autonomous vehicle caused an eight-car pile-up on the San Franciso Bay Bridge. In this instance, the vehicle put its left signal on, merged left, and then activated its brakes causing the crash. Nine people were injured, but fortunately, there were no fatalities.
Finally, Ford has applied for a patent that will allow an autonomous vehicle to repossess itself. Although it has not been implemented, this feature could give corporations a huge advantage over the average consumer. Understandably, corporations want to be paid for their products, but humans still need the opportunity to correct their mistakes. If a single mother misses a payment, then how will she get to work to pay?
The Future Is Here, Sort Of
Autonomous vehicles will be the future. However, technology is not where it needs to be … yet.