The public awaits the day that we can go have a few drinks with some friends, call your car with a press of a button or voice command and get home safely, but how far are we away are from being able to do just that? Are we there yet?
Autonomous self-driving cars offer hope for a safer future. But there is a slew of ethical and legal challenges. The question of its implications for a DWI or DUI arrest rank high on the list of concerns. It boils down to a determination of the individual’s role in the operation of a vehicle. Some states such as Texas don’t have a clear definition.
There is no legal definition for operating in the penal code. Most people will be surprised by this when confronted with it in jury selection, the first place this crucial topic must be broached in the trial. In a jury charge, the judge will not provide a definition of operating. The jury will have to make that decision individually or as a whole to reach a verdict. That leaves it open to interpretation.
Seemingly, we are getting very close to this as drivers are being filmed sleeping while at the wheel. While Telsa is very advanced, it still is not a pure autonomous car.
Uber Already Crashed A Self-Driving Car.
Let’s take the case of the accident in Tempe, Arizona involving a self-driving Uber car. The other driver was clearly at fault. It failed to yield to Uber’s self-driving Volvo SUV, landing the car on its side. Two other cars were damaged. Police didn’t charge the Uber driver, but cited the other one for a moving violation.
#UPDATE: No injuries yet reported in an accident involving a self-driving #uber, captured by @fresconews user Mark Beach in Tempe, AZ. pic.twitter.com/kmizvRD5WP
— Fresco News (@fresconews) March 25, 2017
Unlike other accidents involving self-driving cars, this one obviously was more serious. After all, it takes a great deal of force to flip a car, let alone an SUV. Thankfully, no one was injured. What it shows is that accidents are unavoidable even with self-driving cars. And in Arizona, human intervention is required at least once per mile which raises another question.
How Do You Define a Self-Driving Car?
Let’s consider the question of driver intervention. It’s the law in Arizona law. Other states like California require a steering wheel and pedals with the expectation that a driver could take over. It’s uncertain if the Uber driver would have had enough time to act though the car was in self-driving mode.
Let’s consider what makes a vehicle self-driving first. There are six degrees of automation, according to the SAE International. Level 0 is you driving down the road with cruise control. Level 5 means that a car can do everything that a human can behind the wheel. And that’s a tall order. The current technology is at Level 2, so we have a way to go yet.
In general, SAE J3016™ levels and definitions include:
- Level 0 – No Automation: The full-time performance by the human driver of all aspects of the dynamic driving task, even when enhanced by warning or intervention systems
- Level 1 – Driver Assistance: The driving mode-specific execution by a driver assistance system of either steering or acceleration/deceleration using information about the driving environment and with the expectation that the human driver performs all remaining aspects of the dynamic driving task
- Level 2 – Partial Automation: The driving mode-specific execution by one or more driver assistance systems of both steering and acceleration/deceleration using information about the driving environment and with the expectation that the human driver performs all remaining aspects of the dynamic driving task
- Level 3 – Conditional Automation: The driving mode-specific performance by an Automated Driving System of all aspects of the dynamic driving task with the expectation that the human driver will respond appropriately to a request to intervene
- Level 4 – High Automation: The driving mode-specific performance by an Automated Driving System of all aspects of the dynamic driving task, even if a human driver does not respond appropriately to a request to intervene
- Level 5 – Full Automation: The full-time performance by an Automated Driving System of all aspects of the dynamic driving task under all roadway and environmental conditions that can be managed by a human driver
But can it really take the driver totally out of the picture? And when does that happen? There exists an inescapable human element that state laws have recognized. The George Jetson vision of a vehicle operating without any operator is not the reality. You have to remember all that goes into driving.
Think about beginning a trip, picking a place to go, and starting the car. All these things involve human action. So what does that mean for our drunk Uber driver? The question of a DUI or DWI then rests on defining what operating means.
So What Does Operating a Vehicle Mean?
There is some legal precedent to answer that question. Wisconsin, for example, has a precise definition of operating. It is anything from touching a button or starting any control.
(a) “Drive” means the exercise of physical control over the speed and direction of a motor vehicle while it is in motion.(b) “Operate” means the physical manipulation or activation of any of the controls of a motor vehicle necessary to put it in motion.
Going to back to Arizona law, the requirement for human intervention exists for self-driving cars. It’s evident that had the Uber occupant been drunk, he or she likely would have received a DUI.
The courts have also considered this question. What’s the law when it comes to someone who is drunk and sitting in a vehicle that was not in motion? In the case of the Uber accident, the driver was behind the steering wheel. Even with a legal statute, the court’s interpretation remained cloudy.
There have been rulings on both sides of the issue, leaving it up to the court’s discretion. In light of the seriousness of these charges, a definitive scope is crucial. But you may wonder where the problem is?
You have to consider at what point does control exist. Texas law recognizes the individual’s position with the control of the vehicle. It doesn’t matter if it’s a steering wheel or control panel. If a human uses it, that means they can operate the car. Where the automation begins or ends isn’t important.
But is it necessary for a human to do anything? That’s another gray area. If a self-driving car malfunctions, human intervention is essential which is likely the thinking behind Arizona law. No one can predict whether that might happen. And it also means that a human can act if needed.
If you take a look at the following infographic you can how even a single disengagement while driving could be a huge concern.
You will find more statistics at Statista
Does the ability of the individual to act in case of malfunction mean the same thing as operation?
It suggests that it is an essential part of the self-driving experience from a legal perspective. It comes down to a choice. If someone who is drunk chooses to get behind the wheel, it means that he has broken the law.
What About the Ethical Considerations?
That raises more questions for our drunk Uber occupant. One involves the identification of self-driving cars. Let’s consider what it means if the law says it’s legal if a drunk driver is in a self-driving vehicle. Where does the law stand if an accident occurs with a sober driver? And does what about the passenger’s rights?
This dilemma puts manufacturers in a tricky position too. A major selling point of these cars is driver safety. But what about the safety of others? In the Uber case, two other cars were damaged. Then, there is the consideration of liability. Manufacturers will drive the technology to protect their interests without proper legal direction. The DUI is the proverbial tip of the iceberg.
What Are the Limitations of Self-Driving Vehicles?
By far, human error causes most accidents like the Arizona case. And it’s something that is hard to code into the operation of a self-driving car. With the way things are now, the driver plays an active role in the operation of a vehicle. That suggests that a DWI or DUI arrest is possible.
As the technology evolves, the role of the manufacturer comes into play. It’s possible that their potential liability will steer the course of technology. That may influence how the legal question of driver operation and impairment is answered.
According to Business Insider there are 6 scenarios self-driving cars still can’t handle covered by John Dolan, principle systems scientist at Carnegie Mellon’s Robotics Institute
1. Driverless cars struggle going over bridges.
Raffi Krikorian, Uber’s engineering director, recently told Bloomberg that its driverless cars struggle going over bridges. That’s because Uber has meticulously mapped roads so that the driverless car can compare what it’s seeing with what is supposed to be there, helping it avoid objects and pedestrians.
2. Self-driving cars also struggle to “see” in inclement weather.
“Heavy snow and rain tend to confuse LiDAR sensors and also cameras,” John Dolan, principle systems scientist at Carnegie Mellon’s Robotics Institute, told Business Insider. “So you end up having some problems.”
3. Driverless cars struggle on roads without clear lane markings.
Tesla CEO Elon Musk vented about this problem to a group of reporters in October, according to the Washington Post. At the time, he showed the lack of clear lane markings on Interstate 405 near Los Angeles International Airport.
When driverless cars can’t distinguish the lanes, it makes it nearly impossible for them to drive or change lanes safely. Andrew Ng, chief scientists at Baidu, wrote in a Wired post that it will be necessary to make “modest changes to our infrastructure” for driverless cars to be successful on our streets.
4. Driving in cities is much harder for autonomous cars than cruising on the highway.
Dolan said it can also be difficult for the driverless car’s GPS to locate properly in cities.
“If you’re trying to do urban driving and depending on GPS to a large extent, then when you get into areas where there are a lot of tall buildings it’s hard to receive the GPS signal and you’ll have drop outs,” he said.
5. Robot cars can’t interact the same way humans can, which is problematic
Dolan said the same issue goes for merging on a highway or changing lanes. There are many ways we convey intention that disappears when there’s literally no driver in the front seat.
“We convey intentions in a way that result in natural interactions, rather than what you would call robotic interactions that would unnerve or frustrate a human being,” he said.
6. Driverless cars can also have trouble in high-speed driving situations
Dolan noted that when human drivers try to merge onto roads with cars traveling at higher speeds, they tend to inch forward to make sure it’s ok. Often, people will pull out in front of traffic under the assumption that cars will slow down for the merge, he added.
But a driverless car probably wouldn’t take that risk because if it projected the velocity of the upcoming car, it would pull back to avoid a crash, he said
The Future of Self-Driving Cars
It’s almost certain that manufacturers will put themselves in the driver seat. If there’s a question of human involvement, why not get rid of this factor? That is precisely the road that Google and NASA have done. Together, they are creating vehicles without human drivers.
Their research is looking at developing vehicles without steering wheels or manual controls. That’s important since it is essential for driver operation. It could pave a pathway for distracted or perhaps even, impaired drivers. Technology continues to make new roads. It’s clear that researchers are considering all the legal implications.
The answer to the question rests on two things.
First, the law must define operation. It must decide where and when it begins.
Second, it must take into account the evolving technology. What is clear is that technology is considering these issues. It will behoove the law to follow the same course.