The crash of the driverless Tesla car has made big news and has helped to generate additional, necessary discussion about the role such vehicles will play in our lives.
There is little doubt that autonomous vehicles will be a part of our streets and highways – the push is too broad and too strong to stop the effort. There is also little doubt that driverless vehicles will have a huge impact on our economy. For example, what will happen to our 3.5 million truck drivers when driverless trucks are able to operate on our nation’s roadways?
But the crash of the Tesla car raises a potential liability issue that there has been little public discussion about: how does a person who believes that he or she has been injured or killed by a software or hardware glitch in an autonomous or semi-autonomous vehicle get access to information to determine if his or her belief is true?
Here is the problem: Today’s vehicles contain a collection of information about the operation of the vehicle. For example, in the event of an air bag deployment, one can access the “black box” and determine, in the seconds before the deployment, the speed of the vehicle, whether the seat belt was on, whether the brakes were being able, what was happening with the steering wheel, and lots of additional information. This data can be used to help determine the cause of the collision.
Some manufacturers of vehicles allow a relatively broad group of trained people to download the data held in the “black box.” Other manufacturers severely restrict who is permitted to do the download of that information. For example, in one recent Tennessee truck wreck case, we had to pay $4000 to have a manufacturer-approved company download the data for use in litigation against the trucking company. The cost of accessing this data almost always exceeds $1000.
Driverless cars will have “black box” information available and my guess – and this is a pure guess – very few people currently have the ability to access that data today. That makes some sense – the state of the industry is such that access of the data should be carefully controlled.
But the real issue in all motor vehicle wrecks is “why” the wreck occurred. Today we look for human error, roadway conditions, mechanical or design defects, etc. In driverless vehicles it will also be necessary to understand whether there was a defect in the software that controlled the vehicle. And that means there must be some way for a person who is challenging whether the software controlling the vehicle caused a wreck to have access to the software that was controlling the vehicle and have it evaluated by a trained software engineer / coder. Accessing the “black box” of a driverless vehicle may help a claimant understand “what” happened but it is unlikely that it will tell the claimant “why” the wreck occurred, i.e. was there a programing or other error that caused or contributed to cause the collision?
Do you really think Tesla is going to permit a claimant to have access to the software that controls the operation of the vehicle? Nope – won’t happen. Tesla will claim that the software is proprietary and highly confidential and thus will refuse to share it pre-suit. If a lawsuit is filed it will still refuse to produce the information, even if a protective order is offered, because the data is simply too valuable to risk having it go into a competitor’s hands. A judge may eventually rule that the information must be produced, but the cost of securing the information, and then the cost of analyzing it, will be huge – and all of it is money that must be spent before it is known whether a defect exists.
All of the above to say this: as we discuss driverless vehicles we must address how to handle the issues that will arise when those vehicles get into wrecks and hurt or kill people. We need a system to address these concerns before these vehicles are generally offered to a public for use on the roads. The system needs to anticipate the early use of the vehicles, the transitional phase (when we have a combination of traditional vehicles and autonomous vehicles on the roads), and the likely long-term phase (when all traditional vehicles will be prohibited on public roads). We also need to address what special rules should apply to Level 2, Level 3 and Level 4 vehicles.
The goal of the above should be a system that recognizes that advances in technology should not undermine the basic notion that those that cause harm should be responsible for the harm they cause. The claims resolution system ultimately adopted must not place excessive cost on consumers to determine how a wreck occurred while at the same time offer an appropriate level of protection for a manufacturer’s legitimate concerns about proprietary software.