As with the introduction of many new technologies, it takes some time between the introduction and the development of regulation to ensure its safe use and a legal framework relating to it.
We have seen the advent of 'auto-pilot' and intelligent cruise control in vehicles and, as we move to more fully autonomous vehicles, we are now seeing multiple countries around the world start to develop associated regulations. There are a number of questions that need to be answered around liability and who is responsible if a self driving car crashes.
If there is an accident, and the other driver is considered at fault, then the existing laws seem to work with the liability being with the at fault driver. Where things start to get more difficult is if the self-driving car is considered at fault. In this case who should be liable?
Should the car manufacturer who designed the car and software to drive it be held responsible? If so, this could stall new technology deployed in this space due to liability costs for the technology providers and their reluctance to take on this risk. Alternatively, the human in the car who is in charge of using the system could be considered to at fault, as is currently the case. However, as the car moves towards full autonomy this argument will be harder to make.
Given that the self driving car should have much lower accident rates, will insurance companies provide a discount for drivers who use self-driving capabilities in their vehicle? Or with connected cars, can we predict a pay-as-you-go insurance model - with different per mile charges for human driving versus machine driving?
There are larger issues ahead for self driving cars and regulation. There is a problem known as the Trolley Problem (https://en.wikipedia.org/wiki/Trolley_problem). If an autonomous vehicle is faced with two scenarios, both of which would result in fatality, how does the machine decide which is the lesser of two evils. This draws a number of very thorny issues. A human will make a impulsive decision and act accordingly, but a machine will have been pre-programmed. If there are resulting fatalities then will the author and approver of the code be ultimately responsible for the actions of the vehicle and therefore the outcome?
There is still a long way to go to have a legal framework to cover the various scenarios but we are now at the beginning of this path.
Law on driverless car insurance coming ‘soon’ – transport secretary