New social norms, ethics and legal frameworks often take time to catch up with technological progress.
When we all started carrying mobile phones, it took a number of years for social etiquette to become fully-formed. It took longer still for laws to come into force to control how we use our phones whilst driving.
We are just now starting to see the thinking around self-driving cars evolve. For example, there is a need to define and encode solutions to the trolley problem - how will autonomous vehicles act when put into a situation where they need to choose between two bad outcomes that may involve harming humans? Who will be legally liable for the harm caused?
Society will work this out over time, but it could take years to define and build-up legal precedents and legislation to encapsulate the thinking. Legal frameworks may also end up varying from region to region, requiring cars to alter their algorithms as they cross borders.
The problem of social norms and socially acceptable behaviour is not new but becomes more complex and its outcome becomes more serious as we automate more devices that can cause significant harm if incorrectly controlled.
Self-Driving Mercedes-Benzes Will Prioritize Occupant Safety over Pedestrians The technology is new, but the moral conundrum isn’t: A self-driving car identifies a group of children running into the road. There is no time to stop. To swerve around them would drive the car into a speeding truck on one side or over a cliff on the other, bringing certain death to anybody inside. To anyone pushing for a future for autonomous cars, this question has become the elephant in the room, argued over incessantly by lawyers, regulators, and ethicists. Mercedes-Benz, the world’s oldest carmaker no longer sees the problem, similar to the question from 1967 known as the Trolley Problem, as unanswerable. Rather than tying itself into moral and ethical knots in a crisis, Mercedes-Benz simply intends to program its self-driving cars to save the people inside the car.