Driverless cars will need clear decision-making rules, says new report

Yellow roads signs against blue sky pointing to right and wrong

The ‘trolley problem’ is a dilemma often posed by industry observers who worry about driverless cars having to make tough ethical decisions.

For example, the thinking goes, a driverless car about to crash may have to choose between keeping to its current path and killing its passengers, or taking evasive action likely to kill nearby pedestrians.

When confronted with such difficult consequences, they ask, what should the vehicle be programmed to do?

However, new research published by law firm Gowling WLG finds that concerns about the trolley problem may have been overstated.

Most of the experts interviewed for the firm’s report – The Moral Algorithm, How to set the moral compass for autonomous vehicles – agreed that autonomous vehicles (AVs) will never be programmed to make such distinctions.



Despite this, the research calls for a set of harmonised safety regulations to help guide the development of driverless decision-making – for example, deciding when a driverless car can break the rules of the road or how assertively the vehicle should be designed to act when dealing with other road users.
 
The report delivers a number of recommendations, including:

  • the creation of an independent regulator tasked with balancing the legal, safety and commercial aspects of autonomous vehicles;
  • consideration of how regulation governing both driverless vehicles and the road network can be implemented across the different tiers of UK government;
  • consideration of how to facilitate co-operation between different sector participants, to ensure that autonomous vehicle development takes place rapidly and that connected and autonomous vehicles can interact with each other;
  • the development of a policy regarding how the moral algorithm will operate in terms of major safety situations;
  • a programme of public education and consultation, in co-operation with the industry, to increase public awareness of, and trust in, autonomous vehicle technology and its social benefits.

“It is important not to equate regulation with a burden,” said Stuart Young, a partner at Gowling WLG. “It can, in fact, facilitate new markets and important developments. 

“Unless completely new legislation that accommodates new products in advance of them being produced is implemented, this is likely to impose huge additional risks on the companies producing them, as a result of regulatory uncertainty.”

Most of the experts interviewed as part of the firm’s research agreed that autonomous vehicles will never be programmed to make such distinctions.

Tim Armitage, Arup’s UK Autodrive Project Director, said: “As with any complex new technology, AVs cannot be specifically programmed to respond to every possible scenario. This simply isn’t practical when a machine is expected to interact with humans, in a complex environment, on a day-to-day basis.  

“AVs will drive to the speed limits and will not be distracted from the task of safe driving; they will make practical decisions based on their programming, but they cannot be expected to make moral decisions around which society provides no agreed guidance. 

“To allow AVs to demonstrate their capacity for practical decision-making in complex environments, and to begin to establish public trust through contact, the first step is allowing testing in relatively simple and well-defined environments. 

“Of course, regulation will need to keep up, so in echoing Stuart’s sentiments, it is vital the legal industry act now in order to help create a realistic and viable route to market for AVs.”

Gowling WLG is one of the members of the UK Autodrive consortium, which is seeking to establish the UK as a global hub for the development of autonomous vehicle technologies and to integrate driverless vehicles into urban environments.

The research involved interviews with industry specialists and representatives from the consortium, as well as desktop research and analysis.