Do driverless cars pose a social dilemma over which lives to save?


This month, driverless cars seem to have created something of a dilemma when it comes to safety.

Researchers have highlighted the fact that autonomous vehicles are programmed with a set of safety rules; and that it is not difficult to construct a scenario in which these rules come into conflict with each other. They say to imagine a scenario in which a driverless car must either hit a pedestrian or swerve in such a way that it crashes and harms its passengers. What should it be instructed to do?

A study was recently published which was co-authored by a Massachusetts Institute of Technology (MIT) professor, showing that the public is conflicted over such scenarios and take an inconsistent approach to the safety of autonomous vehicles.

In a series of surveys taken last year, the researchers found that people generally take a utilitarian approach to safety ethics: in other words, they would prefer autonomous vehicles to minimize casualties in situations of extreme danger. That would mean, for example, having a car with one rider swerve off the road and crash to avoid a crowd of 10 pedestrians. But at the same time, the survey’s respondents said they would be much less likely to use a vehicle programmed that way.

Essentially, it seems people want driverless cars that are as pedestrian-friendly as possible — except for the vehicles they would be riding in themselves.

“Most people want to live in in a world where cars will minimize casualties,” said Iyad Rahwan, an associate professor in the MIT Media Lab and co-author of a new paper outlining the study. “But everybody wants their own car to protect them at all costs.”

The result is what the researchers call a ‘social dilemma’, in which people could end up making conditions less safe for everyone by acting in their own self-interest.

“If everybody does that, then we would end up in a tragedy … whereby the cars will not minimize casualties,” Rahwan added.

The researchers conducted six surveys, using the online Mechanical Turk public-opinion tool, between June 2015 and November 2015.

The results consistently showed that people would take a utilitarian approach to the ethics of autonomous vehicles. For instance, 76 percent of respondents believed it more moral for an autonomous vehicle, should such a circumstance arise, to sacrifice one passenger rather than 10 pedestrians.

But the surveys also revealed a lack of enthusiasm for buying or using a driverless car programmed to avoid pedestrians at the expense of its own passengers. One question asked respondents to rate the morality of an autonomous vehicle programmed to crash and kill its own passenger to save 10 pedestrians; the rating dropped by a third when respondents considered the possibility of riding in such a car.

Similarly, people were strongly opposed to the idea of the government regulating driverless cars to ensure they would be programmed with utilitarian principles. In the survey, respondents said they were only one-third as likely to purchase a vehicle regulated this way, as opposed to an unregulated vehicle, which could presumably be programmed in any fashion.

“This is a challenge that should be on the mind of carmakers and regulators alike,”

the scholars wrote.

Of course, the performance of driverless cars and the reality of how these vehicles will share the roads with other road users, is yet to be determined. But the points raised in this recent study are certainly interesting, and do highlight a very real and very important social dilemma.

What do you think? I’d love to hear your thoughts about driverless cars and this social dilemma raised by researchers.


Tweet us: @One_More_Second


The paper, “The social dilemma of autonomous vehicles,” has been published in the journal Science. The authors are Jean-Francois Bonnefon of the Toulouse School of Economics; Azim Shariff, an assistant professor of psychology at the University of Oregon; and Rahwan, the AT&T Career Development Professor and an associate professor of media arts and sciences at the MIT Media Lab.

Image: Courtesy of the researchers.