Toward ethical standards for autonomous vehicles

October 02, 2019 10:09
US prosecutors decided that Uber was not “criminally liable” for the death of a pedestrian last year. Photo: Reuters

Artificial intelligence (AI) has entered various areas of human activity, including autonomous driving, health care, environmental protection, energy conservation, investment and financial management. AI is good at the rational analysis of data and information, but a human being is full of non-rational considerations. How can we "educate" machines so that they can work and think more like humans?

Nature, the world's leading science journal, published the result of a massive survey called the Moral Machine at the end of last year. The aim of the survey was to understand the driver's decision in the event of an accident, a sort of an attempt to establish ethical standards for self-driving. The survey gathered more than two million online responses from around the world.

Although the companies that develop autonomous driving insist that AI is a more dependable driver than humans, and the number of road accidents would be greatly reduced as a result, doubts over the innovative technology heightened when an Uber self-driving car struck and killed a pedestrian in the United States in March last year. The fatal crash stoked public concerns about AI safety as well as ethical standards concerning the technology.

This large online survey listed 13 scenarios, each involving casualties. For example, how does one react when hitting a pedestrian who violates traffic rules? Between a young individual and an elderly, or between rich and poor individual, who should be sacrificed when a choice has to be done? Who is more important, the driver, the passenger or the pedestrian? Between a large group of people and a small group of people, who should be spared?

According to the survey, regardless of their age, gender or country of residence, most people would sacrifice pets over humans, and individuals over groups of people.

Other than these, the moral choices are not universal. In this 18-month survey involving 130 countries with at least 100 respondents in each country, there are roughly three groups of responses:

1. North America and several European countries where Christianity is the dominant religion;

2. Countries such as Japan, Indonesia and Pakistan, with strong Confucian or Islamic traditions;

3. Central and South America, as well as France and former French colonies.

Compared with the second group, the first group prefers to sacrifice an elderly in order to save a young individual; and in the case of pedestrian who violates traffic rules, countries that value discipline with strong government institutions, such as Japan and Finland, tend to knock down the unruly pedestrian. In the same scenario, people in countries with less developed social order such as Nigeria and Pakistan choose not to.

Social and economic status also has an apparent impact. When choosing between a homeless person and an executive, there is no obvious tendency in Finland as the gap between the rich and the poor is narrow. But in countries with significant economic disparity like Colombia, people will sacrifice one who belongs to the lower class. The third group is particularly caring for women and young people, but not for the disabled.

No one knows whether this survey has any implications in providing guidance for companies that  develop autonomous vehicles. However, we still have to see how the law will be formulated with regard to ethics.

In March this year, US prosecutors decided that Uber was not “criminally liable” for the death of a pedestrian. Whether the case will be cited by other countries in the future, we will have to wait and see.

– Contact us at [email protected]

RT/CG

Adjunct Professor, Department of Computer Science, Faculty of Engineering; Department of Geography, Faculty of Social Sciences; and Faculty of Architecture, The University of Hong Kong