Date
15 August 2018
Algorithms designed by data scientists would certainly reflect human biases, exposing their companies to some dangers. Photo: Reuters
Algorithms designed by data scientists would certainly reflect human biases, exposing their companies to some dangers. Photo: Reuters

Businesses shouldn’t ignore risks associated with AI

2018 will be a milestone in the application of Artificial intelligence (AI), according to Gartner.

By 2020, 20 percent of citizens in developed economies are expected to use AI to help them make daily decisions. And up to 40 percent of customer enquiry and consultancy services will be handled by virtual assistants.

AI is expected to create 2.3 million jobs in 2020, but at same time, the technology will eliminate 1.8 million positions.

Apart from the economic impact, technology will also have deeper influence on society and politics.

Last year, MIT and Harvard spent US$27 million researching ethics and governance aspects of AI.

Facebook’s data leak scandal has highlighted the moral and societal risks once AI is used in conducting businesses.

There are numerous advantages of using AI, such as boosting competitiveness and efficiency, as well as improving the understanding of customer behavior.

But algorithms are designed by data scientists, and they would certainly reflect human biases. Such biases could put the companies involved in danger.

Tarnished brands, legal responsibility, customer mistrust are some of the possible downsides.

To avoid such risks, a company has to review and clarify its principles and incorporate its ethics values into the AI application designs.

The argument that customers are willing to give up data privacy for convenience is no longer valid.

This article appeared in the Hong Kong Economic Journal on April 18

Translation by Julie Zhu

[Chinese version 中文版]

– Contact us at [email protected]

RC

Venture Partner of Sequoia Capital China, former head of the data committee and vice president at Alibaba Group.

EJI Weekly Newsletter

Please click here to unsubscribe