The right way to regulate digital harms
As the European Commission’s recent Digital Services Act demonstrates, lawmakers around the world are scrambling, with good reason, to address the extremism, disinformation, and manipulation that have consumed the digital ecosystem, distorted public discourse, and deepened polarization in recent years. And yet their efforts carry risks. Just as rules governing online domains can bolster democracy by promoting inclusive, informed debate, they can also be abused to inhibit freedom of expression.
Fortunately, international human rights law offers a set of principles that can guide regulation in a way that addresses toxic content while promoting freedom of expression. To help illuminate this process, our organization, the Global Network Initiative (GNI), recently brought together experts from across industry and the human-rights community to examine scores of content-regulation initiatives in more than a dozen countries, and provide relevant recommendations.
The first human-rights principle that must be applied is “legality,” which emphasizes the need for clear definitions adopted by democratic processes. Such definitions are missing in Tanzania, for example, which instead has rules barring online content that “promotes annoyance,” among other vague harms. If it is not clear what content is and is not allowed, governments seek to maximize their power to restrict speech; users cannot know what constitutes lawful conduct; and courts and companies struggle to enforce the rules fairly.
Another vital principle is “legitimacy,” which dictates that governments may limit expression only for specific compelling reasons, such as the rights of others, public health, and public order. The principle of “necessity” then demands that restrictions be tailored to fulfill those legitimate goals and be proportionate to the interest being protected. No regulation should be adopted if a less speech-restrictive rule could do the job.
A human rights-focused approach helps to prevent disproportionate consequences. In this respect, the European Union’s proposed regulation on preventing the dissemination of terrorist or extremist content online misses the mark. The regulation would require companies of all types and sizes to remove terrorist content within one hour and introduce proactive measures to filter such material. The dominant companies can afford such rules, but it would raise the barriers for innovative new players to enter the market, as well as resulting in the disproportionate removal of all sorts of acceptable content.
But companies themselves can and should apply rules that advance human rights, regardless of government regulation. Here, transparency, due process, and accountability are essential.
For starters, social media companies must be much more forthcoming about how they regulate content. This means sharing as much information as possible publicly, and providing legitimately sensitive information to regulators and independent experts through vetted access regimes or multi-stakeholder arrangements, similar to the one GNI has created for sharing information about company responses to government demands.
With this information, governments can ensure that intermediaries moderate content consistently and fairly. To this end, regulators, given appropriate resources and expertise (and, ideally, engaging experts and user rights’ advocates), should be tasked with providing guidance for and oversight of content-moderation systems. At the same time, companies should be required to introduce mechanisms that give users greater control over what they see and share.
Ultimately, however, responsibility for moderating sensitive content should not fall solely on private companies. Instead, governments should put democratically accountable organs, like courts, in charge. France’s Law Against the Manipulation of Information, while imperfect, seeks to do that, providing an expedited process for judges to review alleged election-related disinformation. That way, companies are not the ones making these difficult, politically sensitive determinations. By contrast, the country’s Constitutional Council recently struck down a French hate speech law, in part because it circumvented the country’s court system.
No matter how clear the rules and how efficient the moderation systems, regulators and companies will make mistakes. That is why the final piece of the puzzle is dispute resolution and remedy. Companies should allow users to appeal content moderation decisions to independent bodies, with special consideration for vulnerable groups, like children, and those serving the public interest, like journalists. Governments and regulators should also be subject to transparency and accountability mechanisms.
Toxic content hardly began with the Internet. But online platforms have made disseminating it further and faster much easier than ever before. If we are going to limit its spread, without crushing freedom of expression, we need clear and comprehensive regulatory approaches, based on human-rights principles. Otherwise, even rules designed with the best intentions could end up silencing the vulnerable and strengthening the powerful. That’s a lesson the world’s authoritarians know all too well.
The article is co-authored by Jason Pielemeier, Policy Director of the Global Network Initiative.
Copyright: Project Syndicate
-- Contact us at [email protected]
-
Integration of GIS and BIM can drive development of smart city Dr. Winnie Tang
The China Association for Geospatial Industry and Sciences (“the CAGIS”) released the Top Ten Highlights of China's Geographic Information Industry in 2023, which provides much inspiration. The
-
Equip young people for the future Dr. Winnie Tang
In late February, the inaugural flight of an air taxi from Shenzhen Shekou Cruise Homeport to Zhuhai Jiuzhou Port took only 20 minutes with an estimated one-way ticket price of 200 to 300 yuan per
-
Are we raising a generation of leaders, or of followers? Brian YS Wong
The essence of education is defined not by the facts it imparts, but the potential knowledge it inspires students to individually pursue on their own. Put it this way – the ideal form of education
-
The urgent need for reforms to sex education in Hong Kong Sharon Chau
Nearly one in every four university students (23%) in Hong Kong has been sexually harassed, according to a 2019 report published by the Equal Opportunities Commission (EOC). A 2019 study found that
-
STEAM should be linked to real life Dr. Winnie Tang
In the 2017 Policy Address, STEM (science, technology, engineering and mathematics) education was proposed as one of the eight major directions to promote I&T development. Since then, funding has
-
Russia’s nightmare – loss of Far East
-
首屆「中華文化節」六月開幕 感受中華傳統文化多元魅力
-
呈獻精彩絕倫的音樂盛會
-
養顏即食花膠靚湯
-
My Brief Remarks – at the HKS China Conference
-
非凡彩寶之旅 Winston Candy & Winston Kaleidoscope系列
-
The perils of self-censorship
-
中華文化節2024系列~八台戲曲亮相中華文化節 新編粤劇《大鼻子情聖》打響頭鑼
-
伊藤詩織:紀錄片是改變的一部分
-
DIOR MEN Fall 2024~Effortless Chic流麗衣櫥