DE

Freedom of Expression
COGOV: Rebalancing Juggling Balls of Democracy against Disinformation

The same logic of citizen-centered service design is also applicable to the cultivation of media competence.
Juggling
© adobestock

The politics of democracy are dynamic and full of vitality, but like juggling, they require precise balance to prevent the ball from crashing to the ground. On one side of the balance is the power of the state with its vast resources and organizations, though it requires the legitimacy of governance. On the other side are the people, who claim to be the masters and who possess the ultimate power, but often create a lot of noise. This leads us to a dilemma between expecting the government to be competent, yet not wanting to be free of its hold on us; and between encouraging social diversity, yet wanting to succeed together.

Modern-day constitutionalism offers checks and balances through the separation of powers, and parliamentary politics and media freedom ensure a diversity of views and consolidate public opinion. Democracy relies on these mechanisms to build up mutual trust so that the ball suspended in flight will not land.

Freedom of speech is, without a doubt, a crucial cornerstone in this trust mechanism, and speech must be transparent and authentic. The danger of disinformation is obvious: saying the wrong thing or lying is often easier than telling the truth; thus, the saying that "a lie can travel halfway around the world while the truth is putting on its shoes." We traditionally rely on the parliament and the media as gatekeepers to maintain the delicate balance of this ecosystem.

The Internet has brought about enormous changes. Receiving information, expressing opinions, and even mobilizing the masses no longer require the parliament or the media—anyone can become a public opinion leader, or even part of the media itself — a hashtag on a social media network can start an avalanche. 

Mutual trust, already fragile in itself, has become all the more vulnerable. Organized operations to spread untruths take advantage of the weaknesses of social media, such as information overload and algorithm opacity. Disinformation can be disseminated quickly, feeding continuously on similar content, strengthening existing views, and forming alliances of shared opinion. The environment of online discourse is becoming increasingly segregated, the foundations of mutual trust are being gradually eroded, structures of power and trust are collapsing, and the juggling balls of democracy are scattered on the ground.

Freedom of speech is, without a doubt, a crucial cornerstone in (democracy's) trust mechanism, and speech must be transparent and authentic.

Audrey Tang
Audrey Tang

The spread of disinformation has become the Achilles heel for the good governance of democracy. The intuitive response is to introduce protections through legal control. Singapore, for example, has passed the Protection from Online Falsehoods and Manipulation Act, which offers the government the power to order individuals or the media to remove disinformation. Another case in point is Germany's NetzDG, which requires major social media platforms to self-censor as a check and balance and to remain transparent to the public. Yet doesn’t handing over the power of censorship to the government create a new problem in place of the former? Turning to multinational companies will not improve things either. Business operations and algorithms lack transparency and checks and balances even more severely than states, so putting machines in the role of a governor is not an option either.


It is said that addressing the source of trouble should end it. Since the problem comes from the dissemination of information from the media to the community, it is only by empowering civil society to participate in the reconstruction of trust that social norms may be co-created, that power may be conferred to communities that anyone can enter and leave at any time, and that the risk of abuse from the concentration of power may be minimized.

The relentless outpouring of junk mail that flooded into every mailbox once almost led to a catastrophic email crisis. Yet this problem was not solved through powerful legislative control, but rather through community initiatives that prompted email servers to provide users with the power to mark junk mail. Junk mail that was marked by everyone was then intercepted by the system and moved to a different inbox without any messages being deleted. If the mail was considered useful information, it could still be located again and marked as clean. Therefore, power was not concentrated in the hands of any one person or business, and users retained their autonomy.

Taiwan has taken a similar route in responding to disinformation. Primarily, it does not rely on legal control but rather urges major social media platforms or information tool operators such as Facebook, LINE, and Google to simultaneously sign standards of self-regulatory practice. In addition to establishing protections on the technical front, increasing oversight and transparency on political advertisement, and strengthening media competency across all age groups, it is important to cooperate with third parties to build an independent, transparent, and fair supervision mechanism.

Third-party fact-checking mechanisms in Taiwan, such as Taiwan FactCheck Center, MyGoPen, and Rumor & Truth, are mostly independent private organizations. Because they lack manpower and funding, opening up to public participation is inevitable. Many technological developments and fact-checking work were completed by volunteers, for example through the Cofacts project. Public participation allows these third parties to keep a healthy distance while working with the government and businesses and to maintain internal checks and balances.

TW Fact-check

The same logic of human-centered service design is also applicable to the cultivation of media competence. Rather than dogmatically educating students or the people on how to distinguish the Internet’s relentless tide of information, it is much better to help everyone recognize that we on the Internet can act as media and learn to contextualize a piece of information more profoundly.

Because the government’s organizations and resources are vast, it is incumbent on them to provide timely, accurate, and easy-to-understand information to the public and to allow third parties to fact-check. The Executive Yuan adopts the "2-2-2 principle" in expecting prompt clarifications from each ministry within 20 minutes and 200 words, with the inclusion of 2 images (a majority of which should be completed within 1 hour after the disinformation is disclosed). Of course, meme engineering—that is, the "package the message in such a way that you cannot help but be tempted to share it"—can crucially allow accurate information to be disseminated more quickly. If it is possible for the accurate information to immediately follow disinformation, it can similarly gain just as much traction, a strategy that we call “humor over rumor”.

In 2019, the United Nations Secretary-General's High-level Panel on Digital Cooperation released a "Declaration of Digital Interdependence," emphasizing the need to establish a decentralized and common governance framework ("COGOV"). To rebuild the democratic ecosystem of the digital age, it is necessary to change the relationship of checks and balances between the government and the people into a three-way relationship in which the government, businesses, and the people work together and depend on one another. It is only this way we can resist the invasion of disinformation and regain the ecological balance a democratic society needs to give the juggling balls in the air the energy they need to freely fly.

Audrey Tang is the Digital Minister of Taiwan.

This article was first published on https://pdis.nat.gov.tw/en/blog/
The German version is available here: https://www.freiheit.org/taiwan-demokratie-heisst-jonglieren