logo

BWIRE: Why big techs must address platform regulation

Without deliberate, serious dealing with the issue it will result to violation of the freedom of expression

image
by Tabnacha Odeny

Opinion17 June 2024 - 12:45

In Summary


  • Any more delay by big techs to engage more meaningfully in dealing with this problem will, unfortunately, see countries coming up with laws and administrative codes that will stand in the way of freedom of expression and access to information.
  • Content regulation, information integrity and responsible use of digital platforms has gained currency as communities deal with the expanded space for freedom of expression and access to information.
Victor Bwire is the Deputy Chief Executive Officer and Programmes Manager at the Media Council of Kenya.

The issue of disinformation especially through digital media platforms has gained global concern mostly in the North, akin to radicalisation and violent extremism wave in the South, and the threats to poses to nations leaves tech companies with no option but come in urgently to ensure responsible information sharing.

Nations are calling disinformation, or Foreign Information Manipulation and Interference (Fimi) a global crisis, just like the climate change, radicalisation, world financial crises affecting humanity, and a modest approach to the digital ecosystem regulation is unable to solve the problem, and mere citing of community rules and removal of disinformation or hate speech is not any more convincing and adequate.

Any more delay by big techs to engage more meaningfully in dealing with this problem will, unfortunately, see countries coming up with laws and administrative codes that will stand in the way of freedom of expression and access to information.

Its worrying, that without deliberate and serious dealing with the issue by the big techs, especially managing the design elements in the digital information flow, will result in a violation of the freedom of expression, as countries frustrated by the evasive response by the platform owners are developed cybercrime and disinformation laws.

It’s a done deal- countries see disinformation as a big threat to democracy, national development, global peace, elections, health and related, and governments have made up their minds, if the big techs are not more proactive and resolute on platform regulations, government control or regulation will reign on.

Can the big techs stand up be counted, and engage with other players including academics, civil society, regulators, governments among others to ensure a system on digital regulations is developed that does not allow infringement on freedom of expression.

Remember away from demand for more deliberate platform governance, media outlets and content owners are already up in arms, demanding fair pricing and compensation of their content that big techs use. 

There is serious concern that big techs have remained aloof and non-committal to the threats; posed by disinformation shared on their platforms and by extension even quality information they pick from media outlets that spend a lot on generating the professional content.

Freedom of expression is always the first causality in conflicts and crises, and we worried that once disinformation has reached what nationals are calling a global crisis, panic responses include extreme laws and actions will come in,  with adverse effects on freedom of expression.

Already, countries in the global north have come with national responses including laws on foreign information manipulation and interference through national security lenses, with obvious implications on freedom of expression.

Content regulation, information integrity and responsible use of digital platforms has gained currency as communities deal with the expanded space for freedom of expression and access to information.

In addition to global professional standards on the regulation of information outlets including journalists which is generally accepted, some issues, which thought platform providers had thought well handled through the community rules, seem disturbing thus attracting attention.  

Most important is the issue of harmful content, including but not limited to hate speech, disinformation and insults.

It's that laws cannot keep up with developments in technology and in many cases when not applied responsibly can stifle innovation- and many have been against coming up with new laws to deal with managing platforms.

Many freedom expression advocates have argued against such laws as social media tax, computer and cybercrime, and digital media laws, because the already existing regulations are enough to deal with the situation, so long as enough media and digital literacies are conducted within communities.

We don’t need to criminalise the platforms because of the people who misuse them but must tell them that its their responsibility to work on making the platforms safe and civil places.

In any case, media houses ensure their TVs, newspapers, radios and own platforms are safe and responsible for what they carry.

Platform providers cannot remain aloof to the challenges that come with the innovations and must be robust in working with others to ensure responsible use of their platforms. Information shared must be useful to the communities.

They must invest in research, public education and digital/media literacy for the good of the communities in which they serve, they must coalitions that are working on ensuring responsible, constructive use of information shared on the platforms.

Content regulation s not aimed at regulating innovation and technology, but discussions are agreeing that information shared is within the agreed international and national laws to minimise conflict with the law and should not me seen as adding for regulations on regulating Freedom of Expression.  

Through support from UNESCO and the EU, players in the sector have formed a national coalition of content regulators in Kenya, to steer the country away from the existing laws and regulations to ensure platforms are not misused in the sharing of harmful and irresponsible content.

This might be a new concept about regulated self-regulation in content that may be strengthened purely by self-regulation and or coregulation being practised in some areas.

Copyright laws and now knowledge of AI and related developments will be critical to our communities as we move forward.

It is not the complications about how AL works and the science, but general literacies on how it's possible to misuse, and give information such as during elections that lack integrity, or misinformation that messes up with democratic processes in the country.

Its gaining ground that we cannot talk about information integrity and access to useful information by the public if the information they are receiving is misinformation, and the principle of information diversity suffers greatly if we get a few people to dominate the platforms.

Big techs much as they are private must be aware that they are big players in the information public spaces, and must be responsible for their platforms contribution to global peace.


logo© The Star 2024. All rights reserved