Emily Hart
Podcast
Ya es hora. Tackling Big Tech to build peace and security in Colombia
0:00
Current time: 0:00 / Total time: -55:22
-55:22

Ya es hora. Tackling Big Tech to build peace and security in Colombia

New podcast about social media companies and how they contribute to violence in Colombia. Big Tech is undermining free speech, democracy, and security - how can we demand better?

Happy Wednesday, all!

Following on from the podcast I shared last week about Big Tech and human rights, I wanted to share a brand new podcast project: Ya Es Hora! Desenredando redes para construir paz en Colombia.

It’s a closer look at social media usage in Colombia, how it undermines attempts to build peace and democracy - we ask how we can demand and create a better system, which might allow us to enjoy freedom of expression and information - while being better-protected from illegal content.

I wrote, produced, and directed this Spanish-language podcast for human rights group Article 19 - my first podcast project in my second language! We spoke to the FLIP’s Luisa Fernanda Isaza-Ibarra, Mabel Quinto from ACIN, and Juliana Hernández De La Torre, director of Artemisas.

And if you’re not comfortable in Spanish - I got you! Below is a version of the script in English! I also wrote and presented a whole series on human rights and social media (in English) earlier this year - check that out here -

Have a great week, everyone!


15% of the day. Three and a half hours. That’s the average amount of time the average adult internet user spends on social media in Colombia. It’s the 4th highest figure on Earth.

And that time is spent nose-to-nose with some of the biggest businesses on the planet: with Meta - the owner of Facebook and Instagram, Alphabet - the owner of Google and Youtube, and with X and with TikTok.

These are the companies that hold the keys to the Internet, the gatekeepers we feed more and more data during all of this screen time. What we see on social media is dictated by the accounts we interact with, what we react to and dwell on, and what our personalised algorithm knows about us…

But that’s only a part of it: the precise nature of the algorithms remains opaque to us. So what exactly are the processes by which content goes viral, or gets downrated, or disappears entirely into the digital black hole? Or stays up, even when it shouldn’t? When its disappearance constitutes arbitrary censorship, or its continued visibility enables hate speech?

**

In Colombia, 94% of the adult population uses social media - mostly TikTok, Facebook, and Instagram. The reach of these platforms grew by millions of people in the last year alone. And in all of these spaces, censorship, disinformation, stigmatisation, polarisation, and misogyny are rife.

The rules which govern what we see, often known as ‘community standards’, are often vague - content moderation processes are scattergun and content curation processes even more so. This affects us all, whether we’re using social media for personal expression, journalism, activism or professional purposes.

A small number of huge businesses - mostly based in the global north - dictate the terms largely without oversight or accountability, with the aim of keeping our eyes on our feeds, and monetising our time. It’s a marathon race to profit from our attention, our data, and our content.

Content which sits at political - or emotional - extremes is often bumped up and promoted within our various feeds because it drives engagement, and content moderation is chronically underfunded and unaccountable - especially in places, cultures, or languages outside of priority markets.

So much is still opaque, but we know this much: our enjoyment of fundamental rights - personal expression and public debate - is being shaped and too-often violated by private companies which have shown they aren’t very interested in what their global standards mean in local contexts global north and english-speaking contexts which social media were designed for. This has profound implications for human rights in Colombia.

What processes do exist (and we still have very little clarity on what those are) are dictated from offices far from Colombia, so users here end up with already-vague guidelines applied without local knowledge or nuance.

Many of these processes are automated and implemented at huge scale without human review - essential for interpreting content and deciding whether it breaks the law or community guidelines - an interpretation which should be done in the context of Colombia’s cultural sensitivities, beliefs, and value systems.

**

This country is a clear case study of the threat to human rights risks posed by social media platforms in contexts outside of the global north - as well as threats to democracy and the safety of vulnerable groups.

Colombia’s political context - particularly when it comes to security - is uniquely complex: attacks and disinformation which appear online have very real effects both in our sensitive so called ‘post-conflict’ environment and in the extremely precarious situations in which many activists and journalists work. The silencing of voices on- and offline is intertwined, and it is time to move towards a solution.

More activists are murdered here than in any other country in the world: the 181 social leaders assassinated last year show us how urgent the problem is in Colombia. Many of these activists were working on land issues or environmental protection, mostly in rural areas where state protection is scarce and the territorial presence of armed groups is heavy. A disproportionate number of these high-risk groups for physical attack are from ethnic communities.

The silencing of voices on and offline is intertwined - and we need to move towards a solution. Harassment and stigmatisation of activists - often involving disinformation and discriminatory content - compounds their visibility and their vulnerability - this raises the risk of real violence, as well as affecting support for activists within their community and in political discourse. Dangerous, false, and discriminatory content is surprisingly difficult to have removed from these platforms.

**

Meanwhile, content which is crucial to our public debate - posts about our political situation, our ongoing peace processes and our historical justice, has been disappearing from these platforms.

Certain words cause posts and information to disappear, swallowed by the ether - terms absolutely central to public debate in our so-called ‘post-conflict’ era - like combatant or femicide. Automatically-applied algorithms downrate and even hide content or accounts containing chosen words, forcing people to creatively misspell or disguise these words (adding numbers, letters, or blank spaces) in order to continue the discussion.

The effects of this kind of behind-the-scenes manipulation are far from trivial: the Colombian Ministry of Culture reported a near-total disappearance from social media of the voices of FARC guerrillas during the peace process. Journalists or users documenting human rights violations in times of protest and social upheaval have also seen their content restricted or even accounts blocked.

We are not only in a key period of implementation of the 2016 Peace Accords with the FARC guerrilla, but ongoing peace negotiations between the current government and numerous armed groups - public information and civic participation around these processes is crucial, much of which will inevitably happen online.

**

Colombia’s current historical moment is a crucial one: the online world will be key to peace-building. We need online spaces to allow open and free debate, without disinformation or content which increases risk of retaliation - by stigmatising and lying about the actions, funding, or intentions of those speaking up or speaking out, exposing them to violence.

Public debate, especially in a country like Colombia, should not have its terms set by private entities in far-off countries, or algorithms which busily set about automatically wiping content off the internet based on blanket rules which fail to comprehend the complexity of our reality.

How can we people trust or invest in the construction or implementation of any peace process if the debate is being distorted by far-flung figures, out-of-reach companies, and automated algorithms? Where are our voices in that process? Making social media function fairly and effectively is not easy, but the stakes couldn’t be higher.

**

Content curation’s opacity and censorship are of huge and urgent concern: the user needs to better understand content moderation and curation practices like downranking and shadowbanning, which are often even less easy to see or track than content takedowns or account suspensions.

The companies which run these feeds are largely faceless and customer services are virtually non-existent, with enquiries and appeals alike often going unanswered. Despite huge influence and a key role in the exercise of our fundamental rights, we have no proper way to hold these companies accountable, either as individuals or a society.

Some social media companies do release transparency reports, but the data is not complete enough to identify trends or understand the scale and impact of content moderation - particularly at a national level.

Meta and Alphabet claim to have local offices in Colombia that go beyond marketing purposes; they say they have policy officials and good relations with civil society, while X have said they have teams training to consider diversity and context.

But there is no information about the teams that moderate local content or where they are located. Companies have developed direct channels with some local civil society organisations, but only in special circumstances, for example during the 2021 civil unrest or the 2022 elections here in Colombia.

No platform is prepared to provide the level of information on content moderation practices in Colombia that would help users obtain meaningful understanding of the local application of community rules, nor are they willing to provide the transparency that would help to track the number or nature of government takedown requests.

**

Context is key to understanding any speech or act of expression, from the cultural and social to the historical, economic, political. So far, efforts towards tailored content moderation have been negligible: if these companies don’t assess content within its context, how can they hope to adequately moderate or curate it?

Regulation of speech - newspapers, broadcast, town squares, has traditionally been the role of government - a combination of democratic function, public, security, and human rights which fell squarely under the remit of the state.

Now that private companies are writing their own rules and then deciding to what extent to comply with them and how, where does the state fit in? What are its ongoing responsibilities in this sphere? How can a government like Colombia’s engage with these companies? And should it?

These systems of control of what we say and see affect us all - but they do not affect us all equally - in fact they can compound invisibility and vulnerability. This can be seen in particularly stark terms in Colombia, where physical violence against those who speak out is all-too-common, particularly those who speak out against the armed groups which occupy large swathes of our countryside, much of which suffers chronic state absence and profound insecurity.

One of the key issues here is that these systems of rules do not incorporate any understanding of vulnerability: those who are targeted by online attacks are often also vulnerable to real-world violence.

Stigmatised groups, including journalists and activists, are more exposed to discriminatory narratives if they are women, LGBTQ+, indigenous, or Afro-descendent - which can compound vulnerability to threats and violence, especially if they live in rural territories. The connection between stigmatisation by political leaders and citizens alike and the persistence of armed violence in the country has been documented in detail by Colombian NGO Indepaz.

**

Content moderation affects freedom of expression, but we still lack consensus on how to tackle these huge companies, or bring the way they run their algorithms in line with human rights standards.

The best way to find consensus is conversation - diverse interests can be brought together to find the way forward - we need the state, the private sector, and civil society to join forces, to demand increased transparency from platforms, research the impacts of their policies and better understand the ‘black hole’ into which so much content gets sucked.

It’s time to include human rights in the conversation - it’s in all our interests to make that happen. In this crucial moment in Colombia’s history, as we struggle to leave a half-century of conflict behind us, we need to be able to learn and debate on our own terms, without wondering if there are voices and information missing, or having our words disappear into the abyss - or being threatened online for our opinions.

**

The podcast was made by ARTICLE 19 and produced by Christopher Hooton and Emily Hart. The podcast was presented by Maria Juliana Soto and written by Emily Hart.

The podcast was funded by UNESCO under its EU-funded project “Social Media 4 Peace” - for more on the project, check out this report.

The ideas and opinions expressed in this podcast are those of the speakers; they are not necessarily those of UNESCO and do not necessarily reflect the view of the European Union.

Discussion about this podcast