Why we need a Global Coalition for Tech Justice

Alexandra Pardal, Sherylle Dass, Mai El-Sadany on the stage at RightsCon launching the Global Coalition for Tech Justice

In our session at RightsCon 2023, we heard how digital threats turn into lifesize harms. And we launched the new Global Coalition for Tech Justice to help protect elections from the harms – disinformation, hate speech, manipulation and abuse – facilitated by Big Tech.

Over the last year, Digital Action has been convening and listening to civil society organizations across the globe about how Big Tech has harmed them, their democratic processes and human rights.

Big Tech needs to invest in protecting elections in global majority countries

Over 70 countries are holding elections in 2024, a once-in-a-century occurrence. But over the last few years, elections have been influenced by online campaigning.

Anne Ikiara, Executive Director of Digital Action explained why we’re focusing our efforts on 2024.

“Since Digital Action started in 2019, we’ve supported EU organisations during elections and to influence the precedent-setting Digital Services Act. But we know social media companies are causing global harm, so we’ve been working with partners across the world to shape a new coalition and campaign to ask Big Tech to protect elections in 2024 and beyond.

“Without a serious push, tech companies won’t take any serious measures to protect democracy or people. Meta for example has been investing 87% of its global budget for time spent on classifying false or misleading information into the US despite most of Meta’s users being outside the US. Tech companies should invest in moderating other languages (beyond English) that people are able to post in, as well as understanding local contexts. We are working together to protect people and elections, not Big Tech companies.”

Micro-targeting to stoke civil unrest

Micro-targeting has proliferated in many countries, in an uncontrolled fashion, in the last few years. While the data captured about you by social media companies in theory means content that matches your interests is added to your feed, it also pushes content to you that is more and more extreme. Sherylle Dass, Regional Director at the Legal Resources Centre gave an example of how social media platforms amplify unrest in a country through micro-targeting. She explained how in South Africa during July 2021, a network of the president’s supporters were spreading online incitement to real-world violence of specific ethnicities. This type of political micro-targeting was also used following the 2022 Kenyan elections.

Big Tech doesn’t pick up on country-specific online harms due to lack of content moderators who understand a language’s subtleties and cultural context

Mai El-Sadany, Executive Director of The Tahir Institute for Middle Eastern Policy (TIMEP) gave us her perspective on the MENA region’s experience of online harms and elections. “Elections are rarely free and fair in the MENA region but it’s still important to be part of this discussion about Big Tech and democracy, because it’s a region where online harms have doubled and tripled during, before and after elections.”

“During the Arab spring, citizens used social media platforms to organize themselves, even coordinating blood donations through social media. In Sudan, people are still using Twitter to help each other with emergency needs. And yet governments are weaponizing social media and online spaces. For example, they have created troll factories targeting activists posting online, used surveillance tech, and spread hate speech. Governments control the internet space. But this actually made it harder for activists to organise and fact check. In the MENA region, social media platforms are a new arena for power plays and yet we do not know the extent of online harms and very often ad tech companies rarely know either.”

One of the reasons that Big Tech is so in the dark about online harms in the MENA region is their own lack of content moderation.

“All these inequalities in the online world have spilled over into the offline world and we see an underrepresentation of languages like Arabic, which is spoken by billions, not having enough content moderators. The Arabic language has different dialects and social media companies are not investing enough money and resources to moderate that. There’s a need for cultural context understanding so that we end up with content moderation policies that are representative of cultural peculiarities. This approach should be the same for all countries in the global majority.”

Big Tech has a responsibility to fact check election-related content

In Brazil, the 2022 presidential election was undermined by protests and roadblocks around the country by supporters of the losing candidate, Bolsonaro. Social media was used by supporters of Bolsonaro to spread misinformation about electoral fraud, motivating the protesters. This escalated to them storming Brazil’s National Congress in January 2023.

Joao Brant, Secretary for Digital Policies for SECOM/Presidency of the Republic of Brazil said “People have the right to free and fair elections. People have the right to criticize their governments. But also we want to ensure the protection of elections.

“With a flood of disinformation around elections, who will judge what is factual or not? In Brazil, the Superior Electoral Court acted alone to protect the election process. This was vital.

“Platforms should do better to ensure that they are protecting the trust that people can have in a democracy.”

Big Tech digital safety teams who’ve been laid off can be allies for civil society

Christian CIRHIGIRI, Digital Peacebuilding Policy Officer for Search for Common Ground, said “We need to think about the major layoffs in ad tech companies and how the ex-employees can be our allies. How do we start thinking about strategizing, engaging people who were laid off, and building allies with them? Not all tech workers are bad, they might be allies and help us.”

“There is a divide between the global north and the global south when it comes to using tech and social media during elections. If we don’t understand the division and polarisation we won’t be able to give recommendations for the online monitoring of elections. We recommended the EU have a committee to monitor online disinformation during elections there.”

Why we need a Global Coalition for Tech Justice

Alexandra Pardal, Director of Campaigns at Digital Action, concluded the session by setting out why we’ve built a Global Coalition for Tech Justice.

“As we’ve heard today, social media platforms have the potential to negatively impact rights and democratic values globally. They enable the spread of disinformation. Our coalition, and our campaign the #yearofdemocracy is a call to protect human rights above profit, and to protect elections above clicks and ad revenue.”

Sherylle added “We (in the global majority) are facing an inequity dilemma. We need to save and safeguard democracy, through this coalition and campaign, and we are going to call on social media companies to make investments. We need to join the fight together.”

Join the Global Coalition for Tech Justice.

Share this post