Mobilising investors to secure Big Tech disclosures in 2024’s #yearofdemocracy

By Alexandra Pardal, Interim Co-Executive Director of Digital Action

(This blogpost has been published by the Investor Alliance for Human Rights) 

Recently I spoke to an audience of 170 European and US investors at an Investor Alliance for Human Rights webinar on global inequity in efforts to protect people and elections from tech harm. We’re now partnering with the Investor Alliance to mobilise investors to secure Big Tech disclosures after a very disappointing first quarter for global platform safety.

Despite the fact that we’re currently in the largest megacycle of elections of the internet age, the world’s biggest online platforms and messaging services aren’t rising to the occasion. Instead, Google, Meta, X and ByteDance are largely taking a business-as-usual approach.

This is happening in the context of mass tech layoffs and more intense and serious threats to election integrity and human rights, including more prolific and deceptive disinformation campaigns waged with AI-content. Aside from Meta, which made a single reference to one, new global election policy, Google, YouTube and TikTok blog posts on 2024 elections solely talked about this November’s US presidential election. The dozens of elections taking place in some of their largest and most fragile markets were ignored. The latest commitments to work on AI show the reactive nature of policy-making at Big Tech companies when deepfakes have been a known threat to information integrity for years now.

Companies are barely investing in harm assessment and mitigation outside the US and English language content. In recent years, we’ve seen online hate speech fuelling genocide against the Rohingya in Myanmar and violence against communities in India, Ethiopia and Kenya, orchestrated doxxing and disinformation in Tunisia’s slide to authoritarianism, and the online mobilisation of mobs storming Brazil’s state buildings (in scenes resonant of the US Capitol attacks). All this boosted by algorithms, design features and content moderation systems within the control of companies, whose platforms we use, invest in, and advertise on.
Insufficient transparency and oversight tools are also undermining election integrity efforts in real time. YouTube’s latest failure to provide transparency around political ads spending in Pakistan’s election is preventing scrutiny of potential breaches of campaign finance laws, in a highly contested election, which international observers said was neither free, nor fair.

These problems are multi-country and endemic to how the platforms function. Left unaddressed, they also pose immediate commercial risks, which are being borne by advertisers whose reputations are damaged by association with harmful or illegal content. All of these tech harms can be traced to the decisions and investments, or lack thereof, of the tech companies themselves, including insufficient investment in localised policies, content moderation and enforcement. But the harms are also having a cumulative effect across the globe and are part of a long-term trend with declining legitimacy of democratic governments and institutions, a rise of authoritarianism and illiberalism, undermining the rules-based international order on which global businesses thrive.
2024 offers a once-in-a-generation opportunity in addressing the deficiencies in Big Tech business models and governance, enhancing the connection between ad tech design, democratic rights, and the prevention of violence. That’s why the investment community is so critical, in a context of low accountability for Big Tech companies in large parts of the world.

Investors can ask for disclosures to enable honest conversations on how Big Tech companies can do better to protect people and elections. The Global Coalition for Tech Justice of over 200 organisations and experts globally, convened by Digital Action, sent ten key asks to tech companies last July, to which none responded adequately. We’re calling on investors to request full disclosures of the steps the companies have taken this year, including:

  • Risk assessment and mitigation relating to 2024 elections in markets where they have users.
  • Trust and safety spending globally, regionally and nationally, including the number of content moderators per language/dialect/region in Africa, the Middle East, Asia and Latin America.
  • Information on the provision of the full spectrum of harm mitigation tools and measures for all elections, not just the US, operational at each stage of electoral process including post-ballot where risks of post-election violence loom.
  • Substantive information of the extent to which they have increased local expertise and stakeholder engagement, resourcing of partnerships with fact-checkers, independent media, civil society and other bodies that protect electoral integrity.
  • Publication of all contacts with governments, and all government and institutional requests to suppress speech and surveillance demands, where permitted by law.
  • The existence and plans to eliminate policy exemptions for politicians and ensure fact-checking of political ads.
  • Transparency of political advertising for oversight and legal compliance.
  • Strengthening of public oversight and transparency, including free or truly affordable data access and training for researchers, civil society, independent media and election monitors.

We believe that investors can play a crucial role this year to leverage their influence in favour of equitable and effective platform safeguards. Two billion people are depending on it for the protection of their democratic and human rights in this #yearofdemocracy.


Share this post