“It is more than the sum of its parts, that the art of the arrangement is the point of the volume. The stories must make a pattern, say together more than each says alone.”
The above quote is from this article that likens an anthology of stories to a bouquet of flowers: apparently, the word ‘anthology’ originates from the Greek for ‘flowers’. It stuck with me, because at Digital Action, we seek to help civil society not just in the stories they tell, but also the pattern in which they are told, so they are greater than the sum of the parts.
This ‘art of the arrangement’ was the driving thought — though less poetically put — behind a recent project we did around the Digital Services Act (DSA). The European Commission (EC) had opened a public consultation on the DSA seeking input from technologists, national governments, civil society and private sector companies to feed into the sausage-making of this legislation. Our goal: expanding the anthology of stories told in that consultation, especially of those representing protected and/or marginalised groups. We hoped civil society could shape the way the European Commission thinks about Big Tech and platform regulation, and therefore the more diverse this consortium, the more inclusive the DSA will be, leading to a safer, equitable online democratic experience for everyone.
This blog is intended to share our learning from this project:
- Naming the barriers for civil society to participate in these democratic processes
- How civil society can collaborate effectively
- How Digital Action as an organisation can act as a true ally, through the organisational decisions we make
A brief introduction to the DSA
On December 2nd, the EC will publish its draft directive of the DSA, a series of legislative instruments. Using the words of European Digital Rights (EDRi): the DSA is a “unique opportunity to improve the functioning of platforms as public space in our democratic societies, to uphold people’s rights and freedoms, and to shape the internet as an open, safe and accountable infrastructure for everybody.” It will set platform regulation for 27 countries, and will also set a global precedent in internet regulation, so it’s imperative that the EU gets it right. A strong, unified voice from engaged civil society actors who care about democracy and promote a human rights-based approach was, and remains, critical.
Digital Action’s contribution
In general, our approach to the movement towards platform accountability and governance is not to be the policy experts or the spokespeople: it is to unlock collaboration. That’s why we don’t make public policy statements of our own. To build trust and prove we’re an honest broker of collective action, we’re not interested in profile — but in supporting tactics that have the highest impact.
With the DSA we saw our added value as enabling a diverse range of civil society groups to respond to the EC’s public consultation, championing steps that would strengthen democracy. More crucially, we wanted to centre the experiences of those whose societal marginalisation is compounded by the technological status quo, ensuring that their lived experience is included alongside digital rights organisations.
Working with independent human rights consultant Iverna McGowan, now Director of Center for Democracy and Technology in Europe, we:
- Identified organisations representing those who face disproportionate harms from the status quo of big tech, but lacked capacity, or expertise to input confidently into the consultation.
- Asked these organisations if the DSA consultation was on their radar, and what support they would need in order to feel confident to respond, and therefore improve the consultation.
- Ended up assisting 21 organisations. Iverna provided bespoke advice about the process and the opportunity, and developed policy guides for non-tech-policy-expert audiences.
These supporting documents were a mechanism to share the research and recommendations of civil society organisations (CSOs), already well-versed in the tech and democracy space, especially where there is greater alignment on the policy solutions. These works included the call organised by European Partnership for Democracy for universal transparency of ads by default; EDRi’s thoughtful and detailed answering guide; and, a call by thirteen organisations for algorithm inspection and accountability. It also included work by Article 19, Access Now, Center for Democracy and Technology, Data for Black Lives, Panoptykon Foundation, Stiftung Neue Verantwortung, Ranking Digital Rights, to name but a few.
What we learned
Based on feedback from the organisations we engaged through this project, we landed on a number of insights. We want to share these lessons out into the world as a small but hopefully useful contribution to strengthening and diversifying collaboration — with the desire for feedback for us in return.
The art of the arrangement: more than sum of our parts:
- Rippling effect: the emergence of civil society unity on some key issues had powerful results. For example, the UN High Commissioner for Human Rights echoed this messaging to the European Commission President on the topic of the DSA.
- Two shoulders to stand on: while this project enabled some organisations to submit that otherwise wouldn’t have, the organisations we engaged said that the parallel (and coordinated) work by EDRi was also valuable — the bricklaying efforts added to one another, rather than competing for space on the wall.
- Decentralised collaboration: a number of civil society organisations worked with each other on tech policy topics for the first time, in a manner which looks to continue beyond this project — therefore not requiring any ‘centralised’ coordination.
Barriers to wider, more diverse participation in tech policy consultations
- An uninviting process: a number of CSOs cited the cumbersome, complex and intimidatingly long questionnaire from the Commission as a reason they could not engage further. This, coupled with the fact that there were no supporting workshops held alongside the process meant it was hard for non-experts to see how they could fit into this process.
- Further marginalisation of the marginalised: although CSOs representing protected groups and marginalised people expressed the highest level of interest and engagement with the project, it appears that in the end, they were among the fewest to actually submit a consultation response. Effectively, it seems the marginalisation of certain communities is compounded by existing processes.
- Gulf of expertise: there was a strong response to initial outreach and agreement that the DSA is essential for democracy and more broadly to their work. However, many CSOs did not feel comfortable enough in the policy area to formulate their own positions or to respond without further additional support. For instance, all of the CSOs working on migration responded to the initial outreach and acknowledged the significant link between their main priorities work and the impact of disinformation. Ultimately, they said that they felt they lacked the capacity to carry out research and therefore the expertise to make the case. Meanwhile, CSOs with policy experts in the digital rights space struggle with demands on their time which exceed their current capacity.
- Lack of funding: this is of course linked to the above barrier of lack of expertise. While many CSOs had no ambition to hire dedicated staff on tech policy, four CSOs said that they would ideally like to hire a staff member who could bring this expertise in-house, but they lacked the resources to do so. Some had been rejected for funding, and others found the available funding too prescriptive and narrowly focused. One organisation expressed concern at the enthusiasm of private companies to fund this work in Brussels, which at times put organisations in a difficult bind.
It’s challenging to overstate the level of mobilisation and coordination needed for civil society to stand a chance at being a counterweight to the goliath-like lobbying power generated by Big Tech in Brussels. The power asymmetries between private sector interests and civil society organisations are ever further exacerbated by tech companies hiring experts in tech policy from civil society.
Lessons for Digital Action
- Trust is (often) personal: There was a very high correlation between organisations knowing either Iverna and/or Digital Action staff personally and their interest in engaging with the project. At a time when we are particularly deprived of human connection, we will need to make special effort to expand the conversation about technology and democracy beyond traditionally-engaged groups. To do this we must listen, share opportunities, and work openly.
- The longest bridges take the longest to build: as we are reminded by partners, the DSA will take years. Interventions therefore should snowball, not be flashes in the pan. And of course, herein lies an opportunity: we can use this time to build trusting relationships with these CSOs, to learn from how we can shift power to them. It is in this way that we can help transform the conversation about technology. Something that cannot be achieved in a multi-page questionnaire solely.
Digital Action is only 18 months old. We are learning — from mistakes as well as successes. This project provided important insights into how CSOs typically excluded from these conversations may respond to the consultation, and more crucially the barriers which prohibited their engagement to this process.
We concluded that more inclusion work is necessary so that the DSA is rooted in a broader understanding of the systemic social, political and economic inequalities, which disproportionately impact marginalised people’s democratic experience online. These learnings will inform our intended work on these issues, holding ourselves to account in centering lived experience — and expertise — in the collective effort to strengthen democratic rights in a digital age.
Layla Wade was the Community and Campaign Engagement Lead at Digital Action.