Challenging Big Tech Power: Dominican Republic’s Move Toward Rights-Based Regulation

Read the Spanish Version Here

By Vladimir Cortés Roshdestvensky

The Dominican Republic is quietly looking to make history. While tech giants flex their muscles against regulation worldwide, Dominican lawmakers are crafting legislation that could reshape digital governance across the Americas. The move puts the Caribbean nation in line with regional powers like Brazil, where the Supreme Court recently ruled that platforms like Meta, X, and Google will not be liable if they demonstrate that they took steps to swiftly remove illegal content like hate speech, racism, and incitement to violence. 

The question is whether smaller nations can withstand the inevitable corporate backlash.

The proposed Law on Freedom of Expression and Audiovisual Media represents something increasingly unique: a framework that aims to protect rights without stifling speech. Instead of regulating content, it seeks to regulate processes. Instead of empowering censors, it aims to empower users. This vision rests on a simple but powerful premise: users should understand how platforms make decisions about their content, have meaningful ways to challenge those decisions, and retain agency over their digital participation. Rather than treating transparency and innovation as opposing forces, the proposed legislation recognizes them as complementary. Platforms that operate with clear processes and genuine accountability are more likely to earn users’ trust and foster robust digital ecosystems. The framework seeks to ensure that whatever decisions platforms make follow fair, transparent processes that respect users’ fundamental rights to expression and due process. It offers a rights-based foundation, but as tends to be the case, foundations still need careful architecture.

The devil lies in the (institutional) details

The legislation also aims to create a new regulatory body, INACOM (Instituto Nacional de Comunicación), with sweeping powers across multiple sectors: traditional audiovisual media, digital platforms, cinema, and public performances. This broad mandate could create both opportunities and risks that require careful calibration.

The challenge is that different media require different regulatory approaches. While broadcast television and radio, which use scarce public spectrum, legitimately warrant content oversight, digital platforms should be subject primarily to procedural regulation focused on transparency, due process and users rights. International human rights law recognizes these distinctions, and the current draft attempts to address them by dedicating a specific chapter to digital platforms (Chapter IV). However, this structural differentiation could be undermined by INACOM’s overarching mandate, which grants the institute uniform sanctioning powers across all sectors, potentially subjecting digital platforms to the same content-based oversight designed for traditional broadcast media. 

The Dominican bill also recognizes internet access as a fundamental right and prohibits prior censorship. But here’s what makes it groundbreaking: it seeks to demand genuine transparency about how platforms moderate content. Companies would need to explain their automated systems, provide accuracy rates, and offer meaningful appeals processes. Users must be notified promptly and in an accessible language when their content is removed or their accounts are restricted. This focus on due process and the right to defense shifts the balance toward users, not platforms. It pushes for auditing algorithms, another move that puts user rights front and center. 

Still, some parts need refining. It mentions access to source code, but the real challenge lies elsewhere: understanding what these systems do, not just how they’re built. How much content do they suppress? Who signs off? Do users ever win their appeals? As the UN Special Rapporteur on Freedom of Expression has pointed out, meaningful transparency means helping people understand how AI and algorithmic content moderation systems shape their online experience, not just publishing technical jargon. 

That’s the kind of detail that turns lofty principles into actual safeguards.

Concerns over free expression remain

Not everyone is convinced. In May, dozens of journalists and citizens marched in the Dominican Republic’s capital, Santo Domingo, warning that the bill could become a tool for censorship under the guise of protecting privacy and dignity. Critics fear vague wording and a powerful independent regulator might suppress independent reporting, even though the law forbids prior censorship. The current draft includes vague concepts like “messages that manifestly denigrate human dignity” and grants INACOM discretionary power to determine what constitutes violations. Such ambiguous language, combined with broad sanctioning authority (the current draft allows the institute to impose fines up to 200 minimum wages and suspend services for up to 90 days for administrative infractions), could be used for arbitrary enforcement. The solution lies in precision: clearly defining procedural obligations while removing subjective content-based standards from administrative purview. 

A more targeted approach would differentiate INACOM’s powers by sector: full regulatory authority over broadcast media that use public spectrum, procedural oversight for digital platforms focused on transparency and due process, and age-classification systems for cinema and public performances. This differentiated framework would align with international best practices while reducing the risk of overreach. This matters especially for marginalized groups. Content moderation disproportionately affects women, Afro-descendant communities, Indigenous peoples, LGBTI+ populations, and journalists. 

INACOM’s institutional design warrants scrutiny itself. With members appointed through political processes and serving renewable two-year terms, the institute’s independence could be compromised by electoral cycles and partisan pressures. Extending terms to five or seven years, diversifying appointment mechanisms, incorporating civil society oversight, and ensuring genuine financial autonomy would strengthen the regulator’s effectiveness.

Recognizing this pushback isn’t a weakness, it’s a sign this debate is alive. It also underscores why clear safeguards and public oversight must remain central.

A shift toward meaningful accountability?

For tech giants accustomed to self-regulation and opacity, the bill could bring us closer to a fundamental shift towards platform responsibility. The approach is trying to align with rights-respecting frameworks that these companies have spent billions trying to undermine. Rather than operating in a regulatory vacuum where platforms make content decisions without meaningful oversight or user recourse, the legislation would establish transparency requirements and due process protections that hold these companies accountable for their moderation practices.

The pattern is consistent: tech companies frame any regulatory effort as an attack on innovation and free speech, regardless of actual legislative content. They leverage market power and political connections to crush regulatory efforts before they gain momentum. This coordinated offensive against regulation highlights techno-feudalism, where private companies control digital infrastructure, shape public debate, and dictate policy outcomes. Unlike Europe, Latin America remains one of the last frontiers where Big Tech operates with minimal restrictions.

Regulatory success requires more than good intentions — it demands institutional design that can withstand both corporate resistance and political pressure. The Dominican proposal, with targeted refinements, could provide that blueprint.

Back in February, the U.S. government made its position clear: it would defend its tech companies from what it described as “overseas extortion” and unfair penalties. Rather than engage with international calls for stronger oversight of digital platforms, the administration framed foreign regulations as politically motivated attacks on American innovation. This marked a shift toward a more defensive, deregulatory posture, one that prioritizes protecting corporate interests over participating in the global push for accountability. The Trump administration’s weaponization of tariffs against countries challenging American corporate interests also adds complexity to an already difficult regulatory environment.

It is in this geopolitical climate that the Dominican Republic’s effort takes shape — a moment when platform concentration has reached unprecedented levels. A handful of companies now filter the flow of information for billions of people; with direct consequences for democratic life. Big Tech doesn’t just dominate markets, it bends the policy environment in its favor, often dictating the terms of engagement before laws are even written. 

For Latin America, this represents more than regulatory experimentation. It’s a demonstration that the region can lead in rights-based digital governance, despite economic and political costs. The Dominican Republic may be small, but its model could prove influential across a region crafting alternatives to unchecked platform power.

If successful, ripple effects could extend far beyond the Caribbean. If corporate and geopolitical pressure proves too intense, it will serve as a cautionary tale about the real costs of challenging Big Tech’s global dominance.

The Dominican Republic is putting forward a vision where transparency, due process, and user empowerment can coexist with innovation and free expression. Realizing this vision requires surgical precision in institutional design: empowering INACOM where regulations are legitimate, while constraining it where freedom must be protected. 

That’s a stand worth protecting from those who profit from the status quo. Whether that move holds will say a lot, not just about one country’s law, but about who gets to write the rules of the digital age.


Share this post