In June 2024, the then Speaker of the Chamber of Deputies, Arthur Lira (Progressive Party – PP), shelved Bill 2630/2020, known as the “Fake News Law,” claiming the bill had been “contaminated” by the narrative that it would lead to censorship in Brazilian legislation. In other words, it had been tainted by misinformation spread by the far right, the evangelical caucus, and Big Tech lobbyists. In its place, Lira created a working group to address platform regulation—a tactic often used when there’s no real intention to solve a problem. Nearly a year later, this group has held zero meetings, heard zero guests, and submitted zero motions.
Instead of the high-level debate Lira had promised, an alternative proposal was put forward by federal deputies Silas Câmara (Republicans), then leader of the evangelical caucus, and Dani Cunha (Union Brazil), daughter of the impeached and convicted former Speaker Eduardo Cunha.
This new proposal, now Bill 4691/2024, consists of 11 pages and 22 articles—compared to the 48 pages and 60 articles of the original Bill 2630—with only 5% overlap between the two. As such, much of the public deliberation that informed Congressman Orlando Silva’s report on the original bill was discarded by the authors of Bill 4691.
From the original text, the new bill retains essentially Articles 7 and 8, which require platforms to “identify, analyze, and diligently assess systemic risks” with the aim of providing a “reasonable, proportional, and effective” mitigation. In practice, to meet this objective, algorithm-based services would need to:
- Adapt the design, characteristics, or operation of services, including systems and interfaces;
- Modify terms of use and application criteria/methods;
- Adjust content moderation processes, including the speed and quality of handling reports, and when necessary, remove content;
- Test and fine-tune algorithmic systems, including those for prioritization, recommendations, and online advertising;
- Strengthen internal processes, resources, testing, documentation, or oversight;
- Adapt interfaces to provide more information to users;
- Take specific measures to protect the rights of children and adolescents.
These provisions were already present in Bill 2630. However, in the original version, they were accompanied by obligations—under the concept of “duty of care”—that required platforms to proactively moderate content in cases involving the following infractions:
- Crimes against the Democratic Rule of Law;
- Acts of terrorism and preparations for terrorism;
- Inducement, encouragement, or assistance in suicide or self-harm;
- Crimes against children and adolescents, or incitement to commit such crimes, or glorification of criminal acts or perpetrators;
- Racism;
- Violence against women;
- Public health violations, such as failing to execute, obstructing, or resisting health measures during a declared Public Health Emergency of National Importance.
Ineffective law
By removing the duty of care, Bill 4691 becomes an ineffective law, stripped of effective mechanisms to curb the spread of disinformation and other harmful online behaviors. The moderation obligations in the new proposal, such as those in Article 10, largely mirror practices that platforms have already followed for years, as they are essential to their business operations. Furthermore, Bill 4691 omits Chapter III of Bill 2630, which required platforms to notify users when their content was restricted or removed and to adopt a “due process” for moderation decisions, including protocols for appeals.
It is ironic that these provisions have been dropped, given that the censorship feared by the so-called defenders of absolute free speech already exists—but it is exercised solely by the platforms themselves. Over the years, many cases have emerged of posts being removed on platforms like Instagram or YouTube either by mistake or with no explanation. Bill 2630 would have expanded safeguards for users’ rights.
The new proposal also lacks other important advancements from Bill 2630, such as transparency in recommendation algorithms; access to platform data for researchers; external audits; recognition of top government officials’ profiles as public information; creation of an electoral crime of disinformation, among others.
At the same time, Bill 4691 revives identification provisions for internet users that security agencies and banking lobbyists have sought to implement for two decades, dating back to when former Senator Eduardo Azeredo pushed for user registration laws. It also imposes a 5% revenue contribution from platforms to the Universalization Fund for Telecommunications Services (FUST). Although this might seem positive at first glance, it ultimately benefits the platforms by expanding the user base for their services.
Moreover, while Bill 2630 designated the Brazilian Internet Steering Committee (CGI.br) as the regulatory body, Bill 4691 assigns that role to Anatel (the National Telecommunications Agency) and the National Data Protection Authority (ANPD). Notably, in April 2025, the president of Anatel publicly offered the agency to serve as the internet regulator during a Congressional Communications Council hearing, where he also advocated for the passage of Bill 4691. That same month, Bill 4557/2024—authored by Silas Câmara—began moving through Congress, proposing to place CGI.br under Anatel’s control. This prompted CGI.br to issue a public statement rejecting the proposal.
Such a change would be disastrous. While CGI.br has a history of critical engagement and defending citizens’ digital rights, Anatel—like other regulatory agencies in Brazil—has been criticized for poor oversight and instances of regulatory capture.
Looking at the broader picture—the Big Tech lobbying among far-right and evangelical lawmakers to kill Bill 2630; the introduction of Bill 4691 by Bolsonaro-aligned deputies; the attempt to subordinate CGI.br to Anatel; and Anatel’s self-appointment as Brazil’s internet regulator—it is evident that public interest is in check in the ongoing debate over digital platform accountability.
Whether this scenario will evolve into a checkmate by lawmakers compromised by lobbyists and vested in maintaining informational chaos—or not—is a question only civil society can answer.
*Machine translation proofread by Janaína da Silva.