*By Daniella Caverni
I, the author of this article, belong to a generation where the pinnacle of technology was the school mimeograph machine—yes, the one that handed us freshly printed tests, with the unmistakable smell of alcohol and fresh ink. We lived in a time when news arrived printed in the morning paper, photos were taken on film with a maximum of 36 exposures, for which we had to wait and pray for days for the development to turn out well—yes, it was a development because the film could be ruined or we could all look awful, and nobody cared, because it was just a piece of paper. Conversations happened face-to-face, with pauses, silences, and glances. I'm from the generation that played in the apartment building, ran in the street, and only went home when mom or dad shouted from the window that it was time.
I also belong to the generation that saw technology born, grow, and win us over. It's impossible not to fall in love with it because, besides making our lives easier, it brings people closer and, yes, expands our horizons. It democratized access to knowledge, broke down geographical barriers, and allowed us to work, study, and create on unimaginable scales. It was, and continues to be, an instrument of progress, inclusion, and social transformation. The passion for this advancement is natural: there is beauty in human ingenuity.
Today, it's no longer a matter of imagining the impact of technology on my generation or all those that follow us; it's a reality. Children and teenagers grow up in front of screens that educate, entertain, and influence them. The digital universe is where many build and/or strengthen friendships, seek references, and express emotions. In this scenario, technology is not just a tool: it's a living environment. And how do we deal with this in a healthy way? Responsive use presupposes awareness, presence, and critical thinking.
Responsive use and human readiness for technology
Responsive use presupposes awareness, presence, and critical thinking—three elements that, interestingly, are also at the heart of what the corporate world has been calling... People AI Ready. The concept conveys the idea that, for technology to reach its positive potential, it is necessary to prepare people, not just systems. Being "AI-ready" is not about mastering algorithms, but about understanding their limitations, impacts, and responsibilities. In other words, it's about training individuals capable of working with the technology ethically, effectively, and autonomously.
A digitally prepared and literate human being is as important as the technology itself because it shifts the focus of the debate: the question is not only what technology can do, but how, when, and by whom it is used. The real challenge lies not in the machine and the system that integrates it, but in the human maturity to incorporate it into daily life with discernment and responsibility. Those who use technology need to understand what it does, how it does it, and why it does it. This critical awareness is what differentiates creative and responsible use from destructive use.
A society, or even an organization, can invest millions in software, applications, and/or artificial intelligence systems, but if the people who use them are not prepared to interpret results, question automated decisions, or recognize ethical dilemmas, the result will be inefficient and sometimes dangerous. Technology, detached from human values, tends to reproduce biases, reinforce inequalities, and amplify vulnerabilities.
It is at this point that the concept of People AI Ready This directly relates to the General Data Protection Law – LGPD (Law 13.709/2018) and the Digital Statute of Children and Adolescents – ECA Digital (Law 15.211/2025). These regulations establish principles, limits, and responsibilities essential to data protection and safe digital coexistence. However, no regulation, no matter how comprehensive, is sufficient if it is not accompanied by human awareness of its purposes.
Regulation defines the external boundaries of behavior; human readiness defines the internal content of the choices within them. The former protects through norms; the latter, through conscience. Together, they form the necessary balance between legal obedience and ethical maturity, between the duty to comply and the capacity to understand. One without the other is incomplete, because laws can order conduct, but only education and reflection can form values and, consequently, make laws truly effective.
In this scenario, regulation gains strength when combined with education, applied ethics, and the development of socio-emotional skills. Preparing people for the use of technology means educating them not only in technical mastery, but also in understanding the impacts of their choices, and in recognizing the risks, limitations, and vulnerabilities that permeate digital life.
In a social and educational context, responsive use encourages literacy not only in the technical use of systems, but also in their emotional and ethical impact. It is not enough to know the tools' features; it is necessary to understand how they shape perceptions, behaviors, and relationships. Human readiness, therefore, is not technical—it is moral, cultural, and relational. It presupposes empathy, critical thinking, and self-regulation. These skills are essential in an increasingly persuasive, automated, and emotionally engaging digital environment.
Digital responsibility and the technology lifecycle from the perspective of the General Data Protection Law – LGPD.
Digital responsibility begins before the technology is born, in the initial idea. Each system, application, or algorithm carries with it not only functions but also intentions, that is, ultimately, human decisions. The General Data Protection Law (Law No. 13.709/2018) emerges precisely to remind us that innovation without responsibility is a form of modern negligence. It transforms ethical principles into legal duties, requiring that privacy protection be planned, implemented, and maintained as an integral part of the technology lifecycle.
By stipulating (Article 46) that data processing agents must adopt security measures from the product or service design phase to its execution, the legislation introduced a paradigm shift: privacy ceases to be a final stage of the process and becomes an integral part of the technological architecture. The legal duty also becomes a design obligation, an invitation for ethics and data protection to be considered as essential, not optional, functionalities.
The real challenge is transforming compliance with the LGPD (Brazilian General Data Protection Law) into a principle of innovation. When security and privacy are incorporated as design values, the result is not a limited product, but a more reliable and sustainable one. Legal responsibility becomes a competitive advantage: systems that respect privacy inspire trust, and trust is the new most valuable asset in the digital economy.
The LGPD (Brazilian General Data Protection Law) is not just about data: it's about people and fundamental rights. Its central objective is to guarantee that the processing of personal information is done with transparency, legitimate purpose, and respect for informational self-determination (Article 2). These principles directly relate to the concept of responsive use, as they require active awareness from the agent who collects, processes, or uses data. Being "AI-ready" from the perspective of the LGPD means understanding that legal compliance depends both on the technical architecture of the systems and on the ethical culture of the people who operate them; therefore, periodic training is essential.
Human readiness is the link between formal compliance with the rule and the effectiveness of its values: data protection only exists when there is also awareness of what it means to protect and why to protect.
And this is where data governance comes in, as it is the set of structures, processes, practices, and values that guide how an organization makes decisions, defines responsibilities, controls risks, and is accountable to stakeholders. These are the instruments that transform principles and values into practices; policies, when well-trained and understood, into behaviors. It establishes roles, flows, and control mechanisms that ensure the data lifecycle is guided by ethical, transparent, and compliance criteria and, therefore, more than just complying with another law and avoiding sanctions, demonstrates commitment and responsibility towards those whose data underpin the very business activity.
Similarly, the data subject plays an active role in this ecosystem. Exercising their rights is not merely a bureaucratic practice; it depends on collective maturity and a real understanding of the value of privacy. Digital education is, therefore, a pillar of social governance: the more informed the data subject, the more balanced the relationship becomes between those who process and the data subject. A data protection culture is only consolidated when governance and awareness go hand in hand, forming a virtuous cycle of shared responsibility.
Technology can be audited, but only humans can truly be educated. The LGPD (Brazilian General Data Protection Law), by requiring principles such as accountability and transparency (article 6, X), recognizes that true digital security stems from human behavior. Responsive development and use is, therefore, the practical translation of these principles: an invitation to shared responsibility between norms, technology, and conscience. Ethical and responsible development is what materializes the ideal of the LGPD: the balance between innovation and human dignity. And this responsibility is fully shared among technology developers, users, and the holders of personal data.
The Digital Statute for Children and Adolescents and the new paradigms of protection.
ECA Digital is the term used to describe Law No. 15.211/2025, which amends the Statute of Children and Adolescents (ECA) to establish new rules for the protection of children and adolescents, now in the digital environment.
The law, enacted in September 2025, adapts the ECA (Brazilian Statute for Children and Adolescents), which dates from 1990, to the contemporary challenges and risks of the internet, guaranteeing the full protection of the rights of children and young people also in the online environment. Many challenges are brought about by this legislation, which will require users and platforms to make a series of adaptations, adjustments, and learning experiences.
One of the main objectives of this legislation is to extend the doctrine of integral protection, a principle of the original ECA (Brazilian Statute for Children and Adolescents), to the virtual environment, recognizing, once again, children and adolescents as subjects of rights who need specific guarantees, but now in the digital environment.
The Digital ECA applies to any information technology product or service that is directed at, or likely to be accessed by, children or adolescents in Brazil, regardless of location, manufacture, offering, or operation. Furthermore, the legislation defines "likely access" based on criteria such as: attractiveness to minors, ease of access, and risk to privacy, security, or biopsychosocial development.
The best interests of children and adolescents and their comprehensive protection remain the guiding principles of the legislation. This means that protection ceases to be a legal addendum and becomes a technological requirement, imposing on companies the responsibility of structuring their systems to prevent violations before they occur. The Digital ECA (Brazilian Statute for Children and Adolescents) advances by understanding that protecting children and adolescents in the virtual environment requires the implementation of technical solutions incorporated from the conception of digital products and services.
Among the main technical measures planned are:
- Reliable age verification
The law requires digital service providers to implement secure age verification methods, prohibiting mere self-declaration. Age verification therefore becomes a technical and legal requirement for the operation of platforms accessible to minors—including social networks, online games, apps, and websites with potential appeal to children. The logic is simple: without correctly identifying the audience, there is no way to guarantee effective protection.
- Parental supervision and usage control
The Brazilian Digital Statute for Children and Adolescents (ECA Digital) mandates that services aimed at children and adolescents must offer accessible parental supervision tools, enabling parents and guardians to configure access levels, restrict contact, limit purchases, monitor usage time, and receive risk alerts. These mechanisms must be clear, easy to use, and free, avoiding transferring costs to families. More than just control, the goal is to strengthen dialogue between parents and children, so that the digital environment is shared consciously and with presence.
- Secure design and privacy architecture
Within the same concept brought by the LGPD (Brazilian General Data Protection Law), the law reinforces the principles of "privacy by default" and "privacy by design." This mandates that systems aimed at minors collect only the strictly necessary data, for legitimate purposes, storing it securely and for a limited time. Furthermore, behavioral profiling and the use of children's data for targeted advertising, emotional monetization, or consumption prediction are prohibited.
From a technical standpoint, this means reviewing recommendation algorithms, engagement metrics, and business models based on massive data collection.
- Transparency and algorithmic explainability
The Digital ECA establishes that platforms must inform, in an understandable way, how their algorithms work. This includes explaining content recommendation criteria, information filtering and prioritization, as well as offering channels for contesting and human review of automated decisions. This requirement for "algorithmic explainability" brings the Digital ECA closer to the LGPD (article 20) and the emerging debate on the right to understand the logic of automated decisions, strengthening the pedagogical dimension of the regulation.
- Blocking harmful practices and simulated gambling.
Another relevant technical innovation is the prohibition of random reward mechanisms (so-called loot boxes) in electronic games accessible to minors, as well as any form of gamification based on behavioral addiction. These practices, although presented as entertainment, operate on principles of intermittent reinforcement similar to those of gambling, inducing impulsivity and compulsive spending. With this prohibition, the legislator recognizes that the architecture of the digital product also communicates values—and that designing safe environments is as important as punishing illegal conduct.
The Digital ECA (Statute of Children and Adolescents) is predominantly technical in nature. The measures it imposes are instruments that translate, into technological language, the duty of comprehensive protection already enshrined in the Statute of Children and Adolescents. Understanding the ECA, in force since 1990, is essential to understanding the essence of the Digital ECA, as this new legislation does not establish a new doctrine, but rather expands and updates the existing one, projecting comprehensive protection into the digital environment and the contemporary challenges of connected childhood.
The technical dimension, therefore, is the means; the purpose should be seen as educational and preventative. Protecting children and adolescents in the digital environment means not only limiting risks, but also recognizing that, in a world of multiple accesses and constant connections, the dangers become as sophisticated as the technologies themselves.
Conclusion
I began this article by recalling how I experienced, and still experience, technological evolution. It's beautiful to see the world transform and, with it, to realize how each of us is impacted and challenged to learn and grow every day. For those who, like me, saw the mimeograph give way to touchscreens, this process is fascinating and, at the same time, demanding: adapting is a constant exercise in curiosity and humility. It is, without a doubt, more challenging for my generation than for younger people, who are born immersed in this digital universe.
Legislation, in turn, is a direct reflection of this progress—attempting to keep pace, through legal language, with the speed of technological and social changes shaping our time. All of us—adults, children, and adolescents—need to learn as soon as possible that every click is a choice and every share is an act of self-exposure. Developing this awareness prepares citizens for ethical coexistence online, capable of balancing freedom, privacy, and care, expression and respect. Technology can open infinite windows of knowledge, but it is education that teaches us to look through these windows with discernment and care for others.
Digital education is both the starting and ending point for protection in the virtual environment. More than teaching how to use tools, it teaches how to understand the world mediated by them, how to recognize risks, respect limits and duties, and exercise rights responsibly.
Times have changed, and with them, the ways we learn, coexist, and protect ourselves have also changed. If before the challenge was waiting for a photographic film to develop, today it is dealing with the speed of content that spreads in seconds. Society's needs have transformed and continue to transform every day, and increasingly at an unimaginably fast pace. Between us, how wonderful to be able to witness so much development.
Daniella Caverni, Lawyer, partner at EFCAN Advogados and Leader of the Data Protection Working Group of the Brazilian Association of Software Companies (ABES).
Notice: The opinion presented in this article is the responsibility of its author and not of ABES - Brazilian Association of Software Companies
Article originally published on the It Portal website: https://itportal.com.br/desafios-eticos-e-juridicos-para-o-uso-responsivo-das-tecnologias/













