By Paola Cantarini
Artificial intelligence has emerged as an ontological force capable of reshaping the desire to know. Instead of expanding our cognitive autonomy, generative systems—especially large language models—shift the deferred pleasure of investigation toward instant gratification, compressing the interval where curiosity historically formed. This shift, which I call... Algorithmic Erotic Cognition (AEC) alters not only the production of knowledge, but also the affective, epistemic, and political structure of subjectivity itself.
Recent MIT research on "cognitive debt" shows that the recurring delegation of inferential tasks to machines reduces neural connections associated with imagination and metacognition. In parallel, studies on... model collapse Studies reveal that training based on synthetic data generated by other AIs leads to semantic homogenization and the erosion of epistemic diversity—the very opposite of the conditions that make research possible.
From a philosophical standpoint, AEC (Autonomous Electronic Communication) echoes concerns raised by Stiegler, Simondon, Peirce, Damasio, and others: technicalities that accelerate cognitive action without friction erode attention, impoverish doubt and otherness, and transform thought into predictive consumption. When interfaces eliminate error, they also eliminate learning. Reason without risk becomes calculation, and calculation without hesitation produces docile subjectivities—functionally efficient, but epistemically fragile.
This process is not merely technical: it is affective, bodily, and political. Algorithmic acceleration weakens the sensorimotor-affective circuit that sustains wonder, surprise, and openness to the unknown. Furthermore, the advancement of erotic affordances and the monetization of intimacy—phenomena increasingly visible on AI platforms—amplify risks to mental health, child protection, and the formation of asymmetrical emotional bonds mediated by opaque systems. This represents a reconfiguration of the economy of desire under the logic of proprietary engagement.
Given this scenario, the answer is not to reject machines, but to reinscribe them into the cultural circuit of meaning, rethinking our relationship with technology on new foundational bases. Inspired by Gregory Bateson, we argue that AI systems should preserve complexity and produce "cognitive friction": pauses, counterexamples, alternatives, zones of uncertainty that reintroduce abduction—the capacity to launch oneself into the possible and also to imagine the impossible. It is not about slowing down innovation, but about preventing the imperative of efficiency from destroying the very eros that sustains thought, when innovation is sustained only as an end in itself and becomes detached from the world of life.
If AI is a pharmakon — remedy and poison — the civilizing task and central philosophical question lies in how to dose their use in a way that complements the human being, and not replaces them, as well as in keeping us human. Care-oriented interfaces, regulatory standards proportionate to risk, and algorithmic literacies that value imagination, autochthonous and disruptive thought, silence, pause, and doubt are essential steps so that technology does not replace the human, but amplifies it. The future of knowledge depends, above all, on restoring the interval where the desire to know is born, and when we regain the pleasure of knowing and the knowledge that feels above all.
*Paola Cantarini is a researcher at the Think Tank of the Brazilian Association of Software Companies (ABES).
Notice: The opinion presented in this article is the responsibility of its author and not of ABES - Brazilian Association of Software Companies
Article originally published on the GUIA DO PC website: https://www.guiadopc.com.br/artigos/55854/hermeneutica-erotica-da-ia-e-cognicao-erotica-algoritmica.html













