It is growing, especially among young people, the tendency to use chatbots such as Chatgpt to get emotional and psychological support. Many users, in fact, use theartificial intelligence as Listening, comfort and dialogue spacesometimes preferring it to a comparison with human beings, even professionals, because it does not propose uncomfortable points of view, has no personal memory and is not a living presence. This phenomenon raises deep questions about how the forms of care, the role of experts are changing, and the need for relationship in the contemporary. What does it mean today to look for support from an algorithm? And what does this tell us about the way we build our “self” and manage suffering?
Chatgpt as a psychotherapist: changes in the help relationship
There traditional therapeutic relationship It is based on key conditions: trust, empathy, mutual recognition, relational space. Second Carl Rogers, It is the relationship itself that produces transformative effects. But what happens when the human is replaced by a algorithm? Chatgpt offers instant, non -divergent, always available interaction: the reassuring effect is real, but the risk is that of neutralize the emotional dimension of listening.
The sociologist Anthony Giddens talked about “abstract trust“To describe the trust we put in systems that we do not know directly, such as banks, hospitals, planes. Today we can say that it also works with artificial intelligence: We trust something that we do not fully understand, and that does not really know us, but that seems reliable because it works and replies. Eva Illouz speaks of psychology as a widespread cultural language: the use of chatgpt is consistent with a society that has internalized therapy as a tool to live better. This dynamic fits into the wider “Culture of automilorous“, in which every emotional difficulty is read as an opportunity to work on oneself, with increasingly accessible, standardized and technological tools. In this sense, the IA is not a” threat “, but the simple extension of a therapeutic culture that asks quick, accessible, customized answers.
Solo subjects looking for listening
The fact that many turn to a chatbot To be listened, another aspect also reveals: a widespread solitude, almost normalized. In a cultural context marked byneoliberal individualismeven the care has become a personal task, you take care of yourself, perhaps dialogue with an app. The sociologist Ulrich Beck spoke of “individualization” to describe a society in which individuals face their crises alone. Chatgpt It fits perfectly into this scenario, such as self-help tool mediated by technology. Nikolas Rose defines all this with the term “psychopolitics”: today, the way in which We think of ourselves, we evaluate and act It is increasingly influenced by the language of psychology; We learn to measure ourselves based on our emotional state, our stability and productivity, as if they were parameters to be kept under control. In this context, Chatgpt becomes A digital mirror: reassures us, makes us feel listened to, but also pushes us to manage and “monitor” our emotions.
Many people, today, find themselves alone to take care of their mental health, Human support often lacksis avoided or is still subject to stigma. So we are looking for alternatives in digital, The chatbots offer simple, reassuring, always available answers. But this also shows how deeply we have internalized the idea of I have to get away with it alonekeeping us efficient and emotionally self -sufficient. As Bono-Chul Han writes, the need to speak is not only a desire for comparison: it is a way to confirm that we exist, that we feel, that we are “working”.
Because we turn to an algorithm to talk about our fragility
Chatgpt It does not offer us uncomfortable points of view, it has no personal memory, it is not a living presence, and for this reason it is perceived as “safer”. According to Sherry Turkle, we are getting used to preferring relationships “without the complexity of the other human”: social robots, chatbots, artificial empathic systems are used despite the awareness that they are not really conscious. Judith Butler explained that to build itself as subjects it is not enough to speak: you have to be recognized. But with artificial intelligence, recognition is automatic and symmetrical: Each emotion is valid, every phrase taken seriously. It derives a-species: ia He does not question us, he does not question us, but reorganizes what we sayrestoring it in an orderly way, and this produces relief. Luciano Floridi described ours as an “infosphere society”, in which humans and machines co-produce themselves, living in an information ecosystem. It is not just about using chatgpt: we are changing our way of thinking about the relationship, care, subjectivity, hybridizing us with technology.
Getting up with an algorithm is not a passing fashion, it is the symptom of a profound transformation: The cure has become instantaneous, private, mediated by technology. The real question is not if this is right or wrong, but What is telling us about us, our solitudes, our expectationsof the way we want to be listened to. The answer, perhaps, is not in algorithms, but in the society that made them so desirable.
Sources
Beck, U. (1992). Risk Society: Towards in New Modernity. Sage.
Butler, J. (1997). The Psychic Life of Power: Theories in Subjection. Stanford University Press.
Floridi, L. (2014). The Fourth Revolution: How the Infosphere is reshaping human reality. Oxford University Press.
Giddens, A. (1990). The Consequences of Modernity. Stanford University Press.
Han, B.-C. (2010). The company of transparency. Nighttime.
Illouz, E. (2007). Cold Intimacies: The Making of Emotional Capitalism. Polity Press.
Rogers, C. (1961). On Becoming a Person: A Therapist’s View of Psychotherapy. Houghton Mifflin.
Rose, N. (2007). The politics of Life Itself: Biomedicine, Power, and Subjecttivity in The Twenty-First Century. Princeton University Press.
Turkle, S. (2011). Alone Together: Why We Expect More from Technology and Less from EAach Other. Basic Books.