Tech giants want us to fall in love with artificial intelligence
Joanna Stern, a journalist for the Wall Street Journal, smiles. And she asks a specific question to Rick Osterloh, Senior VP Platform and Device at Google. “It seems that you in Silicon Valley have this obsession with recreating the movie Her. But it doesn’t end well for humans, I don’t know if you remember that.” Osterloh’s response is a bit embarrassed: “Well, that’s certainly not our goal. We want artificial intelligence to be as useful to humans as possible.”
Post by @joannasternView on Threads
What’s new from Google, OpenAI and Apple
The context of this interview is the presentation of the new Google Pixel 9, the flagship smartphone of Big G. A device that, needless to say, has a series of AI functions inside. One of the central aspects of the presentation was Gemini Live, the evolution of Google’s artificial intelligence.
Gemini Live is a voice assistant, capable of conversing naturally with the user. The voice is more credible, capable of modulating the tone and intonation to get as close as possible to human speech.
It’s a victory for Google. The first major company to actually launch this evolution, that of AI as an assistant. OpenAI is working on it, which in May presented GPT-4o, the new model capable of speaking credibly with users (with the voice of Scarlett Johanson, but that’s another story). And Apple should launch something similar in September, which with iOS 18 and Apple Intelligence plans to give a significant evolution to the much-maligned Siri.
In short, it is a direction that Silicon Valley seems to have taken: the near future of artificial intelligence is the voice. An assistant, capable of doing something useful, that is also pleasant to talk to.
What are the risks?
And here we go back, to the beginning of this article. And to the film Herwhich tells a love story between a man and his voice assistant. The main risk is precisely the one described by director Spike Jonze: that people become too attached to artificial intelligence.
OpenAI itself says so, in a recent security report, admitting that this type of voice could lead some users to develop an emotional attachment to the chatbot. These indications are found in the GPT-4o System Card, a technical document that describes the risks identified by the company for the model, along with information on the safety tests and the measures taken to minimize those risks. “It’s our last day together,” one of the participants in the AI safety test reportedly told GPT-4o.
According to the Massachusetts Institute of Technology, “we are faced with a huge experiment underway in reality, without knowing exactly what impact these AI chatbots will have on us as individuals or on society”. After all, as credibility increases, the risk of the so-called ELIZA effect increases, that is, the tendency of people to give emotional meaning or intelligence to answers provided by computer programs. Thus, we develop bonds: artificial intelligence, after all, is always there and seems to understand our desires and needs. It appears, even more so if it is equipped with a human, living voice.
The goal: to build a relationship with the object
The phenomenon of AI companions already exists. There are dedicated apps, such as Replika and Character.AI, which have the very objective of providing chatbots – personalized or not – to combat loneliness, to keep people company. They work, especially in the United States: at the time of writing, Character is the fourth most downloaded entertainment app in the US. And it is this model, that of the relationship, that Silicon Valley seems to be looking for.
Because think about it, what happens if we get attached to someone or something? We always come back, we look for contact. If we are talking about an object, we continue to use it: it becomes part, in one way or another, of our days. It may be a coincidence, but today that is precisely the problem of generative artificial intelligence, of ChatGPT, Gemini or Claude: everyone has tried it, not so many use it every day, find an actual use in it.
Is this the – very risky – key?