If you start to cancel outings with friends to stay chatting with Chatgpt, then this study will interest you, because no you are not alone & mldr; And yes, you may have a problem.

Indeed, a new study carried out by Openai and the MIT Media Lab and exploring 40 million interactions with Chatgpt reveals an unexpected phenomenon to say the least: some users develop a real emotional dependence on the great models of language (LLM).

We are talking about “Power users” Who, although very aware of how LLMS work, fall into the illusion of a friendship relationship with Chatgpt. To achieve this observation, the researchers studied a monstrous conversation sample and even set up a controlled study over 1000 participants for four weeks in order to understand what is happening in the heads of intensive conversational AI users.

First reassuring observation: the vast majority of users do not have an emotional relationship with Chatgpt. Phew, we are not yet in the scenario of Her On a large scale but there is still a small group of Power Users, which develops a disturbing attachment.

These intensive users have classic addiction signs: constant concern,, Withdrawal symptoms When they don’t have access to it, loss of control on use, and mood modification linked to use. Doesn’t that remind you of anything? Ah yes, exactly the same symptoms as for social networks, video games or your favorite porn site.

The craziest in there is that these users say they are stressed by the subtle changes in model behavior After updates. You surprise me & mldr;

The study also reveals a lot of counter-intuitive things & mldr; Indeed, contrary to what one might think, users express More emotions with chatgpt in text mode than in vocal mode which however seems more “human”.

Another paradox, people who use chatgpt for personal conversations (emotions, memories) develop less emotional dependence than those that use it for impersonal tasks such as brainstorming or practical advice. As if talking about work to your AI, made her more endearing than telling her about your heart problems.

Human psychology is decidedly a very thick mystery!

The study identifies some risk factors that make us go from “I use chatgpt for my job” has “I tell my day to Chatgpt before falling asleep”. First of all, people with a Strong tendency to attachment In their human relationships are more likely to develop a relationship of friendship with AI. Likewise, users more socially isolated find in Chatgpt a substitute for social relation which does not judge, never interrupt and is available 24/7.

But the most decisive factor remains the duration of use Because whatever your psychological profile or your type of use, the more time you spend with Chatgpt, the more you risk developing an emotional attachment. It is mathematical, and frankly not very reassuring.

These results therefore raise major ethical questions for AI developers like Openai who seeks to create useful assistants without promoting dependence. But don’t be fooled, we are only at the very beginning of this adventure. The AI ​​will continue to progress in social skills and realism, which will worsen the phenomenon. It is enough to see how some are attached to folder or character. To understand that we have not yet reached the peak of emotional dependence at AI.

Personally, I am quite Power use LLMS but not to the point of feeling “friend” with these tools. I often manage to brainstorm vocal for several tens of minutes to several hours with Chatgpt to explore certain subjects. And I say “hello”, “thank you”, “please”, and above all I always ask him to help me including personal concerns. And I believe that the fact of formulating help and receiving it can help to perceive AI as a friend.

In any case, that’s what I interpret it. Now, I too am totally addicted to this technology when it came out. As a drug & mldr; And even if objectively, I could do without it, it would ask me for a big effort to go back that I do not want to do.

However, the real question is perhaps not so much dependence on AI as what it reveals from our human interactions. If thousands of people prefer to discuss with a machine rather than with their fellow men, it may be that our current social relationships are to shit. Or when we are just feignasses and we prefer a relationship without conflict or effort.

And you, do you recognize signs of excessive attachment in your own use of LLM? Maybe you already felt frustration when the service was unavailable? That you have already been surprised to want to share good news with AI before your loved ones?

If so, then welcome to the club. We don’t have badges or official motto yet but we could ask Chatgpt to find all of us!

Source


Source link

Categorized in: