A new experience of a couple with Chatgpt: NPR

A recent evening, my new boyfriend and I found in a birthday.
I accused him of having given in to his anxious thoughts.
“It’s hard to get out of my head,” said David. “The mental spiral is sometimes part of the nature of sensitivity – there is an emotional overflow from this.”
“Well, the spiral is bad,” he said, a woman who is unleashed.
Our different communication styles have fueled the tense exchange. Although I look practical and direct, it is contemplative and conceptual.
I felt that we could benefit from a mediator. So I turned to my new related consultant, Chatgpt.
Ai enters the cat
Almost half of the Z generation uses artificial intelligence for meeting advice, more than any other generation, according to a recent national match of match Group, which has the Tinder and Hinge dating applications. For the anecdote, I know women who have consulted AI chatbots on occasional and serious relationships. They spring up on the crushing, download screenshots of long text threads for dissection, assess long -term compatibility, resolve disagreements and even bass of the sound.
Kat, one of my friends who uses Chatgpt to eliminate dating prospects, told me that she had found it pretty objective. When the emotions could otherwise hinder, the chatbot helped her meet his standards.
“I have the impression that it gives better advice than my friends most of the time. And better advice than my therapist,” said Kat, who asked his first name because of fear that his use of AI could compromise future romantic links. “With friends, we all walk with our cut heads with regard to emotional situations.”
When applications question our old ways to find connection and intimacy, it seems ironic to add another layer of technology to meetings. But could Kat be on something? Perhaps an apparently neutral AI is an intelligent tool for solving relationship problems, without human luggage.
For journalistic purposes, I decided to immerse myself in the trend.
Let’s see what Chatgpt has to say about this …
Based on the theory that couples should look for therapy before major problems arise, I proposed to my boyfriend under six months old that we turn to an AI chatbot to get advice, assess the boots comments and share the results. David, an artist who is always ready for a good experimental project (no surname for him either!), Learned the field.
Our first foray into the councils of mediated couples by Chatgpt began with a question suggested by the bot to arouse a discussion on the health of our relationship. Did David have resources to help him manage his stress and anxiety? He did it – he was in therapy, did the exercise and had friends and a family of support. This reference to his anxiety then sent him to a tangent.
He thought about being a “type of sensitive artist”. He considered that women, who might like that in theory, do not really want to face emotionally sensitive male partners.
“I am supposed to be imperturbable but also emotionally vulnerable,” said David.
It opened. But I accused it of a spiral, projection of hypotheses and monologue.
While he chewed big ideas, I tried to bring the conversation back to our interpersonal friction. This is where Chatgpt came: I saved our conversation and downloaded the transcription to the bot. And then I asked a question. (Our cats have been strongly modified by brevity – he speaks a lot.)
David was incredulous. “It looks like a snapshot,” he said.
DETOURI thought. I returned to Chatgpt and I read:
It was an overwhelming summary. Was I, as Chatgpt suggested, carrying a level of professional exhaustion of emotional work at this early stage of the relationship?
Press objectivity
A human brought me back to reality.
“It might be true that you were doing more emotional work [in that moment] or at the individual level. But there is a huge bias, “said Myra Cheng, IA researcher and computer student at the University of Stanford.
The material on which the models of great language (LLMS), like Chatgpt, Claude and Gemini, are formed – Internet, mainly – has a “huge American and white and male bias,” she said.
And this means that all cultural tropes and biases are present, including the stereotype that women disproportionate emotional work in work and relationships.
Cheng was part of a research team that compared two data sets, each comprising personal advice: a set of data written by humans responding to the real world situations and the second set of data made up of judgments made by LLMS in response to the articles on the advice of Reddit (“Am I hole **?”) Forum advice.
The study revealed that LLMS systematically present higher sycophance levels – excessive agreement with or user flattery – than humans.
For gentle competence issues such as advice, sycophance in AI chatbots can be particularly dangerous, said Cheng, because there is no certainty as to its advice is reasonable. In a recent case revealing the dangers of a sycophantic bot, a man who had manic episodes said that Chatgpt’s claims had prevented him from asking for help.
So, looking for something closer to objectivity in the biased boot, I changed my accessory.
It was again: I was stuck in doing emotional work. I accused Chatgpt of continuing to lack balance.
“Why do you get a” clear communication “?” David asked me, as if I chose these words.
At this point, I asked Faith Drew, a wedding and family therapist under license based in Arizona who wrote on the subject, for advice on how to bring Chatgpt into my relationship.
This is a classic triangulation case, according to Drew. Triangulation is an adaptation strategy in relationships when a third party – a friend, a parent or an AI, for example – is brought to facilitate tension between two people.
There is value in triangulation, whether the source is a bot or a friend. “AI can be useful because it synthesizes information very quickly,” said Drew.
But the triangulation can pass from like when you do not keep your partner’s sight in the equation.
“A person comes out and tries to get answers by himself – “I’m just going to talk to AI,” she said.
The bot may even not have the ability to keep me responsible if I do not feed it all the necessary details, she said. In this case, the triangulation is precious, she said: “If we ask the right questions in the bot, as:” What is my role in the conflict? “”
The breakthrough
In search of neutrality and responsibility, I again calibrated my chatbot. “Use a language that does not blame,” I ordered. Then I sent him the following text of David:
I feel like I’m accusing myself of not listening until I even have a chance to listen to. I make myself available and open and vulnerable to you.
“What is missing on my side?” I asked Chatgpt.
After a lot of flattery, he finally replied:
I found its response simple and revealing. In addition, it was precise.
He picked up a lot of soft in the relationship lately. He made me dinners when the work kept me late and put aside his own work to please me in long -term conversations and opened by AI.
I thought about a point that Drew did – on the importance of putting work in our relationships, especially in uncomfortable moments, instead of counting on AI.
“Being able to sit in distress with your partner-it’s real,” she said. “It is normal not to have the answers. It’s ok to be empathetic and not know how to fix things. And I think that is where the relationships are very special – where AI could never be a replacement.”
Here is my recovery point. Chatgpt had a little overview of our relationship and its dynamics. The relationships are fluid and the chatbot can only capture an snapshot. I called AI in times of tension. I could see how this reflex could feed our discord, not help repair it. Chatgpt could be hasty to choose sides and often decided too quickly that something was a model.
Humans do not think and do not always behave in predictable models. And chemistry is an important factor in compatibility. If an AI chatbot cannot feel the chemistry between people – to feel it, recognize this magical thing that occurs in a three -dimensional space between two imperfect people – it is difficult to trust the machine with regard to something as important as relationships.
On a few times, we both estimated that Chatgpt had given objective and creative comments, offered a valid analysis of our communication styles and defused certain disagreements.
But it took a lot of work to be interesting. In the end, I prefer to invest this time and this energy – what Chatgpt could call my emotional work – in my human relationships.



