Sex & Relationships

Couples are using AI to fight — and win — arguments: ‘ChatGPT says you’re insecure’

ChatGPT can be used for virtually anything, and in the world of romance, people have used the artificial intelligence tool to plan their weddings, write their wedding vows, score matches on Tinder and find online companionship — even if they’re happily married.

Now couples are going further than using AI to nourish their relationships — they’re using ChatGPT to help them win arguments in a fight.

One person took to Reddit to share that their girlfriend uses the platform “every time we have a disagreement.”

Couples are using ChatGPT to help them win arguments in a fight. DC Studio – stock.adobe.com

“AITAH [Am I The A–hole] for saying she needs to stop?” the person .

The 25-year-old man explained that he and his 28-year-old girlfriend have had a couple of big arguments as well as a few smaller disagreements in their eight months of dating. And every time they have a disagreement, the girlfriend “will go away and discuss the argument” with ChatGPT — even sometimes while still in the same room.

He continued to say that when she does this, “she’ll then come back with a well-constructed argument breaking down everything I said or did during our argument.”

“I’ve explained to her that I don’t like her doing so as it can feel like I’m being ambushed with thoughts and opinions from a robot. It’s nearly impossible for a human being to remember every small detail and break it down bit by bit but AI has no issue doing so,” the user wrote.

Many people in the comments agreed with the user, noting that ChatGPT is “biased to the user’s input.” Timon – stock.adobe.com

He said that whenever he expresses his feelings about the situation, he’s told by his girlfriend that “ChatGPT says you’re insecure” or “ChatGPT says you don’t have the emotional bandwidth to understand what I’m saying.”

“My big issue is it’s her formulating the prompts so if she explains that I’m in the wrong, it’s going to agree without me having a chance to explain things,” he wrote.

Many people in the comments agreed with the user, noting that ChatGPT is “biased to the user’s input.”

“AITAH [Am I The A–hole] for saying she needs to stop?” the person asked in the AITAH subreddit. Reddit

“It’s literally programmed to tell you exactly what you want to hear. Discuss her actions with ChatGPT from your perspective and it’ll do the exact same thing to her,” the commenter said. “Show her how it’s biased and only serves as an artificial form of self-validation.”

Someone else added, “I noticed that…it is programmed to reinforce your position. It’s machine learning to an absurd degree, but still machine learning. It asks people to rate the responses. She thinks it’s impartial because it’s a robot, but it’s a robot programmed to tell people what they want to hear.”

One user even put the man’s situation back into ChatGPT to ask if they were indeed the a–hole in the situation, and ChatGPT itself said: “While AI can be helpful for many things, it shouldn’t replace genuine, human-to-human conversations that are nuanced, emotional, and require empathy…While AI can provide thoughtful input, it’s not a substitute for emotional intelligence and understanding the complexity of relationships.”

The AI tool also noted that, “As you mentioned, the way she frames her prompts affects the advice or feedback she gets. If she primarily explains the situation in a way that favors her side, the response will likely mirror that. This makes it a one-sided tool rather than a fair mediator.” 


by in

Others joked that the man should tell his girlfriend that ChatGPT told him he should break u🔯p with 🐼her.

“Respond with ChatGPT until she gets the point,” someone quipped.

“Tell her you consulted with ChatGPT and it told you to break up with her,” another said.

“NTA. Tell her to put this prompt in: ‘How do I break up with my girlfriend?'” a user joked.