Should you use ChatGPT to win an argument? I spoke to mental health and relationship experts to find out
Can AI help resolve conflict, or will it just make things messier?

We’ve all been there, struggling to find the right words in a difficult conversation, frustrated by an argument we can’t quite get our heads around, or convinced we’re right but unable to make our case. If you’ve ever considered turning to ChatGPT for help, you’re not alone.
One of the top reported uses of ChatGPT is writing emails. But plenty of people are taking this further, feeding not just emails but text messages and even voice notes into AI to analyze intent, craft persuasive responses, and try to ‘win’ arguments.
On the surface, this makes sense. AI can strip away messy emotions, offer a neutral perspective, and help structure arguments more clearly. ChatGPT has already been used to mediate legal disagreements, engage with conspiracy theorists, and can be highly persuasive in helping people to see different perspectives.
But does that mean it can help you ‘win’ a debate with your friend, boss, or partner? Not so fast. Just as we explored whether ChatGPT could replace a therapist, it turns out that while AI can provide clarity, relying on it to resolve conflict can make things messier, not easier.
I spoke to mental health and relationship experts about AI’s benefits, pitfalls, and limits as a neutral referee.
Why you shouldn’t rely on ChatGPT to resolve conflict
Let’s start with the obvious. ChatGPT isn’t human and, therefore, it lacks genuine emotional intelligence.
“Conflicts and deep arguments need active listening, empathy, and understanding different viewpoints, which AI lacks,” says Dr. Michael Kane, a psychiatrist and chief medical officer at Indiana Center for Recovery. “Authentic communication depends on full participation in which all of the contributors feel heard and valued, something machines are incapable of providing.”
Get daily insight, inspiration and deals in your inbox
Sign up for breaking news, reviews, opinion, top tech deals, and more.
Jordan Conrad, a psychotherapist and researcher at Madison Park Psychotherapy who focuses on the application of artificial intelligence in the mental health fields, tells me that if you feel the need to ask ChatGPT to help you ‘win’ an argument, you might have already lost.
“The biggest problem with using AI to 'win' arguments is that it demonstrates that you are not 'in it' together – you are trying to win, not resolve the issue. That is a big red flag,” Conrad explains. “As corny as it sounds, in relationships, arguments have to be framed as 'you and me vs. the problem,' not 'you vs. me.' If one person 'wins' and the other 'loses,' then you both lose.”
Disagreements should be about finding resolution, not scoring points. AI doesn’t change that dynamic, it might even make it worse.
This reminds me of a viral Reddit post from last year, where a man claimed his girlfriend would pause arguments to ask ChatGPT for help. And (would you believe it?) somehow, it always seemed to side with her.
“Couples need to find a way for both people to walk away feeling good about the disagreement. If one person dominates the other, you’re creating a feeling of disconnection,” Conrad explains.
There’s also a bigger issue here, that relying on AI to navigate conflicts could weaken our critical thinking skills over time. “AI over-reliance in emotionally charged situations may impair an individual’s ability to feel, process emotions, and hone vital talking skills,” says Kane. The more we outsource conflict resolution to machines, the less we engage in the messy but necessary human process of learning to communicate effectively.
Beyond that, AI may sound confident, but that doesn’t mean it’s right. We already know it can get things wrong, even hallucinate (make up information). This is because it generates responses based on probabilities rather than deep understanding. “It is not at all clear that, at this stage, if AI can genuinely win an argument and not simply do what it is programmed to do, which is provide the statistically most likely word in a sequence,” Conrad explains.
AI doesn’t ‘think’ the way we do, it just predicts the next most logical word based on its training. That can be useful, but it’s also a major limitation, especially in emotionally complex situations. “A calculator does not actually know arithmetic, it just solves the problem, and similarly, Microsoft Word does not actually know English grammar,” Conrad tells me.
Even seemingly simple phrases like "I love you" can carry vastly different meanings depending on context, something AI struggles to interpret. “A romantic partner and a drunken stranger saying the same words convey entirely different meanings. Likewise, a person with a history of trauma will interpret ‘I love you’ differently from someone with secure attachments,” Conrad says. AI lacks the depth to truly mediate between two people with different life experiences and emotional responses.
Why ChatGPT can sometimes be helpful
Despite its flaws, AI can be used to help during arguments and disputes, to a point.
One major advantage is that it can help you organize your thoughts. High-stress situations can make it difficult to articulate what we mean, and AI can assist with that. “I think it's okay to use ChatGPT to organize your thoughts or get your points across more clearly, especially when you’re trying to express something complicated or when emotions might get in the way,” says Michelle English, a Licensed Clinical Social Worker and Executive Clinical Manager at addiction treatment center Healthy Life Recovery.
“For example, if you’re preparing for a tough conversation, ChatGPT can help you outline your message in a calm and clear way, so everything you want to convey is included and the message comes across effectively,” she tells us.
It can also serve as a useful brainstorming tool. “It's helpful for brainstorming ideas, finding the right words, or getting additional insight on something,” says English. Whether you need to think through different approaches to an argument, generate counterpoints, or explore alternative ways to phrase something, AI can provide a starting point.
AI can sometimes offer a neutral perspective. When emotions run high, a third party can help reframe things – but finding someone truly impartial (and willing to drop everything to mediate) isn’t always easy. While ChatGPT doesn’t fully understand emotions, it can at least take some of the heat out of a situation.
“ChatGPT can be a great tool to help reframe your points in a calmer, less accusatory tone. If you’re feeling emotionally charged, ask ChatGPT to help identify and articulate the other person’s viewpoint so you can approach the conflict with more empathy,” says Amanda LaMela, an Advanced Clinical Intern at PAD Mental Health.
Finding a middle ground
As with so many AI applications, ChatGPT isn’t a replacement for human communication, it’s a tool. Like therapy, using AI for support is fine, but outsourcing the entire emotional process? Not so much.
“A more effective approach would be using AI as a tool to provide objective information that helps discussion while also preserving the human element of communication,” says Kane.
For me, context is everything. A disagreement with a romantic partner, where honesty and authenticity matter most? I wouldn’t dream of using AI. Navigating a workplace conflict where I want to double-check my tone? Maybe.
Even then, it might not just be about context – it’s about the nature of the argument itself. Using AI as a mediator in a relationship feels wrong to me. But if a couple kept circling the same unresolved issues, both clouded by emotions and past experiences, could AI help? Maybe. Especially for those who can’t afford relationship counseling.
Still, that assumes AI has some all-knowing wisdom. And we know it doesn’t. It’s prone to hallucination and responds only as well as the prompts you feed it – this makes me worry it could be extremely biased in its views, depending on who asks what and how.
So where does that leave us? I like LaMela’s advice to see this like cooking. Bear with me.
“Using ChatGPT to win an argument is a bit like seasoning your cooking with salt,” she says. “Use it thoughtfully, and it can enhance clarity, smooth out bitterness, and help avoid burning bridges. But too much reliance on AI, and you're just serving up something artificial and gross.”
So, should you use ChatGPT to win an argument? Probably not. But can it help you communicate more clearly and approach discussions with more balance? Maybe. The trick is knowing when to use it, what to ask, and when to step away from the screen and have a real, vulnerable, messy conversation.
You might also like
- I asked ChatGPT to work through some of the biggest philosophical debates of all time – here’s what happened
- OpenAI confirms 400 million weekly ChatGPT users - here's 5 great ways to use the world’s most popular AI chatbot
- I've become a ChatGPT expert by levelling up my AI prompts – here are my 8 top tips for success
Becca is a contributor to TechRadar, a freelance journalist and author. She’s been writing about consumer tech and popular science for more than ten years, covering all kinds of topics, including why robots have eyes and whether we’ll experience the overview effect one day. She’s particularly interested in VR/AR, wearables, digital health, space tech and chatting to experts and academics about the future. She’s contributed to TechRadar, T3, Wired, New Scientist, The Guardian, Inverse and many more. Her first book, Screen Time, came out in January 2021 with Bonnier Books. She loves science-fiction, brutalist architecture, and spending too much time floating through space in virtual reality.
You must confirm your public display name before commenting
Please logout and then login again, you will then be prompted to enter your display name.



















Gemini Deep Research is now free - here are 4 ways to get the most out of Google’s awesome AI tool

5 better prompts to use with ChatGPT