Should your child talk to a chatbot? Experts warn AI mental health apps may do more harm than good
Experts urge caution as AI therapy apps emerge for children’s mental health

As AI mental health tools surge in popularity, experts are speaking out about a worrying new trend: chatbots and mood tracker apps marketed for children's emotional wellbeing – despite having been designed for adults.
These AI-powered apps, some of which simulate therapeutic conversations, are being positioned as cheaper, more accessible alternatives to traditional therapy.
But while they may seem like a quick fix in a stretched mental health system, experts say the risks for children may outweigh the benefits.
Why parents should pause before downloading
“Children are particularly vulnerable,” says Dr Bryanna Moore, Assistant Professor of Health Humanities and Bioethics at the University of Rochester Medical Centre, speaking to MQ Mental Health. “Their social, emotional, and cognitive development is just at a different stage than adults.”
Currently, most AI mental health tools aren’t developed with children in mind – and aren’t regulated to suit their needs. But as tech developers eye up the children’s market, Dr Moore warns there’s a serious lack of understanding about what kids actually need from mental health support.
No one is talking about what is different about kids – how their minds work, how they're embedded within their family unit, how their decision making is different.
Why AI can’t replace a real therapist
One of the main concerns is that AI tools simply can't understand or respond to the complexities of a child’s life – especially when it comes to their home environment, friendships or family relationships, all of which are central to their mental health.
Unlike human therapists, who factor in these dynamics and often involve the family in treatment, AI apps work in isolation.
That means they could miss key signs of distress – or fail to pick up on more serious issues that need urgent intervention.
There’s also the risk that children may begin to see these apps as more than just tools.
Research shows younger children, in particular, may treat AI as a sentient being, forming emotional attachments to their devices rather than seeking support from real people.
So, should you let your child use one?
It’s understandable that parents might consider AI support – especially given the long waiting lists for child and adolescent mental health services (CAMHS).
However, experts say any tech-based solution should be treated with caution and never replace professional care.
Instead, Dr Moore and other mental health advocates are urging tech companies and policymakers to invest in child-specific regulation, research, and design before rolling out AI mental health apps to children.
The bottom line? Until the tech catches up with children’s needs, nothing beats real, human connection – especially when it comes to our kids' mental health.
Read more:
- Are “smart months” real? What a Harvard study says about when your baby’s born
- Thousands of parents fined for not strapping children in cars – and it’s set to spike this spring
- Lullaby Trust endorses Angelcare baby monitor in landmark move to support bereaved families
- What is Labubu? Meet the cute-but-creepy collectable toy monster trending with kids (and adults)
Authors

Ruairidh is the Digital Lead on MadeForMums. He works with a team of fantastically talented content creators and subject-matter experts on MadeForMums.