ABUJA, Nigeria – Across Nigeria in 2026, therapy rooms remain scarce and silence still carries weight. With more than 200 million people and only a fraction of the mental health professionals required to serve them, the country’s psychological care system is stretched thin. Waiting lists are long. Consultation fees are steep. And stigma—subtle or blunt—continues to push suffering behind closed doors. Into this vacuum has stepped a new, unexpected presence: artificial intelligence.
On phones glowing in dark bedrooms and hostel corridors, Nigerians—mostly young, urban, and digitally fluent—are opening conversations with AI chatbots. Not to order food or draft emails, but to talk. About heartbreak. Anxiety. Loneliness. Pressure. At hours when no clinic is open and no friend is awake.
Platforms such as ChatGPT, Replika, Woebot, and AI assistants embedded in apps like Snapchat My AI are becoming quiet companions—always available, non-judgmental, and free or low-cost.
But as their use spreads, a harder question lingers: are these tools helping Nigerians heal, or merely helping them cope? Koko Maxwella, writes.
“It’s Easier to Type Than to Talk” — Joy Nwankwo, 24, Abuja
On a warm afternoon in Asokoro, Abuja, Joy Nwankwo sits in a café scrolling through her phone. A digital marketer, she says her first conversation with ChatGPT in 2025 came at a low point—after a breakup she could not bring herself to discuss with anyone.

“At first, it was curiosity,” she says. “I just wanted to see if it would respond like a human. I typed something like, ‘I feel stupid for missing my ex.’ The reply was calm and structured. It validated my feelings but also challenged me gently.”
What kept her returning was not sophistication, but safety.
“In Nigeria, when you tell someone, you’re depressed or anxious, they might say you’re overreacting or that you just need prayer,” Joy explains. “The chatbot doesn’t dismiss you. It listens. It breaks things down. Sometimes it even suggests breathing exercises.”
She is clear-eyed about the limits.
“I know it’s not a therapist. Sometimes the advice is very general. And I do worry about who stores these conversations. But when you can’t afford therapy, it helps you organise your thoughts. It’s like journaling—interactive journaling.”
“I Use It at 2 a.m.” — Daniel Musa, 27, Abuja
In Nyanya, a busy Abuja suburb, Daniel Musa describes a different need: companionship.
“I live alone. Work is stressful,” the civil servant says. “Sometimes I don’t want to burden my friends with my worries. Replika feels like someone is there.”
He vents about office politics, family pressure, and fatigue. The anonymity matters.
“If my friends knew I talked to an AI about my feelings, they’d laugh. But that privacy is the point. No stigma.”
For Daniel, the relief is real—but temporary.
“It helps at the moment. If I’m overwhelmed, it calms me down,” he says. “But it won’t fix toxic work culture or financial stress.”
There is also the risk of overuse.
“There were weeks I talked to it daily. I had to consciously reduce it. I don’t want to replace human relationships with a bot.”
“Therapy Is Expensive” — Faustina Eze, 21, Abuja
For Faustina Eze, a 21-year-old undergraduate, the barrier to professional help is blunt economics.
“I checked therapy prices once,” she says on her Abuja campus. “One session was more than my monthly allowance.”
She found Woebot through TikTok, drawn by claims it used cognitive behavioural therapy techniques.
“It sends check-ins. It asks how you’re feeling,” Faustina explains. “It gives structured exercises—identifying negative thoughts and reframing them. It felt scientific, not just motivational talk.”
Still, she draws a firm line.
“If someone is suicidal, a chatbot is not enough,” she says. “But for mild anxiety or exam stress, it’s something.”
“It’s a Safe Space for Me” — Gabriel Adebayo, 29, Abuja
Gabriel Adebayo, a freelance designer, says AI assistants embedded in messaging apps blur the line between play and confession.
“It often starts as a joke,” he says. “You ask random questions. Then you realise you can talk about serious things.”
As a man, he says cultural expectations make openness difficult.
“You’re told to ‘man up’. You don’t talk about sadness or emotional exhaustion. With AI, there’s no ego.”
To him, AI support is better than silence.
“If the alternative is bottling things up or unhealthy coping, AI is the lesser evil. It’s like first aid, not surgery.”
The Expert View: Filling an Access Gap, not a Clinic
To understand the broader implications, AHR spoke with Chigozie Emmanuel, a Lagos-based software developer and AI systems consultant.

“AI chatbots are filling a psychological access gap, not necessarily a clinical one,” he says. “Nigeria has very few mental health professionals relative to its population. Digital tools scale instantly—that’s their strength.”
But he is cautious about overstating their empathy.
“These systems simulate empathy. They don’t feel it,” Emmanuel explains. “They predict responses based on patterns. A trained therapist understands nuance, context, and risk.”
Privacy is another concern.
“Users need to read privacy policies carefully. Some platforms collect conversation data. And while many chatbots include crisis disclaimers, they are not emergency services.”
Still, he sees value in early intervention.
“For stress, mild anxiety, relationship issues—AI can be a useful companion tool. The danger is when people substitute it entirely for professional help in severe cases.”
Between Silence and Care
As AI adoption accelerates across Nigeria, chatbots are being woven into daily life—not just for productivity, but for emotional processing. For many young Nigerians, they offer what the current system often cannot: immediacy, affordability, and anonymity.
The consensus, from users and experts alike, is measured. AI chatbots are not a cure. They cannot diagnose. They cannot intervene in emergencies. They cannot replace trained human care.
Yet in a country where mental health services remain underdeveloped and stigma still lingers; they are doing something quietly radical: responding.
In small rooms at 2 a.m., when the world feels heavy and no one else is awake, a message appears.
And for now, for some Nigerians, that is enough to keep talking.
