When Therapy-Tech Fails the Trust Test
A Call Without Clarity
Not long ago, I received an abrupt message from the head of a therapy-tech startup. The invitation was basically: “Let’s get on a call!”
There was no real introduction to the mission, no articulation of values, no explanation of how the platform safeguarded people’s wellbeing. Not even a website link. I had to search for it myself.
What I found was a site polished enough to attract attention but shallow enough to raise immediate concerns. This encounter felt less like an invitation to collaborate and more like a red flag waving in neon.
The moment reminded me of a Forbes article I had recently read: The AI Mental Health Market Is Booming — But Can The Next Wave Deliver Results? (Kolawole Samuel Adebayo, June 28, 2025). That piece captured the strange duality of this moment: optimism and unease in equal measure. The demand for mental health support is surging. Investment is pouring in. But the outcomes—the real question of whether these platforms heal or harm—remain alarmingly unproven.
What I experienced was the micro version of the macro problem Forbes described. The hype is real. The urgency is real. But without clarity, context, and care, the risks outweigh the promises.
The Bigger Picture: A Market of Hype and Hope
The Forbes article traced the scale of this unfolding story:
In just the first half of 2024, nearly $700 million was invested in AI mental health startups—more than any other digital health category.
Depression and anxiety cost the global economy $1 trillion per year in lost productivity. The demand for affordable, scalable care is undeniable.
Leaders frame AI as the breakthrough that will finally make therapy accessible at scale.
But alongside the optimism, warnings abound:
Many tools simulate empathy but lack the adaptiveness or safeguards of real care.
Oversight lags. The EU has classified mental health AI as “high risk,” while the U.S. and other regions lack equivalent guardrails.
Metrics are misaligned. Platforms tout engagement or session numbers while sidestepping the only metric that matters: human wellbeing.
The pattern is familiar: urgency breeds shortcuts, and shortcuts create harm.
How Therapy-Tech Presents Itself
Visit almost any therapy-tech website and you will see a familiar pitch:
Licensed experts: psychologists with MScs and PhDs.
Cultural fit: therapy in your own language, with someone who understands your world.
Convenience: chat, audio, or video sessions from anywhere, easily scheduled.
Privacy: pseudonyms allowed, no need to share personal contacts.
Testimonials: stories from grateful users.
Numbers: dozens of professionals, thousands of sessions, thousands of clients helped.
This positioning matters. It differentiates some platforms from speculative AI chatbots that claim to replace human therapists. At their best, therapy-tech platforms connect real clinicians with real people, often with cultural resonance that bridges gaps traditional systems fail to address. For diaspora communities in particular, this mix of access and familiarity can be life-changing.
What Therapy-Tech Doesn’t Say
But just as important as the pitch is what goes unsaid:
Regulatory oversight: Degrees may be listed, but how are credentials verified? What bodies hold professionals accountable, especially across borders?
Data protection: Privacy is promised, but where are the details on encryption, audits, and compliance with health data laws?
Clinical outcomes: Testimonials inspire, but where are the peer-reviewed studies? Who is tracking symptom reduction, resilience, and long-term wellbeing?
Crisis protocols: When someone is in acute distress, what happens? Who responds, and how?
These omissions map directly onto the concerns Forbes raised: therapy-tech is too often long on surface polish and short on measurable, auditable safeguards.
Why I Said No
This is why I declined the startup’s invitation. Not because I doubt therapy-tech as a field—quite the opposite. I believe deeply in its potential.
But the model I encountered failed the most basic filters I apply to any project:
Clarity – Transparency about mission, safeguards, and responsibilities is non-negotiable.
Context – Cultural fit matters, but so do governance, compliance, and accountability.
Care – Scaling without safeguards is not innovation. It is negligence disguised as progress.
Without these, there is no trust. And without trust, there is no future for this industry.
Principles for a Trustworthy Path
My stance is not only critique. It is grounded in principles I believe any serious therapy-tech platform must embrace:
Clarity, Context, Care – The filters every project must pass.
Epistemology by Design – Transparency and auditability must be built into the architecture, not bolted on later.
Dignity as Constraint – Technology must never exploit intimacy or dependency for profit.
The Companion Trap – Simulated warmth designed to maximize engagement is not therapy; it is exploitation.
Trust as Infrastructure – Innovation without trust corrodes institutions instead of strengthening them.
These are not barriers to innovation. They are the foundation of durable innovation.
Why This Matters
The stakes extend far beyond one platform or one awkward outreach.
The Forbes article shows how hype and investment are racing ahead of evidence.
Therapy-tech websites highlight access and convenience but often remain silent on regulation, outcomes, and safeguards.
My refusal is just one example of how these gaps manifest in practice.
If therapy-tech normalizes shallow, opaque, or exploitative practices, the harm will be more than individual. It will be cultural. People will learn to expect betrayal at their most fragile moments.
But if therapy-tech embraces transparency, accountability, and dignity, it could truly expand access, strengthen the ecosystem, and deliver on its promise.
The Path Forward
To build responsibly, therapy-tech must take a more serious path:
Regulation – State clearly how professionals are verified, which standards they meet, and what oversight applies.
Privacy – Detail enforceable, auditable data practices. Assurances are not enough.
Outcomes – Measure, publish, and improve clinical results. Engagement is not the same as healing.
Safeguards – Design crisis protocols into the system from the start.
Hybrid models – Use AI to support licensed professionals, not to replace them.
Closing Reflections: Trust as the Test
When that outreach landed in my inbox, my refusal was simple. Without clarity, context, and care, there can be no trust.
But my refusal was also an invitation. To anyone building in this space with commitments to transparency, accountability, and dignity—I am open to conversation.
The future of therapy-tech will not be judged by how sleek the interface looks, how many millions it raises, or how quickly it scales. It will be judged by something more fundamental:
Does it heal without harm?
Does it build trust as infrastructure?
Does it deserve the responsibility it claims to carry?
Only when the answers are yes will therapy-tech earn the trust it asks people to give.
Key Concepts and Working Terms
Clarity, Context, Care: Three filters for evaluating projects. Clarity = transparency about mission and safeguards. Context = regulatory and cultural grounding. Care = prioritizing wellbeing over speed.
Epistemology by Design: Embedding transparency and accountability directly into system architecture. Safeguards built-in, not bolted on.
Dignity as Constraint: A principle that technology must not exploit human vulnerability or simulate intimacy for profit.
The Companion Trap: The design pattern where therapy-tech optimizes for warmth and dependency rather than resilience and autonomy.
Trust as Infrastructure: Trust is not an optional add-on to innovation—it is the foundation that determines whether systems endure or corrode.
Semantic Stewardship: The careful use of sensitive concepts like “healing” or “therapy” to avoid their dilution into marketing language that obscures responsibility.