Health News

The missing piece: technology and mental health

When it comes to mental health, there is a growing divide around a hot topic: the role of technology and AI in treatment.

There is a widespread belief in technology that therapists are resistant to change or anti-innovation. We are here to set the record straight.

If you’re building these tools, investing in them, or want to create the next big thing that will transform the industry, we’re here to share some inside information with you.

Therapists and mental health professionals are not afraid of technological progress, but we are rightly concerned about the consequences of building it without us.

Therapy is not just another industry to modernize.

Therapists are at a crossroads, often divided over whether AI in mental health will exacerbate the problem of burnout and workforce loss, or whether it will ultimately support clinicians and improve care.

In our experience, the biggest indicator of the success or safety of a new tool for therapy is whether experienced therapists are involved in its development, ensuring compliance and the safety standards for which we are responsible, by being involved from the very design of a product, and not just as testers before its launch. We need tools that correspond to our way of being: independent, autonomous and diversified.

If technology reshapes therapy without understanding it, we risk solving the wrong problem and, worse, creating new ones.

The Hardest Part of Therapy Isn’t What You Might Think

When we started our practice, we did everything correctly.

We researched the best tools to use, interviewed the experts, hired exceptional therapists, and hired a strong administrative support team. We should have done well, but we were stuck.

We were faced with the same problem again, threatening to shut us down and disrupt our customers. It was not clinician burnout, client trauma, or overwhelming emotional labor. It was unpaid administrative work.

Turns out it wasn’t just us, but a widespread problem in the healthcare industry. Approximately 40% of the tasks required to ethically and safely care for our clients are unpaid, non-reimbursable tasks and considered non-essential by the industry.

Yet today’s market treats the ongoing national mental health crisis as a productivity or workflow problem.

According to the National Bureau of Labor Statistics, 54 percent of people with a master’s degree in mental health counseling never become licensed, even though they have spent up to seven years in school and several years of graduate school under strict supervision, doing the same clinical work as a licensed provider.

We sincerely believe that many of the people developing mental health technology have good intentions. They want to help.

However, most technology tools are designed with the payer or consumer in mind, creating a gap that is more than inconvenient, but extremely dangerous.

We’re seeing a trend in the industry where tools fail to launch, frustrate the clinicians expected to adopt them, or end up disappointing the customers they’re trying to reach. Many of these teams have the same thing in common: the professionals experienced in the work were not meaningfully involved in building the solution.

When you look at it, how many of these companies promising to solve the mental health crisis are using real therapists, who have been in the field, working with clients, who can tell you what they need to successfully treat their clients?

Many may have one or two leadership providers, but many, if not most, have already decided that they know what professionals want and need, without ever speaking to the professionals themselves. Others use therapists looking for feedback, not development. Experts come last, not first.

In an enthusiastic effort to tackle the problem, the experience and training of subject matter experts is considered optional.

Risk, or what we can’t afford to get wrong

If there’s one thing we can guarantee in every session, every day, across the country, it’s this: Clinicians are constantly assessing client safety.

Above all, our duty and mandate is the safety of our customers. This doesn’t just mean safety in the moment, but also safe physical and emotional relationships, safe boundaries, and safe experiences.

Protecting information is also important. A breach or error, and it is neither the company nor the technology that will be held responsible. It’s my license, my career, and ultimately my responsibility.

Therapists are responsible for both what we control and what we do not control, what we know and what we should have known to avoid harm, even when control of the technology we use is beyond our control.

Do you know of any other industry held to this standard?

Other professional fields, such as medicine or aviation, also face serious individual liabilities, but in most cases these professions have comprehensive risk allocation systems. In mental health, the therapist East the system: the clinician, the risk manager, the compliance officer and the privacy guardian, while remaining personally responsible for the tools they did not build or control.

When technology fails, the responsibility does not lie with the software, with the company, with the manufacturers.

In mental health, the therapist alone assumes this responsibility.

What kind of future do we create when therapist voices are present?

Ultimately, what keeps us up at night is this: Once these tools are perfected to reduce costs, could they eventually be used to determine that some people don’t deserve a therapist at all?

Somewhere in a conference room, technology leaders are discussing whether a chatbot could replace human therapy. This is not new, most institutions already have processes in place to identify individuals who are “not sufficiently symptomatic” to justify the cost of care. Presenting this as a cost saving instead of prioritizing patient care is not only asking the wrong questions, but declaring a solution based on a dangerous assessment that has meaningful and real consequences.

By involving highly experienced therapists and mental health administration experts early in the development of new technologies, we not only build more effective tools, but also build trust between clinicians. This early input allows us to proactively address concerns around security, compliance and liability, ensuring the technology matches real-world needs.

In doing so, we can both give therapists tools that support their practice and protect them from dangerous errors in otherwise well-intentioned technology.

When we have therapists who say it’s safe, it’s trustworthy, it’s helpful, we have marketable technology that therapists will want to adopt in their practice.

We’re not saying don’t build it, we’re saying build it with us.

Therapists do not resist modernization, we are the guardians of security and trust. We want tools that move us forward and we want to shape them. If the future of therapy lies in technology that protects clinicians’ time, voice, and autonomy, we all benefit.

Photo credit: Olga Strelnikova, Getty Images


Kira Torre, LMFT, and Emily Daubenmire, CPC, are co-founders of a mental health group practice with a simple mission: Putting therapists first means putting patient care first. Working at the intersection of clinical practice, operational leadership, and digital health innovation, they bring a unique perspective to the next generation of mental health care and, together, advocate for ethically aligned technology in behavioral health.

This message appears via the MedCity Influencers program. Anyone can post their views on healthcare business and innovation on MedCity News through MedCity Influencers. Click here to find out how.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button