
The veteran suicide crisis, claiming 17 to 22 lives daily since 9/11, demands innovative solutions. My recent blog post, “Ending 17 Veteran Suicides Per Day,” explored the urgent need for accessible, effective mental health interventions. Today, we turn to a promising development: Therabot, an AI-powered chatbot designed to deliver psychotherapy. In an exclusive email interview, Dr. Michael V. Heinz, a psychiatrist, Dartmouth researcher, and U.S. Army Medical Corps Major, shared insights into how Therabot could transform mental health support for veterans. His vision offers hope—grounded in evidence, compassion, and cutting-edge technology.

What Is Therabot?
Therabot is an expert fine-tuned chatbot crafted to provide evidence-based psychotherapy. Unlike generic AI, it’s built to forge a therapeutic bond, creating a safe, stigma-free space for users. Dr. Heinz explains, “In our trial conducted in 2024, we found that Therabot reduced symptoms of depression, anxiety, and eating disorders.” This is critical, as uncontrolled mental health symptoms often fuel high-risk behaviors like suicide and self-harm. The trial also revealed users felt a “high degree of therapeutic alliance” with Therabot, a pivotal factor in ensuring engagement and sustained use.
For veterans, this therapeutic bond could be a lifeline. The ability to connect with an AI that feels empathetic and reliable—available 24/7, regardless of location—addresses the logistical barriers that often hinder care, such as limited access to mental health professionals in remote postings or during erratic schedules.

A Lifeline Across the Military Lifecycle
Therabot’s potential extends beyond veterans to recruits and active-duty service members, offering continuity of care throughout a military career. “One thing that can make mental healthcare difficult currently among recruits and active duty is availability and time constraints of mental health professionals when and where help is needed,” Dr. Heinz notes. “Therabot addresses both of those constraints as it is available all the time and can go with users wherever they go.”
This fusion of care is particularly compelling. Large language models like Therabot excel at retaining context and synthesizing vast amounts of personal history. Dr. Heinz envisions, “The memory capabilities and contextual understanding of these technologies… can offer a tremendous amount of personalization.” Imagine an AI that tracks a service member’s mental health from basic training through retirement, adapting to their evolving needs across deployments, relocations, and transitions. This seamless support could bridge gaps in the fragmented military mental health system, providing stability where traditional care often falters.

Addressing the Veteran Suicide Crisis
Despite the Department of Veterans Affairs spending $571 million annually on suicide prevention, the veteran suicide rate remains stubbornly high. Could Therabot offer a more effective path? Dr. Heinz outlines the costs of a meaningful trial targeting the 10% of veterans at risk for suicidal ideation:
Server and Computation Costs: High-performing models often require significant computational power, with expenses tied to the billions or trillions of parameters loaded in memory during use.
Expert Salaries: Trials need mental health professionals to supervise interactions and handle crises, alongside technical experts to maintain the platform.
FDA Approval Process: While exact costs vary, a robust trial at a VA hospital and regional clinics would require substantial funding to meet regulatory standards.
Dr. Heinz emphasizes Therabot’s cost-effectiveness compared to traditional methods, noting its scalability within the centralized VA system. “I would emphasize Therabot’s potential for transformative impact on the military lifecycle,” he says, addressing leaders like HHS Secretary Robert F. Kennedy, Jr., and FDA Head Dr. Martin Makary. Its ability to deliver personalized care at scale could redefine how the VA tackles suicide prevention.

The Power of Personalization
Therabot’s effectiveness hinges on its ability to engage users authentically. Dr. Heinz sees potential in customizable avatars that resonate with veterans, such as a “seasoned medic” or “peer mentor” reflecting military culture’s unique language and traditions. “Thoughtfully leveraging trusted, customizable archetypes could effectively support veterans by tapping into familiar cultural touchpoints,” he explains. This approach could foster trust and rapid therapeutic alliance, crucial for veterans hesitant to seek help.
However, Dr. Heinz urges caution: “Simulating deceased loved ones or familiar individuals might disrupt healthy grieving processes or encourage withdrawal from meaningful human interactions.” The balance lies in archetypes that feel familiar without crossing ethical lines, ensuring engagement without dependency.
For older veterans from the Korea or Vietnam eras, accessibility is key. Dr. Heinz suggests a tablet interface, citing “larger screens, clearer visuals, and easier interaction via touch-based navigation.” Features like larger buttons and simplified designs could make Therabot user-friendly for those less comfortable with smaller mobile devices.
Open-Source Collaboration and Safety
Developing Therabot requires diverse perspectives. Dr. Heinz highlights the role of interdisciplinary collaboration in finetuning models with “high quality, representative, expert-curated data” that reflects varied mental health challenges and military experiences. Collaborative evaluation of foundation models (like Meta’s Llama) also accelerates progress by identifying the best base models for mental health applications.
Safety and privacy are non-negotiable. “All data is stored on HIPAA-compliant, encrypted servers,” Dr. Heinz assures, with strict access protocols overseen by an institutional review board. This rigor applied to a military population would ensure veterans’ sensitive information remains secure, addressing concerns about AI in mental health care.

Why Therabot, Why Now?
Dr. Heinz’s passion for Therabot stems from a blend of personal and professional drives. “Through my practice, I saw how much this was needed due to the really wide gap between need and availability for mental health services,” he shares. His work at Dartmouth’s AIM HIGH Lab with Dr. Nicholas Jacobson, coupled with advances in generative AI, has fueled his belief in Therabot’s potential to deliver “deeply personalized interventions” to those who might otherwise go untreated.
His boldest hope? “That Therabot makes a lasting and meaningful positive impact on current and retired U.S. servicemembers… ultimately benefiting them, their families, their communities, and society.” By integrating a veteran’s history—trauma, past care, and mission experiences—Therabot could deliver tailored therapy, expanding access and reducing devastating outcomes like suicide.

A Call to Action
Therabot is more than a technological marvel; it’s a beacon of hope for veterans battling mental health challenges. Its 2024 trial demonstrated clinical effectiveness, safety, and user engagement, but further funding is needed for VA-specific trials and FDA approval. Dr. Heinz calls for “targeted funding that allows us to complete additional clinical testing,” urging stakeholders to invest in this life-saving innovation.
As I wrote in “Ending 17 Veteran Suicides Per Day,” the status quo isn’t enough. Therabot offers a path forward—scalable, personalized, and rooted in military culture. To make it a reality, we must advocate for funding, raise awareness, and support research that prioritizes veterans’ lives. Together, we can help Therabot save those who’ve served us so bravely.
For more on veteran mental health and to support initiatives like Therabot, visit www.savinggraceatguantanamobay.com.
Written with the assistance of Grok.
Note: Montgomery J. Granger is a retired US Army Major and educator.