Stanley Vashovsky is a New York City healthcare entrepreneur and operator known for pairing technology with clinical workflows to improve patient experience. Over more than three decades, Stanley Vashovsky has founded and led ventures that applied data analytics, location-based dispatching, and mobile services at scale. As cofounder of DocGo, he helped build proprietary platforms for medical transportation and population health programs, including tools such as ShareLink that offered real-time transparency for patients and caregivers. Earlier, he founded Medcare, later joining Philips Healthcare as Vice President of Innovations after an acquisition, and led a turnaround at Health Systems Solutions with a subsequent exit. He has also volunteered as a paramedic and supported frontline EMS staff through compensation and equity initiatives. That background informs a balanced view of how AI can assist behavioral health by improving access and efficiency while recognizing the limits that require human judgment.
Understanding the Benefits and Limitations of AI in Mental Health
In recent years, developments in artificial intelligence (AI) have impacted many aspects of daily life, and hold the potential to revolutionize entire industries. This includes the possibility of broad adoption of AI tools to help transform the delivery of healthcare.
While there promise and potential of using AI in medicine is vast, it also presents a unique set of considerations, particularly when it comes to behavioral health services. Behavioral health care in the US has faced challenges dating back many decades, long before the advent of AI. Despite medical advances in a number of areas, a person’s ability to access quality behavioral health care is still largely determined by where they live, what type of insurance coverage they have, and their socioeconomic status. These hurdles create real barriers that prevent those in need from accessing vital support services.
AI has provided health system leaders with tools to address a number of these challenges. AI services can, for example, facilitate screening services and symptom monitoring processes. AI tools can also help individuals access mental health support between visits with a primary clinician. Many American businesses use AI chatbots to engage with customers. Mental health care providers can use AI tools to similar ends, providing 24/7 support powered by large language models (LLM).
While companies like OpenAI and Google provide scalable LLMs, organizations such as Ellipsis Health have developed more specialized services capable of analyzing vocal biomarkers that can help mental health professionals key in on an individual’s mental state. While helpful, these tools are not infallible, nor can they replicate the sensitivity and nuance of a trained mental health services provider. For instance, LLMs cannot interpret body language. Studies suggest AI tools provide the best results when interacting with individuals living with mild to moderate depression or anxiety symptoms.
An increasing number of companies are releasing digital cognitive behavioral therapy (CBT) tools capable of addressing symptoms of depression. The most advanced tools can produce results comparable to those associated with short-term, human-delivered therapy. While AI shows great promise in the field of mental health care, it bears repeating that AI is still a developing area with many limitations. Behavioral health is a wide, diverse medical field characterized by a diversity of complex, nuanced conditions, many of which carry high-risk symptoms. AI is not effective or recommended when it comes to dealing with individuals living with post-traumatic stress disorder or severe depression. Although health professionals can train LLMs to monitor for keywords and phrases, they should not rely on AI to pick up on suicidal ideation, self-harm, and other mental health conditions that pose an immediate risk to patient health.
Traditional CBT services and other forms of therapeutic care are rooted in a foundation of trust and personal connection that develops over time between a person and their care provider. AI tools are not yet capable of replicating this connection and therefore cannot function as the central care resource for humans in need of mental health services. Put more clearly, decades of studies suggest that the human-to-human interactions that occur during therapy sessions play a significant role in positive treatment outcomes, something even the most knowledgeable and sensitive AI system cannot currently replicate.
In closing, while artificial intelligence offers exciting possibilities for expanding access and improving efficiency in behavioral health, its role must remain supportive rather than substitutive. The future of effective behavioral health care will depend on thoughtfully integrating AI tools into a system still grounded in human empathy, clinical judgment, and trust. By combining the scalability and precision of technology with the compassion and insight of trained professionals, the healthcare community can create a more responsive, equitable, and person-centered model of behavioral health care—one that harnesses innovation without losing the essential human connection at its core.
About Stanley Vashovsky
Stanley Vashovsky is a New York City resident and healthcare entrepreneur focused on technology that improves patient experience. He cofounded DocGo, which evolved from Ambulnz, and helped develop proprietary platforms that support medical transportation, mobile health, and population health initiatives. Earlier he founded Medcare and later served as Vice President of Innovations at Philips Healthcare following acquisition. He led a turnaround at Health Systems Solutions before its sale in 2015. A volunteer paramedic, he has advocated for improved compensation, benefits, and equity participation for frontline EMS staff and contributes thought leadership on healthcare topics.

