Skip to main content
Apr 3

Chatbots Prescribing Psych Meds: Here's The Catch.

A fundamental question, posed by some psychiatrists, is precisely what problem this new initiative aims to resolve. Utah has authorized an artificial

7 min read120 views3 tags
Originally reported bytheverge

A fundamental question, posed by some psychiatrists, is precisely what problem this new initiative aims to resolve.

Utah has authorized an artificial intelligence system to prescribe psychiatric medications independently of a human physician, marking only the second instance nationwide where such clinical prescribing authority has been granted to AI. While state authorities champion this initiative as a means to reduce healthcare costs and alleviate treatment scarcities, medical professionals express concerns, labeling the system as lacking transparency, inherently risky, and improbable to broaden mental health access for the most vulnerable populations.

Unveiled last week, a year-long pilot program will enable Legion Health's AI chatbot to facilitate the renewal of specific psychiatric medication prescriptions in certain scenarios. The San Francisco-based startup offers Utah residents “fast, simple refills” via a $19 monthly subscription. While the program is slated to commence in April, the company is currently managing a waitlist.

The program's design is intentionally restricted, both in the range of medications it encompasses and the eligibility criteria for patients. As per Legion's agreement with Utah’s Office of Artificial Intelligence Policy, the chatbot is authorized to renew only 15 specific lower-risk maintenance medications, all of which must have been previously prescribed by a human clinician. These include commonly used drugs for anxiety and depression such as fluoxetine (Prozac), sertraline (Zoloft), bupropion (Wellbutrin), mirtazapine, and hydroxyzine. Patients must also demonstrate clinical stability, meaning individuals with recent changes in dosage or medication, or a psychiatric hospitalization within the past year, are disqualified. Furthermore, patients are required to consult with a healthcare provider after every 10 refills or six months, whichever occurs first.

Crucially, the system is not permitted to issue new prescriptions or manage medications necessitating rigorous clinical monitoring, such as those requiring blood tests. Controlled substances are also prohibited, thereby excluding numerous ADHD medications. The exclusion extends to benzodiazepines for anxiety, antipsychotics for conditions like schizophrenia and bipolar disorder, and lithium – often regarded as the benchmark treatment for bipolar disorder – effectively placing many intricate psychiatric cases beyond the purview of this pilot.

For system utilization, patients must actively opt-in, verify their identity, and provide proof of an existing prescription, typically through a photo of the label or pill bottle. Subsequent steps involve questions regarding their symptoms, medication side effects, and efficacy. The system also screens for potential red flags by inquiring about suicidal thoughts, self-harm, severe reactions, and pregnancy. Should any responses deviate from the pilot's low-risk parameters, the case is mandated to be escalated to a human clinician prior to any refill being dispensed. Both patients and pharmacists retain the option to request a human review.

In their announcement of the pilot, state officials asserted, “By safely automating the renewal process for maintenance medications, we are allowing patients to get the care they need much more quickly and affordably.” They further suggested that, over time, this initiative could enable healthcare providers to “focus their time on more complex, higher-risk patient needs” and help mitigate the shortages that have left half a million Utah residents without adequate mental health services. Yash Patel, Legion cofounder and CEO, presented the program with even broader vision, characterizing it as a world-first innovation poised to significantly expand healthcare access and herald “the beginning of something much bigger than refills.”

However, psychiatrists remain largely unconvinced. Dr. Brent Kious, a psychiatrist and professor at the University of Utah School of Medicine, conveyed to The Verge his belief that the “advantages of an AI-based refill system may be overstated.” He harbors doubts that the tool “will not increase access for those who are most in need of care,” noting that prospective patients would already require an established treatment plan with their psychiatrist to utilize the service.

“It would be better if there were greater transparency, more science, and more rigorous testing before people are asked to use this.”

Dr. Kious posits that such automation could inadvertently foster an “epidemic of over-treatment” within psychiatry, leading some patients to continue medication longer than medically necessary. Dr. John Torous, director of digital psychiatry at Beth Israel Deaconess Medical Center and professor of psychiatry at Harvard Medical School, echoed a similar apprehension. He highlighted that while long-term psychiatric medication use benefits some individuals, others may achieve better outcomes by gradually reducing or discontinuing their prescriptions. “They require more active management, changes, and careful consideration,” Dr. Torous stated, emphasizing the difficulty of achieving this level of nuanced care when refill check-ins are delegated to a chatbot.

A more profound concern revolves around the chatbot's capacity to safely automate even the seemingly routine aspects of psychiatric care. Dr. Torous underscored that prescribing extends beyond mere drug interaction checks, questioning whether any current AI system today “can understand the unique context and factors that go into a person’s medication plan.” Dr. Kious concurred, stating: “This is something that could be safe in principle, but it all depends on the details.” These anxieties are exacerbated by the nascent nature of these systems and their inherent opacity to external scrutiny. “It feels a bit like alchemy right now,” he remarked, reiterating, “It would be better if there were greater transparency, more science, and more rigorous testing before people are asked to use this.”

Furthermore, immediate safety concerns persist. Dr. Kious highlighted the potential for the chatbot to overlook critical information during screening, perhaps by failing to ask pertinent questions, or if a patient misinterprets a side effect or provides inaccurate information. He also noted that some patients might intentionally provide answers they believe the system expects to expedite care. While acknowledging that psychiatry significantly relies on self-report, a challenge not exclusive to chatbots, he pointed out that human clinicians typically access broader contextual information. When interacting with patients, he explained, he observes not only their verbal responses but also their non-verbal cues and overall presentation. Although patients can also mislead human providers, Dr. Kious suggested that a chatbot system might inadvertently simplify the process for patients to manipulate their responses to achieve a desired outcome.

Dr. Torous also pointed to more apparent safety risks, which resonate with real-world experiences of chatbot deployment. Legion’s chatbot represents Utah’s second foray into AI prescribing, following a broader primary care pilot with Doctronic launched last December. Within weeks of its activation, security researchers successfully manipulated Doctronic’s system into disseminating vaccine conspiracy theories, providing instructions for synthesizing methamphetamines, and inappropriately tripling a patient’s opioid dosage. State officials affirm that the more targeted program with Legion is specifically tailored to address “the state’s mental health shortage.”

Legion asserts that the pilot operates within stringent safeguards. Beyond its “conservative eligibility gates,” the agreement with Utah mandates the submission of detailed monthly reports and a thorough review by human physicians of the initial 1,250 requests, followed by periodic sampling of approximately 5 to 10 percent of subsequent requests.

Arthur MacWaters, Legion cofounder and president, informed The Verge that “risks exist in any remote care model, whether AI-assisted or fully human-led,” emphasizing that the company’s “workflow does not rely on a single self-reported answer to unlock treatment.” He outlined key safeguards, including the pilot’s strict limitations on medications and patient eligibility, integrated AI safety screens, pharmacist engagement, and the option for clinician escalation. “We see this as critical to expand access to hundreds of thousands of people in Utah who live in mental health shortage areas, as well as an important proving ground for AI in medicine,” MacWaters stated.

While MacWaters refrained from commenting on potential additional use cases, medications, or expansion into other states, he expressed the firm's enthusiasm for “what the future holds.” He also declined to provide a timeline for Legion's expansion strategies, despite both he and Legion publicly signaling broader aspirations beyond Utah. Legion's refill website states the service will be available “nationwide 2026,” and MacWaters has previously suggested it “will be in every state very very quickly.”

For the psychiatrists interviewed, the initiative fundamentally prompts a basic question: What specific problem is Legion genuinely addressing? Dr. Kious noted that established patients frequently do not require an appointment for a prescription refill, explaining that most psychiatrists are typically “happy to refill prescriptions for free and without an appointment” unless there are concerns about the patient's well-being or the medication poses significant risks. Ironically, these are precisely the complex cases that Legion's AI system is explicitly designed to exclude.

“I would personally avoid it for now,” advised Dr. Torous, suggesting that if an individual has established an effective treatment plan, it is likely best to maintain care with their current clinician.

ES
Editorial StaffEditor

The Editorial Staff at AIChief is a team of professional content writers with extensive experience in AI and marketing. Founded in 2025, AIChief has quickly grown into the largest free AI resource hub in the industry.

View all posts
Reader feedback

What did you think of this story?

User Comments

Filter:
No comments yet. Be the first to comment!
Continue reading
View all news