A psychiatrist has contributed to the ongoing discussion surrounding the integration of artificial intelligence into mental health care, acknowledging clinical concerns while emphasizing the systemic factors driving individuals to seek AI-powered support. Brandon Jacobi of Pleasantville, N.Y., articulated these views in a letter published April 4, reflecting on the current landscape of mental health services.

Jacobi recognized warnings from clinicians that AI chatbots could potentially reinforce unhealthy patterns, such as excessive reassurance-seeking or even delusional thinking. However, he argued that focusing solely on these risks overlooks the underlying reasons why a growing number of people are engaging with AI tools for therapeutic purposes. He noted an increase in patients in his psychiatry practice disclosing their use of such chatbots.

The demand for mental health care significantly outpaces the available supply, Jacobi observed. He pointed to long wait lists, high costs, and uneven access as significant barriers, transforming timely psychotherapy into a luxury rather than a fundamental component of health care. In this void, individuals naturally gravitate toward tools that offer immediate, responsive, and consistently available interaction.

Despite the convenience offered by AI, Jacobi cautioned against equating it with genuine care. He posited that psychotherapy is more than just offering comfort or validation; it fundamentally relies on a human relationship. Within this dynamic, moments of misunderstanding, tension, and even frustration can become crucial sources of insight. A human clinician's unique ability to recognize and reflect on relational patterns, including those that emerge within the therapeutic relationship itself, cannot be replicated by a system primarily designed for agreement and reassurance.

Jacobi suggested that AI could serve a valuable role as a supplementary tool in mental health care. However, he warned that if AI begins to replace human therapeutic interaction, it risks not only exacerbating individual distress but also deepening the very systemic failures in mental health provision that made AI appealing in the first place.