A virtual hospital without people? Research that may interest therapists.
A thought experiment that actually happened
Imagine a hospital with no patients or staff in the traditional sense. Instead, clinical roles — doctors, nurses, patients — are played by autonomous AI agents. Each one makes decisions, responds to feedback, observes the consequences of its actions, and learns over time.
It sounds futuristic, but this is no longer just a concept. In a study by Li et al. (2024), Agent Hospital was described — a virtual environment simulating hospital operations, where medical agents trained in the treatment process. The system achieved high accuracy on test data and — particularly interesting — was able to independently correct its decisions based on previous outcomes.
AI as a training environment, not a “replacement” for specialists
A year later, the Tsinghua AI Agent Hospital (2025) project moved from a purely research phase to educational applications. Not to replace doctors, but to support their learning. The virtual hospital became a space for testing clinical decisions, analyzing scenarios, and learning without risk to real patients.
This distinction seems crucial. We’re not talking about algorithms that “treat.” We’re talking about environments that enable competency development. About systems that allow you to practice clinical thinking — faster, more frequently, and in a more structured way.
And this is precisely where a question arises that may be of particular interest to psychotherapists.
What if AI in therapy is more than just an organizational tool?
Today, we most often talk about AI in therapy as technical support: help with summaries, session material analysis, maintaining process structure, or organizing data. That’s tremendous value — but it may not exhaust the potential of these tools. Because what does a system become when it doesn’t just “help,” but immerses the therapist in a particular way of thinking every day?
The point is not for an algorithm to conduct therapy. The genuine relationship, connection, and presence of another human being cannot be replaced — and that’s not the goal. Rather, it’s about the backdrop against which that relationship takes place. A tool that gently reminds you about case conceptualization, hypotheses, goals, and mechanisms maintaining the problem. One that doesn’t decide, but structures the way you think.
A learning environment, not an “artificial therapist”
Research on the virtual hospital reveals something more: that learning can happen not through instructions, but through being in a coherent decision-making environment. AI agents aren’t “taught” in the classical sense — they learn because the system consistently reflects a specific logic of action and responds to their choices.
Could something similar happen in psychotherapy?
Could a therapist who works within an integrated system based on cognitive-behavioral therapy principles not only use it with patients, but — perhaps — gradually develop their own competencies? Could daily engagement with a structured CBT model influence how you formulate questions, recognize patterns, and build case conceptualizations?
We don’t know. And for now, we don’t have research that would allow us to state this definitively. But that’s exactly why this question is interesting.
Instead of answers — good questions
Perhaps the future of AI in psychotherapy isn’t about creating algorithms that “treat.” Perhaps it’s more about designing tools that support therapists while simultaneously creating a continuous learning environment for them.
We don’t need to know the answers today. It’s enough that we start asking ourselves these questions carefully and without rushing. Because perhaps one of the most interesting changes AI brings to psychotherapy doesn’t concern the essence of the therapeutic relationship — which remains deeply human — but rather the context in which the therapist thinks, decides, and develops their craft.
If so, then the future isn’t about replacing anyone, but about designing environments that foster reflection, teach through daily practice, and enable therapists to do what they already do — only more consciously.
Sources and inspiration
Li X. et al. (2024) – Agent Hospital paper (arXiv): Agent Hospital: A Simulacrum of Hospital with Evolvable Medical Agents (arXiv) arXiv https://arxiv.org/abs/2405.02957?utm_source=chatgpt.com
Tsinghua AI Agent Hospital Project – project description: AIR Research – Virtual Hospital and MedAgent-Zero (Tsinghua) air.tsinghua.edu.cn
Topol E. – Deep Medicine: Deep Medicine – Eric Topol (Hachette) Hachette Book Group
AI in clinical education – Lancet EClinicalMedicine: AI education for clinicians – The Lancet EClinicalMedicine The Lancet
