An over-the-shoulder view of a person wearing a maroon blazer and a headset, sitting at a desk and looking at a laptop. On the laptop screen is a video call showing a cartoon woman in a yellow blazer, looking stressed with her head in her hands.

Adobe Stock image edited with Generative AI

Using AI to train future mental health clinicians

The CISA AI Task Force sat down with Laura Jimenez Arista, clinical associate professor and director of undergraduate training in the School of Counseling and Counseling Psychology. Jimenez is a licensed psychologist who teaches graduate and undergraduate courses at ASU’s Polytechnic and Tempe campuses, supervises the graduate Spanish practicum, and oversees the undergraduate clinical internship. In this post, Jimenez discusses how AI is being integrated into the training of future clinicians in the School of Counseling and Counseling Psychology. 

Question: What inspired you to integrate artificial intelligence into counselor training, and what gaps in traditional training methods are you hoping to address with this tool?
Answer: We were inspired to integrate AI because we recognize the need for scalability in counselor training, especially with our brand-new Master of Counseling online program. Traditional training often relies on scheduled in-person practice or peer simulations, which limit opportunities for students to practice consistently. With AI, students — whether online or in-person — can practice basic counseling skills anytime, not just during class or scheduled simulations. This provides more flexibility and ensures that all students have ample opportunities to develop foundational skills.

Q: Can you tell us more about how your AI simulation works — for example, how it mimics a client’s responses or helps trainees practice specific counseling skills?
A: Our AI uses a publicly available client script that’s widely recognized in counselor training programs. The trainee takes on the role of the counselor, while the AI plays the client, following the script. This setup allows students to practice fundamental counseling skills, such as active listening, empathy and appropriate questioning, in a controlled and repeatable way.

Q: What do you see as the biggest benefits and challenges of using AI in developing empathy, active listening and other core counseling skills?
A: One of the greatest benefits is that AI provides an accessible, low-pressure environment where students can practice core skills repeatedly and receive feedback. They can also encounter a variety of client scenarios that might not appear in traditional training. A challenge, however, is that AI can’t fully capture the complexity and unpredictability of human emotion. While it’s excellent for building foundational skills, real-world clinical experience remains essential. The key is using AI as a complement, not a replacement, to traditional methods.

Q: How do you envision this kind of AI tool evolving in the future? Could it eventually complement or even transform how counseling students gain clinical experience?
A: I see AI becoming increasingly sophisticated, offering dynamic simulations that adapt to student responses and provide personalized feedback. In the future, it could expose students to rare or complex cases that are hard to replicate in traditional training. While AI will never replace real human interaction, it can significantly enhance skill development, confidence and readiness before students enter actual clinical settings, ultimately complementing and transforming the way students gain clinical experience.

What we're reading right now

Integrating AI and LLMs into Counseling Education: Ethical and Inclusive Practices
American Counseling Association (2025)
This new ACA resource outlines best practices for counselor educators integrating AI and large language models into their teaching. It highlights ethical considerations, inclusive pedagogy and practical guidance for ensuring AI complements human supervision in counselor training.

Artificial intelligence in mental health care
American Psychological Association (2024)
The APA explores how AI is reshaping mental health training and practice by enabling scalable simulations, adaptive learning tools and client-response modeling. The article stresses that AI can enhance access and consistency in training, but human judgment, empathy and ethics remain central to clinical care.

Mental health practitioners’ perceptions and adoption intentions of AI-enabled technologies: an international mixed-methods study
Julia Cecil et al., BMC Health Services Research (2025)
This mixed-methods international study surveyed mental health professionals on their readiness to adopt AI tools. Respondents expressed both optimism about improved efficiency and concern about ethical implications, reliability and the loss of human connection — mirroring many of Jimenez’s reflections.

New AI tools for mental health research and treatment
Google Health Blog (2025)
Google’s research team outlines new applications of AI in clinical training, assessment and treatment support. The post emphasizes responsible design, privacy and the continued importance of human oversight — aligning with Jimenez’s emphasis on using AI as a training supplement, not a replacement.

New study warns of risks in AI mental health tools
Stanford News (2025)
Stanford researchers warn that AI mental-health chatbots can reinforce stigma and give unsafe advice in crises, such as offering harmful suggestions to users expressing suicidal thoughts. Researchers conclude these tools aren’t ready to replace therapists but could assist with low-risk tasks like journaling or scheduling, if used carefully.