Active Listening in Healthcare - AI Roleplay
Tutor notes
When we lend a curious, non-judgemental ear to patients and let them air their thoughts freely, we gain a better understanding of their needs and help them to feel truly heard. The result is better reception to treatment options and a more positive care experience during a stressful time in their lives.
But listening actively and with intent takes practice. Many of us listen with the primary goal of responding, rather than first seeking to understand. To overcome this chasm in conversations, we must know how to ask good open questions, reflect back what we hear, and validate feelings with empathy.
Our Active Listening in Healthcare roleplay is an AI-powered, free-flowing conversation simulator where learners can practice these key active listening skills with a virtual patient and receive tailored feedback for improvement.
About this resource
Key learner outcome and goals
Learning outcome
Practice your active listening skills with a virtual patient to build confidence and professional competency
Learning goals
- Ask good open questions and seek clarity to build an understanding
- Use paraphrasing and summarising to indicate you’re listening
- Demonstrate empathy to validate emotions and show you care
- Communicate with good eye contact and open body language
Disclaimer about AI
This roleplay simulator uses an LLM (large language model) to generate the virtual character’s responses, guide the direction of the conversation, and write personalised feedback for the learner. While we have designed the prompts around the learning outcomes and goals of this simulator, and with strict conversation boundaries that prevent unintended or inappropriate use, please be aware of the following limitations:
- Due to the large nature of its dataset, the LLM is prone to ‘hallucinate’, meaning it may generate information that appears factual but is incorrect, misleading, or entirely fabricated.
- It is impossible to predict everything that users may feed into the customisation fields and everything that they may say in the conversation. Therefore, unintended or misguided uses of the conversation customisation, deliberate attempts to manipulate the system, and repeatedly unpredictable answers from the learner may cause unanticipated outputs from the LLM that are beyond our control.
In short, please be aware that we cannot guarantee all AI-generated content will be accurate, appropriate, or aligned with educational objectives in all circumstances.
We appreciate any feedback about the performance of our simulators. We work hard to design and improve our LLM-based roleplays to give you a personalised, yet safe and impactful learning experience.
Characters and environment

Florence

Private patient room
.webp?width=240&height=350&name=20_Brandon_Avatar_Male%20(4).webp)
Brandon
The scenario and virtual patient in this Active Listening in Healthcare roleplay can be customised in BSGO for a unique and personalised experience:
- Learner role – what healthcare profession will the learner roleplay in this conversation?
- Patient avatar – choose between Florence and Brandon to match your scenario
- Medical history – establish the AI patient’s wider care needs
- Patient’s current concern – to give the conversation a focus and direction
- Patient’s personal background – to liven their personality!
- Patient’s current mood – how will the AI patient behave when the learner first talks to them?
- Transcript submission – you can toggle to choose if learners must submit their transcript for instructor review
Upon entering the simulator, learners will see the customised instructions in a text pop-up for guidance on what to expect and their goals during the conversation. Then, they’ll meet the virtual patient and practice putting their active listening skills into action.
The conversation flows back and forth with exchanges between the AI-powered virtual patient and the learner. After the AI patient shares their concern, the learner then has an opportunity to respond in their own words, then the patient replies, and so on. The patient’s responses dynamically adapt to progress the conversation based on the learner’s input.
If the learner navigates the conversation with a keen ear and thoughtful answers, they’ll elicit positive responses from the patient and constructively progress the conversation, eventually reaching a stage where the patient feels heard, their worries ease, and they’re open to the learner’s advice.
Oppositely, giving judgemental, dismissive, or self-centred responses may devolve the conversation into hurt feelings and resistance from the patient, before they eventually withdraw from speaking openly to the learner any further.
Once the conversation comes to a natural conclusion, the learner can view AI-powered, personalised feedback on their performance and use of fundamental active listening skills.
When the learner’s conversation with the AI patient concludes, a pop-up panel offers LLM-generated insight about their active listening skills. Short text summaries explain where they excelled and where they could improve in future, using examples from their transcript.
Learner performance is measured based on these key verbal and non-verbal active listening skills:
- Paraphrasing and summarising – reflecting back what the patient shared with them to demonstrate understanding.
- Empathy – using empathetic and validating statements that show caring and recognition of the patient’s emotions.
- Open questions and clarification – seeking to open up the conversation and gain a better grasp of the patient’s wants and needs.
- Eye contact – making frequent, but not excessive, eye contact to convey engagement and empathy.
- Body language – adopting a relaxed, open posture that creates a calm and welcoming aura.
The learner can also view the full conversation transcript in this pop-up panel, for further self-reflection of where they excelled and where they went wrong.