AI prompting mistakes in healthcare can impact the potential of even the most advanced tools – and you may not even realise you’re making them.
AI tools have incredible potential to improve healthcare – from streamlining admin tasks to enhancing patient outcomes. Yet even the best tools rely on one key factor: the quality of your prompts.
When used effectively, AI can support decision-making, education and communication. But when prompts are vague, misdirected or lacking context, the results can be irrelevant or even misleading.
Here are five common AI prompting mistakes in healthcare – and how to avoid them to get more useful, accurate results.
1. Using vague prompts
Vague prompt:
“Help me improve my patient communication skills.”
This prompt is too broad, leaving the AI unsure of your exact needs, which could result in general or irrelevant advice.
Better prompt:
“Suggest three ways I can improve my patient communication with elderly patients who have chronic conditions and hearing loss.”
Why it matters:
Vague prompts lead to vague answers. The more specific your prompt, the more useful and actionable the AI’s response will be.
2. Not providing enough context
Vague prompt:
“Summarise this research article.”
This is a vague prompt that doesn’t clarify the target audience, key takeaways or the intended use of the summary, leading to broad or irrelevant information.
Better prompt:
“Summarise this research article for time-poor GPs in under 150 words. Focus on the three most important clinical takeaways, and highlight how the findings could influence everyday practice.”
Why it matters:
Context and constraints help AI focus. The more guidance you give, the more usable the output becomes.
3. Taking AI output at face value
Vague prompt:
“Write a FAQ for patients about our new telehealth service [insert background information].”
This prompt lacks clarity regarding the tone, structure and content needed, which can lead to an incomplete or unsuitable response.
Better prompt:
“Write a first draft of an FAQ for patients about using our new telehealth service. Limit to 5 questions and answers, keep each response under 60 words, use a reassuring tone and plain language (year 7 reading level) [insert background information].”
Why it matters:
AI is a first-draft partner – not the final word. Always revise and tailor to your audience and clinical standards.
4. Using jargon or overly complex language
Vague prompt:
“Create an onboarding document for multidisciplinary staff on EHR interoperability frameworks.”
This prompt uses technical language without considering the diverse backgrounds of the audience, which could result in overly complex or confusing content.
Better prompt:
“Write a 300-word plain English guide for new staff about how our electronic health records integrate with other systems. Keep the tone supportive and friendly. Use short sentences, no acronyms and include a 3-point summary at the end.”
Why it matters:
Simple, clear language helps ensure the AI produces content that’s accessible and understandable for everyone, including staff with varying levels of technical expertise.
Plain language isn’t just better for patients – it’s better for AI prompting because it helps you understand your objectives and goals. Clarity leads to better outcomes.
5. Not iterating or refining prompts
Vague prompt:
“Write a staff newsletter article about workplace wellbeing.”
This prompt is vague and open-ended. The AI may produce something generic that doesn’t fit your organisation’s tone, format or purpose.
Better approach:
“Write a 250-word draft for a staff newsletter article about our new workplace wellbeing program. Focus on mental health resources and flexible work options. Use a friendly, encouraging tone, include a short intro and three subheadings, and end with a call to action to visit our wellbeing portal.”
Why it matters:
The first AI output is rarely the best. Once you review the draft, refine your prompt to clarify structure, tone or content. This iterative approach helps you shape content that’s relevant, accurate and engaging.
Final thoughts: Avoiding AI prompting mistakes in healthcare
AI won’t replace health professionals – but it can support your work when used effectively. Avoiding these common AI prompting mistakes in healthcare will help you get more relevant, accurate and practical results.
Want to go deeper? Explore my AI Prompt Writing for Healthcare CPD course – designed to help health professionals use AI tools responsibly and confidently.