A Huge Amount of Doctors Are Already Using AI in Medical Care ...
One in five UK doctors use a generative artificial intelligence (GenAI) tool – such as OpenAI's ChatGPT or Google's Gemini – to assist with clinical practice. This is according to a recent survey of around 1,000 GPs.
The Impact of AI in Medical Care
Doctors reported using GenAI to generate documentation after appointments, help make clinical decisions and provide information to patients – such as comprehensible discharge summaries and treatment plans. Considering the hype around artificial intelligence coupled with the challenges health systems are facing, it's no surprise doctors and policymakers alike see AI as key in modernising and transforming our health services.
The Challenges of Implementing AI in Healthcare
But GenAI is a recent innovation that fundamentally challenges how we think about patient safety. Traditionally, AI applications have been developed to perform a very specific task. For example, deep learning neural networks have been used for classification in imaging and diagnostics. Such systems prove effective in analyzing mammograms to aid in breast cancer screening.
![The Impact of AI in Healthcare: A Statistical Analysis](https://d2ms8rpfqc4h24.cloudfront.net/ss1_b94b70480b.jpg)
However, GenAI is not trained to perform a narrowly defined task. These technologies are based on so-called foundation models, which have generic capabilities. This means they can generate text, pixels, audio, or a combination of these.
Challenges with GenAI Implementation
One major hurdle in using GenAI in healthcare is the phenomenon of "hallucinations". Hallucinations are nonsensical or untruthful outputs that can be based on the input provided. This plausibility is another reason it's too soon to safely use GenAI in routine medical practice. Hallucinations occur because GenAI works on the principle of likelihood – predicting which word will follow in a given context – rather than being based on "understanding" in a human sense.
![2019 Study Results of AI In Healthcare](https://www.definitivehc.com/sites/default/files/blog/inline-images/AI_healthcare_challenges_1.jpg)
Ensuring Patient Safety with GenAI
Another reason it's too soon to use GenAI in healthcare is because patient safety depends on interactions with the AI to determine how well it works in a certain context and setting. Before these technologies can be used in healthcare more broadly, safety assurance and regulation will need to become more responsive to developments in where and how these technologies are used.
Developers of GenAI tools and regulators need to work with the communities using these technologies to develop tools that can be used regularly and safely in clinical practice.