Thoughts on AI Scribes for Clinical documentation in NZ Healthcare

Artificial intelligence has been used in healthcare for many years. But new generative scribes are quickly being adopted, with significant interest by clinicians in NZ using Nabla and Hiedi Copilots (or similar),  to reduce the amount of time practitioners spend writing up their clinical documentation.

These AI assistants are designed to ease the workload on medical practitioners by capturing audio recordings of a consultation between a practitioner and a patient and using a combination of speech-to-text transcription and large language models (LLMs) to produce a structured medical note for a patient’s medical records.  Some AI Assistants are now claiming to expand on this with single click, instant second opinions on diagnosis and treatment, past patient notes or lab reports available during the consult.

These large language models (LLM’s) are nothing new, nor the speech-to-text engines, which have been around with products like Dragon for over 30 years. What is new, is the ease with which start-ups anywhere in the world with a few developers can now create their own, or plug into existing LLM’s and make “AI” scribes to sell or provide free version to NZ and the world. With such rapid developments in AI technology, the proliferation of providers and the buzz they have created quite evident, some tough questions do need to be asked.

A greater emphasis is needed by practices and media on the medico-legal issues, especially with start-ups that do not have the wealth of experience in the clinical space that existing players & world-wide vendors have built up over many years. Investigating the potential privacy and misdiagnosis risks before implementing these scribes is essential.
A recent survey by Otago University survey was conducted February to March with nearly 200 GPs providing insights into their experiences with GenAI scribe tools used for clinical notes. Less than 60% of these Kiwi GPs had asked for patient consent before using an AI scribe during consultations. The survey also revealed concerns about patient consent and data privacy as critical issues, with only 65% of the surveyed GP’s having thoroughly reviewed the terms and conditions of the tools themselves, before use.

Obtaining explicit patient consent before using an AI scribe / Assistant in a consultations is essential, but with many scribes used in NZ having their data based overseas, are we raising questions of data sovereignty and the depth / breadth / appropriateness of their statistical data used to train the model matching our kiwi market?
Mutaz Shegewi, a senior research director with IDC’s Worldwide Healthcare Provider Digital Strategies group is reported in recent Computer World article as saying: “Clinicians may also become de-skilled as over-reliance on the outputs of AI diminishes critical thinking,” “Large-scale deployments will likely raise issues concerning patient data privacy and regulatory compliance. The risk for bias, inherent in any AI model, is also huge and might harm underrepresented populations.”
AI is known to have its own statistical biases, with a higher ratio of sex and racial biases well documented within LLMs. GenAI models are inherently prone to bias due to their training on datasets that may disproportionately represent certain populations or scenarios.
Bioethicist and lawyer Rochelle Style in a recent article for eHealthNews reported by Rebecca McBeth states; “New Zealand has very poor health literacy, and probably even poorer AI health literacy. So, there is an issue about being able to first know what you are consenting to and giving patients real choice as once these tools become embedded, what will happen if people do not want their consultations recorded and summarised?”

The sales techniques used to promote these products, also need to be questioned. Many clinicians naturally feel alone and a bit uncertain at times; sales pitches about second opinions being instantly available during your consultation, is an incredibly powerful and attractive proposition. Valid second opinions depend very much on the data they are drawn from.

Bioethicist and lawyer Rochelle Style in the article for eHealthNews stated “I still think there is a bit of puffery in what some of the platforms are claiming …” she is reported as stating, also adding that there are potential legal risks if clinicians do not carefully review all documentation.

It is undeniable that the wheels of change are in motion, Pandora’s box has been opened. And like so many technologies that have come before, it is up to users to determine how AI will be used and if the results are for the greater good. As with anything new, we must do our research and pick partners carefully, looking at longevity in the industry, investment in R&D and a history of success. We must also figure out how to effectively use the tools, counsel patients and clinicians on appropriately safe, and informed use and remain educated on developments.

From our perspective at Voice Power NZ Ltd, we believe in the ‘fundamental theorem of informatics’  – which assumes that the combination of human and computer together should outperform either on their own. This is true for AI also; assistants and scribes that listen in the background and produce full clinician notes are not the complete solution, as they miss the vital step of clinicians own reflective narrative.

Voice Power staff have for over 20 years been before NZ clinicians with AI driven Dragon speech recognition software, empowering clinicians to create in their own words and with their own experience, actionable and clear clinical documentation. Documenting at the speed of sound with your own knowledge driving the clinical narrative.

Nuance Dragon solutions have millions of users, over 10 thousand healthcare organisations; delivering proven savings and ROI case studies in NZ and throughout the world. Microsoft bought Nuance – Dragon in 2021 for a record $20 billion and have now combined it with their large ChatGPT investment in producing the award winning – Dragon® Ambient eXperience™.

DAX Copilot combines proven AI and Large Language Models (LLMs) and is backed by Microsoft’s scale, strength, and security. Award-winning AI built on a rich clinical data sets, anchored in 1B+ minutes of medical dictation and 15M+ ambient encounter annotated notes annually. Established on the platform which the existing Dragon Medical One (widely used throughout NZ) sits, automating clinical tasks & notes, all from one vendor, on one platform (Microsoft Azure).
DAX Copilot will be coming to APAC & NZ sometime in the future, and we will let customers know as soon as dates are confirmed. Microsoft have already opened their first hyperscale data centre in New Zealand, ensuring sensitive data is kept on our shores. The extra time for DAX to deploy here helps ensure data sovereignty and the statistical data used to train models match our regional and specific kiwi needs.

In the meantime, we encourage clinicians to try available AI scribes with care; But remember, it’s your voice, your words that patients and downstream healthcare professionals need to hear.

David Shepherd
Director – Voice Power NZ Ltd

See Nuance – DAX Copilot Demo Video here

Articles that I drew upon that are worth a good read:

22/11/24 – Blog: Would you like a second opinion with your AI scribe? By Kate McDonald PULSE IT

6/11/24 – GP survey raises patient consent concerns over AI scribe use. By Rebecca McBeth eHealthNews.nz

3/12/24 – Will AI help doctors decide whether you live or die? By Lucas Mearian ComputerWorld

12/12/24 – Microsoft opens first hyperscale data centre in New Zealand

0 Comments

Leave a reply

Copyright © 2024 VOICE POWER NZ LIMITED. All rights reserved.

We endeavor to ensure accuracy of descriptions, images and pricing, however we can not always guarantee this. Customers should email info@voicepower.co.nz immediately if any discrepancies are found. Voice Power NZ Ltd standard terms and conditions apply.

CONTACT VOICE POWER NZ LTD

Send us a message and we'll get back to you, as soon as possible.

Sending

Log in with your credentials

Forgot your details?