In 2017, Dr Ingrams (LMC Chief Executive) wrote an article about whether AI would replace GPs, which is available to read here . His conclusion was that AI will enhance our work but could never replace GPs. I concluded “I look forward to the improved efficiency and safety that continuing development of computer systems will bring. But these will not save clinical time and the replacement of doctors by the likes of a Star Trek emergency medical hologram will remain science fiction.”
Currently AI Scribes are being used more and more frequently in general practices. This has been in a vacuum of central advice or local support and was described to me by a fellow informatician as being a ‘Wild West.’ It is against this background, that it is important that practices as data controllers do not allow AI Scribes to be used without first considering the potential risks and having a plan to mitigate these and formalise your approach.
This is a dichotomy between the lack of guidance and support from DHSC, whilst Health and Social Care Secretary Wes Streeting is saying that IA is an important future for the NHS and “We are bringing our analogue NHS into the digital age.”
AI scribes are software solutions that use artificial intelligence to transcribe and document consultations and/or write letters and referrals. They aim to lighten the administrative burden on GPs and allow them to focus more on patients. They utilise speech recognition and natural language processing to create detailed clinical notes.
This guide provides very high-level guidance about what practices need to consider.
NHS England has recently published guidance regarding the use of AI, which although helpful, is aimed more at Trusts/larger organisations.
GPC has issued some high-level guidance for LMCs which looks at the possible risks, benefits and what to consider (the main contents are attached to the bottom of this section). They propose to produce a more detailed practical guide for practices soon.
LMC Checklist
Before your practice starts to use an AI scribe, we advise that you consider the following:
- Decide whether you wish to (Read GPC guide below first):
o Allow individuals to use an AI scribe, or
o Roll out to all (majority) of clinicians, or
o Not allow anyone in your surgery to use. - Decide which of the many AI Scribe products you wish to use.
- Develop an DCB0160 and DPIA (your DPO should advise you on these – if you use the MLCSU, then
premade DPIAs are available on their portal https://primarypoint.co.uk). - Decide on your consent model
o Individual consent (most relevant if only one or two clinicians using) and whether in writing or oral only.
o Opt-out model. Advise patients by
▪ Notice on website, and
▪ Personalised messages (via AccuRx) to all in advance and on any message confirming appointments, and
▪ Notices in waiting room and consulting/clinical rooms. - Decide whether you will allow/encourage locums to use AI Scribe, and, if so, to advise of your protocol include which AI Scribe your practice supports (if you decide not to use AI Scribes at all, or not to allow locums, then make it clear in your locum handbook that they must not be used).
- Provide training for clinicians who will be using it.
- Inform the ICB if you plan to use or are already using an AI Scribe.
- Develop a protocol to include all the above and consider adding to your risk register.
GPC Guidance:
Overview
This document, published by NHS England on 27th April, provides guidance on the deployment of AI-enabled ambient scribing products including advanced ambient voice technologies (AVTs) used for clinical or patient documentation and workflow support in health and care settings. It is of relevance to GPs aiming to implement a specific product. This is the first in a series of documents to be published over the next six months. An AI Ambassadors network will be established to support best practice and sharing of insights.
What are AI-enabled Ambient Scribing Products?
These are speech recognition and natural language processing (NLP) systems that:
- Record and transcribe conversations between clinicians and patients during consultations.
- Use AI algorithms to generate structured clinical notes, such as SNOMED-coded entries.
- Can auto-populate sections of the patient record, with clinician approval.
- Can generate outputs in the form of medical letters or other documentation
- Can recommend actions such as onward referral.
Some tools incl include features as:
- Automatic summarisation of discussions based on text transcripts.
- Intelligent prompts for missing clinical information.
- Real-time transcription during the consultation.
Potential Benefits for GPs
Reduced Clinician workload
- Save time otherwise spent on typing or dictation.
- Potential to reduce burnout related to administrative overload.
Improved Patient Engagement - Enable GPs to dedicate more time to providing care rather than documenting it.
- May enhance rapport and patient satisfaction with the GP solely focused on the patient not the computer
screen during consultations.
Improved Consistency and Quality - Potentially more comprehensive and standardised notes improving the quality of the patient record and
supporting clinical decision making by having up to date, detailed, real-time accurate records. - Assists with clinical coding and claiming for work carried out.
- Improvements in operational efficiency and potential cost savings by reducing administrative workload and
improving data quality. - Intelligently automating workflow, promoting scalability and interoperability across health care settings.
Key Governance and Safety Considerations
Clinical Responsibility and Legal Liability
- NHS organisations (including GP practices) may still be liable for any claims arising out of the use of AI products particularly if it concerns a non-delegable duty of care between the practitioner and the patient. This is a complex and largely uncharted area, with limited case law to provide clarity. Clear and comprehensive contracting arrangements with suppliers setting out their roles, responsibilities and liability can mitigate this risk.
- Even though AI creates the draft, GPs must validate, correct, and sign off on all content as the final output is legally and clinically attributable to the practitioner.
- Practices must complete the DCB0160 documentations (including related safety case, hazard log, and monitoring frameworks) and a Data Protection Impact Assessment (DPIA) and ensure that the supplier has completed the DCB0129.
- If a product is registered with the MHRA, training for intended users may be necessary to meet safety requirements.
- The Yellow Card reporting mechanism needs to be used every time a medical device does something
unexpected or gets something wrong. Most AI scribe products will be classed as medical devices. - CQC inspection teams will consider if: – Technologies are safely deployed (practices need a DPIA). – Staff have been trained. – There are reporting processes for incidents. – There is innovative practice which improves patient care.
Patient Consent and Information - Consent must be explicit and informed. Key points to explain to patients:
o What the AI is recording and why.
o How their data is stored and for how long.
o Their right to refuse recording or withdraw consent. - Consider using visual signage or digital consent forms. Make information available to patients in public areas, on practice websites and social media channels.
- Notes and recordings may need to be disclosed to the patient in case of a SAR. What the AI scribe records may diverge from what was said by the GP/ healthcare professional. At the current time, it is unclear who will be responsible for what if full records are not maintained. The practice should familiarise itself with the
document retention policy, privacy notices and DPNs of the potential providers.
Data Protection (UK GDPR) Compliance - Engage with information governance and cybersecurity support early to ensure legal and regulatory requirements are met. Seek help from your local ICB to ensure tools are compliant with:
o UK GDPR and Data Protection Act 2018.
o NHS Data Security and Protection Toolkit, CREST and Cyber Essentials Plus certification. - Use the least amount of data necessary to deliver the function.
- Be transparent around how information is used and shared in relation to ambient scribing products including data storage, encryption, and retention.
- NHS England has developed guidance for IG professionals to assist with meeting the requirements of data
protection legislation when implementing AI-enabled technologies. Further guidance will be released in
- The ICO provides guidance for data protection and compliance with UK GDPR for those adopting AI technologies, and specific guidance on using Generative AI that processes personal data.
Digital Technology Assessment Criteria (DTAC)
The use of Generative AI for further processing, such as summarisation would likely qualify as a medical device (requiring it to be registered with MHRA). A product is a medical device if its intended purpose falls under the definition of a medical device, for example if it informs or drives medical decisions and care. This includes incorporating a diagnosis or prognosis within its outputs, triaging and stratifying, or carrying a prescriptive function like managing or recommending treatments. Products that solely generate text transcriptions are not likely to be classed as medical devices.
Ambient scribing tools must meet DTAC standards for medical device regulation, including:- Clinical safety (compliant with DCB0129/0160).
- Technical security.
- Interoperability with existing systems (e.g., EMIS, SystmOne).
- Accessibility and usability.
Bias and Inaccuracy Risks - AI tools may generate inaccurate or fabricated content.
- Risk of embedded algorithmic bias (e.g., misinterpretation of accents or clinical context such as mistaking a patient’s hypothetical statement for a confirmed diagnosis, misgendering, gaps in documentation leading to compromised care).
- Products that use Generative AI can introduce unique cybersecurity challenges.
- Clinical staff must identify and correct errors before notes are committed to record.
- Ensure ongoing quality assurance and monitoring such as error reporting processes, service usage
monitoring, and comparison between reported and observed metrics.
Quick Implementation Guide
- Assign a Clinical Safety Officer and identify key risks – Consider technical risks (e.g. output errors, system unavailability) and clinical hazards (e.g. incorrect
context or information). – Be aware that new functions may be introduced unintentionally or through user-provided instructions. – An Appendix to the guidance provides actions for technical and product teams leading AI adoption. A series of detailed questions are offered to aid in clarifying the specific features and functionalities of ambient scribing products. The answers should be sought from the supplier, technical experts, clinical safety, IG and IT teams before implementation. - Complete the DCB0160 documentation and a Data Protection Impact Assessment (DPIA) – Develop a safety case, hazard log, and monitoring framework. If you do not have the right tools and capabilities to comply with these standards you should seek help from your local ICB.
- Plan for appropriate integration – Ensure integration with your IT infrastructure, systems and workflows.
- Ensure appropriate controls – Consider legal and regulatory requirements. Ensure compliance with all applicable information law, the Data Security and Protection Toolkit (DSPT), and Medical Device regulations if applicable. – Ensure users review any product outputs prior to further actions.
- Implement your monitoring framework – Ensure ongoing audits of clinical documentation, and reviews of incident reports and system performance.
If you would like any more help or information, please contact the LMC