Events Calendar

Mon
Tue
Wed
Thu
Fri
Sat
Sun
M
T
W
T
F
S
S
27
28
29
30
31
1
2
12:00 AM - NextGen UGM 2025
3
4
6
7
8
9
10
11
12
13
14
15
16
17
10:00 AM - MEDICA 2025
18
19
20
21
22
23
24
25
26
27
28
29
30
NextGen UGM 2025
2025-11-02 - 2025-11-05    
12:00 am
NextGen UGM 2025 is set to take place in Nashville, TN, from November 2 to 5 at the Gaylord Opryland Resort & Convention Center. This [...]
Preparing Healthcare Systems for Cyber Threats
2025-11-05    
2:00 pm
Healthcare is facing an unprecedented level of cyber risk. With cyberattacks on the rise, health systems must prepare for the reality of potential breaches. In [...]
MEDICA 2025
2025-11-17 - 2025-11-20    
10:00 am - 5:00 pm
Expert Exchange in Medicine at MEDICA – Shaping the Future of Healthcare MEDICA unites the key players driving innovation in medicine. Whether you're involved in [...]
Events on 2025-11-02
NextGen UGM 2025
2 Nov 25
TN
Events on 2025-11-05
Events on 2025-11-17
MEDICA 2025
17 Nov 25
40474 Düsseldorf
Articles News

AI is being used by some physicians to create medical records. What knowledge do you require?

EMR Industry

Consider this. You’ve worked up the nerve to approach a general practitioner about an uncomfortable issue. You take a seat. The doctor states, “I’m using my computer to record my appointments before we start. Because it’s AI, it will compose a letter to the specialist and a summary for the notes. Is that acceptable?
What? Our medical records are written by AI? Why would we desire that?

For safe and efficient medical treatment, records are crucial. Keep accurate records in order to maintain your registration as a clinician. To be recognized, health services must have reliable record-keeping systems. Records are legal documents as well, and they may be crucial in court cases or insurance claims.

However, it takes time to write things down or dictate letters or notes. Clinicians may split their focus during patient visits between maintaining accurate records and communicating with the patient. Clinicians occasionally have to work on records after hours, when their days are already long.

Health care practitioners of all stripes are understandably excited about “ambient AI” and “digital scribes.”

How do digital scribes work?
This isn’t your typical transcription program: Software records a dictated letter word for word.

Not so with digital scribes. They use AI, which is akin to ChatGPT (or occasionally, GPT4 itself) and consists of big language models with generative capabilities.

Using a specialized sensitive microphone or the microphone on a phone, tablet, or computer, the application captures patient-physician conversations in silence. The recording is transformed word for word into a transcript by the AI.

After receiving instructions, the AI system uses the transcript to generate a clinical note and/or letters for other physicians, which are then ready for the clinician to review.

The majority of doctors are not well-versed in these technologies. They are not AI specialists; they are experts in their field. “Let AI take care of your clinical notes so you can spend more time with your patients,” the promotional materials claim.

Imagine yourself as the clinician. One could respond, “Yes, please.”

How are they controlled?
The Australian Health Practitioner Regulation Agency has published a digital scribe best practices code of practice. A fact sheet was released by the Royal Australian College of General Practitioners. Both caution doctors that they are still accountable for the information in their medical records.

While many digital scribes are exempt from regulations, other AI applications are. Thus, it is frequently the responsibility of health services or physicians to determine the efficacy and safety of scribes.

What is the current state of the research?
There is a dearth of real-world data or proof about the effectiveness of digital scribes.

In a ten-week pilot study, researchers tracked 9,000 physicians in a large Californian healthcare system using a digital scribe.

The scribe was liked by certain doctors. They had fewer work hours and improved patient communication. Not even one other began to use the scribe.

Additionally, the scribe made errors, such as noting the incorrect diagnosis or the fact that a test was performed when one should have been performed.

Thus, how must we to handle digital scribes?
The first Australian National Citizens’ Jury on AI in Health Care’s recommendations are a wonderful place to start because they outline what Australians desire from AI in healthcare.

Expanding upon those suggestions, consider the following before visiting the clinic or emergency room regarding digital scribes:

1. If there is a digital scribe in use, you ought to know about it.

2. Only scribes made specifically for medical usage should be employed in medical settings. It is not appropriate to use common, open-source generative AI technologies (such as Google Gemini or ChatGPT) in healthcare settings.

3. You ought to have the option to approve or disapprove the usage of a digital scribe. Any pertinent risks ought to be disclosed to you, and you ought to have the freedom to accept or reject.

4. Strict privacy requirements must be met by clinical digital scribes. Regarding your medical care, you have a right to secrecy and privacy. A clinical note typically contains much less material than the entire transcript of a session. Thus, inquire:

    • Are your appointments’ transcripts and summaries processed in Australia or another nation?
    • How are they protected from prying eyes (are they encrypted, for instance)?
    • To whom are they accessible?
    • What is their purpose (do they train AI systems, for instance)?

    Is human oversight sufficient?
    Artificial intelligence (AI) generative systems are prone to error, fabrication, and misinterpretation of patient accents. However, they frequently convey these mistakes in an extremely convincing manner. Thus, meticulous human verification is essential.

    Insurance and IT companies tell doctors that they should (and should) review every summary or letter. However, it’s not that easy. Overly dependent on the scribe, busy practitioners may just accept the summaries. Clinicians who are worn out or lack expertise may believe that the AI is correct and their memory is flawed, a phenomenon referred to as automation bias.

    Some have proposed that these scribes ought to be qualified to write patient summaries as well. Health records are not our property, but we typically have access to them. Customers may be more inclined to view the information in their health record if they are aware that a digital scribe is being used.

    It has always been the responsibility of clinicians to take notes regarding our humiliating issues. These records’ secrecy, security, privacy, and quality have always been crucial.

    Perhaps in the future, digital scribes will lead to improved contacts with our providers and better records. However, at this time, solid proof that these instruments can function in actual clinics without sacrificing standards of quality, safety, or ethics is required.