Dr. Matthew Hitchcock, a family physician in Chattanooga, Tennessee, has an AI assistant.
It records patient visits on his smartphone and summarizes them for treatment plans and billing. He does some editing of what the AI produces and finishes his daily documentation of patient visits in about 20 minutes.
Dr. Hitchcock would spend up to two hours writing these medical notes after his four children had gone to bed. “It’s a thing of the past,” he said. “It’s great.”
ChatGPT-style AI is coming to healthcare, and the big vision of what it can bring is inspiring. Every doctor, enthusiasts predict, will have a super-intelligent assistant that will make suggestions to improve care.
But more mundane applications of artificial intelligence will come first. A major goal will be to ease the crushing burden of digital documentation that physicians must produce, entering lengthy notes into electronic medical records necessary for treatment, billing, and administrative purposes.
For now, the new AI in healthcare will be less of a genial partner than a tireless scribe.
From leaders at major medical centers to family doctors, there is optimism that health care will benefit from the latest advances in generative AI, a technology that can produce everything from poetry to computer programs, often with human-level fluency.
But medicine, doctors emphasize, is not a wide field for experiments. AI’s tendency to occasionally create fabrications, or so-called hallucinations, can be fun, but not in the high-stakes realm of healthcare.
This makes generative AI, they say, very different from AI algorithms already approved by the Food and Drug Administration for specific applications, such as scanning medical images for cell clusters or subtle patterns that suggest the presence of lung or breast cancer. Doctors are also using chatbots to communicate more effectively with some patients.
Doctors and medical researchers say regulatory uncertainty and concerns about patient safety and litigation will slow the adoption of generative AI in healthcare, especially its use in diagnosis and treatment plans.
Those doctors who have tried the new technology say its effectiveness has improved significantly over the past year. And medical note software is designed so that doctors can verify what AI generates summaries versus the words spoken during the patient visit, making it verifiable and fostering trust.
“At this point, we have to choose our use cases carefully,” said Dr. John Halamka, president of the Mayo Clinic Platform, which oversees the healthcare system’s adoption of artificial intelligence. “Reducing the paperwork burden would be a huge win in itself.”
Recent studies show that doctors and nurses report high levels of burnout, causing many to leave the profession. High on the list of complaints, especially for primary care physicians, is the time spent on electronic health record documentation. This work is often carried over into the evenings, the off-hours doctors call “pajama time.”
Generative AI looks like a promising weapon to combat the physician workload crisis, experts say.
“This technology is rapidly improving at a time when health care needs help,” said Dr. Adam Landman, chief information officer of Mass General Brigham, which includes Massachusetts General Hospital and Brigham and Women’s Hospital in Boston.
For years, doctors have used various types of documentation aids, including speech recognition software and human transcribers. But the latest AI does much more: it summarizes, organizes and tags the conversation between doctor and patient.
Companies developing this kind of technology include Abridge, Ambience Healthcare, Augmedix, Nuance, which is part of Microsoft, and Suki.
Ten doctors at the University of Kansas Medical Center have been using generative AI software for the past two months, said Dr. Gregory Ator, an ear, nose and throat specialist and the center’s chief medical informaticist. The medical center plans to eventually make the software available to its 2,200 physicians.
But the Kansas health system has shied away from using generative AI in diagnostics, concerned that its recommendations may be unreliable and that its motivations are not transparent. “In medicine, we cannot tolerate hallucinations,” said Dr. Ator. “And we don’t like black boxes.”
The University of Pittsburgh Medical Center is a test center for Abridge, a startup company led and co-founded by Dr. Shivdev Rao, a practicing cardiologist who also served as executive director of the medical center’s risk arm.
Abridge was founded in 2018 when Big Language Patterns, the technology engine for generative AI, was emerging. Technology, Dr. Rao said, opened the door to an automated solution to the red tape in health care that he saw all around him, even for his own father.
“My father retired early,” Dr. Rao said. “He just couldn’t type fast enough.”
Today, Abridge software is used by more than 1,000 physicians in the University of Pittsburgh Medical System.
Dr. Michelle Thompson, a family physician in Hermitage, Pa., who specializes in lifestyle and integrative care, said the software has freed up nearly two hours in her day. Now she has time to do a yoga class or catch up on sit down family dinner.
Another benefit is improving the patient’s visit experience, Dr. Thompson said. No more writing, note-taking or other distractions. She simply asks patients for permission to record their conversation on her phone.
“AI has allowed me, as a physician, to be 100 percent present for my patients,” she said.
The AI tool, Dr. Thompson added, has also helped patients become more engaged in their own care. Immediately after the visit, the patient receives a summary, available through the University of Pittsburgh Medical System online portal.
The software translates any medical terminology into plain English at about a fourth-grade reading level. It also provides a record of the visit with “medical moments” color-coded for medications, procedures and diagnoses. The patient can click on a colored label and hear part of the conversation.
Studies show that patients forget up to 80 percent of what doctors and nurses say during visits. The recorded and AI-generated visit summary, Dr. Thompson said, is a resource her patients can return to for reminders to take medication, exercise or schedule follow-up visits.
After the appointment, doctors receive a summary of the clinical note for review. There are links to the transcript of the conversation between doctor and patient so that the AI’s performance can be verified and confirmed. “It really helped me build confidence in AI,” Dr. Thompson said.
In Tennessee, Dr. Hitchcock, who also uses Abridge software, has read ChatGPT reports, scored high on standardized medical tests, and heard predictions that digital doctors will improve care and solve staffing shortages.
Dr. Hitchcock has tried ChatGPT and is impressed. But he would never consider loading a patient record into the chatbot and asking for a diagnosis for legal, regulatory and practical reasons. For now, he’s grateful that his evenings are free, no longer mired in the tedious digital paperwork required by the American healthcare industry.
And he sees no technological cure for the health care workforce shortage. “AI is not going to fix this anytime soon,” said Dr. Hitchcock, who is looking to hire another doctor for his four-physician practice.