It’s 9 p.m. I’ve been home for a few hours. The dishes are done, the children are in bed, even the dogs are asleep. I open my laptop to do the two things common to many of us: finish clinic notes and sort through the inbox of patient questions, requests, and communications.
For those of you who this does not apply to, I applaud you. But it is a statistical majority of physicians who do bring work home. In those moments, it feels like technology has certainly not made our lives better or easier.
On one hand, it has taken accessibility to an extreme. Over a decade ago, taking calls on a pager, there was a certain threshold to trigger a phone call. I acknowledge the bar was low, and I am intentionally setting my nostalgia for the pre-80-hour work week era aside. If nothing else, the conversation seemed to have some substance, a caring human voice on the other end of the line who was trying to do right by the patient, cautiously calculating how aggravating a phone call would be at 3 a.m. versus what the patient needed.
And, of course, you could not bring the paper charts home. The paper chart was like a phone booth; it mostly worked in one direction. But if you wanted to receive something from it, you had to sort of wait around.
By contrast, the patient portal is like an interstate truck stop, complete with dull fluorescent lighting, bad food, and gum stains in the parking lot. It’s open 24 hours a day and filled mostly with sadness.
“I had surgery three months ago on my knee and today I got this weird pain in my left third toe. Could this be related to my surgery? What should I do?”
As a result, the improved access to physicians promised by electronic communications has driven burnout through compassion fatigue. It’s as though we have given patients concierge service, but the concierge has been working for seven days straight, mostly for free, and has a day job.
The result is predictable. We are so digitally connected that we are disconnected. I find it challenging to rouse the compassion and caring with which I approached my day in the clinic seeing patients face to face, and I have no doubts this is clear in my responses.
“Your toe bone is not connected to your knee bone,” I find myself wanting to write.
Now there are data to support this observation. A recent study comparing the responses of physicians to chatbots demonstrated that mock patients preferred the bots' replies 80% of the time AND rated them as more empathetic.
The fact that a bot is out-empathizing a doctor says one thing about the power of the bot, but something else about the compassion fatigue of the physicians.
Like many of us, I have now experimented with using AI to generate various documents: patient information, instruction sheets, letters for insurance companies, letters of recommendation. Doximity’s DocsGPT has been a great resource for rapidly filling my website with information targeted to patients on various types of knee pathology, and I have been impressed with the quality of information.
Previously, I utilized medical students to research a topic then create background information based on a set template. I would then edit this information and add my own experience and practice patterns and preferences. It is a time-consuming process that has been ongoing for two years, but has provided a good opportunity for students and has helped me educate patients.
By contrast, I trialed replacing the students with DocsGPT. Anecdotally, the results have been the same as the study cited above. The machine provided overall better information (though not perfectly accurate) in a much shorter time. I took the same cyborg approach, collaborating with the bot to both edit its responses and to put my own twist on these documents, overlaying my own experience and treatment approaches to various conditions as well as some images of my own cases. Using the machine is faster and easier, but it disenfranchises the students who gain from the opportunity.
I will say that referring patients to the relevant information online has saved me time in the clinic.
“If you go to my website, I have a whole section here on your problem and my approach to it. If you would like to learn more it is all right here, but in your particular case I would like to recommend …”
The other interesting part of the doctors versus chatbots study was that the preferred bots' text was not always longer, but it was ultimately more efficient in conveying concern and remedying the problem.
I see this not as a critique but as an opportunity. Let’s consider one more example.
“Doctor, I’m sorry to bother you, but I have been having some itching after my surgery. I’m concerned, what is going on?”
What if I could write (or better yet, speak) a short, terse response that addressed the medical issue at hand?
“It’s because opioids.”
And DocsGPT can dress this up into a well-drafted and properly punctuated email that reads like a 19th-century romantic novel. Here is an answer to a prompt that I asked the bot:
“Thank you for reaching out and sharing your concerns about experiencing itching after opioid use in the postoperative period. It's not uncommon for some patients to experience pruritus, or itching, following the use of opioids. This reaction can be attributed to the fact that opioids can stimulate certain receptors in the brain that can cause itching. However, this side effect is usually benign and temporary. Nevertheless, if you find the itching to be bothersome or persistent, you can trial use of Benadryl or wean off the opioids. If we need to adjust your medication regimen or suggest other methods to alleviate this discomfort, please give us a call at the office. Please don't hesitate to reach out if you have any other questions or concerns.”
Are not both the patient and the physician better off?
By removing the pomp and circumstance of the encounter behind a layer of text that does not carry the emotions of human expression, I can work in tandem with the bot to address the issue and get myself home to my own family in a more timely fashion.
When looking at studies on physician burnout, time with friends and family is one of the most frequently cited salves for depersonalization. As was true for the patient, letting the robot do the repetitive and simple tasks allows the doctor to focus on being human, not with the patient, but with his own family.
And so, AI collaboration may improve compassion for the patient and compassion fatigue for the doctor, ironically, by removing humans from the emotional exchange.
Rather than being concerned about whether machines will replace our jobs, we can choose which parts of our jobs we want the machines to replace. In this brave new world, AI may ultimately make us all feel better about being further apart than ever before.
What part of your job would you replace with AI? Share in the comments.
Brian Gilmer, MD is an orthopaedic surgeon in Reno, NV, and Mammoth Lakes, CA. He has used AI to help build the EncycloKNEEdia at briangilmermd.com, but did not use it to draft this editorial. Brian was a 2022–2023 Doximity Op-Med Fellow, and continues as a 2023–2024 Doximity Op-Med Fellow.
Animation by Diana Connolly