Will AI Take My Nursing Job?

Nursing jobs appear safe for the foreseeable future, but hospitals around the country are using AI to lighten nurses' workloads.
By
portrait of Evan Castillo
Evan Castillo
Read Full Bio

Reporter

Evan Castillo is a reporter on BestColleges News and wrote for the Daily Tar Heel during his time at the University of North Carolina at Chapel Hill. He's covered topics ranging from climate change to general higher education news, and he is passiona...
Published on June 12, 2023
Edited by
portrait of Darlene Earnest
Darlene Earnest
Read Full Bio

Editor & Writer

Darlene Earnest is a copy editor for BestColleges. She has had an extensive editing career at several news organizations, including The Virginian-Pilot and The Atlanta Journal-Constitution. She also has completed programs for editors offered by the D...
Learn more about our editorial process
Image Credit: Solskin / DigitalVision / Getty Images
  • AI is not a threat to nursing jobs for the foreseeable future. Rather, hospitals are seeking to lighten nurses' administrative workloads with AI.
  • Health systems are partnering with AI companies to more easily convert audio conversations into written documentation.
  • ChatGPT can "hallucinate" facts and provide false information, according to research.
  • Google and Mayo Clinic are collaborating to use AI solutions that can help bring data from different databases, documents, and intranets to clinicians faster.

In some form or fashion, artificial intelligence (AI) has been incubating in the healthcare world for some time now, through clinical tools found in places like electronic health records.

But AI, as we know it today, has led many workers — from screenwriters to farmworkers — to wonder if AI poses a threat to their jobs. Could it be a threat to nurses? Could AI take your nursing job?

Based on recent research, nurses appear safe from the AI onrush. In fact, their lives could be about to get easier.

So while it's nowhere near capable of "taking over" the human decision-making, empathy, and direct physical care nurses provide to patients, AI could be poised to reduce administrative burdens.

According to experts, AI could help automate records, note-taking, and documentation and bring information across different databases to nurses even faster. Experts also want language processing models like ChatGPT and GPT-4 to help give clinicians quick answers if they're unsure of diagnoses.

But first, they have to make sure it doesn't make up facts.

Voice to Documentation Relieves Workload From Nurses

According to a report from Modern Healthcare, the University of Kansas (KU) health system is partnering with medical AI company Abridge to implement an AI that summarizes audio from patient clinical conversations while omitting irrelevant audio.

KU is optimistic about finishing implementation by the end of the year. Still, the most prominent challenge is training clinicians to use AI.

The BayCare Health System in Florida is partnering with healthPrecision for a similar product called Medical Brain. This mobile application helps nurses by translating verbal information into written documentation in patients' medical records.

The app will also prompt nurses if they overlook an opportunity for more assessments or treatments and will remind them if a patient follow-up is recommended or required.

AI Is Unreliable for Diagnoses, for Now

When examining clinical data, AI can still make errors in scans and medical imagery. According to a December Pew Research Center study, 60% of adult U.S. patients would feel uncomfortable if their healthcare provider relied on AI for their medical care.

"Obviously, there's a lot of energy and a lot of concern," Greg Ator, MD, chief medical informatics officer at the University of Kansas Health System, said in the Modern Healthcare report. "People just get way out in front of their skis on some of these technologies."

According to a study conducted by Stanford Health Care's chief data scientist Nigam Shah that tested ChatGPT's ability to answer clinical questions with "known truth" answers, less than 20% of the model's responses met that criterion.

The biggest challenge to implementing language processing models is their ability to "hallucinate" facts. Hallucinations happen when the AI makes up a fact or citation and presents it as truth.

He said these hallucinations have varying degrees of possible patient harm, the largest being increased physician confusion. If a clinician were to consult the AI without knowing the correct answer to their question, they are less likely to trust the tool moving forward.

Shah said models could reduce the disagreements between responses and known truths if the models could provide citations or backup for their summaries. Humans can also tone down the AI's "creativity" by setting parameters that limit its generative ability.

According to Shah, implementing AI in the clinical field will take a staggered approach, starting with answering patient messages like, "What time is the clinic open?" and "Is this covered in my insurance?"

"As long as we can ensure factual accuracy and it's not hallucinating, why should a human have to answer that?" Shah said in his report. "We might go a little bit into answering patient messages, then we can go into medical scribing, and we can keep going up the value chain. Then we might reach diagnosis, and the ability to provide a second opinion or a summary of the literature.

Shah said it's challenging to conduct traditional five-year randomized control trials because these models change every two weeks.

Google is trying to solve diagnosing issues and bring complex patient data to physicians even quicker in a collaboration with Mayo Clinic, which was announced June 7. According to Google Cloud, the company is bringing its Enterprise Search in Generative AI App Builder (Gen App Builder), which unifies data across documents, databases, and intranets to make searching, analyzing, and identifying easier.

The announcement comes the same day that Google announced the Gen App Builder can now support Health Insurance Portability and Accountability Act (HIPPA) compliance.

"Our prioritization of patient safety, privacy, and ethical considerations, means that generative AI can have a significant and positive impact on how we work and deliver healthcare," Cris Ross, Mayo Clinic's chief information officer, said in the press release.

"Google Cloud's tools have the potential to unlock sources of information that typically aren't searchable in a conventional manner, or are difficult to access or interpret, from a patient's complex medical history to their imaging, genomics, and labs. Accessing insights more quickly and easily could drive more cures, create more connections with patients, and transform healthcare."