ChatGPT Is More Like Penicillin Than Precision Medicine

— It has useful healthcare applications but can’t replace human sensibility

by

An artificial intelligence (AI) arms race was just kicked into high gear with OpenAI’s mass release of the free language model called ChatGPT. The chatbot has already passed 100 million monthly users, which makes it one of the fastest consumer app adoptions in history with a nearly $30 billion company valuation to go along with it.

Like providers and investors in healthcare, the ChatGPT chatbot is trained on extensive amounts of data to learn patterns to help predict what comes next in a sequence. But like its AI predecessors, questions remain on the app’s ability to address empathy, ethics, and critical thinking.

As a health policy and management professor, healthcare investor, health company board member, author, and volunteer provider in healthcare, I was curious where and how ChatGPT would be helpful to my portfolio of work. I downloaded the application, took the tutorial, and started hacking away at healthcare use cases.

My first warm up question was: “Who is the best orthopedic surgeon in the U.S.?” ChatGPT was appropriately careful, answering, “This is a difficult question to answer as there are many excellent orthopedic surgeons so it’s best to consult with your primary care physician.”

I asked for a review of the latest guidelines for treating community acquired pneumonia (CAP), which was much more targeted and accurate but not sourced.

ChatGPT was quick to generate answers to direct questions but grappled with going deep on the real-world application of those answers. It was less accurate when probed around the stages of chronic kidney disease. There wasn’t an option to give feedback or add links to peer reviewed references to help with quality improvement. ChatGPT wasn’t helpful with deal sourcing or valuation work. In fact, the app wouldn’t even put a value on itself.

In healthcare, clinical judgement, ethics, and treating patients as individuals is a critical part of patient care. It’s also why so many tech companies have failed in healthcare, as they don’t account for the necessary human interaction, which limits the success of their products. Many seniors living alone at home need a human to provide care and literally drive them to care versus a product that just delivers a component of care electronically.

The most concerning result (or exciting, depending on the user) was when I plugged in a prior exam from my “Business of Healthcare” class, and ChatGPT passed the exam (binary multiple choice) in less than 5 minutes.

At the end of the day, we want students to grow intellectually, and distill information to apply their learnings to the real world. Using AI to pass an exam is far less valuable than a graduate’s ability to comprehend data and use judgment to apply it ethically and correctly. To be fair, there are subjects in school like algebra, chemistry, and pathophysiology that do require some level of memorization. The concern is the short circuiting of the academic foundation required for higher scholarly learning and advancement in certain fields.

All in all, my experience with ChatGPT was analogous to penicillin: a powerful invention, widely used to treat a range of problems (infections) but not useful in all cases (some bacterial strains or viruses) and with a risk for overuse and possible negative effects by the user. If automated responses are acted upon without consideration for generalization, bias, or transparent authorship, it’s a slippery slope. In fact, today I received an email from an employer that they do not support ChatGPT and are “looking into tools to detect its use” to ensure content fidelity.

The potential benefits of ChatGPT or similar applications in healthcare are wide-ranging. It takes an average of 17 years for research evidence to reach clinical practice; more efficient use of data could accelerate innovation and improve care delivery while saving the healthcare system $450 billion a year.

On the safety side, improved accuracy in e-prescribing medication orders could help decrease medication errors, which harm at least 1.5 million people annually with morbidity and mortality costs running $77 billion per year. In terms of experience, administrative friction around appointment booking and billing processes are ripe for improvement.

The field of AI and applications like ChatGPT have an opportunity to help healthcare users (researchers, providers, students, caregivers, investors) construct and query data to get answers faster, which are on the right side of the quadruple aim goals.

Healthcare has a dichotomous need to address mundane administrative tasks (billing) while advancing R&D analytical firepower (curing) at the bedside. AI can be a powerful partner to enhance efficiency and reduce cost, but I predict it will never replace the critical need for judgement, ethics, and sensibility.

To date the best in healthcare is a symbiotic partnership of humans and technology.

Meghan FitzGerald, DrPH, MPH, RN, is an adjunct professor with the Columbia University Mailman School of Public Health, and a private equity investor.

Disclosures

FitzGerald has no investments in AI chatbots.

Read More
Lloyd Fetzer

Latest

The Truth About Red Light Therapy Masks, According to a Dermatologist

You don't have permission to access "http://www.medpagetoday.com/popmedicine/cultureclinic/120805" on this server. Reference #18.9751c317.1776352333.14d1e5c https://errors.edgesuite.net/18.9751c317.1776352333.14d1e5c

Do Stabilisation Admissions Support Eating Disorder Care?

TOPLINE: Among children and young individuals (aged 0-18 years) admitted for medical stabilisation of restrictive eating disorders, more than half continued their recovery through outpatient care; however, a notable proportion were medically unstable on admission, and many required nutritional support. METHODOLOGY: Researchers conducted a retrospective cohort analysis to evaluate outcomes of medical stabilisation admissions in

‘Not your parents’ cannabis:’ Legalization lights up innovation—but not clinical research

🛡️ Just a quick check We’re checking your connection to prevent automated abuse

NFL Analyst Raises Red Flags Over Arvell Reese’s Fit As Edge Rusher on PFSN’s Football Debate Club

As the 2026 NFL Draft approaches, few defensive prospects have generated as much intrigue as Ohio State’s Arvell Reese. Widely viewed as one of the most talented defenders in the class, Reese’s versatility has made him a standout on scouting boards. However, with that versatility comes an ongoing debate about how he projects at the

Newsletter

Don't miss

The Truth About Red Light Therapy Masks, According to a Dermatologist

You don't have permission to access "http://www.medpagetoday.com/popmedicine/cultureclinic/120805" on this server. Reference #18.9751c317.1776352333.14d1e5c https://errors.edgesuite.net/18.9751c317.1776352333.14d1e5c

Do Stabilisation Admissions Support Eating Disorder Care?

TOPLINE: Among children and young individuals (aged 0-18 years) admitted for medical stabilisation of restrictive eating disorders, more than half continued their recovery through outpatient care; however, a notable proportion were medically unstable on admission, and many required nutritional support. METHODOLOGY: Researchers conducted a retrospective cohort analysis to evaluate outcomes of medical stabilisation admissions in

‘Not your parents’ cannabis:’ Legalization lights up innovation—but not clinical research

🛡️ Just a quick check We’re checking your connection to prevent automated abuse

NFL Analyst Raises Red Flags Over Arvell Reese’s Fit As Edge Rusher on PFSN’s Football Debate Club

As the 2026 NFL Draft approaches, few defensive prospects have generated as much intrigue as Ohio State’s Arvell Reese. Widely viewed as one of the most talented defenders in the class, Reese’s versatility has made him a standout on scouting boards. However, with that versatility comes an ongoing debate about how he projects at the

Athena launches FabOrchestrator, an agentic AI platform for manufacturing execution systems

In short: Athena Technology Solutions, a Fremont-based MES integrator with roughly 120 employees, has launched FabOrchestrator, an agentic AI platform for manufacturing that automates reporting, support tickets, system modelling, and code generation for semiconductor and electronics factories. Built in partnership with Bangalore-based LLM at Scale.AI, it layers LLM capabilities on top of the Siemens Opcenter

SoE necessary but not sufficient, business leaders say

PE­TER CHRISTO­PHER Se­nior Mul­ti­me­dia Re­porter pe­ter.christo­pher@guardian.co.tt Heavy hand­ed but nec­es­sary giv­en the state of crime in T&T. This was a com­mon as­sess­ment from var­i­ous busi­ness groups when asked for their per­spec­tive on the lat­est de­c­la­ra­tion of a state of emer­gency in the coun­try. The T&T Cham­ber of In­dus­try and Com­merce, in a re­leased is­sued yes­ter­day

The Big Business of Carolyn Bessette-Kennedy

Can a nine-episode limited series really impact an entire season of shopping trends? Today brands are experiencing—and chasing—the “Carolyn Bessette-Kennedy effect” as a result of Ryan Murphy’s Love Story. And in many cases, it’s more pervasive than they could have prepared for. The FX series, based on the relationship between John F. Kennedy Jr. and

‘Mind Your Own Business’: Kamal Haasan Rebukes Trump Over ‘Permission’ To Buy Russian Oil

Updated 8 March 2026 at 18:20 IST Actor and Rajya Sabha MP Kamal Haasan has hit out at US President Donald Trump after America announced that it has given India temporary "permission" to buy Russian oil amid global supply disruptions caused by the Middle East conflict. 'Mind Your Own Business': Kamal Haasan Rebukes Trump Over