• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer

Emerging Scholars Blog

InterVarsity's Emerging Scholars Network

DONATE
  • Home
  • About Us
    • About Our Bloggers
    • ESN Writing Inquiries
    • Commenting Policy
  • Reading Lists
  • Scholar’s Compass
    • Scholar’s Compass Discussion Guide
    • Scholar’s Compass Posts
    • Scholar’s Compass Booklet
  • Connect
    • Membership
    • Events
    • Donate
    • Contact Us
Home » Science Corner: Healthcare for All

Science Corner: Healthcare for All

October 30, 2019 by Andy Walsh Leave a Comment

photo of physician measuring a patient's blood pressure
Blood pressure checks are standard care; deciding who to prioritize for more complex treatment is not straightforward. (Photo by rawpixel)

Amidst all the conversations of how to best pay for healthcare in the United States, we also need to discuss how to best treat patients for those dollars. As with so many areas of our lives, algorithms trained via machine learning are becoming a part of the treatment process. Machine learning techniques look for patterns in data, even to the point of finding patterns their programmers did not expect or know about. Sadly, one common pattern in many data sets is racial bias. Often when an algorithm is looking to replicate human decision making, it winds up replicating the biases of the humans as well. A recent study revealed exactly this sort of racial bias in an algorithm used to prioritize healthcare delivery such that black patients were getting less care than white patients with equivalent needs.

In this case, prior healthcare costs were used as a proxy for need. All else being equal, patients who have already incurred higher costs for the year are more likely to have more serious health needs today, possibly because of underlying chronic conditions or risk factors. On that basis, prior costs make sense to include in an assessment of which patients to refer for additional care or resources. But rarely is all else equal. In this case, the black patients and white patients differ in their prior healthcare utilization and treatments. This led to an overall trend where a black patient with a given condition had incurred less cost than a white patient with the same condition, even though their actual needs might have been the same. Whether because of different utilization, being treated differently, having access to different levels of care or other causes, prior cost turned out to be unreliable for assessing current risk or need.

From the sound of it, the company providing the algorithm has been fairly open to revising their software to be less biased, albeit while maintaining that there was no problem in the first place. Hopefully a genuine resolution is achieved which brings more equity to healthcare delivery going forward.

Of course, this problem is not restricted to just healthcare algorithms. It has emerged in a variety of settings, including law enforcement. NPR recently covered biases in methods used to identify potential spies. (H/T to Tom Grosh for the link.) The article also has a good summary of the general problem.

And since the biased data training those algorithms comes from biased human decisions, we’ve got work to do beyond just improving our machine learning practices. For example, folks are making an effort to highlight the scientific contributions of a more representative array of scholars. I’m not sure how I missed it last year when she was inducted into the Air Force Space and Missile Pioneers Hall of Fame, but I came across the story of Gladys West recently. She developed computer algorithms for processing satellite data that became the foundation for GPS technology. This seemed like as good an opportunity as any to make sure you didn’t miss her story either.


You also don’t want to miss the opportunity to participate in a video conference with Bob Kaita, Dave Vosberg, Ciara Reyes-Ton and Hannah Eagleson tomorrow (10/31/19) at 12pm EDT. More details on how to participate here.

Andy Walsh
Andy Walsh

Andy has worn many hats in his life. He knows this is a dreadfully clichéd notion, but since it is also literally true he uses it anyway. Among his current metaphorical hats: husband of one wife, father of two teenagers, reader of science fiction and science fact, enthusiast of contemporary symphonic music, and chief science officer. Previous metaphorical hats include: comp bio postdoc, molecular biology grad student, InterVarsity chapter president (that one came with a literal hat), music store clerk, house painter, and mosquito trapper. Among his more unique literal hats: British bobby, captain’s hats (of varying levels of authenticity) of several specific vessels, a deerstalker from 221B Baker St, and a railroad engineer’s cap. His monthly Science in Review is drawn from his weekly Science Corner posts — Wednesdays, 8am (Eastern) on the Emerging Scholars Network Blog. His book Faith across the Multiverse is available from Hendrickson.

Share this:

  • Click to email a link to a friend (Opens in new window) Email
  • Click to print (Opens in new window) Print
  • Click to share on Facebook (Opens in new window) Facebook
  • More
  • Click to share on X (Opens in new window) X
  • Click to share on Tumblr (Opens in new window) Tumblr
  • Click to share on LinkedIn (Opens in new window) LinkedIn
  • Click to share on Reddit (Opens in new window) Reddit
  • Click to share on Pinterest (Opens in new window) Pinterest

Filed Under: Science and Faith Tagged With: bias, Gladys West, machine learning, science, science corner

Reader Interactions

Leave a ReplyCancel reply

Primary Sidebar

Become a Member

Membership is Free. Sign up and receive our monthly newsletter and access ESN member benefits.

Join ESN Today

Scholar’s Compass Booklet

Scholar's Compass Booklet

Click here to get your copy

Top Posts

  • Rest and Flourishing: ESN Weekly Summer Readings for Faculty (Small Group or Individual)
  • Rest and Growth: ESN Weekly Summer Readings for Grad Students (Small Group or Individual)
  • Faith and Reason, Part 2: Augustine
  • A Prayer for Those Finishing a Semester
  • The Message of Genesis 1

Facebook Posts

Facebook Posts

Footer

Privacy & Cookies: This site uses cookies. By continuing to use this website, you agree to their use.
To find out more, including how to control cookies, see here: Cookie Policy

Recent Posts

  • Encouraging One Another
  • Science Corner: Grandmother, What Grey Fur You Have
  • ESN Conversation: Nailing It

Article Categories

Footer Logo
© 2025 InterVarsity Christian Fellowship/USA®. All rights reserved.
InterVarsity, InterVarsity Christian Fellowship/USA, and the InterVarsity logo are trademarks of InterVarsity Christian Fellowship/USA and its affiliated companies.

Member of the International Fellowship of Evangelical Students

Privacy Policy | Terms of Use | Contact Us