Software infusing racism into U.S. health care – Why planning, diversity, and training matter

I came across this article by Casey Ross over at STAT about how software introduces racism into US health care, and it got me thinking about how could this be possible. Here is what they found:

A STAT investigation found that a common method of using analytics software to target medical services to patients who need them most is infusing racial bias into decision-making about who should receive stepped-up care. While a study published last year documented bias in the use of an algorithm in one health system, STAT found the problems arise from multiple algorithms used in hospitals across the country. The bias is not intentional, but it reinforces deeply rooted inequities in the American health care system, effectively walling off low-income Black and Hispanic patients from services that less sick white patients routinely receive.

These algorithms are running in the background of most Americans’ interaction with the health care system. They sift data on patients’ medical problems, prior health costs, medication use, lab results, and other information to predict how much their care will cost in the future and inform decisions such as whether they should get extra doctor visits or other support to manage their illnesses at home. The trouble is, these data reflect long-standing racial disparities in access to care, insurance coverage, and use of services, leading the algorithms to systematically overlook the needs of people of color in ways that insurers and providers may fail to recognize.

Actually, I am not all that surprised this occured because my gut was saying that most software developers are either predominantly white or from another country and 1) are less familiar with the racial nuances of U.S. culture or 2) don’t have a fundamental understanding of the differences in how our industries may work compared to their industries back home. To ensure I was not just stereotyping a job role, I looked for some data on software engineers and came across this Computer Software Engineering data from DATA USA, which Deloitte and Datawheel put together. The data suggest that non-resident aliens earn the most college degrees in the field, however the most common race/ethnicity and gender combination is white male. The article does state that this bias in the software is unintentional. I can certainly see how this happens given the demographics of who may be developing this software, but that does not make it right and is definitely avoidable. I believe there are 3 issues that are systemic in the software industry, in general, that lead to these outcomes. The first has to do with a lack of diversity. There is no doubt that there is a lack of diversity in the job pool in software engineering. However, software development doesn’t just happen at a coding level. There are multiple areas in which diversity could be applied. From the project management side, to the quality control side, the addition of diversity in these roles can lead to increased questions being raised about functionality of the software. For example, the project team is responsible for crafting the deliverables and functionality. If there is no diversity on this team, it is likely that issues such as what occurred above are more likely to occur. However, companies should strive to invest in their communities and school programs that help minority students gain the knowledge and skills to ultimately go on to pursue degrees in software engineering. The second has to deal with planning. Many of these software companies will get a general idea of what the customer needs or industry needs by looking at the high level information. On the one hand, the customer may have their own incomplete information or their own biases (conscious or unconscious). The customer may also lack full information about their customers due to a lack of history, especially when engaging with an audience who is less likely to engage with the services in the same way. In the case of this article, some rural towns are affected by the lack of minority patient history compared to their white patient history. This would certainly lead to a misunderstanding of the decision outcomes the software would have to make based on the given parameters. Therefore, exploration of the needs of the other stakeholders impacted by these tools is necessary. Any amount of general, well-constructed research on how decisions of the software may impact minorities differently would likely have been found during the course of this process, so long as the developer did not simply take the word of their client. By identifying this issue in advance, these algorithms could have been designed to preempt such challenges. When it comes to healthcare, identifying these issues could have significant and favorable ramifications for patient outcomes that have plagued our system (for some additional information, see this article about how minorities experience worse outcomes in healthcare at the National Institute of Health website). The third issue is likely related to a lack of training on diversity or unconscious bias. Unconscious bias is a real thing. Training that makes one become more understanding of others can significantly help with becoming more aware of these types of differences and thus lead to a greater care taken when addressing these issues. In conjunction with a diverse project and development team, companies can ultimately stamp out some of these issues ahead of time. Doing so can lead to better products, services, and standard of care for all. There are many other factors at play here, such as the culture of the organization, but by addressing planning, diversity, and training, we can start making real progress for all. *We are neither sponsored by nor have any affiliation with these organizations.


One Reply to “Software infusing racism into U.S. health care – Why planning, diversity, and training matter”

Leave a Reply

%d