posted on 2025-05-01, 00:00authored byKenya S. Andrews
When individuals are unseen, the critical elements necessary for fair and accurate (i.e., just) decision-making are missing, leading to a distorted version of their identity that reinforces systemic inequities and exacerbates the marginalization of vulnerable groups. This misrepresentation perpetuates systemic injustices, creating barriers that deny opportunities and deepen inequality. Increasingly, decisions in life-critical contexts—such as healthcare and resource allocation—are being delegated to algorithmic decision-makers (such as artificial intelligence (AI) and machine learning (ML)). These algorithms and the data that inform them are often, however, influenced by biases embedded in historical data, further entrenching these inequities. In turn, these biases limit decision spaces, disproportionately affecting marginalized communities and perpetuating cycles of harm and exclusion.
Specifically, I quantitatively uncover previously hidden disparities in medical settings and identify patterns of unjust silencing in emergency care, emphasizing the need for justice-oriented decision-making frameworks that incorporate an intersectional perspective. I demonstrate the benefits of this approach and use causal discovery to construct, to my knowledge, the first causal graphs (SCMs) in this domain. Through rule-based and context-based methods, I modify clinical notes according to insights from the causal analysis of these SCMs to reduce these disparities empirically evaluating how refining textual inputs—through empathy injection, word removal, and rephrasing—can improve patient visibility in medical settings, enhancing recognition by both large language models (LLMs) and physicians. I then developed a human-in-the-loop approach that helps editors of text to directly make such modifications rather than relying solely on algorithmic approaches, ensuring accountability, explainability, and long-term changes in word choice. Additionally, I analyze disparities in access to life-critical goods between disadvantaged and advantaged populations, applying optimization techniques to design targeted interventions that reduce inequities and advance justice-oriented algorithmic solutions from grassroots to global levels. These analyses examine how such interventions shape decision-making, refine predictive accuracy, and mitigate biases in real-world applications. By addressing these computational challenges, this research promotes the proper visibility of marginalized communities within decision-making systems through algorithmic justice.
History
Advisor
Mesrob I. Ohannessian
Department
Computer Science
Degree Grantor
University of Illinois Chicago
Degree Level
Doctoral
Degree name
PhD, Doctor of Philosophy
Committee Member
Lu Cheng
Elena Zheleva
Brian Ziebart
Jakita Thomas