DOCUMENTS

bibliography

Case Study: The Distilling of a Biased Algorithmic Decision System through a Business Lens

Published: May 31, 2022
Category: Bibliography
Authors: A M Gamundani, C Chauhan, C Dorsey, D Baur, K Ingram, M Hickok, T O'Brien
Countries: USA
Language: English
Types: Performance Analysis, Population Health
Settings: Health Plan

Abstract

Technological advances embedded within algorithmic decision systems are being deployed every day. Some of these investments are unquestionably worthwhile, while some others prioritize commercialization of technology ahead of societal impact. The article uses a real-world case from the healthcare sector to demonstrate the design and governance shortfalls of an algorithmic tool through its lifecycle. The healthcare sector sits on a mine of data, making it one of the most lucrative fields for big data–based analytics. However, the remunerative healthcare sector is particularly sensitive to the quality of data and algorithmic design decisions, making it paramount for all stakeholders to safely and ethically develop, deploy and implement the algorithmic tools. Otherwise, these systems can have detrimental effects on the life, well-being, and safety of patients. The authors provide guidance on responsible and sustainable deployment practices applicable across industries. The systematic dissection of this case can be applied to different systems across different domains like employment, credit scoring, housing, education, criminal justice, and many others. Every business automating and streamlining processes and core functions through digital transformation needs to implement new accountability and governance mechanisms and invest in the inevitable culture change necessary.

algorithmic decision,artificial intelligence,automated decision,bias,discrimination,ethics,accountability,algorithm,governance

Please log in/register to access.

Log in/Register

LinkedIn Facebook Twitter

© The Johns Hopkins University, The Johns Hopkins Hospital, and Johns Hopkins Health System.
All rights reserved. Terms of Use Privacy Statement

Back to top