As part of ONC’s efforts to embrace the concept of health equity by design, ONC’s Health Information Technology Advisory Committee (HITAC) held a hearing in March to explore equity considerations in health IT featuring expert panelists from the fields of health equity, health IT, and health data exchange.
Panelists drew from their experience working to advance health equity in distinct sectors, including practicing clinical medicine, developing technology for healthcare, and working to foster data sharing between healthcare and community-based organizations. Synthesized themes from the hearing include consistent, equitable data collection and interoperability challenges; the digital divide, including disparities in broadband access and digital literacy; and artificial intelligence considerations, including transparency and the importance of identifying bias and curtailing its impacts…
Panelists called for closed-loop referral processes enabling clinicians to place referrals to social service providers, track progress, and follow up on patient outcomes. Dr. Denise Hines of the Georgia Health Information Network and Dr. Dominic Mack, director of the National Center for Primary Care at Morehouse School of Medicine, described the need for data standards to facilitate information sharing between the EHR, community-based entities like pharmacies and food banks, and the statewide health information exchange, while preserving patient privacy. Dr. Angela Thomas, vice president of healthcare delivery research at MedStar Health, pointed to a similar need for “seamless communication” between the disparate organizations providing prenatal and delivery care to mothers and infants…
Themes relating to artificial intelligence (AI) included the importance of transparent and diverse data, as well as diverse representation among those developing, testing, and evaluating AI, to help minimize and mitigate potential bias. As Dr. Nicol Turner Lee, director of the Brookings Institution’s Center for Technology Innovation, stated, “Computers do not discriminate. We do. We come with our values, norms, judgments, and assumptions about the world, and explicitly and implicitly they find themselves in our models.” Dr. Ziad Obermeyer of the University of California, Berkeley discussed the importance of defining bias, establishing goals of an algorithm, and assessing the algorithm’s performance based on these goals – to evaluate potential bias in an algorithm… Read the full article here.