1932

Abstract

Predicting clinical risk is an important part of healthcare and can inform decisions about treatments, preventive interventions, and provision of extra services. The field of predictive models has been revolutionized over the past two decades by electronic health record data; the ability to link such data with other demographic, socioeconomic, and geographic information; the availability of high-capacity computing; and new machine learning and artificial intelligence methods for extracting insights from complex datasets. These advances have produced a new generation of computerized predictive models, but debate continues about their development, reporting, validation, evaluation, and implementation. In this review we reflect on more than 10 years of experience at the Veterans Health Administration, the largest integrated healthcare system in the United States, in developing, testing, and implementing such models at scale. We report lessons from the implementation of national risk prediction models and suggest an agenda for research.

Loading

Article metrics loading...

/content/journals/10.1146/annurev-biodatasci-122220-110053
2022-08-10
2024-05-02
Loading full text...

Full text loading...

/deliver/fulltext/biodatasci/5/1/annurev-biodatasci-122220-110053.html?itemId=/content/journals/10.1146/annurev-biodatasci-122220-110053&mimeType=html&fmt=ahah

Literature Cited

  1. 1.
    Apgar V, Holaday DA, James LS, Weisbrot IM, Berrien C. 1958. Evaluation of the newborn infant-second report. JAMA 16:151985–88
    [Google Scholar]
  2. 2.
    Charlson ME, Pompei P, Ales KL, MacKenzie CR 1987. A new method of classifying prognostic comorbidity in longitudinal studies: development and validation. J. Chron. Dis. 40:5373–83
    [Google Scholar]
  3. 3.
    Wilson PW, D'Agostino RB, Levy D, Belanger AM, Silbershatz H, Kannel WB 1998. Prediction of coronary heart disease using risk factor categories. Circulation 97:181837–47
    [Google Scholar]
  4. 4.
    Lyell D, Coiera E, Chen J, Shah P, Magrabi F. 2021. How machine learning is embedded to support clinician decision making: an analysis of FDA-approved medical devices. BMJ Health Care Inform 28:e100301
    [Google Scholar]
  5. 5.
    Challener DW, Prokop LJ, Abu-Saleh O. 2019. The proliferation of reports on clinical scoring systems: issues about uptake and clinical utility. JAMA 321:242405–6
    [Google Scholar]
  6. 6.
    Atkins D, Kilbourne AM, Shulkin D. 2017. Moving from discovery to system-wide change: the role of research in a learning health care system: experience from three decades of health systems research in the Veterans Health Administration. Annu. Rev. Public Health 38:467–87
    [Google Scholar]
  7. 7.
    Fihn SD, Francis J, Clancy C, Nielson C, Nelson K et al. 2014. Insights from advanced analytics at the Veterans Health Administration. . Health Aff. 33:71203–11
    [Google Scholar]
  8. 8.
    Wang L, Zhang Y, Wang D, Tong X, Liu T et al. 2021. Artificial Intelligence for COVID-19: a systematic review. Front. Med. 8:704256
    [Google Scholar]
  9. 9.
    Moons KGM, Altman DG, Reitsma JB, Ioannidis JPA, Macaskill P et al. 2015. Transparent reporting of a multivariable prediction model for individual prognosis or diagnosis (TRIPOD): explanation and elaboration. Ann. Intern. Med. 162:W1–73
    [Google Scholar]
  10. 10.
    Collins GS, Reitsma JB, Altman DG, Moons KG. 2015. Transparent reporting of a multivariable prediction model for individual prognosis or diagnosis (TRIPOD): the TRIPOD statement. Ann. Intern. Med. 162:155–63. Erratum. 2015. Ann. Intern. Med. 162:8600
    [Google Scholar]
  11. 11.
    Moons KG, de Groot JA, Bouwmeester W, Vergouwe Y, Mallett S et al. 2014. Critical appraisal and data extraction for systematic reviews of prediction modelling studies: the CHARMS checklist. PLOS Med 11:e1001744
    [Google Scholar]
  12. 12.
    Moons KGM, Wolff RF, Riley RD, Whiting PF, Westwood M et al. 2019. PROBAST: a tool to assess risk of bias and applicability of prediction model studies: explanation and elaboration. Ann. Intern. Med. 170:W1–33
    [Google Scholar]
  13. 13.
    Collins GS, Dhiman P, Andaur Navarro CL, Ma J, Hooft L et al. 2021. Protocol for development of a reporting guideline (TRIPOD-AI) and risk of bias tool (PROBAST-AI) for diagnostic and prognostic prediction model studies based on artificial intelligence. BMJ Open 11:7e048008
    [Google Scholar]
  14. 14.
    Norgeot B, Quer G, Beaulieu-Jones BK, Torkamani A, Dias R et al. 2020. Minimum information about clinical artificial intelligence modeling: the MI-CLAIM checklist. Nat. Med. 26:1320–24
    [Google Scholar]
  15. 15.
    Hernandez-Boussard T, Bozkurt S, Ioannidis JPA, Shah NH. 2020. MINIMAR (minimum information for medical AI reporting): developing reporting standards for artificial intelligence in health care. J. Am. Med. Inform. Assoc. 27:122011–15
    [Google Scholar]
  16. 16.
    Solebo AL, Braithwaite T. 2020. Implications of the artificial intelligence extensions to the guidelines for consolidated standards of reporting trials and for standard protocol item recommendations for interventional trials (the CONSORT-AI and SPIRIT-AI extensions). eClinicalMedicine 26:100536
    [Google Scholar]
  17. 17.
    Wynants L, Van Calster B, Collins GS, Riley RD, Heinze G et al. 2020. Prediction models for diagnosis and prognosis of covid-19: systematic review and critical appraisal. BMJ 369:m1328
    [Google Scholar]
  18. 18.
    Wu E, Wu K, Daneshjou R, Ouyang D, Ho DE, Zou J. 2021. How medical AI devices are evaluated: limitations and recommendations from an analysis of FDA approvals. Nat. Med. 27:582–84
    [Google Scholar]
  19. 19.
    Makridis CA, Strebel T, Marconi V, Alterovitz G. 2021. Designing COVID-19 mortality predictions to advance clinical outcomes: evidence from the Department of Veterans Affairs. BMJ Health Care Inform 28:1e100312
    [Google Scholar]
  20. 20.
    Ramazzotti D, Lal A, Wang B, Batzoglou S, Sidow A. 2018. Multi-omic tumor data reveal diversity of molecular mechanisms that correlate with survival. Nat. Commun. 9:4453
    [Google Scholar]
  21. 21.
    FDA (U.S. Food Drug Admin.) 2021. Artificial intelligence/machine learning (AI/ML)-based software as a medical device (SaMD) action plan. Action Plan, FDA Washington, DC: https://www.fda.gov/media/145022/download
    [Google Scholar]
  22. 22.
    FDA (U.S. Food and Drug Administration) 2020. Proposed regulatory framework for modifications to artificial intelligence/machine learning (AI/ML)-based software as a medical device (SaMD) Discuss. Pap. FDA Washington, DC: https://www.fda.gov/files/medical%20devices/published/US-FDA-Artificial-Intelligence-and-Machine-Learning-Discussion-Paper.pdf
  23. 23.
    Harris LA. 2021. Artificial intelligence: background, selected issues, and policy considerations Tech. Rep. Congr. Res. Serv. Washington, DC: https://crsreports.congress.gov/product/pdf/R/R46795
  24. 24.
    Obermeyer Z, Powers B, Vogeliand C, Mullainathan S. 2019. Dissecting racial bias in an algorithm used to manage the health of populations. Science 366:6464447–53
    [Google Scholar]
  25. 25.
    OECD (Organ. Econ. Coop. Dev.) 2021. State of implementation of the OECD AI principles: insights from national AI policies Digit. Econ. Pap. 311 OECD Paris: https://www.oecd-ilibrary.org/science-and-technology/state-of-implementation-of-the-oecd-ai-principles_1cd40c44-en
  26. 26.
    GAO (U.S. Gov. Account. Off.) 2021. Artificial intelligence: an accountability framework for federal agencies and other entities Tech. Rep. GAO Washington, DC: https://www.gao.gov/products/gao-21-519sp
  27. 27.
    Makridis C, Hurley S, Klote M, Alterovitz G. 2021. Ethical applications of artificial intelligence: evidence from health research on veterans. JMIR Med. Inform. 9:6e28921
    [Google Scholar]
  28. 28.
    Exec. Off. Pres 2020. Promoting the use of trustworthy intelligence in the federal government Exec. Order 13960, 85 Fed. Regist. 78939 (Dec. 8)
  29. 29.
    Wang L, Porter B, Maynard C, Evans G, Bryson C et al. 2013. Predicting risk of hospitalization or death among patients receiving primary care in the Veterans Health Administration. Med. Care 51:4368–373
    [Google Scholar]
  30. 30.
    Ruiz JG, Priyadarshni S, Rahaman Z, Cabrera K, Dang S et al. 2018. Validation of an automatically generated screening score for frailty: the Care Assessment Need (CAN) score. BMC Geriatr 18:106
    [Google Scholar]
  31. 31.
    Osborne TF, Suarez P, Edwards D, Hernandez-Boussard T, Curtin C. 2020. Patient electronic health records score for preoperative risk assessment before total knee arthroplasty. JBJS Open Access 5:2e0061
    [Google Scholar]
  32. 32.
    Osborne TF, Veigulis ZP, Arreola DM, Röösli E, Curtin CM. 2020. Automated EHR score to predict COVID-19 outcomes at US Department of Veterans Affairs. PLOS ONE 15:7e0236554
    [Google Scholar]
  33. 33.
    Soerensen SJC, Thomas IC, Schmidt B, Daskivich TJ, Skolarus TA et al. 2021. Using an automated electronic health record score to estimate life expectancy in men diagnosed with prostate cancer in the Veterans Health Administration. Urology 155:70–76
    [Google Scholar]
  34. 34.
    Kessler RC, Hwang I, Hoffmire CA, McCarthy JF, Petukhova MV et al. 2017. Developing a practical suicide risk prediction model for targeting high-risk patients in the Veterans Health Administration. Int. J. Methods Psychiatr. Res. 26:3e1575
    [Google Scholar]
  35. 35.
    McCarthy JF, Bossarte RM, Katz IR, Thompson C, Kemp J et al. 2015. Predictive modeling and concentration of the risk of suicide: implications for preventive interventions in the US Department of Veterans Affairs. Am. J. Public Health 105:91935–42
    [Google Scholar]
  36. 36.
    Reger GM, McClure ML, Ruskin D, Carter SP, Reger MA. 2019. Integrating predictive modeling into mental health care: an example in suicide prevention. Psychiatr. Serv. 70:171–74
    [Google Scholar]
  37. 37.
    du Pont A, Stanley IH, Pruitt LD, Reger MA. 2022. Local implementation evaluation of a suicide prevention predictive model at a large VA health care system. Suicide Life-Threat. Behav 52:2214–21
    [Google Scholar]
  38. 38.
    McCarthy JF, Cooper SA, Dent KR, Eagan AE, Matarazzo BB et al. 2021. Evaluation of the recovery engagement and coordination for health-veterans enhanced treatment suicide risk modeling clinical program in the Veterans Health Administration. JAMA Netw. Open 4:10e2129900
    [Google Scholar]
  39. 39.
    Lin LA, Bohnert ASB, Kerns RD, Clay MA, Ganoczy D, Ilgen MA. 2017. Impact of the Opioid Safety Initiative on opioid-related prescribing in veterans. Pain 158:5833–39
    [Google Scholar]
  40. 40.
    Oliva EM, Bowe T, Tavakoli S, Martins S, Lewis ET et al. 2017. Development and applications of the Veterans Health Administration's Stratification Tool for Opioid Risk Mitigation (STORM) to improve opioid safety and prevent overdose and suicide. Psychol. Serv. 14:134–49
    [Google Scholar]
  41. 41.
    Ramoni R, Klote M, Muralidhar S, Brandt C, Bernstein MA et al. 2021. COVID-19 insights partnership: leveraging big data from the Department of Veterans Affairs and supercomputers at the Department of Energy under the public health authority. J. Am. Med. Inform. Assoc. 28:71578–81
    [Google Scholar]
  42. 42.
    Franklin JC, Ribeiro JD, Fox KR, Bentley KH, Kleiman EM et al. 2017. Risk factors for suicidal thoughts and behaviors: a meta-analysis of 50 years of research. Psychol. Bull. 143:2187–232
    [Google Scholar]
  43. 43.
    Kimbrel NA, Ashley-Koch AE, Qin XJ, Lindquist JH, Garrett ME et al. 2022. A genome-wide association study of suicide attempts in the Million Veterans Program identifies evidence of pan-ethnic and ancestry-specific risk loci. Mol. Psychiatry 27:2264–27
    [Google Scholar]
  44. 44.
    Chang ET, Yoon J, Esmaeili A, Zulman DM, Ong MK et al. 2021. Outcomes of a randomized quality improvement trial for high-risk Veterans in year two. Health Serv. Res. 56:Suppl. 11045–56
    [Google Scholar]
  45. 45.
    Frakt AB, Prentice JC, Pizer SD, Elwy AR, Garrido MM et al. 2018. Overcoming challenges to evidence-based policy development in a large, integrated delivery system. Health Serv. Res. 53:64789–807
    [Google Scholar]
  46. 46.
    Minegishi T, Frakt AB, Garrido MM, Gellad WF, Hausmann LRM et al. 2019. Randomized program evaluation of the Veterans Health Administration Stratification Tool for Opioid Risk Mitigation (STORM): a research and clinical operations partnership to examine effectiveness. Subst. Abus 40:114–19
    [Google Scholar]
  47. 47.
    Mattarazzo B. 2018. REACH VET: Recovery Engagement and Coordination for Health–Veterans Enhanced Treatment: program overview Cyberseminar VA Health Serv. Res. Dev., US Dep. Veterans Aff. Washington, DC: https://www.hsrd.research.va.gov/for_researchers/cyber_seminars/archives/3527-notes.pdf
  48. 48.
    Reger MA, Ammerman BA, Carter SP, Gebhardt HM, Rojas SM et al. 2021. Patient feedback on the use of predictive analytics for suicide prevention. Psychiatr. Serv. 72:2129–35
    [Google Scholar]
  49. 49.
    Rogal SS, Chinman M, Gellad WF, Mor MK, Zhang H et al. 2020. Tracking implementation strategies in the randomized rollout of a Veterans Affairs national opioid risk management initiative. Implement Sci 15:48
    [Google Scholar]
  50. 50.
    Fan VS, Dominitz JA, Eastment MC, Locke ER, Green P et al. 2021. Risk factors for testing positive for severe acute respiratory syndrome coronavirus 2 in a national United States healthcare system. Clin. Infect. Dis. 73:9e3085–94
    [Google Scholar]
  51. 51.
    Rentsch CT, Kidwai-Khan F, Tate JP, Park LS, King JT Jr. et al. 2020. Patterns of COVID-19 testing and mortality by race and ethnicity among United States veterans: a nationwide cohort study. PLOS Med 17:9e1003379
    [Google Scholar]
  52. 52.
    Ioannou GN, O'Hare AM, Berry K, Fan VS, Crothers K et al. Trends over time in the risk of adverse outcomes among patients with SARS-CoV-2 infection. Clin. Infect. Dis. 74:3416–26
    [Google Scholar]
  53. 53.
    King JT Jr., Yoon JS, Rentsch CT, Tate JP, Park LS et al. 2020. Development and validation of a 30-day mortality index based on pre-existing medical administrative data from 13,323 COVID-19 patients: the Veterans Health Administration COVID-19 (VACO) index. PLOS ONE 15:11e0241825
    [Google Scholar]
  54. 54.
    Ioannou GN, Locke E, Green P, Berry K, O'Hare AM et al. 2020. Risk factors for hospitalization, mechanical ventilation, or death among 10 131 US veterans with SARS-CoV-2 infection. JAMA Netw. Open 3:9e2022310
    [Google Scholar]
  55. 55.
    Ioannou GN, Green P, Fan VS, Dominitz JA, O'Hare AM et al. 2021. Development of COVIDVax model to estimate the risk of SARS-CoV-2-related death among 7.6 million US veterans for use in vaccination prioritization. JAMA Netw. Open 4:4e214347
    [Google Scholar]
  56. 56.
    Arnold J, Davis A, Fischhoff B, Yecies E, Grace J et al. 2019. Comparing the predictive ability of a commercial artificial intelligence early warning system with physician judgement for clinical deterioration in hospitalised general internal medicine patients: a prospective observational study. BMJ Open 9:10e032187
    [Google Scholar]
  57. 57.
    Zulman DM, Pal Chee C, Ezeji-Okoye SC, Shaw JG, Holmes TH et al. 2017. Effect of an intensive outpatient program to augment primary care for high-need Veterans Affairs patients: a randomized clinical trial. JAMA Intern. Med 177:2166–75
    [Google Scholar]
  58. 58.
    Kelly CJ, Karthikesalingam A, Suleyman M, Corrado G, King D. 2019. Key challenges for delivering clinical impact with artificial intelligence. BMC Med 17:195
    [Google Scholar]
  59. 59.
    Li RC, Asch SM, Shah NH. 2020. Developing a delivery science for artificial intelligence in healthcare. NPJ Digit. Med. 3:107
    [Google Scholar]
  60. 60.
    Magrabi F, Ammenwerth E, McNair JB, De Keizer NF, Hyppönen H et al. 2019. Artificial intelligence in clinical decision support: challenges for evaluating AI and practical implications. Yearb. Med. Inform. 28:1128–34
    [Google Scholar]
  61. 61.
    Weissman GE. 2021. FDA regulation of predictive clinical decision-support tools: What does it mean for hospitals?. J. Hosp. Med. 16:4244–46
    [Google Scholar]
  62. 62.
    EC (Eur. Comm.) 2021. Proposal for a regulation of the European Parliament and of the Council laying down harmonised rules on artificial intelligence (Artificial Intelligence Act) and amending certain Union legislative acts Explan. Memo., Eur. Comm. Brussels: https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A52021PC0206
/content/journals/10.1146/annurev-biodatasci-122220-110053
Loading
/content/journals/10.1146/annurev-biodatasci-122220-110053
Loading

Data & Media loading...

  • Article Type: Review Article
This is a required field
Please enter a valid email address
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error