Deep learning algorithms to identify documentation of serious illness conversations during intensive care unit admissions.
Read the original article.
- Department of Psychosocial Oncology and Palliative Care, Dana-Farber Cancer Institute, Boston, MA, USA.
- Harvard T. H. Chan School of Public Health, Boston, MA, USA.
- Computer Science and Artificial Intelligence Laboratory, Massachusetts Institute of Technology, Cambridge, MA, USA.
- College of Science and Mathematics, University of Massachusetts Boston, Boston, MA, USA.
- Division of Pulmonary and Critical Care Medicine, Department of Medicine, Brigham and Women’s Hospital, Boston, MA, USA.
- Division of General Internal Medicine and Health Services Research, David Geffen School of Medicine, University of California, Los Angeles, Los Angeles, CA, USA.
- Palliative Care, VA Greater Los Angeles Healthcare System, Los Angeles, CA, USA.
- Division of Palliative Medicine, Department of Medicine, Brigham and Women’s Hospital, Boston, MA, USA.
Timely documentation of care preferences is an endorsed quality indicator for seriously ill patients admitted to intensive care units. Clinicians document their conversations about these preferences as unstructured free text in clinical notes from electronic health records.
To apply deep learning algorithms for automated identification of serious illness conversations documented in physician notes during intensive care unit admissions.
Using a retrospective dataset of physician notes, clinicians annotated all text documenting patient care preferences (goals of care or code status limitations), communication with family, and full code status. Clinician-coded text was used to train algorithms to identify documentation and to validate algorithms. The validated algorithms were deployed to assess the percentage of intensive care unit admissions of patients aged ⩾75 that had care preferences documented within the first 48 h.
Patients admitted to one of five intensive care units.
Algorithm performance was calculated by comparing machine-identified documentation to clinician-coded documentation. For detecting care preference documentation at the note level, the algorithm had F1-score of 0.92 (95% confidence interval, 0.89 to 0.95), sensitivity of 93.5% (95% confidence interval, 90.0% to 98.0%), and specificity of 91.0% (95% confidence interval, 86.4% to 95.3%). Applied to 1350 admissions of patients aged ⩾75, we found that 64.7% of patient intensive care unit admissions had care preferences documented within the first 48 h.
Deep learning algorithms identified patient care preference documentation with sensitivity and specificity approaching that of clinicians and computed in a tiny fraction of time. Future research should determine the generalizability of these methods in multiple healthcare systems.
Quality indicators (healthcare); advance care planning; end-of-life care; intensive care units; machine learning
- PMID: 30427267
- DOI: 10.1177/0269216318810421