May 06, 2020
Right Matched Patient Cohort Improves Clinical Decision-Making
Radiologists must make a large number of clinical decisions daily. They analyze many thousands of images on up to 100 scans per day, working 10- to 12-hour shifts. Many of these decisions are relatively routine and well-supported by past experience. But, inevitably, some are not. Basic supporting clinical information is often unavailable, and every shift has a few cases where findings are unexpected or appear in unusual combinations. In those cases, finding relevant clinical information can be extraordinarily difficult, requiring digging deep through EMR screens, reports, records, other images in the radiology information system (RIS) and picture archiving and communication system (PACS), and written notes—all under significant time pressure. In-person consultations with referring providers, once the source of a lot of supporting information, have grown increasingly rare.
This lack of access to supporting information becomes most significant for those disease states that vary widely in presentation or whose significance depends strongly on other risk factors, as in lymphoma, osteomyelitis, TB, or white matter abnormalities in the brain. This has resulted in wide diagnostic variation and unsettled standards of care, increasing the challenges for radiologists.
Addressing this widespread challenge is why Baystate Health and its innovation arm, TechSpring, are partnering with Life Image to create a machine learning tool called Patient Compare.
The Support Radiologists Need
Radiologists could gain significant benefit from a tool that presents essential imaging data and clinical information in a single, easily manipulated view, has the ability to analyze and uncover large-scale patterns among patients and disease states, and can then provide easily interpreted diagnostic possibilities and highest-value potential courses of action. This is where the rising demands on radiologists can be met by support from carefully devised and implemented machine-learning algorithms.
Data provision risks being either overwhelming or too prescriptive. TechSpring has determined that the best way to support radiologists without adversely affecting workflow is to provide a cohort of patients from this health system that match the patient demographics, family history, matching lab work, genetics, pathology, and a range of other parameters, and then show how this cohort responded to a range of possible treatments. This patient matching goes on in the background so that the information is instantly accessible when needed, but is strongly responsive to radiologist requests and requirements.
The intimate involvement of Baystate Health radiologist leadership and team from the beginning has defined the nature of the problem to be solved, identified the highest value information to be incorporated into workflow, and enabled projecting the real value to clinical practice. Life Image has worked closely with Baystate Health to create and make available large amounts of high-quality, heterogeneous clinical data needed to train and validate the machine learning algorithms that will support radiologists by identifying and presenting appropriate matching patient cohorts.
The Process of Developing Patient Compare
The current user interface (UI) radiologists depend on comprises three full-screen monitors showing radiology images, EMR data, and RIS worklists from three separate systems. Patient Compare will pull data from these separate systems to provide a single unified view. As the radiologists across the system use and respond to the system, their feedback will reveal if value is being added, on a use case by use case basis. The development will be incremental and heavily responsive to clinical workflow requirements.
By Richard Hicks, MD, FACR, chair of the Department of Radiology at Baystate Health in Springfield, Mass.
Published in Axis Imaging News on May 6, 2020.