How a Robot Could Be Reading Your Next Mammogram – AI Tools in Development at Google Health and iCAD Poised to Improve Care, Reduce Costs, and Perhaps Ultimately Replace Human Interpretation
New artificial intelligence (AI) based technologies coming to the forefront have the potential to dramatically improve accuracy, leading to higher rates of cancer detection and lower false positives. Finding cancer earlier and reducing unnecessary follow-on care could also dramatically lower costs for the healthcare system. Physician assistance tools such as iCAD’s ProuFound AI® have been around for some time and demonstrated the ability to reduce read times, improve sensitivity, and lower patient recall rates. However, new AI tools are beginning to demonstrate the ability to read medical imagery better than and without the need for a follow-on radiologist interpretation. Currently there is a shortage of radiologists in the United States and theses shortages are especially acute in Europe where current mammography guidelines require dual reads by two radiologists. With an ageing population driving increased demand for medical imaging, it is estimated by 2024 the United States could be short by over 35,000 radiologists so improved tools to reduce the burden are typically welcomed by physicians as well.
Mammography remains the standard of care for breast cancer screening in the United States and recent changes to 2023 United States Preventive Task Force (USPTF) guidelines lowered the recommended screening age from mammography to 40 from 50. Currently women at average risk are recommended to get biannual screening from the ages of 40-74. However, mammography while the standard of care is not perfectly accurate due to the human interpretation element. A 2013 published study in the International Journal of Cancer which followed over 93,000 patients at 118 centers found United States sensitivity for diagnostic mammography was 85%, missing approximately one in six cancers. Specificity in the United States was 93% meaning approximately 7% of women will have a false positive indication following mammography, and in the United States only about one in five women referred for breast biopsy actually have cancer. This leads to a large number of unnecessary invasive and expensive follow-on procedures with the average cost of ultrasound guided breast biopsy in the United States around $2.500. Additionally, false positive readings lead to additional patient stress and morbidity.
In late 2022, iCAD announced a strategic development and commercialization agreement with Google Health to integrate its artificial intelligence (AI) technology into iCAD’s beast imaging software solutions for 2-D and 3-D mammography (breast tomosynthesis). Under the terms of the agreement, iCAD will pay royalties to Google Health for the use of its AI technology. This technology has already demonstrated the ability to be more accurate than trained radiologists in interpreting images. In a study utilizing a curated dataset, the technology showed a U.S. reduction of 9.4% in false negatives and 5.7% in false positives when compared to six expert radiologists. The system outperformed all radiologists in the study with an area under the curve (AUC) 11.5% higher than the average radiologist in the group. Radiologists make on approximately $30 to $50 to interpret a mammogram and they make more when utilizing computer aided detection (CAD) technologies like ProFound AI®, the market leading CAD tool (CPT codes 77063, 77065, 77066, and 77067). Based upon data from the National Ambulatory Medical Care Survey there are around 30 million screening mammograms performed in the U.S. every year, and in addition to lower interpretation costs, the ability to lower false positives and catch cancers earlier could have a dramatic impact on costs.
Mammography is not the only area where new AI tools are being applied to cancer imaging. MRIs looking at brain tumors are notoriously difficult to detect and can take 40 min. for a trained radiologist. A study presented at the Radiological Society for North America meeting showed AI tools could accurately detect 91% of brain tumors compared to 86% for academic neuroradiologists and a much lower percentage for less trained radiologists. More recent AI datasets on brain tumors show even higher sensitivity and specificity.
New AI tools such as Google Health’s DermAssist combines the power of a smartphone with computer algorithms to detect 90% of most skin conditions including 288 skin, hair, and nail conditions. Users take a well-lit photo of their skin and answer a few questions. Within minutes the tool provides a list of potentially matching conditions to the user. In 2021 the product received CE Mark. In a study published in Nature, the phone-based tool was found to be equivalent to dermatologists in detecting 26 skin conditions that make up 80% of primary care visits, and superior to primary care physicians and nurse practitioners. Google Health has stated that despite these promising results they are currently not pursuing U.S. FDA approval for the device given there is not a clear pathway to approval. Google Health is also working on AI based tools for conditions such as diabetic retinopathy, a leading cause of blindness in the U.S.
Many other companies such as Merative (formerly IBM Watson Health), Siemens, GE Healthcare, NVIDIA, Zebra Medical Vision, and Arterys are developing new AI technologies for medical imaging interpretation. AI based tools are poised to take a much more prominent role in medical imaging, could have a dramatic impact on patient care, reducing the burden on the healthcare system, improving diagnostic accuracy, and ultimately lowering healthcare costs. In some cases these tools appear positioned to not only aid, but ultimately replace human interpretation of imagery in the near term.