SPIE Membership Get updates from SPIE Newsroom
  • Newsroom Home
  • Astronomy
  • Biomedical Optics & Medical Imaging
  • Defense & Security
  • Electronic Imaging & Signal Processing
  • Illumination & Displays
  • Lasers & Sources
  • Micro/Nano Lithography
  • Nanotechnology
  • Optical Design & Engineering
  • Optoelectronics & Communications
  • Remote Sensing
  • Sensing & Measurement
  • Solar & Alternative Energy
  • Sign up for Newsroom E-Alerts
  • Information for:
SPIE Photonics West 2019 | Register Today

SPIE Defense + Commercial Sensing 2019 | Call for Papers

2019 SPIE Optics + Photonics | Call for Papers



Print PageEmail Page

Biomedical Optics & Medical Imaging

SPIE Professional, October 2018

Artificial Intelligence: Best for Breast

Sophisticated artificial intelligence algorithms are revolutionizing breast cancer screening.

22 October 2018, SPIE Newsroom. DOI:

Image courtesy of Staff Sgt. Liliana Moreno, U.S. Air Force.

In 2017, the chief medical officer of the American Cancer Society, Dr. Otis Brawley, wrote to the Annals of Internal Medicine about the "phenomenon of overdiagnosis" that exists in breast cancer screening.

Citing Danish research that estimates screen-detected breast tumors have an overdiagnosis rate of up to 38.6%, Brawley's comments renewed debate on the validity of such programs. Only months later, UK-based medics echoed his concerns when they wrote to The Times national newspaper, claiming "[the UK] breast screening program mostly causes more unintended harm than good."

These views will strike a chord with women worldwide. Regular breast screening-typically via a mammogram-helps to find cancer, but has its limits.

Mammograms depict differences in breast composition, with dense tumors reducing the intensity of the x-ray beam and appearing opaque in a scan. However, the technology can generate false positive results, seeing many women recalled for unnecessary and stressful biopsies or ultrasound scans. What's more, potentially cancerous masses can be more difficult to detect in dense breast tissue, increasing the risk that cancer will be missed.

However, hope is at hand with the development of artificial intelligence (AI) to augment mammographic and other types of medical imaging. The first implementation of AI-computer aided detection-has long been utilized to better detect abnormal masses, calcifications, and clusters of microcalcifications in mammograms.

As Professor Maryellen Giger of the Department of Radiology, The University of Chicago (USA), and 2018 SPIE President, says, "Computer-aided detection was FDA-approved for screening programs in 1998 to detect breast tumors and is now extensively used clinically on screening mammograms."

Indeed, since the mid-1990s, the field of computerized medical image interpretation has grown exponentially in size, evolving to "radiomics," which includes computer-aided detection, computer-aided diagnosis, prognosis, future cancer risk assessment, and prediction of response to therapy.

Radiomics can be based on traditional machine-learning methods in which quantitative human-designed features are extracted from medical images. However, it can also use deep-learning methods in which neural networks perform advanced pattern recognition from image data.

Giger and her colleagues have pioneered AI for medical image analysis. They developed the methods used in the first FDA-cleared machine-learning computer-aided diagnosis system for aiding breast cancer diagnosis from magnetic resonance imaging in 2017.

As Giger's colleague, Professor Karen Drukker, points out, such recent breakthroughs hinge on decades of imaging and computing progress and the arrival of graphical processing units (GPUs). "Today we have better image quality and faster computers, and, for example, MRI data can now be analyzed on the fly, in real-time... GPUs have had a huge impact on deep learning," she says.

As a result, more and more AI is being applied to ultrasound, mammography, and MRI, as well as state-of-the-art 3D mammography and synthetic mammography, which calculates a 2D image from the 3D dataset. Also, AI is increasingly probing additional patient information, including clinical data, molecular subtypes, and genomics, to predict patient prognosis and more.

Many researchers believe this marriage of imaging and additional patient data will be crucial to future cancer diagnosis and treatment, and Giger, Drukker, and colleagues have already related radiomic features extracted from MR images of breast lesions to clinical, molecular, and genomics biomarkers.

In recent studies, they linked radiomic features from pre-treatment MR images of breast cancer patients to pathologic cancer stage and lymph node status, post-surgery. Tumor size was found to be a powerful indicator of cancer stage, but radiomic features relating to cancer biology and genomics also showed promise in predicting cancer stage and lymph node status, which could not be predicted by tumor size alone.

The researchers have also explored whether radiomics can predict the risk of breast cancer recurrence and be used to understand the genetic mechanisms of tumor development.

Reducing recalls

Such AI developments arrive not a moment too soon. The American Cancer Society estimates that of the 12.1 million mammograms performed annually, 50% yield false positive results. Throw in the rising number of mammograms performed every year, and radiologists worldwide are struggling to keep pace.

With this in mind, Professor Stephen Wong, founding chairman for the Department of Systems Medicine and Bioengineering, Houston Methodist Research Institute (USA), and his colleagues have developed AI to evaluate mammograms and pathology reports, and assist physicians with rapid and accurate prediction of breast cancer risk.

Their AI-based natural language processing software algorithms automatically extract mammographic and pathologic findings from free text reports and are said to translate patient charts into diagnostic information at 30 times that of human speed and with 99% accuracy.

As Wong highlights, "This has the potential to decrease unnecessary biopsies... it is so important to combine all the information we have to create a better risk assessment model."

Professor Regina Barzilay and colleagues from MIT's Computer Science and Artificial Intelligence Laboratory (USA) are also painfully aware of the issues surrounding unnecessary biopsies and later surgeries. As PhD candidate Adam Yala puts it, "We work with Massachusetts General Hospital and there are so many cases of overtreatment. For example, everyone with a high-risk lesion gets surgery, but only 10% of these patients actually have cancer."

Like Wong, Yala has built an information-extraction tool to automatically read free text from breast pathology reports that is being used at Massachusetts General Hospital. The tool extracts information on the characteristics of, say, atypical cells and tumors, providing more detail on a potential cancer while reducing the time taken for physicians to understand patient data.

His colleagues in the MIT laboratory have also developed an AI system that uses machine learning to predict if a high-risk lesion identified on a needle biopsy will upgrade to cancer at surgery. Trained on information from more than 600 existing high-risk lesions, the model identifies patterns amongst data elements, such as past biopsies and pathology reports.

"When there is so much uncertainty in data, machine learning is exactly the tool that we need to improve selection and prevent overtreatment," highlights Barzilay.

Professor Reyer Zwiggelaar from the Department of Computer Science, Aberystwyth University (UK) has also been developing methods to reduce patient biopsies and ease radiologist workload. In a series of papers presented at this year's SPIE-sponsored 14th International Workshop on Breast Imaging (IWBI), he has looked at how deep convolutional neural networks can classify tumors and the effectiveness of machine vision models to classify benign and malignant mammogram masses.

He and colleagues have also been focusing on microcalcification clusters in mammograms using a range of methods, including computer-aided detection and diagnosis, and state-of-the-art detection algorithms, to better detect these often elusive abnormalities. As he points out, past approaches have either focused on the morphology of individual microcalcifications or overall cluster features. But their model examines the structure of clusters at different scales, assessing, for example, connectivity between individual microcalcifications, which can indicate malignancy.

"We're not interested in finding an abnormality, but instead we're asking is this benign or malignant, once it's been found," he says. "This hasn't yet been used in a practical setting but we think it will bring more certainty [to mammogram interpretation] and could reduce biopsies."

Images courtesy of Reyer Zwiggelaar
Left: Brighter blobs show a micro-calcification cluster. Right: AI can be used to see which micro-calcifications are close to each other and as such connected. Using such information at multiple scales can be used to classify clusters as benign or malignant.

Zwiggelaar is also excited about using deep learning in both mammography and breast histology to link mammographic information to the smaller-scale detail in tissue microstructure. His team is developing deep-learning networks to map the features and phenotypes between mammographic abnormalities and histology imagery.

They believe such a "linking map" could vastly reduce the need for further biopsy and surgery if, say, an identified mammogram mass is deemed benign from corresponding histology data. "We could use mammography data to predict what the histology will look like," he says. "It is the early days for this research, but initial results look promising."

Still, Zwiggelaar is keen to highlight the fact that researchers using deep learning do not yet fully understand how it works. "Some of our deep learning is a black box for us," he points out. "Somewhere internally it makes the right decision, propagates, and gets a correct final answer, but we would rather have a slightly better understanding of what is happening inside and be able to track its processes."

Given this, Zwiggelaar and researchers worldwide are investigating how deep learning contributes to breast screening applications. "Such an understanding will mean more clinical staff will view this as a valid second opinion," he says. "We are getting there, but a full understanding is probably a number of years down the line."

So what can we expect from AI and breast screening? Houston Methodist's Wong is confident that AI will become intrinsically embedded into healthcare infrastructure, and then, as he puts it, "we will never mention it again."

"AI will remove the mundane work of radiologists and improve their efficiencies so we see very large volume screening," he adds.

Meanwhile, The University of Chicago's Giger hopes to see AI embraced by clinicians, using it as a screening aid. "As long as we present artificial intelligence as something to augment interpretation and not to replace clinicians, then people will learn how to incorporate and relate it to other medical tests," she says. "We really are going to see this used across the board."