A preliminary study on computerized lesion localization in MR mammography using 3D nMITR maps, multilayer cellular neural networks, and fuzzy c-partitioning


Ertas G., GÜLÇÜR H. Ö., Tunaci M., Osman O., Ucan O. N.

MEDICAL PHYSICS, cilt.35, sa.1, ss.195-205, 2008 (SCI-Expanded) identifier identifier identifier

  • Yayın Türü: Makale / Tam Makale
  • Cilt numarası: 35 Sayı: 1
  • Basım Tarihi: 2008
  • Doi Numarası: 10.1118/1.2805477
  • Dergi Adı: MEDICAL PHYSICS
  • Derginin Tarandığı İndeksler: Science Citation Index Expanded (SCI-EXPANDED), Scopus
  • Sayfa Sayıları: ss.195-205
  • İstanbul Üniversitesi Adresli: Evet

Özet

Cellular neural networks (CNNs) are massively parallel cellular structures with learning abilities. They can be used to realize complex image processing applications efficiently and in almost real time. In this preliminary study, we propose a novel, robust, and fully automated system based on CNNs to facilitate lesion localization in contrast-enhanced MR mammography, a difficult task requiring the processing of a large number of images with attention paid to minute details. The data set consists of 1170 slices containing one precontrast and five postcontrast bilateral axial MR mammograms from 39 patients with 37 malignant and 39 benign mass lesions acquired using a 1.5 Tesla MR scanner with the following parameters: 3D FLASH sequence, TR/TE 9.80/4.76 ms, flip angle 25 degrees, slice thickness 2.5 mm, and 0.625 x 0.625 mm(2) in-plane resolution. Six hundred slices with 21 benign and 25 malignant lesions of this set are used for training the CNNs; the remaining data are used for test purposes. The breast region of interest is first segmented from precontrast images using four 2D CNNs connected in cascade, specially designed to minimize false detections due to muscles, heart, lungs, and thoracic cavity. To identify deceptively enhancing regions, a 3D nMITR map of the segmented breast is computed and converted into binary form. During this process tissues that have low degrees of enhancements are discarded. To boost lesions, this binary image is processed by a 3D CNN with a control template consisting of three layers of 11 x 11 cells and a fuzzy c-partitioning output function. A set of decision rules extracted empirically from the training data set based on volume and 3D eccentricity features is used to make final decisions and localize lesions. The segmentation algorithm performs well with high average precision, high true positive volume fraction, and low false positive volume fraction with an overall performance of 0.93 +/- 0.05, 0.96 +/- 0.04, and 0.03 +/- 0.05, respectively (training: 0.93 +/- 0.04, 0.94 +/- 0.04, and 0.02 +/- 0.03; test: 0.93 +/- 0.05, 0.97 +/- 0.03, and 0.05 +/- 0.06). The lesion detection performance of the system is quite satisfactory; for the training data set the maximum detection sensitivity is 100% with false-positive detections of 0.28/lesion, 0.09/slice, and 0.65/case; for the test data set the maximum detection sensitivity is 97% with false-positive detections of 0.43/lesion, 0.11/slice, and 0.68/case. On the average, for a detection sensitivity of 99%, the overall performance of the system is 0.34/lesion, 0.10/slice, and 0.67/case. The system introduced does not require prior information concerning breast anatomy; it is robust and exceptionally effective for detecting breast lesions. The use of CNNs, fuzzy c-partitioning, volume, and 3D eccentricity criteria reduces false-positive detections due to artifacts caused by highly enhanced blood vessels, nipples, and normal parenchyma and artifacts from vascularized tissues in the chest wall due to oversegmentation. We hope that this system will facilitate breast examinations, improve the localization of lesions, and reduce unnecessary mastectomies, especially due to missed multicentric lesions and that almost real-time processing speeds achievable by direct hardware implementations will open up new clinical applications, such as making feasible quasi-automated MR-guided biopsies and acquisition of additional postcontrast lesion images to improve morphological characterizations.