Next Article in Journal
Exploring the Role of Testosterone Replacement Therapy in Benign Prostatic Hyperplasia and Prostate Cancer: A Review of Safety
Previous Article in Journal
Endoscopic Treatment of Upper Tract Urothelial Carcinoma: Challenging the Definition of the Maximal Lesion Size for Safe Ablation
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Diagnosis and Localization of Prostate Cancer via Automated Multiparametric MRI Equipped with Artificial Intelligence

1
Department of Renal and Genitourinary Surgery, Graduate School of Medicine, Hokkaido University, Sapporo 060-8638, Japan
2
AI Research Center, National Institute of Advanced Industrial Science and Technology, Tsukuba 305-8560, Japan
*
Author to whom correspondence should be addressed.
Uro 2022, 2(1), 21-29; https://doi.org/10.3390/uro2010004
Submission received: 12 December 2021 / Revised: 4 January 2022 / Accepted: 11 January 2022 / Published: 14 January 2022

Abstract

:
Prostate MRI scans for pre-biopsied patients are important. However, fewer radiologists are available for MRI diagnoses, which requires multi-sequential interpretations of multi-slice images. To reduce such a burden, artificial intelligence (AI)-based, computer-aided diagnosis is expected to be a critical technology. We present an AI-based method for pinpointing prostate cancer location and determining tumor morphology using multiparametric MRI. The study enrolled 15 patients who underwent radical prostatectomy between April 2008 and August 2017 at our institution. We labeled the cancer area on the peripheral zone on MR images, comparing MRI with histopathological mapping of radical prostatectomy specimens. Likelihood maps were drawn, and tumors were divided into morphologically distinct regions using the superpixel method. Likelihood maps consisted of pixels, which utilize the cancer likelihood value computed from the T2-weighted, apparent diffusion coefficient, and diffusion-weighted MRI-based texture features. Cancer location was determined based on the likelihood maps. We evaluated the diagnostic performance by the area under the receiver operating characteristic (ROC) curve according to the Chi-square test. The area under the ROC curve was 0.985. Sensitivity and specificity for our approach were 0.875 and 0.961 (p < 0.01), respectively. Our AI-based procedures were successfully applied to automated prostate cancer localization and shape estimation using multiparametric MRI.

1. Introduction

Technological progress in medical imaging has enabled more sophisticated diagnosis using clinical images through the use of various modalities and protocols with thinner slices at multiple time points. When fused with transrectal ultrasonography (TRUS)–guided prostate biopsy, multiparametric magnetic resonance imaging (mpMRI) is a critical modality for the detection of prostate cancer [1,2]; especially when the cancer is limited to the ventral region or transitional zone of the prostate, where cancer is rarely detected by systematic biopsy [3]. Moreover, a targeted biopsy can avoid overdetection of clinically insignificant cancer [4]. Real-time MRI-TRUS fusion has been put into practical use for targeted prostate biopsy, and the findings have led the European Association of Urology and National Comprehensive Cancer Network guidelines to demonstrate the superiority of diagnosis via targeted biopsy compared with systematic biopsy [5,6]. Recent review supported the fundamental role of targeted biopsy, complementary with systemic biopsy in enhancing detection rates and reducing the risk of missing clinically significant cancer [7]. An unnecessary biopsy could be avoided using screening algorithms with up-front liquid biopsy followed by mpMRI and biopsy [8].
On the other hand, the increase in the total number of radiograms that must be analyzed has imposed a severe burden on clinicians, who are responsible for the interpretation of all radiograms [9]. Moreover, interpretation of prostate mpMRI for cancer detection requires well-trained interpretation to make a diagnosis by the combined findings from multi-sequential MR images such as T2-weighted images (T2WI), diffusion-weighted images (DWI), apparent diffusion coefficient (ADC) maps, and dynamic contrast-enhanced images [10,11].
To reduce such an interpretational burden and improve productivity in clinical practice, artificial intelligence (AI)-based computer-aided diagnosis (CAD) is expected to be a critical technology. Furthermore, localizing a cancer is clinically more important than determining the cancer presence or aggressiveness because, at present, histopathological diagnosis is essential for a definitive diagnosis of prostate cancer. A number of studies have reported that the detection of cancer location by mpMRI enhances the accuracy of targeted prostate biopsy [1,2], whereas studies of AI-based automated cancer diagnosis from prostate mpMRI have mainly focused on the cancer presence or aggressiveness [12,13,14]. Furthermore, previous reports about AI-based cancer localization proposed individual-pixel diagnosis [15,16], but the methods used in those studies had shape-estimation problems that generated vermiculate overdetection and omissions owing to statistical outliers [15].
To resolve such problems, we propose a method of superpixel segmentation of likelihood maps, which can pinpoint cancer distribution. Likelihood maps consist of the cancer likelihood value for each pixel computed from texture information of MR images using a support vector machine (SVM). The superpixel method divides images into non-linearly shaped regions, aggregating neighboring pixels with similar pixel values. Collaboration between the two machine-learning techniques enables more accurate cancer localization and shape estimation and clearly delineates the physiological boundary and anatomical continuity.
We present an AI-based diagnostic method of prostate cancer localization and shape determination using mpMRI.

2. Materials and Methods

This study has received the approval of the Institutional Review Board for clinical research of Hokkaido University Hospital.

2.1. Study Population

This study enrolled 15 prostate cancer patients who underwent radical prostatectomy (RP) and prostate mpMRI at the Hokkaido University Hospital between April 2008 and August 2017. The patients met the following inclusion criteria: (i) biopsy naïve MR images were available in all of the three sequences, i.e., the T2WI OR sequences; T2WI, DWI, and ADC maps, (ii) whole-mount histopathological tumor maps were available, and (iii) patients had no prior treatment for prostate cancer or surgery for a benign prostate tumor. Those patients whose cancer sites were not visible on MRI or whose cancer sites were too small and patients without peripheral zone cancer were excluded from the study. Baseline characteristics of PCa patients are shown in Table 1 and Table 2.

2.2. Overview of the AI-Assisted Diagnostic Method

Our automated diagnostic procedure consisted of three steps (Figure 1). The first step was to extract texture features of tumors and draw likelihood maps. For each pixel, the texture features were calculated as numeric values from neighbor or inter-sequential pixel values on MR images, and the SVM converted the texture features into a cancer probability as one of the pixel values of likelihood maps [17]. In this way, we generated “likelihood maps” that defined the cancer distribution.
In the second step, the superpixel method divided a likelihood map into ~600 non-linear superpixel regions to bring cancerous pixels together [18]. A cancer diagnosis for each superpixel constituted the final step in our procedure. Each superpixel was assigned as “cancer” or “normal” based on its mean likelihood. We implemented the automated cancer detection program using MATLAB®.

2.3. MR Images and Histopathological Images

All MR images were acquired with a 3.0 T scanner (Achieva 3.0-T TX series R3.21; Philips Medical Systems, Best, The Netherlands) with a pelvic phased-array coil (32-channel SENSE Torso/Cardiac Coil). No endorectal coil was used. The slice thickness was 3 mm for all sequences. The following MR sequences were obtained: axial T2WI and axial DWI. The ADC values were calculated from two DWI scans acquired with b = 0 and 2000 s/mm2, and ADC maps were then rebuilt by calculating the ADC values for each pixel of each slice.
RP specimens were sectioned perpendicular to the prostatic urethra from the apex to the base according to Japanese General Rules for Prostatic Cancer. Pathologists in our institution examined the specimens and mapped the cancer regions that were apparent in all cross-sections.
We labeled the cancer regions on the MR images by comparing the histopathological maps with the MR images. If both the histopathological maps and MR findings indicated the presence of prostate cancer, it was judged as a cancer region.

2.4. Texture Features and Likelihood Maps

For the first step, we designed a new texture feature named higher-order local texture information (HLTI), which is a suitable customization of higher-order local auto-correlation (HLAC) for pixel-based computation [19]. HLTI includes intra-sequential and inter-sequential HLAC, contrast, and homogeneity.
SVM posterior probability computation generated four types of primary likelihood maps by changing the combination of MR sequences for feature extraction: T2WI and ADC maps; T2WI and DWI; ADC maps and DWI; T2WI, ADC maps, and DWI. Subsequently, we generated secondary likelihood maps, giving each pixel the least cancer likelihood among the four primary likelihood maps.

2.5. Superpixel Segmentation and Cancer Diagnosis

In this study, we applied the superpixel method to non-linear segmentation of likelihood maps to describe cancer distribution. The superpixel method uses pixel-based, k-means clustering. For our purposes, the superpixel method used a simple linear iterative clustering (SLIC) algorithm [18], and hyperparameters consisted of the following: number of superpixels = 600, compactness = 60, and number of iterations = 15. The SLIC algorithm groups neighbor pixels into superpixel regions with similar pixel values. In this way, the peripheral zone of the tumor was divided into cancerous superpixels and benign ones.
We calculated the mean likelihood of cancerous tissue for each superpixel and set 0.5 as the diagnostic threshold. We defined superpixels whose mean likelihood was greater than 0.5 as a diagnosis of cancer and defined superpixels containing more than 50% of cancer pixels as true cancer. We evaluated the diagnostic accuracy of our AI-based CAD via area-weighted sensitivity, specificity, and the area under the area-weighted receiver operating characteristic (ROC) curve. Pearson’s Chi-square test was used to compare categorical data, with p < 0.01 considered statistically significant. “Area-weighted” implies that each superpixel was counted as its number of pixels.

2.6. Cross Validation

SVM classifiers were evaluated independently through leave-one-patient-out cross validation. Cross validation is a technique used to evaluate classifiers by partitioning the original sample into a training dataset from 14 patients to train the classifier and a test dataset from one patient to evaluate it. The SVM classifier used radial basis function kernels, and the hyperparameters consisted of each kernel’s width parameter γ and misclassification penalty C.

3. Results

The area under the area-weighted ROC curve was 0.985 as shown in Figure 2. Examples of AI diagnosis are shown in Figure 3. In these examples, area-weighted sensitivity, specificity, positive predictive value, and negative predictive value were 87.5%, 96.1%, 73.7%, and 98.4% (p < 0.0001), respectively.

4. Discussion

Our AI-based CAD accurately identified prostate cancer location and shape using mpMRI despite the small population of patients. The most important advantage of our procedure is that our CAD requires just more than a dozen training datasets to provide adequate performance, whereas the training of neural networks requires hundreds or thousands of datasets or data augmentation [14]. This is because we chose pixel-based feature extraction. In other words, all the peripheral zone pixels were data samples for training the SVM. Although the population size was very small, the data sample size was very large in this study.
Little has been reported about cancer localization through the automated diagnosis of prostate MR images, but some studies have focused on cancer presence or aggressiveness [12,13,14]. In 2012, the first report, to our knowledge, of AI-based CAD of prostate cancer via MRI proposed that cancer probability maps can be computed by SVM [16]. Strictly speaking, however, this CAD did not make a final diagnosis but rather provided only a “distribution of cancer probability” to facilitate accurate diagnosis by physicians. In 2017, Sun et al. reported on automated cancer localization including a final diagnosis [15]. Classifying individual pixels by SVM, they succeeded in extracting cancer regions of typical morphology. However, this SVM diagnosed individual pixels using extremely local information without subregional aggregate computation. In this way, it generated a number of minute overdetection and omissions that reduced the diagnostic performance. Fehr et al. indicated that, under manual segmentation of cancer regions, SVM and the Haralick feature could not only classify cancer regions but also estimate Gleason’s score [20], especially with SMOTE data augmentation [12]. The present study demonstrates that the combination of the likelihood map and pixel aggregation by the superpixel method enhances cancer localization and shape estimation. This combination filtered out minute misclassifications and automatically extracted cancer regions, thereby accurately portraying tumor size and shape. Correct localization would reduce the length of biopsy core to achieve Gleason’s score agreement between biopsy and RP specimen and lead to providing accurate information for a decision of active surveillance or treatment [21].
The localization of cancer in the prostate requires detailed knowledge of the distribution of local populations of cancer cells. We labeled the cancer regions in each MR slice according to histopathological prostate cancer mappings, and our SVM computation estimated the cancer probability for individual pixels. These pixel-based procedures for cancer labeling and estimating cancer probability achieved accurate localization of cancer and approximation of tumor shape. In this report, we manually labeled cancer regions on each T2WI, comparing histopathological maps with MR images, whereas Sun et al. directly compared their AI-based diagnosis with raw histopathological maps. We chose this approach because we assumed that cancer distribution in raw histopathological maps did not fit with most MR images and could even cause incorrect labeling. In fact, the slice thickness and angle differed between our histopathological maps and MR images. Formalin fixation or a surgical procedure could modify the RP specimen, but the computation of texture features in a single pixel causes statistical outliers because of the absence of statistical processing such as averaging. Outliers are major causes of vermiculate overdetection and omissions. Superpixel segmentation contributed to resolving this outlier problem.
Moreover, we generated four types of primary likelihood maps by changing the combination of MR sequences from which the texture feature was extracted. We synthesized secondary likelihood maps from the four primary likelihood maps. The reason why secondary maps are needed is that using only one type of likelihood map often diminishes any overdetection in other types of likelihood maps, whereas true cancer tends not to be diminished by any type of likelihood map.

4.1. HLTI

Here we propose the use of the newly designed texture feature named HLTI, which is an extended texture feature of HLAC [19], for multi-sequential MR images. HLTI is computed from pixel values of a central pixel and neighboring pixels and differs from the existing HLAC or Haralick feature with respect to whether the texture feature is differentially extracted from each sequential MR image or is extracted from two or three sequential MR images in a lump [20]. As far as we know, little has been reported about such inter-sequential feature extraction from prostate mpMRI. Inter-sequential feature extraction is expected to reduce the deterioration of inter-sequential information by standardization.

4.2. Diagnostic Partition Using the Superpixel Method

Accurate cancer localization for targeted biopsy or estimation of capsule invasion definitely requires detailed shape estimation. In our unpublished preliminary experiment, we divided MR images into small rectangular patches and diagnosed each patch. This procedure, named ‘partial diagnosis’, can hardly estimate the shape of cancer regions because MR images were divided into patches, which ignored the true physiological boundary. This automatic cancer segmentation could contribute to extracting cancer-site-based features to make more elaborate diagnoses [22].
Diagnosis of aggregated cancerous pixels, however, is a reasonable method for detailed shape estimation compared with partial diagnosis. We call this procedure ‘diagnostic partition’. We used the SLIC superpixel method for segmentation [18], which aggregates neighboring pixels with similar values. The superpixel method segments raw MR images in a manner that is blind to texture information. Therefore, raw MR images should be converted into cancer distribution images, which express cancer likelihood as pixel values. In other words, the superpixel method partitions the likelihood maps. In this way, our procedure enables non-linear segmentation, thereby preserving the physiological boundary. Furthermore, the superpixel method has the benefit of absorbing outlier-induced misdiagnosed pixels as neighboring pixels.

4.3. Limitations

There are some limitations to this study. First, histopathological maps cannot be directly applied to MR images as cancer labelling because the location of the MR slice does not exactly match that of histopathological maps or even other sequences of MR images in terms of slice angle, thickness, and scale because of body motion, rectum peristalsis, formalin fixation, surgical procedure, or MRI device settings. It is difficult to correct such mismatches. However, combining automated organ segmentation with image fusion techniques such as elastic fusion or shrinkage factor might facilitate automated and accurate cancer labeling [23,24,25].
Secondary, no cancer-free, whole-mount prostate specimens were available. Although we demonstrated favorable performance of our CAD, it remained unknown how correctly our CAD diagnoses benign prostate. Prostate MRI before radical cystectomy would break through the limitation.
Third, our CAD doesn’t support the diagnosis of Gleason’s score, local tumor invasion, or rare histological variant. This is because our CAD needs at least hundreds of training data. In this study, pixel-based training, which provides large training data, enables our CAD to achieve the benchmarks. We have to mention that only 15 patients were enrolled in this single-center study. A future study including a large population would resolve this problem.

5. Conclusions

In conclusion, diagnostic partition using the superpixel method and SVM-computed likelihood maps enables automated diagnosis of prostate cancer location and shape in mpMRI.

Author Contributions

Conceptualization, Y.O., T.K. and M.M.; methodology, Y.O., H.N. and H.S.; software, T.O.; validation, T.O. and T.A.; formal analysis, Y.O.; investigation, Y.O., T.K. and H.N.; resources, T.A.; data curation, T.O.; writing—original draft preparation, Y.O.; writing—review and editing, T.K., T.O., T.A., N.S., H.N., H.S. and M.M.; visualization, Y.O.; supervision, T.K., N.S. and M.M. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was conducted according to the guidelines of the Declaration of Helsinki, and approved by the Institutional Review Board of Hokkaido University Hospital (protocol code 016-0105 and date of approval is 30 March 2018).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data presented in this study are available on request from the corresponding author. The data are not publicly available due to a particular file format.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Garcia Bennett, J.; Vilanova, J.C.; Guma Padro, J.; Parada, D.; Conejero, A. Evaluation of MR imaging-targeted biopsies of the prostate in biopsy-naive patients. A single centre study. Diagn. Interv. Imaging 2017, 98, 677–684. [Google Scholar] [CrossRef] [PubMed]
  2. Kasivisvanathan, V.; Rannikko, A.S.; Borghi, M.; Panebianco, V.; Mynderse, L.A.; Vaarala, M.H.; Briganti, A.; Budaus, L.; Hellawell, G.; Hindley, R.G.; et al. MRI-Targeted or Standard Biopsy for Prostate-Cancer Diagnosis. N. Engl. J. Med. 2018, 378, 1767–1777. [Google Scholar] [CrossRef] [PubMed]
  3. Ouzzane, A.; Puech, P.; Lemaitre, L.; Leroy, X.; Nevoux, P.; Betrouni, N.; Haber, G.P.; Villers, A. Combined multiparametric MRI and targeted biopsies improve anterior prostate cancer detection, staging, and grading. Urology 2011, 78, 1356–1362. [Google Scholar] [CrossRef] [PubMed]
  4. Moore, C.M.; Robertson, N.L.; Arsanious, N.; Middleton, T.; Villers, A.; Klotz, L.; Taneja, S.S.; Emberton, M. Image-guided prostate biopsy using magnetic resonance imaging-derived targets: A systematic review. Eur. Urol. 2013, 63, 125–140. [Google Scholar] [CrossRef]
  5. Carroll, P.R.; Parsons, J.K.; Andriole, G.; Bahnson, R.R.; Castle, E.P.; Catalona, W.J.; Dahl, D.M.; Davis, J.W.; Epstein, J.I.; Etzioni, R.B.; et al. NCCN Guidelines Insights: Prostate Cancer Early Detection, Version 2.2016. J. Natl. Compr. Cancer Netw. JNCCN 2016, 14, 509–519. [Google Scholar] [CrossRef]
  6. Mottet, N.; Bellmunt, J.; Bolla, M.; Briers, E.; Cumberbatch, M.G.; De Santis, M.; Fossati, N.; Gross, T.; Henry, A.M.; Joniau, S.; et al. EAU-ESTRO-SIOG Guidelines on Prostate Cancer. Part 1: Screening, Diagnosis, and Local Treatment with Curative Intent. Eur. Urol. 2017, 71, 618–629. [Google Scholar] [CrossRef]
  7. Rapisarda, S.; Bada, M.; Crocetto, F.; Barone, B.; Arcaniolo, D.; Polara, A.; Imbimbo, C.; Grosso, G. The role of multiparametric resonance and biopsy in prostate cancer detection: Comparison with definitive histological report after laparoscopic/robotic radical prostatectomy. Abdom. Radiol. 2020, 45, 4178–4184. [Google Scholar] [CrossRef]
  8. de la Calle, C.M.; Fasulo, V.; Cowan, J.E.; Lonergan, P.E.; Maggi, M.; Gadzinski, A.J.; Yeung, R.A.; Saita, A.; Cooperberg, M.R.; Shinohara, K.; et al. Clinical Utility of 4Kscore((R)), ExosomeDx and Magnetic Resonance Imaging for the Early Detection of High Grade Prostate Cancer. J. Urol. 2021, 205, 452–460. [Google Scholar] [CrossRef]
  9. Nakajima, Y.; Yamada, K.; Imamura, K.; Kobayashi, K. Radiologist supply and workload: International comparison--Working Group of Japanese College of Radiology. Radiat. Med. 2008, 26, 455–465. [Google Scholar] [CrossRef]
  10. Shin, T.; Smyth, T.B.; Ukimura, O.; Ahmadi, N.; de Castro Abreu, A.L.; Ohe, C.; Oishi, M.; Mimata, H.; Gill, I.S. Diagnostic accuracy of a five-point Likert scoring system for magnetic resonance imaging (MRI) evaluated according to results of MRI/ultrasonography image-fusion targeted biopsy of the prostate. BJU Int. 2018, 121, 77–83. [Google Scholar] [CrossRef] [Green Version]
  11. Kirkham, A.P.; Haslam, P.; Keanie, J.Y.; McCafferty, I.; Padhani, A.R.; Punwani, S.; Richenberg, J.; Rottenberg, G.; Sohaib, A.; Thompson, P.; et al. Prostate MRI: Who, when, and how? Report from a UK consensus meeting. Clin. Radiol. 2013, 68, 1016–1023. [Google Scholar] [CrossRef]
  12. Fehr, D.; Veeraraghavan, H.; Wibmer, A.; Gondo, T.; Matsumoto, K.; Vargas, H.A.; Sala, E.; Hricak, H.; Deasy, J.O. Automatic classification of prostate cancer Gleason scores from multiparametric magnetic resonance images. Proc. Natl. Acad. Sci. USA 2015, 112, E6265–E6273. [Google Scholar] [CrossRef] [Green Version]
  13. Wibmer, A.; Hricak, H.; Gondo, T.; Matsumoto, K.; Veeraraghavan, H.; Fehr, D.; Zheng, J.; Goldman, D.; Moskowitz, C.; Fine, S.W.; et al. Haralick texture analysis of prostate MRI: Utility for differentiating non-cancerous prostate from prostate cancer and differentiating prostate cancers with different Gleason scores. Eur. Radiol. 2015, 25, 2840–2850. [Google Scholar] [CrossRef]
  14. Song, Y.; Zhang, Y.D.; Yan, X.; Liu, H.; Zhou, M.; Hu, B.; Yang, G. Computer-Aided diagnosis of prostate cancer using a deep convolutional neural network from multiparametric MRI. J. Magn. Reson. Imaging JMRI 2018, 48, 1570–1577. [Google Scholar] [CrossRef] [PubMed]
  15. Sun, Y.; Reynolds, H.; Wraith, D.; Williams, S.; Finnegan, M.E.; Mitchell, C.; Murphy, D.; Ebert, M.A.; Haworth, A. Predicting prostate tumour location from multiparametric MRI using Gaussian kernel support vector machines: A preliminary study. Australas. Phys. Eng. Sci. Med. 2017, 40, 39–49. [Google Scholar] [CrossRef]
  16. Shah, V.; Turkbey, B.; Mani, H.; Pang, Y.; Pohida, T.; Merino, M.J.; Pinto, P.A.; Choyke, P.L.; Bernardo, M. Decision support system for localizing prostate cancer based on multiparametric magnetic resonance imaging. Med. Phys. 2012, 39, 4093–4103. [Google Scholar] [CrossRef] [Green Version]
  17. Platt, J.C. Probabilistic Outputs for Support Vector Machines and Comparisons to Regularized Likelihood Methods. Adv. Large Margin Classif. 1999, 10, 61–74. [Google Scholar]
  18. Achanta, R.; Shaji, A.; Smith, K.; Lucchi, A.; Fua, P.; Susstrunk, S. SLIC superpixels compared to state-of-the-art superpixel methods. IEEE Trans. Pattern Anal. Mach. Intell. 2012, 34, 2274–2282. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  19. Otsu, N.; Kurita, T. A New Scheme for Practical Flexible and Intelligent; IAPR Workshop CV: Tokyo, Japan, 1988; pp. 431–435. [Google Scholar]
  20. Haralick, R.M.; Shanmugam, K.; Dinstein, I.H. Textural Features for Image Classification. IEEE Trans. Syst. Man. Cybern. 1973, 3, 610–621. [Google Scholar] [CrossRef] [Green Version]
  21. Fiorentino, V.; Martini, M.; Dell’Aquila, M.; Musarra, T.; Orticelli, E.; Larocca, L.M.; Rossi, E.; Totaro, A.; Pinto, F.; Lenci, N.; et al. Histopathological Ratios to Predict Gleason Score Agreement between Biopsy and Radical Prostatectomy. Diagnostics 2020, 11, 10. [Google Scholar] [CrossRef]
  22. Cuocolo, R.; Stanzione, A.; Ponsiglione, A.; Romeo, V.; Verde, F.; Creta, M.; La Rocca, R.; Longo, N.; Pace, L.; Imbriaco, M. Clinically significant prostate cancer detection on MRI: A radiomic shape features study. Eur. J. Radiol. 2019, 116, 144–149. [Google Scholar] [CrossRef] [PubMed]
  23. Ukimura, O.; Desai, M.M.; Palmer, S.; Valencerina, S.; Gross, M.; Abreu, A.L.; Aron, M.; Gill, I.S. 3-Dimensional elastic registration system of prostate biopsy location by real-time 3-dimensional transrectal ultrasound guidance with magnetic resonance/transrectal ultrasound image fusion. J. Urol. 2012, 187, 1080–1086. [Google Scholar] [CrossRef]
  24. Radtke, J.P.; Schwab, C.; Wolf, M.B.; Freitag, M.T.; Alt, C.D.; Kesch, C.; Popeneciu, I.V.; Huettenbrink, C.; Gasch, C.; Klein, T.; et al. Multiparametric Magnetic Resonance Imaging (MRI) and MRI-Transrectal Ultrasound Fusion Biopsy for Index Tumor Detection: Correlation with Radical Prostatectomy Specimen. Eur. Urol. 2016, 70, 846–853. [Google Scholar] [CrossRef] [PubMed]
  25. Cecchini, S.; Castellani, D.; Fabbietti, P.; Mazzucchelli, R.; Montironi, R.; Cecarini, M.; Carnevali, F.; Pierangeli, T.; Dellabella, M.; Ravasi, E. Combination of Multiparametric Magnetic Resonance Imaging With Elastic-fusion Biopsy Has a High Sensitivity in Detecting Clinically Significant Prostate Cancer in Daily Practice. Clin. Genitourin. Cancer 2020, 18, e501–e509. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Overview of our AI-based CAD. This procedure included training the SVM, generating the likelihood maps, and superpixel segmentation. The final diagnosis was fully automated.
Figure 1. Overview of our AI-based CAD. This procedure included training the SVM, generating the likelihood maps, and superpixel segmentation. The final diagnosis was fully automated.
Uro 02 00004 g001
Figure 2. ROC curve for area-weighted diagnostic performance.
Figure 2. ROC curve for area-weighted diagnostic performance.
Uro 02 00004 g002
Figure 3. Examples of AI diagnosis (case 13). AI-diagnosed regions correspond approximately to true cancer. Cancer regions showed red. AI-diagnosed regions and peripheral zones were outlined by blue and cyan lines, respectively.
Figure 3. Examples of AI diagnosis (case 13). AI-diagnosed regions correspond approximately to true cancer. Cancer regions showed red. AI-diagnosed regions and peripheral zones were outlined by blue and cyan lines, respectively.
Uro 02 00004 g003
Table 1. Baseline characteristics of 15 PCa patients.
Table 1. Baseline characteristics of 15 PCa patients.
Mean ± SD
Age69.3 ± 4.4
PSA (ng/mL)15.2 ± 15.4
Prostate volume by TRUS (mL)27.6 ± 14.0
Prostate weight of RP specimen (g)52.1 ± 16.4
PSA density by TRUS (ng/mL2)0.63 ± 0.69
PSA density of RP specimen (ng/mL/g)0.32 ± 0.35
Gleason’s score (biopsy)3 + 3: 14 + 4: 3
3 + 4: 14 + 5: 3
4 + 3: 65 + 4: 1
Gleason’s score (RP specimen)3 + 3: 04 + 4: 0
3 + 4: 24 + 5: 8
4 + 3: 55 + 4: 0
Table 2. Baseline characteristics of each patient.
Table 2. Baseline characteristics of each patient.
Case
No
AgePSA
(ng/mL)
PV *
(mL)
PW **
(g)
PSA Density by PVPSA Density by PWGS ***
(Biopsy)
GS
(RP Specimen)
1654.3130.0400.140.114 + 34 + 3
2694.6618.1640.260.074 + 44 + 3
36511.0523.6600.470.184 + 54 + 5
4605.1710.3540.500.104 + 34 + 5
5767.1164.5480.110.154 + 54 + 5
67015.1319.9320.760.474 + 34 + 3
76844.8030.2601.480.754 + 44 + 5
8677.5427.1460.270.164 + 44 + 5
9727.5420.3402.721.394 + 54 + 5
106455.4021.3460.290.133 + 33 + 4
11726.1021.4461.180.555 + 44 + 5
127125.1840.0620.370.244 + 34 + 5
137514.9749.31000.300.154 + 33 + 4
14734.7417.0360.280.133 + 44 + 3
15727.6020.4480.370.164 + 34 + 3
* PV: prostate volume estimated by TRUS, ** PW: prostate weight of RP specimen, *** GS: Gleason’s score.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Oishi, Y.; Kitta, T.; Osawa, T.; Abe, T.; Shinohara, N.; Nosato, H.; Sakanashi, H.; Murakawa, M. Diagnosis and Localization of Prostate Cancer via Automated Multiparametric MRI Equipped with Artificial Intelligence. Uro 2022, 2, 21-29. https://doi.org/10.3390/uro2010004

AMA Style

Oishi Y, Kitta T, Osawa T, Abe T, Shinohara N, Nosato H, Sakanashi H, Murakawa M. Diagnosis and Localization of Prostate Cancer via Automated Multiparametric MRI Equipped with Artificial Intelligence. Uro. 2022; 2(1):21-29. https://doi.org/10.3390/uro2010004

Chicago/Turabian Style

Oishi, Yuichiro, Takeya Kitta, Takahiro Osawa, Takashige Abe, Nobuo Shinohara, Hirokazu Nosato, Hidenori Sakanashi, and Masahiro Murakawa. 2022. "Diagnosis and Localization of Prostate Cancer via Automated Multiparametric MRI Equipped with Artificial Intelligence" Uro 2, no. 1: 21-29. https://doi.org/10.3390/uro2010004

Article Metrics

Back to TopTop