Repository logo
 
Publication

An innovative faster R-CNN-based framework for breast cancer detection in MRI

dc.contributor.authorRaimundo, João Nuno
dc.contributor.authorFontes, João Pedro
dc.contributor.authorMagalhães, Luís
dc.contributor.authorGuevara Lopez, Miguel Angel
dc.date.accessioned2023-08-24T09:43:25Z
dc.date.available2023-08-24T09:43:25Z
dc.date.issued2023-08
dc.description.abstractReplacing lung cancer as the most commonly diagnosed cancer globally, breast cancer (BC) today accounts for 1 in 8 cancer diagnoses and a total of 2.3 million new cases in both sexes combined. An estimated 685,000 women died from BC in 2020, corresponding to 16% or 1 in every 6 cancer deaths in women. BC represents a quarter of a total of cancer cases in females and by far the most commonly diagnosed cancer in women in 2020. However, when detected in the early stages of the disease, treatment methods have proven to be very effective in increasing life expectancy and, in many cases, patients fully recover. Several medical imaging modalities, such as X-rays Mammography (MG), Ultrasound (US), Computer Tomography (CT), Magnetic Resonance Imaging (MRI), and Digital Tomosynthesis (DT) have been explored to support radiologists/physicians in clinical decision-making workflows for the detection and diagnosis of BC. In this work, we propose a novel Faster R-CNN-based framework to automate the detection of BC pathological Lesions in MRI. As a main contribution, we have developed and experimentally (statistically) validated an innovative method improving the “breast MRI preprocessing phase” to select the patient’s slices (images) and associated bounding boxes representing pathological lesions. In this way, it is possible to create a more robust training (benchmarking) dataset to feed Deep Learning (DL) models, reducing the computation time and the dimension of the dataset, and more importantly, to identify with high accuracy the specific regions (bounding boxes) for each of the patient’s images, in which a possible pathological lesion (tumor) has been identified. As a result, in an experimental setting using a fully annotated dataset (released to the public domain) comprising a total of 922 MRI-based BC patient cases, we have achieved, as the most accurate trained model, an accuracy rate of 97.83%, and subsequently, applying a ten-fold cross-validation method, a mean accuracy on the trained models of 94.46% and an associated standard deviation of 2.43%.pt_PT
dc.description.versioninfo:eu-repo/semantics/publishedVersionpt_PT
dc.identifier.citationRaimundo, J. N. C., Fontes, J. P. P., Gonzaga Mendes Magalhães, L., & Guevara Lopez, M. A. (2023). An Innovative Faster R-CNN-Based Framework for Breast Cancer Detection in MRI. Journal of Imaging, 9(9), 169. http://dx.doi.org/10.3390/jimaging9090169pt_PT
dc.identifier.doihttps://doi.org/10.3390/jimaging9090169pt_PT
dc.identifier.issn2313-433X
dc.identifier.urihttp://hdl.handle.net/10400.26/46186
dc.language.isoengpt_PT
dc.peerreviewedyespt_PT
dc.subjectBreast Cancer Detectionpt_PT
dc.subjectMagnetic Resonance Imagingpt_PT
dc.subjectComputer Visionpt_PT
dc.subjectMachine Learningpt_PT
dc.subjectDeep Learningpt_PT
dc.subjectConvolutional Neural Networkspt_PT
dc.titleAn innovative faster R-CNN-based framework for breast cancer detection in MRIpt_PT
dc.typejournal article
dspace.entity.typePublication
person.familyNameGUEVARA LÓPEZ
person.givenNameMIGUEL ANGEL
person.identifierA-3126-2011
person.identifier.ciencia-id8910-E298-D967
person.identifier.orcid0000-0001-7814-1653
person.identifier.scopus-author-id36999281000
rcaap.rightsopenAccesspt_PT
rcaap.typearticlept_PT
relation.isAuthorOfPublication38c91a9b-1db6-4515-9462-b0a031edc325
relation.isAuthorOfPublication.latestForDiscovery38c91a9b-1db6-4515-9462-b0a031edc325

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
jimaging-09-00169-with-cover.pdf
Size:
841.21 KB
Format:
Adobe Portable Document Format
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.85 KB
Format:
Item-specific license agreed upon to submission
Description: