Melanoma segmentation: A framework of improved DenseNet77 and UNET convolutional neural network

Marriam Nawaz, Tahira Nazir, Momina Masood, Farooq Ali, Muhammad Attique Khan, Usman Tariq, Naveera Sahar, Robertas Damaševičius

Research output: Contribution to journalArticlepeer-review

72 Scopus citations

Abstract

Melanoma is the most fatal type of skin cancer which can cause the death of victims at the advanced stage. Extensive work has been presented by the researcher on computer vision for skin lesion localization. However, correct and effective melanoma segmentation is still a tough job because of the extensive variations found in the shape, color, and sizes of skin moles. Moreover, the presence of light and brightness variations further complicates the segmentation task. We have presented improved deep learning (DL)-based approach, namely, the DenseNet77-based UNET model. More clearly, we have introduced the DenseNet77 network at the encoder unit of the UNET approach to computing the more representative set of image features. The calculated keypoints are later segmented by the decoder of the UNET model. We have used two standard datasets, namely, the ISIC-2017 and ISIC-2018 to evaluate the performance of the proposed approach and acquired the segmentation accuracies of 99.21% and 99.51% for the ISIC-2017 and ISIC-2018 datasets, respectively. We have confirmed through both the quantitative and qualitative results that the proposed improved UNET approach is robust to skin lesions segmentation and can accurately recognize the moles of varying colors and sizes.

Original languageEnglish
Pages (from-to)2137-2153
Number of pages17
JournalInternational Journal of Imaging Systems and Technology
Volume32
Issue number6
DOIs
StatePublished - Nov 2022

Keywords

  • deep learning
  • DenseNet
  • dermoscopy
  • melanoma
  • skin moles
  • UNET

Fingerprint

Dive into the research topics of 'Melanoma segmentation: A framework of improved DenseNet77 and UNET convolutional neural network'. Together they form a unique fingerprint.

Cite this