Author
Listed:
- Muhammad Amir khan
(Universiti Teknologi MARA)
- Muhammad Danish Ali
(COMSATS University)
- Tehseen Mazhar
(National College of Business Administration and Economics
Government of Punjab)
- Tariq Shahzad
(COMSATS University Islamabad)
- Waheed Ur Rehman
(IIC University of Technology)
- Mohammad Shahid
(IIC University of Technology)
- Habib Hamam
(Uni de Moncton
University of Johannesburg
International Institute of Technology and Management (IITG)
Bridges for Academic Excellence - Spectrum)
Abstract
One of the most prevalent cancers in humans, skin cancer is typically identified by visual inspection. Early detection of this kind of cancer is essential. Consequently, one of the most difficult aspects of designing and implementing digital medical systems is coming up with an automated method for classifying skin lesions. Convolutional Neural Network (CNN) models, enabled by thermoscopic pictures, are being used by an increasing number of individuals to automatically differentiate benign from malignant skin tumors. The classification of skin cancer through the use of deep learning and machine learning techniques may have a significant positive impact on patient diagnosis and care. These approaches’ significant computational cost means that their capacity to extract highly nonlinear properties needs to be improved. Using fewer learnable parameters, this work aims to enhance model convergence and expedite training by classifying early-stage skin cancer. Combining the VGG19 and network-in-network (NIN) architectures, the VGG-NIN model is a strong and scale-invariant deep model. The exceptional nonlinearity of this model simplifies the task of capturing complex patterns and features. Additionally, by adding NIN to the model, additional nonlinearity is introduced, improving classification performance. Based on samples of skin cancer, both Benign and Malignant, our model has an outstanding 90% accuracy with the fewest possible trainable parameters. As part of our research, we used a publicly accessible Kaggle dataset to do a benchmark analysis to assess the performance of our suggested model. The processed photos from the ISIC Archive, notably the HAM10000 Skin Cancer dataset, made up the dataset used in this study. One well-known source for dermatological photos is the ISIC Archive. The suggested model effectively uses computer resources and performs more accurately than cutting-edge techniques.
Suggested Citation
Muhammad Amir khan & Muhammad Danish Ali & Tehseen Mazhar & Tariq Shahzad & Waheed Ur Rehman & Mohammad Shahid & Habib Hamam, 2025.
"An Advanced Deep Learning Framework for Skin Cancer Classification,"
The Review of Socionetwork Strategies, Springer, vol. 19(1), pages 111-130, April.
Handle:
RePEc:spr:trosos:v:19:y:2025:i:1:d:10.1007_s12626-025-00181-x
DOI: 10.1007/s12626-025-00181-x
Download full text from publisher
As the access to this document is restricted, you may want to search for a different version of it.
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:trosos:v:19:y:2025:i:1:d:10.1007_s12626-025-00181-x. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.