IDEAS home Printed from https://ideas.repec.org/a/rfh/bbejor/v13y2024i2p136-141.html
   My bibliography  Save this article

An Enhanced Lung Cancer Identification and Classification Based on Advanced Deep Learning and Convolutional Neural Network

Author

Listed:
  • Ammar Hassan

    (4dots Solutions, 534 Block G1, Johar Town Lahore, 54000, Pakistan)

  • Hamayun Khan

    (Department of Computer Science, Faculty of Computer Science & IT Superior University Lahore, 54000, Pakistan)

  • Arshad Ali

    (Faculty of Computer and Information Systems, Islamic University of Madinah, Al Madinah Al Munawarah, 42351, Saudi Arabia)

  • Irfan Ud din

    (Department of Computer Science, Faculty of Computer Science & IT Superior University Lahore, 54000, Pakistan)

  • Abdullah Sajid

    (4dots Solutions, 534 Block G1, Johar Town Lahore, 54000, Pakistan)

  • Mohammad Husain

    (Faculty of Computer and Information Systems, Islamic University of Madinah, Al Madinah Al Munawarah, 42351, Saudi Arabia)

  • Muddassar Ali

    (Department of Computer Science & Information Technology Superior University Lahore, 54000, Pakistan)

  • Amna Naz

    (Department of Computer Science & Information Technology Superior University Lahore, 54000, Pakistan)

  • Hanfia Fakhar

    (Department of Computer Science & Information Technology Superior University Lahore, 54000, Pakistan)

Abstract

In this research, a fast, accurate, and stable system of lung cancer detection based on novel deep learning techniques is proposed. Lung cancer continues to be one of the most monumental global health concerns, which is why there is an urgent need for low-cost and non-invasive screening. Though the diagnostic methods that are most commonly in use include CTscan, X-ray etc. The interpretation by the human eye varies and errors are bound to occur. In response to this challenge, we outline a more automated approach that is based on deep learning models and can be used to classify lung pictures with high levels of accuracy. This research makes use of a large data set of lung scans categorised as normal, malignant, and benign. The first look what the data had in store threw up some correlation with picture size and what seemed to be category differences. Realizing that live feed requires constant input, each picture underwent grayscale conversion and dimensionality reduction. In order to effectively deal with the unbalanced nature of the dataset that was discovered in the study, the Synthetic Minority Oversampling Technique (SMOTE) was applied as a technique. In this presentation, three new designs were introduced: Model I, Model 2, and Model 3. Additionally, one architecture was developed with the purpose of merging the predictions of all three models. Furthermore, out of all the models created, the best model emerged as model 1 with approximately an accuracy of 84%. 7%. But the ensemble strategy which was intended to make the best of each of the models, produced an astounding 82. 5% accuracy. The specific advantages and misclassification behaviors of Model 2 and 3, although less accurate than Model 1 but are currently under evaluation for future Model ensemble improvements. The technique developed using deep learning addresses the challenges at a faster, efficient, and contactless approach to lung cancer analysis. The fact that it is capable of operating in tandem with others diagnostic instruments may help reduce diagnostic errors and enhance patient care. We have addressed this issue so that the various practitioners would be able to read this paper and we can go to the next generation of diagnostic technologies.

Suggested Citation

  • Ammar Hassan & Hamayun Khan & Arshad Ali & Irfan Ud din & Abdullah Sajid & Mohammad Husain & Muddassar Ali & Amna Naz & Hanfia Fakhar, 2024. "An Enhanced Lung Cancer Identification and Classification Based on Advanced Deep Learning and Convolutional Neural Network," Bulletin of Business and Economics (BBE), Research Foundation for Humanity (RFH), vol. 13(2), pages 136-141.
  • Handle: RePEc:rfh:bbejor:v:13:y:2024:i:2:p:136-141
    DOI: https://doi.org/10.61506/01.00308
    as

    Download full text from publisher

    File URL: https://bbejournal.com/BBE/article/view/814/787
    Download Restriction: no

    File URL: https://bbejournal.com/BBE/article/view/814
    Download Restriction: no

    File URL: https://libkey.io/https://doi.org/10.61506/01.00308?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:rfh:bbejor:v:13:y:2024:i:2:p:136-141. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Dr. Muhammad Irfan Chani (email available below). General contact details of provider: https://edirc.repec.org/data/rffhlpk.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.