IDEAS home Printed from https://ideas.repec.org/a/spr/ijsaem/v9y2018i1d10.1007_s13198-016-0526-z.html
   My bibliography  Save this article

Training of neural network for pattern classification using fireworks algorithm

Author

Listed:
  • Asaju La’aro Bolaji

    (Federal University Wukari)

  • Aminu Ali Ahmad

    (Gombe State University)

  • Peter Bamidele Shola

    (University of Ilorin)

Abstract

The challenge of training the artificial neural networks (ANNs) which is frequently used for classification purpose has been consistently growing over the last few years, this is probably due to the high dimensional and multi-modal nature of the search space. Nature-inspired metaheuristic algorithms have been successfully employed in the process of weight training of such complex continuous optimization problems. In this paper, a recently proposed fireworks algorithm (FWA) is presented for the training of the parameters of the ANNs. FWA is a class of population-based search method which imitates the explosion process of real fireworks at night. In order to investigate the performance of the proposed method, experiments were conducted on seven benchmark problem instance from the UCI machine learning laboratory and the results obtained by the proposed method are compared with those obtained by krill herd algorithm, harmony search algorithm and genetic algorithm. The results of the evaluation showed superiority of the proposed algorithm in both SSE and training CA and had comparative performance in testing CA and thus it can be concluded that FWA could be adopted as one of the new template algorithm for the training of ANNs.

Suggested Citation

  • Asaju La’aro Bolaji & Aminu Ali Ahmad & Peter Bamidele Shola, 2018. "Training of neural network for pattern classification using fireworks algorithm," International Journal of System Assurance Engineering and Management, Springer;The Society for Reliability, Engineering Quality and Operations Management (SREQOM),India, and Division of Operation and Maintenance, Lulea University of Technology, Sweden, vol. 9(1), pages 208-215, February.
  • Handle: RePEc:spr:ijsaem:v:9:y:2018:i:1:d:10.1007_s13198-016-0526-z
    DOI: 10.1007/s13198-016-0526-z
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s13198-016-0526-z
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s13198-016-0526-z?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Sexton, Randall S. & Dorsey, Robert E. & Johnson, John D., 1999. "Optimization of neural networks: A comparative analysis of the genetic algorithm and simulated annealing," European Journal of Operational Research, Elsevier, vol. 114(3), pages 589-601, May.
    2. Sexton, Randall S. & Alidaee, Bahram & Dorsey, Robert E. & Johnson, John D., 1998. "Global optimization for artificial neural networks: A tabu search application," European Journal of Operational Research, Elsevier, vol. 106(2-3), pages 570-584, April.
    3. Ying Tan & Chao Yu & Shaoqiu Zheng & Ke Ding, 2013. "Introduction to Fireworks Algorithm," International Journal of Swarm Intelligence Research (IJSIR), IGI Global, vol. 4(4), pages 39-70, October.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Pendharkar, Parag C., 2001. "An empirical study of design and testing of hybrid evolutionary-neural approach for classification," Omega, Elsevier, vol. 29(4), pages 361-374, August.
    2. B Dengiz & C Alabas-Uslu & O Dengiz, 2009. "A tabu search algorithm for the training of neural networks," Journal of the Operational Research Society, Palgrave Macmillan;The OR Society, vol. 60(2), pages 282-291, February.
    3. Pendharkar, Parag C., 2002. "A computational study on the performance of artificial neural networks under changing structural design and data distribution," European Journal of Operational Research, Elsevier, vol. 138(1), pages 155-177, April.
    4. Wen, Ue-Pyng & Lan, Kuen-Ming & Shih, Hsu-Shih, 2009. "A review of Hopfield neural networks for solving mathematical programming problems," European Journal of Operational Research, Elsevier, vol. 198(3), pages 675-687, November.
    5. Ilkyeong Moon & Sanghyup Lee & Moonsoo Shin & Kwangyeol Ryu, 2016. "Evolutionary resource assignment for workload-based production scheduling," Journal of Intelligent Manufacturing, Springer, vol. 27(2), pages 375-388, April.
    6. Geraint Johnes, 2000. "Up Around the Bend: Linear and nonlinear models of the UK economy compared," International Review of Applied Economics, Taylor & Francis Journals, vol. 14(4), pages 485-493.
    7. Joo, Rocío & Bertrand, Sophie & Chaigneau, Alexis & Ñiquen, Miguel, 2011. "Optimization of an artificial neural network for identifying fishing set positions from VMS data: An example from the Peruvian anchovy purse seine fishery," Ecological Modelling, Elsevier, vol. 222(4), pages 1048-1059.
    8. Yimeng Shi & Hongyuan Zhang & Zheng Chen & Yueyue Sun & Xuecheng Liu & Jin Gu, 2023. "A Study on the Deployment of Mesoscale Chemical Hazard Area Monitoring Points by Combining Weighting and Fireworks Algorithms," Sustainability, MDPI, vol. 15(7), pages 1-19, March.
    9. Emir Malikov & Shunan Zhao & Subal C. Kumbhakar, 2020. "Estimation of firm‐level productivity in the presence of exports: Evidence from China's manufacturing," Journal of Applied Econometrics, John Wiley & Sons, Ltd., vol. 35(4), pages 457-480, June.
    10. Mak, Brenda & Blanning, Robert & Ho, Susanna, 2006. "Genetic algorithms in logic tree decision modeling," European Journal of Operational Research, Elsevier, vol. 170(2), pages 597-612, April.
    11. Laura Palagi, 2019. "Global optimization issues in deep network regression: an overview," Journal of Global Optimization, Springer, vol. 73(2), pages 239-277, February.
    12. Ashwini Pradhan & Debahuti Mishra & Kaberi Das & Ganapati Panda & Sachin Kumar & Mikhail Zymbler, 2021. "On the Classification of MR Images Using “ELM-SSA” Coated Hybrid Model," Mathematics, MDPI, vol. 9(17), pages 1-21, August.
    13. M. Milenković & N. Milosavljevic & N. Bojović & S. Val, 2021. "Container flow forecasting through neural networks based on metaheuristics," Operational Research, Springer, vol. 21(2), pages 965-997, June.
    14. Sexton, Randall S. & Dorsey, Robert E. & Johnson, John D., 1999. "Optimization of neural networks: A comparative analysis of the genetic algorithm and simulated annealing," European Journal of Operational Research, Elsevier, vol. 114(3), pages 589-601, May.
    15. Laura Palagi, 2017. "Global Optimization issues in Supervised Learning. An overview," DIAG Technical Reports 2017-11, Department of Computer, Control and Management Engineering, Universita' degli Studi di Roma "La Sapienza".
    16. Miriyala, Srinivas Soumitri & Subramanian, Venkat & Mitra, Kishalay, 2018. "TRANSFORM-ANN for online optimization of complex industrial processes: Casting process as case study," European Journal of Operational Research, Elsevier, vol. 264(1), pages 294-309.
    17. Gupta, Jatinder N. D. & Sexton, Randall S., 1999. "Comparing backpropagation with a genetic algorithm for neural network training," Omega, Elsevier, vol. 27(6), pages 679-684, December.
    18. El-Fallahi, Abdellah & Marti, Rafael & Lasdon, Leon, 2006. "Path relinking and GRG for artificial neural networks," European Journal of Operational Research, Elsevier, vol. 169(2), pages 508-519, March.
    19. Ruslan Abdulkadirov & Pavel Lyakhov & Nikolay Nagornov, 2023. "Survey of Optimization Algorithms in Modern Neural Networks," Mathematics, MDPI, vol. 11(11), pages 1-37, May.
    20. Rä‚Zvan Popa, 2020. "Improving Earnings Predictions With Neural Network Models," Review of Economic and Business Studies, Alexandru Ioan Cuza University, Faculty of Economics and Business Administration, issue 26, pages 77-96, December.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:ijsaem:v:9:y:2018:i:1:d:10.1007_s13198-016-0526-z. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.