IDEAS home Printed from https://ideas.repec.org/a/eee/jomega/v24y1996i4p385-397.html
   My bibliography  Save this article

Effect of data standardization on neural network training

Author

Listed:
  • Shanker, M.
  • Hu, M. Y.
  • Hung, M. S.

Abstract

Data transformation is a popular option in training neural networks. This study evaluates the effectiveness of two well-known transformation methods: linear transformation and statistical standardization. These two are referred to as data standardization. A carefully designed experiment is used in which data from two-group classification problems were trained by feedforward networks. Different kinds of classification problems, from relatively simple to hard, were generated. Other experimental factors include network architecture, sample size, and sample proportion of group 1 members. Three performance measurements for the effect of data standardization are employed. The results suggest that networks trained on standardized data yield better results in general, but the advantage diminishes as network and sample size become large. In other words, neural networks exhibit a self-scaling capability. In addition, impact of data standardization on the performance of training algorithm in terms of computation time and number of iterations is evaluated. The results indicate that, overall, data standardization slows down training. Finally, these results are illustrated with a data set obtained from the American Telephone and Telegraph Company.

Suggested Citation

  • Shanker, M. & Hu, M. Y. & Hung, M. S., 1996. "Effect of data standardization on neural network training," Omega, Elsevier, vol. 24(4), pages 385-397, August.
  • Handle: RePEc:eee:jomega:v:24:y:1996:i:4:p:385-397
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/0305-0483(96)00010-2
    Download Restriction: Full text for ScienceDirect subscribers only
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Venkat Subramanian & Ming S. Hung, 1993. "A GRG2-Based System for Training Neural Networks: Design and Computational Experience," INFORMS Journal on Computing, INFORMS, vol. 5(4), pages 386-394, November.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Miguel Núñez-Peiró & Anna Mavrogianni & Phil Symonds & Carmen Sánchez-Guevara Sánchez & F. Javier Neila González, 2021. "Modelling Long-Term Urban Temperatures with Less Training Data: A Comparative Study Using Neural Networks in the City of Madrid," Sustainability, MDPI, vol. 13(15), pages 1-23, July.
    2. James R. Coakley & Carol E. Brown, 2000. "Artificial neural networks in accounting and finance: modeling issues," Intelligent Systems in Accounting, Finance and Management, John Wiley & Sons, Ltd., vol. 9(2), pages 119-144, June.
    3. Shigeyuki Hamori & Takahiro Kume, 2018. "Artificial Intelligence And Economic Growth," Advances in Decision Sciences, Asia University, Taiwan, vol. 22(1), pages 256-278, December.
    4. Matteo Picozzi & Antonio Giovanni Iaccarino, 2021. "Forecasting the Preparatory Phase of Induced Earthquakes by Recurrent Neural Network," Forecasting, MDPI, vol. 3(1), pages 1-20, January.
    5. Zhang, Guoqiang & Y. Hu, Michael & Eddy Patuwo, B. & C. Indro, Daniel, 1999. "Artificial neural networks in bankruptcy prediction: General framework and cross-validation analysis," European Journal of Operational Research, Elsevier, vol. 116(1), pages 16-32, July.
    6. Luis Alberto Geraldo-Campos & Juan J. Soria & Tamara Pando-Ezcurra, 2022. "Machine Learning for Credit Risk in the Reactive Peru Program: A Comparison of the Lasso and Ridge Regression Models," Economies, MDPI, vol. 10(8), pages 1-21, July.
    7. Qian, Cheng & Xu, Binghui & Chang, Liang & Sun, Bo & Feng, Qiang & Yang, Dezhen & Ren, Yi & Wang, Zili, 2021. "Convolutional neural network based capacity estimation using random segments of the charging curves for lithium-ion batteries," Energy, Elsevier, vol. 227(C).
    8. Luke T. Woods & Zeeshan A. Rana, 2023. "Modelling Sign Language with Encoder-Only Transformers and Human Pose Estimation Keypoint Data," Mathematics, MDPI, vol. 11(9), pages 1-28, May.
    9. Semenoglou, Artemios-Anargyros & Spiliotis, Evangelos & Makridakis, Spyros & Assimakopoulos, Vassilios, 2021. "Investigating the accuracy of cross-learning time series forecasting methods," International Journal of Forecasting, Elsevier, vol. 37(3), pages 1072-1084.
    10. Zhang, Gioqinang & Hu, Michael Y., 1998. "Neural network forecasting of the British Pound/US Dollar exchange rate," Omega, Elsevier, vol. 26(4), pages 495-506, August.
    11. Apostolos Ampountolas, 2023. "Comparative Analysis of Machine Learning, Hybrid, and Deep Learning Forecasting Models: Evidence from European Financial Markets and Bitcoins," Forecasting, MDPI, vol. 5(2), pages 1-15, June.
    12. Joana Dias & Humberto Rocha & Brígida Ferreira & Maria Lopes, 2014. "A genetic algorithm with neural network fitness function evaluation for IMRT beam angle optimization," Central European Journal of Operations Research, Springer;Slovak Society for Operations Research;Hungarian Operational Research Society;Czech Society for Operations Research;Österr. Gesellschaft für Operations Research (ÖGOR);Slovenian Society Informatika - Section for Operational Research;Croatian Operational Research Society, vol. 22(3), pages 431-455, September.
    13. Apostolos Ampountolas, 2023. "Comparative Analysis of Machine Learning, Hybrid, and Deep Learning Forecasting Models Evidence from European Financial Markets and Bitcoins," Papers 2307.08853, arXiv.org.
    14. Samuka Mohanty & Rajashree Dash, 2023. "A New Dual Normalization for Enhancing the Bitcoin Pricing Capability of an Optimized Low Complexity Neural Net with TOPSIS Evaluation," Mathematics, MDPI, vol. 11(5), pages 1-28, February.
    15. Zhang, Guoqiang & Eddy Patuwo, B. & Y. Hu, Michael, 1998. "Forecasting with artificial neural networks:: The state of the art," International Journal of Forecasting, Elsevier, vol. 14(1), pages 35-62, March.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Hu, Michael Y. & Zhang, G. Peter & Chen, Haiyang, 2004. "Modeling foreign equity control in Sino-foreign joint ventures with neural networks," European Journal of Operational Research, Elsevier, vol. 159(3), pages 729-740, December.
    2. Indro, D. C. & Jiang, C. X. & Patuwo, B. E. & Zhang, G. P., 1999. "Predicting mutual fund performance using artificial neural networks," Omega, Elsevier, vol. 27(3), pages 373-380, June.
    3. Denton, James W. & Hung, Ming S., 1996. "A comparison of nonlinear optimization methods for supervised learning in multilayer feedforward neural networks," European Journal of Operational Research, Elsevier, vol. 93(2), pages 358-368, September.
    4. Zhang, Guoqiang & Eddy Patuwo, B. & Y. Hu, Michael, 1998. "Forecasting with artificial neural networks:: The state of the art," International Journal of Forecasting, Elsevier, vol. 14(1), pages 35-62, March.
    5. Zhang, G. Peter & Keil, Mark & Rai, Arun & Mann, Joan, 2003. "Predicting information technology project escalation: A neural network approach," European Journal of Operational Research, Elsevier, vol. 146(1), pages 115-129, April.

    More about this item

    Keywords

    neural networks modelling;

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:jomega:v:24:y:1996:i:4:p:385-397. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/wps/find/journaldescription.cws_home/375/description#description .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.