IDEAS home Printed from https://ideas.repec.org/a/inm/ormnsc/v48y2002i5p607-624.html
   My bibliography  Save this article

Adaptive Inventory Control for Nonstationary Demand and Partial Information

Author

Listed:
  • James T. Treharne

    (United States Army TRADOC Analysis Center, Fort Leavenworth, Kansas 66027)

  • Charles R. Sox

    (Department of Industrial and Systems Engineering, Auburn University, Auburn, Alabama 36849)

Abstract

This paper examines several different policies for an inventory control problem in which the demand process is nonstationary and partially observed. The probability distribution for the demand in each period is determined by the state of a Markov chain, the core process. However, the state of this core process is not directly observed, only the actual demand is observed by the decision maker. Given this demand process, the inventory control problem is a composite-state, partially observed Markov decision process (POMDP), which is an appropriate model for a number of dynamic demand problems. In practice, managers often use certainty equivalent control (CEC) policies to solve such a problem. However, this paper presents results that demonstrate that there are other practical control policies that almost always provide much better solutions for this problem than the CEC policies commonly used in practice. The computational results also indicate how specific problem characteristics influence the performance of each of the alternative policies.

Suggested Citation

  • James T. Treharne & Charles R. Sox, 2002. "Adaptive Inventory Control for Nonstationary Demand and Partial Information," Management Science, INFORMS, vol. 48(5), pages 607-624, May.
  • Handle: RePEc:inm:ormnsc:v:48:y:2002:i:5:p:607-624
    DOI: 10.1287/mnsc.48.5.607.7807
    as

    Download full text from publisher

    File URL: http://dx.doi.org/10.1287/mnsc.48.5.607.7807
    Download Restriction: no

    File URL: https://libkey.io/10.1287/mnsc.48.5.607.7807?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Samuel Karlin, 1960. "Dynamic Inventory Policy with Varying Stochastic Demands," Management Science, INFORMS, vol. 6(3), pages 231-258, April.
    2. William S. Lovejoy, 1990. "Myopic Policies for Some Inventory Models with Uncertain Demand Distributions," Management Science, INFORMS, vol. 36(6), pages 724-738, June.
    3. Katy S. Azoury, 1985. "Bayes Solution to Dynamic Inventory Models Under Unknown Demand Distribution," Management Science, INFORMS, vol. 31(9), pages 1150-1160, September.
    4. James C. Bean & Robert L. Smith, 1984. "Conditions for the Existence of Planning Horizons," Mathematics of Operations Research, INFORMS, vol. 9(3), pages 391-401, August.
    5. George E. Monahan, 1982. "State of the Art---A Survey of Partially Observable Markov Decision Processes: Theory, Models, and Algorithms," Management Science, INFORMS, vol. 28(1), pages 1-16, January.
    6. William S. Lovejoy, 1987. "Some Monotonicity Results for Partially Observed Markov Decision Processes," Operations Research, INFORMS, vol. 35(5), pages 736-743, October.
    7. Jing-Sheng Song & Paul H. Zipkin, 1996. "Managing Inventory with the Prospect of Obsolescence," Operations Research, INFORMS, vol. 44(1), pages 215-222, February.
    8. Thomas E. Morton & William E. Wecker, 1977. "Discounting, Ergodicity and Convergence for Markov Decision Processes," Management Science, INFORMS, vol. 23(8), pages 890-900, April.
    9. White, Chelsea C. & White, Douglas J., 1989. "Markov decision processes," European Journal of Operational Research, Elsevier, vol. 39(1), pages 1-16, March.
    10. Chuanpu Hu & William S. Lovejoy & Steven L. Shafer, 1996. "Comparison of Some Suboptimal Control Policies in Medical Drug Therapy," Operations Research, INFORMS, vol. 44(5), pages 696-709, October.
    11. Ravi Anupindi & Thomas E. Morton & David Pentico, 1996. "The Nonstationary Stochastic Lead-Time Inventory Problem: Near-Myopic Bounds, Heuristics, and Testing," Management Science, INFORMS, vol. 42(1), pages 124-129, January.
    12. William S. Lovejoy, 1993. "Suboptimal Policies, with Bounds, for Parameter Adaptive Decision Processes," Operations Research, INFORMS, vol. 41(3), pages 583-599, June.
    13. Richard D. Smallwood & Edward J. Sondik, 1973. "The Optimal Control of Partially Observable Markov Processes over a Finite Horizon," Operations Research, INFORMS, vol. 21(5), pages 1071-1088, October.
    14. Chelsea C. White & William T. Scherer, 1989. "Solution Procedures for Partially Observed Markov Decision Processes," Operations Research, INFORMS, vol. 37(5), pages 791-797, October.
    15. Graves, Stephen C., 1997. "A single-item inventory model for a non-stationary demand process," Working papers WP 3944-97., Massachusetts Institute of Technology (MIT), Sloan School of Management.
    16. Narendra Agrawal & Stephen A. Smith, 1996. "Estimating negative binomial demand for retail inventory management with unobservable lost sales," Naval Research Logistics (NRL), John Wiley & Sons, vol. 43(6), pages 839-861, September.
    17. Jing-Sheng Song & Paul Zipkin, 1993. "Inventory Control in a Fluctuating Demand Environment," Operations Research, INFORMS, vol. 41(2), pages 351-370, April.
    18. G. Hadley & T. M. Whitin, 1961. "An Optimal Final Inventory Model," Management Science, INFORMS, vol. 7(2), pages 179-183, January.
    19. Abbas A. Kurawarwala & Hirofumi Matsuo, 1996. "Forecasting and Inventory Management of Short Life-Cycle Products," Operations Research, INFORMS, vol. 44(1), pages 131-150, February.
    20. Chelsea C. White & William T. Scherer, 1994. "Finite-Memory Suboptimal Design for Partially Observed Markov Decision Processes," Operations Research, INFORMS, vol. 42(3), pages 439-455, June.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Yossi Aviv & Amit Pazgal, 2005. "A Partially Observed Markov Decision Process for Dynamic Pricing," Management Science, INFORMS, vol. 51(9), pages 1400-1416, September.
    2. Hao Zhang, 2010. "Partially Observable Markov Decision Processes: A Geometric Technique and Analysis," Operations Research, INFORMS, vol. 58(1), pages 214-228, February.
    3. Yanling Chang & Alan Erera & Chelsea White, 2015. "Value of information for a leader–follower partially observed Markov game," Annals of Operations Research, Springer, vol. 235(1), pages 129-153, December.
    4. Yossi Aviv, 2003. "A Time-Series Framework for Supply-Chain Inventory Management," Operations Research, INFORMS, vol. 51(2), pages 210-227, April.
    5. Yanling Chang & Alan Erera & Chelsea White, 2015. "A leader–follower partially observed, multiobjective Markov game," Annals of Operations Research, Springer, vol. 235(1), pages 103-128, December.
    6. Chernonog, Tatyana & Avinadav, Tal, 2016. "A two-state partially observable Markov decision process with three actionsAuthor-Name: Ben-Zvi, Tal," European Journal of Operational Research, Elsevier, vol. 254(3), pages 957-967.
    7. Xiaomei Ding & Martin L. Puterman & Arnab Bisi, 2002. "The Censored Newsvendor and the Optimal Acquisition of Information," Operations Research, INFORMS, vol. 50(3), pages 517-527, June.
    8. Gen Sakoda & Hideki Takayasu & Misako Takayasu, 2019. "Data Science Solutions for Retail Strategy to Reduce Waste Keeping High Profit," Sustainability, MDPI, vol. 11(13), pages 1-30, June.
    9. Arnab Bisi & Maqbool Dada, 2007. "Dynamic learning, pricing, and ordering by a censored newsvendor," Naval Research Logistics (NRL), John Wiley & Sons, vol. 54(4), pages 448-461, June.
    10. Iida, Tetsuo, 1999. "The infinite horizon non-stationary stochastic inventory problem: Near myopic policies and weak ergodicity," European Journal of Operational Research, Elsevier, vol. 116(2), pages 405-422, July.
    11. Larson, C. Erik & Olson, Lars J. & Sharma, Sunil, 2001. "Optimal Inventory Policies when the Demand Distribution Is Not Known," Journal of Economic Theory, Elsevier, vol. 101(1), pages 281-300, November.
    12. Yossi Aviv & Awi Federgruen, 2001. "Design for Postponement: A Comprehensive Characterization of Its Benefits Under Unknown Demand Distributions," Operations Research, INFORMS, vol. 49(4), pages 578-598, August.
    13. Serin, Yasemin, 1995. "A nonlinear programming model for partially observable Markov decision processes: Finite horizon case," European Journal of Operational Research, Elsevier, vol. 86(3), pages 549-564, November.
    14. Xiangwen Lu & Jing-Sheng Song & Amelia Regan, 2006. "Inventory Planning with Forecast Updates: Approximate Solutions and Cost Error Bounds," Operations Research, INFORMS, vol. 54(6), pages 1079-1097, December.
    15. Glenn, David & Bisi, Arnab & Puterman, Martin L., 2004. "The Bayesian Newsvendors in Supply Chains with Unobserved Lost Sales," Working Papers 04-0110, University of Illinois at Urbana-Champaign, College of Business.
    16. Katy S. Azoury & Julia Miyaoka, 2009. "Optimal Policies and Approximations for a Bayesian Linear Regression Inventory Model," Management Science, INFORMS, vol. 55(5), pages 813-826, May.
    17. Satya S. Malladi & Alan L. Erera & Chelsea C. White, 2023. "Inventory control with modulated demand and a partially observed modulation process," Annals of Operations Research, Springer, vol. 321(1), pages 343-369, February.
    18. Nicholas C. Petruzzi & Maqbool Dada, 2002. "Dynamic pricing and inventory control with learning," Naval Research Logistics (NRL), John Wiley & Sons, vol. 49(3), pages 303-325, April.
    19. Abhijit Gosavi, 2009. "Reinforcement Learning: A Tutorial Survey and Recent Advances," INFORMS Journal on Computing, INFORMS, vol. 21(2), pages 178-192, May.
    20. Woonghee Tim Huh & Paat Rusmevichientong, 2009. "A Nonparametric Asymptotic Analysis of Inventory Planning with Censored Demand," Mathematics of Operations Research, INFORMS, vol. 34(1), pages 103-123, February.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:inm:ormnsc:v:48:y:2002:i:5:p:607-624. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Chris Asher (email available below). General contact details of provider: https://edirc.repec.org/data/inforea.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.