Adaptive stochastic resource control: A machine learning approach
AuthorCsaji, BC; Monostori, L
Source TitleJOURNAL OF ARTIFICIAL INTELLIGENCE RESEARCH
PublisherAI ACCESS FOUNDATION
University of Melbourne Author/sCSAJI, BALAZS
AffiliationElectrical and Electronic Engineering
Document TypeJournal Article
CitationsCsaji, B. C. & Monostori, L. (2008). Adaptive stochastic resource control: A machine learning approach. JOURNAL OF ARTIFICIAL INTELLIGENCE RESEARCH, 32, pp.453-486. https://doi.org/10.1613/jair.2548.
Access StatusThis item is currently not available from this repository
<jats:p>The paper investigates stochastic resource allocation problems with scarce, reusable resources and non-preemtive, time-dependent, interconnected tasks. This approach is a natural generalization of several standard resource management problems, such as scheduling and transportation problems. First, reactive solutions are considered and defined as control policies of suitably reformulated Markov decision processes (MDPs). We argue that this reformulation has several favorable properties, such as it has finite state and action spaces, it is aperiodic, hence all policies are proper and the space of control policies can be safely restricted. Next, approximate dynamic programming (ADP) methods, such as fitted Q-learning, are suggested for computing an efficient control policy. In order to compactly maintain the cost-to-go function, two representations are studied: hash tables and support vector regression (SVR), particularly, nu-SVRs. Several additional improvements, such as the application of limited-lookahead rollout algorithms in the initial phases, action space decomposition, task clustering and distributed sampling are investigated, too. Finally, experimental results on both benchmark and industry-related data are presented.</jats:p>
KeywordsArtificial Intelligence and Image Processing
- Click on "Export Reference in RIS Format" and choose "open with... Endnote".
- Click on "Export Reference in RIS Format". Login to Refworks, go to References => Import References