Please use this identifier to cite or link to this item:
|Title:||Adaptive maintenance policies for aging devices using a markov decision process|
condition monitoring (CM)
Markov decision processes
|Citation:||Abeygunawardane, S.K., Jirutitijaroen, P., Xu, H. (2013). Adaptive maintenance policies for aging devices using a markov decision process. IEEE Transactions on Power Systems 28 (3) : 3194-3203. ScholarBank@NUS Repository. https://doi.org/10.1109/TPWRS.2012.2237042|
|Abstract:||In competitive environments, most equipment are operated closer to or at their limits and as a result, equipment's maintenance schedules may be affected by system conditions. In this paper, we propose a Markov decision process (MDP) that allows better flexibility in conducting maintenance. The proposed MDP model is based on a state transition diagram where inspection and maintenance (I&M) delay times are explicitly incorporated. The model can be solved efficiently to determine adaptive maintenance policies. This formulation successfully combines the long term aging process with more frequently observed short term changes in equipment's condition. We demonstrate the applicability of the proposed model using I&M data of local transformers. © 1969-2012 IEEE.|
|Source Title:||IEEE Transactions on Power Systems|
|Appears in Collections:||Staff Publications|
Show full item record
Files in This Item:
There are no files associated with this item.
checked on Jul 12, 2018
WEB OF SCIENCETM
checked on Jun 4, 2018
checked on Jun 30, 2018
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.