Please use this identifier to cite or link to this item: https://doi.org/10.1109/TPWRS.2012.2237042
Title: Adaptive maintenance policies for aging devices using a markov decision process
Authors: Abeygunawardane, S.K.
Jirutitijaroen, P. 
Xu, H. 
Keywords: Asset management
backward induction
condition monitoring (CM)
maintenance
Markov decision processes
transformers
Issue Date: 2013
Citation: Abeygunawardane, S.K., Jirutitijaroen, P., Xu, H. (2013). Adaptive maintenance policies for aging devices using a markov decision process. IEEE Transactions on Power Systems 28 (3) : 3194-3203. ScholarBank@NUS Repository. https://doi.org/10.1109/TPWRS.2012.2237042
Abstract: In competitive environments, most equipment are operated closer to or at their limits and as a result, equipment's maintenance schedules may be affected by system conditions. In this paper, we propose a Markov decision process (MDP) that allows better flexibility in conducting maintenance. The proposed MDP model is based on a state transition diagram where inspection and maintenance (I&M) delay times are explicitly incorporated. The model can be solved efficiently to determine adaptive maintenance policies. This formulation successfully combines the long term aging process with more frequently observed short term changes in equipment's condition. We demonstrate the applicability of the proposed model using I&M data of local transformers. © 1969-2012 IEEE.
Source Title: IEEE Transactions on Power Systems
URI: http://scholarbank.nus.edu.sg/handle/10635/54903
ISSN: 08858950
DOI: 10.1109/TPWRS.2012.2237042
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.

SCOPUSTM   
Citations

9
checked on Dec 10, 2018

WEB OF SCIENCETM
Citations

7
checked on Dec 10, 2018

Page view(s)

43
checked on Oct 20, 2018

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.