Implementing condition-based maintenance: optimizing maintenance decisions in multi-component systems using Markov Decision Processes
Date
Authors
Editor(s)
Advisor
Supervisor
Co-Advisor
Co-Supervisor
Instructor
BUIR Usage Stats
views
downloads
Series
Abstract
Maintenance scheduling has been playing a pivotal role in many industrial areas since unexpected failures result in costly actions to bring the system back to the operating state. An advanced maintenance policy is condition-based maintenance (CBM), which schedules the maintenance actions according to the data collected from the system inspections. In this study, we present a realistic discretization method for a maintainable multi-component system that is subject to periodic in-spection. We consider CBM policy and age-based maintenance policy for critical components and non-critical components of the system, respectively. We define ageneral coststructureincludingan operatingcost which isafunction ofsystem reliability, and we explain how this operating cost must be assigned in discrete and continuous state space. We use the Markov Decision Process (MDP) to find the optimal maintenance policy for the discrete control problem. Using the MDP model, we prove that the threshold policy is not always optimal, which is the most well-known policy in the CBM literature. Finally, we propose two policies, RL-KIT and RI-MIT, to implement the policy found by MDP in the continuous environment. We show that either of these policies can be optimal depending on the system of interest using simulation.