Robust inference for an interval-monitored step-stress experiment under proportional hazards
Abstract
Accelerated life tests (ALTs) play a crucial role in reliability analyses,
providing lifetime estimates of highly reliable products. Among ALTs,
step-stress design increases the stress level at predefined times, while
maintaining a constant stress level between successive changes. This approach
accelerates the occurrence of failures, reducing experimental duration and
cost. While many studies assume a specific form for the lifetime distribution,
in certain applications instead a general form satisfying certain properties
should be preferred. Proportional hazard model assumes that applied stresses
act multiplicatively on the hazard rate, so the hazards function may be divided
into two factors, with one representing the effect of the stress, and the other
representing the baseline hazard. In this work we examine two particular forms
of baseline hazards, namely, linear and quadratic. Moreover, certain
experiments may face practical constraints making continuous monitoring of
devices infeasible. Instead, devices under test are inspected at predetermined
intervals, leading to interval-censoring data. On the other hand, recent works
have shown an appealing trade-off between the efficiency and robustness of
divergence-based estimators. This paper introduces the step-stress ALT model
under proportional hazards and presents a robust family of minimum density
power divergence estimators (MDPDEs) for estimating device reliability and
related lifetime characteristics such as mean lifetime and distributional
quantiles. The asymptotic distributions of these estimates are derived,
providing approximate confidence intervals. Empirical evaluations through Monte
Carlo simulations demonstrate their performance in terms of robustness and
efficiency. Finally, an illustrative example is provided to demonstrate the
usefulness of the model and associated methods developed.