Home
Scholarly Works
A Conditioned Kullback-Leibler Divergence Measure...
Journal article

A Conditioned Kullback-Leibler Divergence Measure through Compensator Processes and its Relationship to Cumulative Residual Inaccuracy Measure with Applications

Abstract

Kullback-Leibler divergence measure between two random variables is quite useful in many contexts and has received considerable attention in numerous fields including statistics, physics, probability, and reliability theory. A cumulative Kullback-Leibler divergence measure has been proposed recently as a suitable extension of this measure upon replacing density functions by cumulative distribution functions. In this paper, we study a dynamic version of it by using a point process martingale approach conditioned on an observed past. Interestingly, this concept is identical to cumulative residual inaccuracy measure introduced by (Bueno and Balakrishnan (Probab Eng Sci 36:294-319, 2022). We also extend the concept of relative cumulative residual information generating measure to a conditional one and get Kullback-Leibler divergence measure through it. We further extend the new versions to non-explosive univariate point processes. In particular, we apply the conditioned Kullback-Leibler divergence to compare measures between two non-explosive point processes. Several applications of the established results are presented, including to a general repair process, minimal repair point process, coherent systems, Markov-modulated Poisson processes and Markov chains.

Authors

Costa Bueno VD; Balakrishnan N

Journal

Methodology and Computing in Applied Probability, Vol. 27, No. 2,

Publisher

Springer Nature

Publication Date

June 1, 2025

DOI

10.1007/s11009-025-10153-x

ISSN

1387-5841

Contact the Experts team