Home
Scholarly Works
Bayesian estimation under Kullback-Leibler...
Journal article

Bayesian estimation under Kullback-Leibler divergence measure based on exponential data

Abstract

In information theory, Kullback-Leibler divergence measure is a commonly used difference measure that is used for computing the distance between two probability distributions. In this paper, we apply Kullback-Leibler divergence measure between actual and approximate distribution to drive a loss function. We then apply the derived loss function on Exponential distribution to find the Bayes estimate of the parameter θ, and compare it with the Bayes estimate obtained using square error loss function. Our comparisons between these two estimates are based on complete, type II censoring and type I censoring data.

Authors

Abufoudeh GK; Abu Awwad RR; Bdair OM

Journal

Investigacion Operacional, Vol. 40, No. 1, pp. 61–72

Publication Date

January 1, 2019

ISSN

0257-4306

Contact the Experts team