Abstract:
In information theory, information is conceptualized as a factor that reduces uncertainty, with Shannon entropy serving as a fundamental tool for its quantification. In this paper, we explore the estimation of Shannon entropy for the generalized log–Moyal distribution using non-informative priors within a Bayesian framework. First, we derive several important non–informative priors, including the Jeffreys prior, reference priors and probability matching priors. Next, we validate the propriety of the posterior distributions and prove the existence of posterior expectations under these three priors. Through a comprehensive simulation study, we demonstrate that the proposed Bayesian method outperforms the maximum likelihood approach. Finally, the proposed approach is applied to analyze a real dataset.