SHI Huijun, Li Qiang, HE Daojiang, . Objective Bayesian Inference for the Shannon Entropy of the Generalized Log–Moyal Distribution[J]. Chinese Journal of Applied Probability and Statistics.
Citation: SHI Huijun, Li Qiang, HE Daojiang, . Objective Bayesian Inference for the Shannon Entropy of the Generalized Log–Moyal Distribution[J]. Chinese Journal of Applied Probability and Statistics.

Objective Bayesian Inference for the Shannon Entropy of the Generalized Log–Moyal Distribution

  • In information theory, information is conceptualized as a factor that reduces uncertainty, with Shannon entropy serving as a fundamental tool for its quantification. In this paper, we explore the estimation of Shannon entropy for the generalized log–Moyal distribution using non-informative priors within a Bayesian framework. First, we derive several important non–informative priors, including the Jeffreys prior, reference priors and probability matching priors. Next, we validate the propriety of the posterior distributions and prove the existence of posterior expectations under these three priors. Through a comprehensive simulation study, we demonstrate that the proposed Bayesian method outperforms the maximum likelihood approach. Finally, the proposed approach is applied to analyze a real dataset.
  • loading

Catalog

    Turn off MathJax
    Article Contents

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return