广义对数Moyal分布Shannon熵的客观Bayes推断

Objective Bayesian Inference for the Shannon Entropy of the Generalized Log–Moyal Distribution

  • 摘要: 在信息论中,信息被视为能够减少不确定性的因素,而Shannon熵则为信息的量化提供了一种重要方法.本文运用客观Bayes方法,针对广义对数Moyal分布Shannon熵估计问题展开研究.首先,推导了几类重要的无信息先验,包括Jeffreys先验、reference先验以及概率匹配先验.其次证明了这三类先验下的后验分布均是proper的,并且后验期望均存在.模拟研究表明,所提出的Bayes方法在频率表现上优于最大似然估计法.最后,将该Bayes方法应用于实际数据分析中.

     

    Abstract: In information theory, information is conceptualized as a factor that reduces uncertainty, with Shannon entropy serving as a fundamental tool for its quantification. In this paper, we explore the estimation of Shannon entropy for the generalized log–Moyal distribution using non-informative priors within a Bayesian framework. First, we derive several important non–informative priors, including the Jeffreys prior, reference priors and probability matching priors. Next, we validate the propriety of the posterior distributions and prove the existence of posterior expectations under these three priors. Through a comprehensive simulation study, we demonstrate that the proposed Bayesian method outperforms the maximum likelihood approach. Finally, the proposed approach is applied to analyze a real dataset.

     

/

返回文章
返回