Entropy Measures, Maximum Entropy Principle and Emerging ApplicationsKarmeshu Springer, 1 thg 10, 2012 - 297 trang The last two decades have witnessed an enormous growth with regard to ap plications of information theoretic framework in areas of physical, biological, engineering and even social sciences. In particular, growth has been spectac ular in the field of information technology,soft computing,nonlinear systems and molecular biology. Claude Shannon in 1948 laid the foundation of the field of information theory in the context of communication theory. It is in deed remarkable that his framework is as relevant today as was when he 1 proposed it. Shannon died on Feb 24, 2001. Arun Netravali observes "As if assuming that inexpensive, high-speed processing would come to pass, Shan non figured out the upper limits on communication rates. First in telephone channels, then in optical communications, and now in wireless, Shannon has had the utmost value in defining the engineering limits we face". Shannon introduced the concept of entropy. The notable feature of the entropy frame work is that it enables quantification of uncertainty present in a system. In many realistic situations one is confronted only with partial or incomplete information in the form of moment, or bounds on these values etc. ; and it is then required to construct a probabilistic model from this partial information. In such situations, the principle of maximum entropy provides a rational ba sis for constructing a probabilistic model. It is thus necessary and important to keep track of advances in the applications of maximum entropy principle to ever expanding areas of knowledge. |
Nội dung
2 | |
References | 47 |
Facets of Generalized Uncertaintybased Information | 55 |
Application of the Maximum Information Entropy Prin | 79 |
VIII | 90 |
Geometric Ideas in Minimum CrossEntropy | 103 |
References | 113 |
References | 133 |
Minimum Mean Deviation from the SteadyState Condi | 163 |
On the Utility of Different Entropy Measures in Image | 176 |
References | 197 |
References | 208 |
References | 225 |
References | 250 |
References | 258 |
Ấn bản in khác - Xem tất cả
Entropy Measures, Maximum Entropy Principle and Emerging Applications Karmeshu Xem trước bị giới hạn - 2003 |
Entropy Measures, Maximum Entropy Principle and Emerging Applications Karmeshu Không có bản xem trước - 2012 |
Entropy Measures, Maximum Entropy Principle and Emerging Applications Karmeshu Không có bản xem trước - 2010 |
Thuật ngữ và cụm từ thông dụng
algorithm analysis applications approximation attribute characterized complexity Computer conditional entropy constraints corresponding cross-entropy method data mining defined denotes divergence E.T. Jaynes entropy measure entropy value equation estimate expressed failure finite framework frequency fuzzy measure theory fuzzy set GE-type geometric given global gray level gray values Hausdorff dimension histogram IEEE Trans information measure information theory information-theoretic measures interarrival iterative Jaynes Kapur Kesavan Klir Kullback-Leibler divergence Lagrange multipliers log2 Math maximization maximum entropy principle mean measure of information measure of uncertainty method minimization minimum cross-entropy MinMax measure mutual information neurons obtained one-way association optimal parameters partitioned principle of maximum probabilistic probability density probability distribution probability theory problem quantum queueing networks queueing system random variable Rényi Rényi entropy sequence Shannon entropy solution statistical steady-state stochastic theorem tion vector