MyJournals Home  

RSS FeedsEntropy, Vol. 21, Pages 969: Conditional Rényi Divergence Saddlepoint and the Maximization of ?-Mutual Information (Entropy)

 
 

4 october 2019 17:00:59

 
Entropy, Vol. 21, Pages 969: Conditional Rényi Divergence Saddlepoint and the Maximization of ?-Mutual Information (Entropy)
 


Rényi-type generalizations of entropy, relative entropy and mutual information have found numerous applications throughout information theory and beyond. While there is consensus that the ways A. Rényi generalized entropy and relative entropy in 1961 are the “right” ones, several candidates have been put forth as possible mutual informations of order α . In this paper we lend further evidence to the notion that a Bayesian measure of statistical distinctness introduced by R. Sibson in 1969 (closely related to Gallager’s E 0 function) is the most natural generalization, lending itself to explicit computation and maximization, as well as closed-form formulas. This paper considers general (not necessarily discrete) alphabets and extends the major analytical results on the saddle-point and saddle-level of the conditional relative entropy to the conditional Rényi divergence. Several examples illustrate the main application of these results, namely, the maximization of α -mutual information with and without constraints.


 
251 viewsCategory: Informatics, Physics
 
Entropy, Vol. 21, Pages 970: Kolmogorov Complexity of Coronary Sinus Atrial Electrograms before Ablation Predicts Termination of Atrial Fibrillation after Pulmonary Vein Isolation (Entropy)
Entropy, Vol. 21, Pages 975: Information Theoretic Causal Effect Quantification (Entropy)
 
 
blog comments powered by Disqus


MyJournals.org
The latest issues of all your favorite science journals on one page

Username:
Password:

Register | Retrieve

Search:

Physics


Copyright © 2008 - 2024 Indigonet Services B.V.. Contact: Tim Hulsen. Read here our privacy notice.
Other websites of Indigonet Services B.V.: Nieuws Vacatures News Tweets Nachrichten