Maximizing Entropy with an Expectation Constraint and One-Parameter Exponential Families of Distributions: A Reexamination

Author:   David L. Neuhoff
Publisher:   now publishers Inc
ISBN:  

9781638284802


Pages:   272
Publication Date:   09 December 2024
Format:   Paperback
Availability:   In Print   Availability explained
This item will be ordered in for you from one of our suppliers. Upon receipt, we will promptly dispatch it out to you. For in store availability, please contact us.

Our Price $261.36 Quantity:  
Add to Cart

Share |

Maximizing Entropy with an Expectation Constraint and One-Parameter Exponential Families of Distributions: A Reexamination


Add your own review!

Overview

The usual answer to the question “What probability distribution maximizes entropy or differential entropy of a random variable X subject to the constraint that the expected value of a real-valued function g applied to X has a specified value µ?” is an exponential distribution (probability mass or probability density function), with g(x) in the exponent multiplied by a parameter λ, and with the parameter chosen so the exponential distribution causes the expected value of g(X) to equal µ. The latter is called moment matching. While it is well-known that, when there are multiple expected value constraints, there are functions and expected value specifications for which moment matching is not possible, it is not well-known that this can happen when there is a single expected value constraint and a single parameter. This motivates the present monograph, whose goal is to reexamine the question posed above, and to derive its answer in an accessible, self-contained and complete fashion. It also derives the maximum entropy/differential entropy when there is a constraint on the support of the probability distributions, when there is only a bound on expected value and when there is a variance constraint. Properties of the resulting maximum entropy/differential entropy as a function of µ are derived, such as its convexity and its monotonicities. Example functions are presented, including many for which moment matching is possible for all relevant values of µ, and some for which it is not. Indeed, there can be only subtle differences between the two kinds of functions. As one-parameter exponential probability distributions play a dominant role, one section provides a self-contained discussion and derivation of their properties, such as the finiteness and continuity of the exponential normalizing constant (sometimes called the partition function) as λ varies, the finiteness, continuity, monotonicity and limits of the expected value of g(X) under the exponential distribution as λ varies, and similar issues for entropy and differential entropy. Most of these are needed in deriving the maximum entropy/differential entropy or the properties of the resulting function of µ. Aside from addressing the question posed initially, this monograph can be viewed as a warmup for discussions of maximizing entropy/differential entropy with multiple expected value constraints and of multiparameter exponential families. It also provides a small taste of information geometry.

Full Product Details

Author:   David L. Neuhoff
Publisher:   now publishers Inc
Imprint:   now publishers Inc
Weight:   0.384kg
ISBN:  

9781638284802


ISBN 10:   1638284806
Pages:   272
Publication Date:   09 December 2024
Audience:   Professional and scholarly ,  Professional & Vocational
Format:   Paperback
Publisher's Status:   Active
Availability:   In Print   Availability explained
This item will be ordered in for you from one of our suppliers. Upon receipt, we will promptly dispatch it out to you. For in store availability, please contact us.

Table of Contents

1. Introduction 2. Maximum Entropy with an Expected Value Constraint 3. Maximum Differential Entropy with an Expected Value Constraint 4. Working Simultaneously with Discrete and Continuous Cases 5. Extensions to Multiple Variables and Multiple Expected Value Constraints 6. Properties of One-Parameter Exponential-Form Probability Distributions 7. Properties of Hmax(AX, g, μ) and Hdmax(S, g, μ) Appendix References

Reviews

Author Information

Tab Content 6

Author Website:  

Customer Reviews

Recent Reviews

No review item found!

Add your own review!

Countries Available

All regions
Latest Reading Guide

RGJUNE2025

 

Shopping Cart
Your cart is empty
Shopping cart
Mailing List