|
![]() |
|||
|
||||
OverviewOne of the main issues in communications theory is measuring the ultimate data compression possible using the concept of entropy. While differential entropy may seem to be a simple extension of the discrete case, it is a more complex measure that often requires a more careful treatment. Handbook of Differential Entropy provides a comprehensive introduction to the subject for researchers and students in information theory. Unlike related books, this one brings together background material, derivations, and applications of differential entropy. The handbook first reviews probability theory as it enables an understanding of the core building block of entropy. The authors then carefully explain the concept of entropy, introducing both discrete and differential entropy. They present detailed derivations of differential entropy for numerous probability models and discuss challenges with interpreting and deriving differential entropy. They also show how differential entropy varies as a function of the model variance. Focusing on the application of differential entropy in several areas, the book describes common estimators of parametric and nonparametric differential entropy as well as properties of the estimators. It then uses the estimated differential entropy to estimate radar pulse delays when the corrupting noise source is non-Gaussian and to develop measures of coupling between dynamical system components. Full Product DetailsAuthor: Joseph Victor Michalowicz , Jonathan M Nichols , Frank BucholtzPublisher: CRC Press Imprint: CRC Press ISBN: 9781306124300ISBN 10: 1306124301 Pages: 241 Publication Date: 01 January 2013 Audience: General/trade , General Format: Undefined Publisher's Status: Active Availability: Available To Order ![]() We have confirmation that this item is in stock with the supplier. It will be ordered in for you and dispatched immediately. Table of ContentsReviewsAuthor InformationTab Content 6Author Website:Countries AvailableAll regions |