<$BlogRSDUrl$>

segunda-feira, janeiro 22, 2007

Maximum entropy, the only reasonable distribution 

The information entropy can be seen as a numerical measure which describes how uninformative a particular probability distribution is from zero (completely informative) to log m (completely uninformative).

By choosing to use the distribution with the maximum entropy allowed by our information, the argument goes, we are choosing the most uninformative distribution possible.

To choose a distribution with lower entropy would be to assume information we do not possess; to choose one with a higher entropy would violate the constraints of the information we do possess.

Thus the maximum entropy distribution is the only reasonable distribution.

0 comments

Post a Comment

This page is powered by Blogger. Isn't yours?