Entropy and rare events

What's new

Let $latex {X}&fg=000000$ and $latex {Y}&fg=000000$ be two random variables taking values in the same (discrete) range $latex {R}&fg=000000$, and let $latex {E}&fg=000000$ be some subset of $latex {R}&fg=000000$, which we think of as the set of “bad” outcomes for either $latex {X}&fg=000000$ or $latex {Y}&fg=000000$. If $latex {X}&fg=000000$ and $latex {Y}&fg=000000$ have the same probability distribution, then clearly

$latex displaystyle {bf P}( X in E ) = {bf P}( Y in E ).&fg=000000$

In particular, if it is rare for $latex {Y}&fg=000000$ to lie in $latex {E}&fg=000000$, then it is also rare for $latex {X}&fg=000000$ to lie in $latex {E}&fg=000000$.

If $latex {X}&fg=000000$ and $latex {Y}&fg=000000$ do not have exactly the same probability distribution, but their probability distributions are close to each other in some sense, then we can expect to have an approximate version of the above statement. For instance, from the definition of the total variation distance

View original post 1,429 more words

Advertisements

发表评论

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / 更改 )

Twitter picture

You are commenting using your Twitter account. Log Out / 更改 )

Facebook photo

You are commenting using your Facebook account. Log Out / 更改 )

Google+ photo

You are commenting using your Google+ account. Log Out / 更改 )

Connecting to %s