Engineering Analysis/Expectation and Entropy

From testwiki
Revision as of 01:19, 15 June 2017 by imported>PokestarFan (Entropy: {{BookCat}}/possible general fixes, replaced: [[Category:{{FULLBOOKNAME}}|{{FULLCHAPTERNAME}}]] → {{BookCat}} using AWB)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

Expectation

The expectation operator of a random variable is defined as:

E[x]=xfX(x)dx

This operator is very useful, and we can use it to derive the moments of the random variable.

Moments

A moment is a value that contains some information about the random variable. The n-moment of a random variable is defined as:

E[xn]=xnfX(x)dx

Mean

The mean value, or the "average value" of a random variable is defined as the first moment of the random variable:

E[x]=μX=xfX(x)dx

We will use the Greek letter μ to denote the mean of a random variable.

Central Moments

A central moment is similar to a moment, but it is also dependent on the mean of the random variable:

E[(xμX)n]=(xμX)nfX(x)dx

The first central moment is always zero.

Variance

The variance of a random variable is defined as the second central moment:

E[(xμX)2]=σ2

The square-root of the variance, σ, is known as the standard-deviation of the random variable

Mean and Variance

the mean and variance of a random variable can be related by:

σ2=μ2+E[x2]

This is an important function, and we will use it later.

Entropy

the entropy of a random variable X is defined as:

H[X]=E[1p(X)]

Template:BookCat