# Variance

density of a two-dimensional normaldistributed random variable with different variances in the individual dimensions

the variance is in the statistics a dispersion measure, i.e. a measure for the deviation of a variate [itex] X< /math> of their expectancy value< math> \ operator name {E} (X)< /math>. The variance generalizes the concept of the sum of the squaredDeviations from the average value in an observation row. The variance of the variate [itex] X< /math> becomes usually as [itex] \ operator name {V} (X)< /math> or [itex] \ operator name {Var} (X)< /math> noted. Their disadvantage for practice is that it possesses another unit than the data. This disadvantage can be repaired, by one from the variance to of themSquare root, the standard deviation turns into.

## definition

if [itex] \ mu = \ operator name {E} (X)< /math>the expectancy value of the squarely integrable variates [itex] X< /math> is, then the variance computes itself both for discrete, as well as constant variates too

< math> \ operator name {Var} (X): = \ operator name {V} (X): = \ operator name {E} ((x \ mu) ^2)< /math>

The variance is thus the second central moment variates.

The variance is the average of the discrepancy squares ofCut through a statistic characteristic.

In the statistics the sample variance exists. It is the variance of observed values, which come of to a population as sample. This variance is used in the descriptive statistics as measure for the spread of data. As inferentielle variance it serves that for the estimationunknown variance in the population. The sample variance is more near described under standard deviation or treasures and tests.

## arithmetic rules

### Verschiebungssatz

< math> \ operator name {V} (X) = \ operator name {E} \ (\ left (x \ operator name {E} (X) \ right) ^2 left \ right) = \ operator name {E} (X^2) - \ (\ operator name {E} (X) \ right) ^2 /math< left>

### linear transformation

< math> \ operator name {V} (aX+b) =a^2 \ operator name {V} \ left (X \ right)< /math>

### variance of sums of variates

< math> \ operator name {V} \ left (\ sum_ {i=1} ^na_iX_i \ right) = \ sum_ {i=1} ^na_i^2 \ operator name {V} (X_i) +2 \ sum_ {i=1} ^n \ sum_ {j=i+1} ^na_ia_j \ operator name {Cov} (X_i, X_j)< /math>

## examples

### discrete variate

is given a discrete variate [itex] X< /math> with the probabilities

 i 1 2 3 x i -1 1 2 f (x i) 0.5 ,0.3 ,0.2

the variance then V computes itself

<}> (X) = (- 1-0.2) ^2 \ cdot 0.5 + (1-0,2) ^2 \ cdot 0.3 + (2-0,2) ^2 \ cdot 0,2 as math \ operator name {= 1.56 [/itex]

whereby the expectancy value

[itex] \ operator name {E} (X) = -1 \ cdot 0.5 + 1 \ cdot 0.3 + 2 \ cdot 0.2 = 0.2< /math>

amounts to. With the Verschiebungssatz one receives accordingly

< math> \ operator name {to V} (X) = (- 1) ^2 \ cdot 0.5 +1^2 \ cdot 0.3 +2^2 \ cdot 0.2 - 0,2^2 = 1.56 \. [/itex]

### constant variate

a constant variate has the density function

[itex] f (x) =

\ begin {cases} \ frac {1} {x} & \ mbox {if} 1 \ le x \ le e \ \ 0 & \ mbox {otherwise} \ end {cases}[/itex]

With the expectancy value

[itex] \ operator name {E} (X) = \ int_1^e x \ cdot \ frac {1} {x} dx= e - 1< /math>

computes itself the variance with the help of the Verschiebungssatzes as

< math> \ operator name {V} (X) = \ int_ {- \ infty} ^ \ infty x^2 \ cdot f (x) dx - (\ operator name {E} (X))^2 = \ int_1^e x^2 \ cdot \ frac {1} {x} dx - (e - 1) ^2 [/itex]
[itex] \ qquad = \ [\ frac {x^2} {2} \ right] _1^e - (e - left 1) ^2 = \ frac {e^2} {2}- \ frac {1} {2} - (e-1) ^2 \ approx 0 {,} 242< /math>