Parameters and Families of Distributions

Recall the definition of bernoulli random variables. If we only have 0 and 1 events, then if p(0)=.8p(1)=1p(0)=.2. In general, p(1)=αp(0)=1α where 0<α<1. Since p(x) for some X discrete random variable and it depends on α, we write it as p(x;α):

p(x;α)={1αx=0αx=1
parameter, family

Suppose p(x) depends on one or more quantities, each of which can be assigned any one of a number of possible values, with each different value determining a different probability distribution. Each such quantity is a parameter of the distribution. The collection of all probability distributions for different values of the parameter(s) is called a family of probability distributions

For example, α above is a parameter. Each 0<α<1 determines a different member of a family of distributions, such as p(x;.2) or p(x;.08). The bernoulli random variables have the same form, so it's the family of Bernoulli distributions.

Example

Let A be an event and define θ=P(A). Say X is some discrete event (like a number of occurences). Then θ is a parameter of the Probability Distribution, PMF of X. Experimental outcomes are A,A,AAA, with X values 1,2,3,... counting the occurrence before (and including) we hit A. Then:

p(1;θ)=P(X=1)=P(A)=θp(2;θ)=P(X=2)=P(AA)=P(A)P(A)=(1θ)θp(3;θ)=P(X=3)=P(AAA)=P(A)2P(A)=(1θ)2θ

In general:

p(x;θ)=(1θ)x1θ

for x=1,2,3,....

What about expected value in this case? See:

E(X)=x=1xp(x;θ)=x=1x(1θ)x1θ=θx=1[ddθ(1θ)x]=θddθx=1(1θ)x=θddθ1θθ=θ(θ)2=1θ

These families of distributions are called the geometric distributions due to the solving of the geometric series.