Recall the definition of bernoulli random variables. If we only have and events, then if . In general, where . Since for some discrete random variable and it depends on , we write it as :
parameter, family
Suppose depends on one or more quantities, each of which can be assigned any one of a number of possible values, with each different value determining a different probability distribution. Each such quantity is a parameter of the distribution. The collection of all probability distributions for different values of the parameter(s) is called a family of probability distributions
For example, above is a parameter. Each determines a different member of a family of distributions, such as or . The bernoulli random variables have the same form, so it's the family of Bernoulli distributions.
Example
Let be an event and define . Say is some discrete event (like a number of occurences). Then is a parameter of the Probability Distribution, PMF of . Experimental outcomes are with values counting the occurrence before (and including) we hit . Then:
In general:
for .
What about expected value in this case? See:
These families of distributions are called the geometric distributions due to the solving of the geometric series.