Random Variable:

A probability distribution is a table or an equation that links each outcome of a statistical experiment with its probability of occurrence. Consider a simple experiment in which we flip a coin two times. An outcome of the experiment might be the number of heads that we see in two coin flips. The table below associates each possible outcome with its probability.

Number of heads Probability
0 0.25
1 0.50
2 0.25

Suppose the random variable X is defined as the number of heads that result from two coin flips. Then, the above table represents the probability distribution of the random variable X.


Types of Random Variables:

  1. Bernoulli Random Variable.

  2. Binomial Random Variable.

  3. Geometric Random Variable.

  4. Negative Binomial Random Variable.

  5. Hyper Geometric Random Variable.

  6. Poisson Random Variable.



Bernoulli Random Variable :

               Bernoulli Random Variable is the probability distribution of a random variable which takes value 1 with success probability p and value 0 with failure probability q=1-p. It can be used, for example, to represent the toss of a coin, where "1" is defined to mean "heads" and "0" is defined to mean "tails" (or vice versa).
The probability mass function f of this distribution is
 f(k;p) = \begin{cases} p & \text{if }k=1, \\[6pt]
1-p & \text {if }k=0.\end{cases}

The expected value of a Bernoulli random variable X

E\left(X\right)=p, and its variance is \textrm{Var}\left(X\right)=p\left(1-p\right).


Binomial Random Variable :

               Binomial Random Variable with parameters n and p is the discrete probability distribution of the number of successes in a sequence of n independent yes/no experiments, each of which yields success with probability p. A success/failure experiment is also called a Bernoulli experiment or Bernoulli trial; when n = 1, the binomial distribution is a Bernoulli distribution. The binomial distribution is the basis for the popular binomial test of statistical significance.


In general, if the random variable X follows the binomial distribution with parameters n and p, we write X ~ B(np). The probability of getting exactly k successes in n trials is given by the probability mass function:
 f(k;n,p) = \Pr(X = k) = {n\choose k}p^k(1-p)^{n-k}
for k = 0, 1, 2, ..., n, where
{n\choose k}=\frac{n!}{k!(n-k)!}
is the binomial coefficient, hence the name of the distribution. The formula can be understood as follows: we want exactly k successes (pk) and n − k failures (1 − p)n − k. However, the k successes can occur anywhere among the n trials, and there are {n\choose k} different ways of distributing k successes in a sequence of n trials.

If X ~ B(n, p), that is, X is a binomially distributed random variable, n being the total number of experiments and p the probability of each experiment yielding a successful result, then the expected value of X is
 \operatorname{E}[X] = np ,
(For example, if n=100, and p=1/4, then the average number of successful results will be 25)
and the variance
 \operatorname{Var}[X] = np(1 - p).

Geometric Random Variable :

         
              Geometric Random Variable X is the number of Bernoulli trials needed to get first success.
It’s the probability that the first occurrence of success requires k number of independent trials, each with success probability p. If the probability of success on each trial is p, then the probability that the kth trial (out of k trials) is the first success is
\Pr(X = k) = (1-p)^{k-1}\,p\,
for k = 1, 2, 3, ....
The expected value of a geometrically distributed random variable X is 1/p and the variance is (1 − p)/p2:
\mathrm{E}(X) = \frac{1}{p},
 \qquad\mathrm{var}(X) = \frac{1-p}{p^2}.


Negative Binomial Random Variable :

            Negative binomial distribution is a discrete probability distribution of the number of successes in a sequence of independent and identically distributed Bernoulli trials before a specified (non-random) number of failures (denoted r) occurs. For example, if we define a "1" as failure, all non-"1"s as successes, and we throw a die repeatedly until the third time “1” appears (r = three failures), then the probability distribution of the number of non-“1”s that had appeared will be a negative binomial.

The probability mass function of the negative binomial distribution is

    f(k; r, p) \equiv \Pr(X = k) = {k+r-1 \choose k} p^k(1-p)^r \quad\text{for }k = 0, 1, 2, \dots
Here the quantity in parentheses is the binomial coefficient, and is equal to

    {k+r-1 \choose k} = \frac{(k+r-1)!}{k!\,(r-1)!} = \frac{(k+r-1)(k+r-2)\cdots(r)}{k!}.
 The expected value of a Negative Binomial distributed random variable X is r/p.
The Variance of Negative Binomial Random Variable is \frac{pr}{(1-p)^2}

Hyper Geometric Random Variable :

            Hypergeometric distribution is a discrete probability distribution that describes the probability of k successes in n draws, without replacement, from a finite population of size N containing exactly K successes, wherein each draw is either a success or a failure. This is in contrast to the binomial distribution, which describes the probability of k successes in n draws with replacement.

A random variable X follows the hypergeometric distribution if its probability mass function (pmf) is given by:
 P(X=k) = {{{K \choose k} {{N-K} \choose {n-k}}}\over {N \choose n}}
Where:
  • N is the population size
  • K is the number of success states in the population
  • n is the number of draws
  • k is the number of successes
  • \textstyle {a \choose b} is a binomial coefficient
The pmf is positive when \max(0, n+K-N) \leq k \leq \min(K,n).

 The expected value of a Hyper Geometric distributed random variable X is (nN)/(m+n).


The Variance is
 var(x)=sum_(i=1)^Nvar(x_i)+sum_(i=1)^Nsum_(j=1; j!=i)^Ncov(x_i,x_j).





Poisson Random Variable :

              The Poisson distribution is a discrete probability distribution that expresses the probability of a given number of events occurring in a fixed interval of time and/or space if these events occur with a known average rate and independently of the time since the last event. The Poisson distribution can also be used for the number of events in other specified intervals such as distance, area or volume.

A discrete random variable X  is said to have a Poisson distribution with parameter λ > 0, if, for k = 0, 1, 2, …, the probability mass function of X  is given by:
\!f(k; \lambda)= \Pr(X{=}k)= \frac{\lambda^k e^{-\lambda}}{k!},
where
  • e is Euler's number (e = 2.71828...)
  • k! is the factorial of k.
The positive real number λ is equal to the expected value of X and also to its variance
\lambda=\operatorname{E}(X)=\operatorname{Var}(X).
The Poisson distribution can be applied to systems with a large number of possible events, each of which is rare. How many such events will occur during a fixed time interval? Under the right circumstances, this is a random number with a Poisson distribution.

Reference : All these Definations and Formulas are been taken from en.wikipedia.org







Post a Comment Blogger

 
Top