Continous Variables and Probablity
If a variable is continuous, between any two possible values of the variable are an infinite number of other possible values, even though we cannot distinguish some of them from one another in practice. It is therefore not possible to count the number of possible values of a continuous variable. In this situation calculus provides the logical means of finding probabilities.
Probability from the Probability Density Function
(a) Basic Relationships
The probability that a continuous random variable will be between limits a and b is given by an integral, or the area under a curve.
b(Upper Limit) Pr[a < X < b] = ∫ f (x) dx a(Lower Limit)
The function f(x) is called a probability density function. The probability that the continuous random variable, X, is between a and b corresponds to the area under the curve representing the probability density function between the limits a and b. This is the cross-hatched area in Figure Compare this relation with the relation for the probability that a discrete random variable is between limits a and b, which is the sum of the probability functions for all values of the variable X between a and b.
The cumulative distribution function for a continuous random variable is given by the integral of the probability density function between x = –∞ and x = x1, where x1 is a limiting value. This corresponds to the area under the curve from –∞ to x1. The cumulative distribution function is often represented by F(x1) or F(x).
Σp(xi ) ∼ ∫ f (x)dx
Since any probability must be between 0 and 1, as we have seen previously, the probability density function must always be positive or zero, but not negative. f (x) ≥ 0
Expected Value and Variance
The mathematical expectation or expected value of a discrete random variable is a mean result for an infinitely large number of trials, so it is a mean value that would be approximated by a large but finite number of trials. This holds also for a continuous random variable. For a discrete random variable the expected value is found by adding up the product of each possible outcome with its probability.
The variance of a discrete random variable is the expectation of (xi – μ)2. This carries over to a continuous random variable and becomes:
Extension: Useful Continuous Distributions
Other distributions are uniform distribution, the exponential distribution, the Weibull distribution, the beta distribution, and the gamma distribution.
The uniform distribution is very simple. Its probability density function is a constant in a particular interval (say for a < X < b) and zero outside that interval.
The exponential distribution has the following probability density function: where λ is a constant closely related to the mean and standard deviation:-
where λ is a constant closely related to the mean and standard deviation.
The exponential distribution is related to the Poisson distribution, although the exponential distribution is continuous whereas the Poisson distribution is discrete. The Poisson distribution gives the probabilities of various numbers of random events in a given interval of time or space when the possible number of discrete events is much larger than the average number of events in the given interval. If the variable is time, the exponential distribution gives the probability distribution of the time between successive random events for the same conditions as apply to the Poisson distribution.
The Weibull distribution, the beta distribution, and the gamma distribution are more complicated, mainly because each has two independent parameters. Both the Weibull distribution and the gamma distribution give the exponential distribution with particular choices of one of their two parameters.
Reliability is applied in many areas of engineering, including design of mechanical devices, electronic equipment, and power transmission systems. Although failures of supply of electricity to factories, offices, and residences were once frequent, they have become much less frequent as engineers have devoted more attention to reliability.