MANE 3332.05
Lecture 9
Agenda
- Complete Chapter 3 lectures
- Start Chapter 4 lectures
- Binomial Quiz (assigned 9/25/2025, due 9/30/2025)
- Poisson Practice Problems (assigned 9/30/2025, due 10/2/2025)
- Schedule
Handouts
| Tuesday Date and Topic(s) | Thursday Date and Topic(s) |
|---|---|
| 9/30: Poisson Distribution, Chapter 4 | 10/2: standard normal |
| 10/7: normal distribution | 10/9: Exponential and Weibull distributions |
| 10/14: Chapter 5 (not on midterm) | 10/16: Midterm Review |
| **10/21: ** Midterm Exam | 10/23: Continue Part Two |
Poisson Process
-
The number of events over an interval (such as time) is a discrete random variable that is often modelled by the Poisson distribution
-
The length of the interval between events is often modeled by the (continuous) exponential distribution
-
These two distributions are related
Poisson Process
-
The number of events over an interval (such as time) is a discrete random variable that is often modelled by the Poisson distribution
-
The length of the interval between events is often modelled by the (continuous) exponential distribution
-
These two distributions are related
Poisson Process
Assume that the events occur at random throughout the interval. If the interval can be partitioned into subintervals of small enough length such that
-
The probability of more than one count in a subinterval is zero
-
The probability of one count in a subinterval is the same for all subintervals and proportional to the length of the subinterval, and
-
The count in each subinterval is independent of other subintervals, the random experiment is called a Poisson process
Poisson Distribution
If the mean number of counts in the interval is \(\lambda>0\), the random variable \(X\) that equals the number of counts in the interval has a Poisson distribution with parameter \(\lambda\)
- The Poisson PMF is
- The mean of a Poisson random variable is
- The variance of a Poisson random variable is
Poisson Practice Problems
Poisson Example

Chapter 4 Content
Continuous Random Variable
-
The probability distribution of a random variable \(X\) is a description of the set of probabilities associated with the possible values of \(X\)
-
Density functions are commonly used in engineering to describe physical systems.
-
A probability density function \(f(x)\) can be used to describe the probability distribution of a continuous random variable
Probability Density Function
-
Notice the difference from a discrete random variable
-
The formal definition of a probability density function is a function such that
-
\(f(x)\geq 0\)
-
\(\int_{-\infty}^\infty f(x)\,dx=1\)
-
\(P(a\leq X\leq b)=\int_a^bf(x)\,dx\)
-
Probability Density Function
- Any interesting property of continuous random variables is
- Does not apply to discrete random variables
- Explanation
Cumulative Distribution Function
The cumulative distribution function for a continuous random variable \(X\) is
Mean and Variance of a Continuous Random Variable
- The mean value of a continuous random variable is defined to be
- The variance of a continuous random variable is defined to be
- The standard deviation of \(X\) is
Continuous Uniform Distribution
The continuous uniform distribution is the analog of the discrete uniform distribution in that all outcomes are equally likely to occur
- A continuous uniform distribution for the random variable \(X\) has a probability density function
- The mean of the uniform distribution is
- The variance of \(X\) is
Uniform Problem 4.1.6
- See page P-25
The Normal Distribution
-
The normal distribution is the most widely used and important distribution in statistics.
-
You must master this!
-
A random variable \(X\) with probability density function
has a normal distribution with parameters \(\mu\) and \(\sigma\) where \(-\infty<\mu<\infty\) and \(\sigma>0\)
-
The normal distribution with parameters \(\mu\) and \(\sigma\) is denoted \(N(\mu,\sigma^2)\)
-
An interesting web-site is http://www.seeingstatistics.com/seeingTour/normal/shape3.html
Mean and Variance of the Normal Distribution
- The mean of the normal distribution with parameters \(\mu\) and \(\sigma\) is
- The variance of the normal distribution with parameters \(\mu\) and \(\sigma\) is
Central Limit Theorem
-
Brief introduction
-
States that the distribution of the average of independent random variables will tend towards a normal distribution as \(n\) gets large
-
More details later
Calculating Normal Probabilities
-
Is somewhat complicated
-
The difficulty is \(\int_a^bf(x)\,dx\) does not have a closed form solution
-
Probabilities must be found by numerical techniques (tabled values)
-
It is very helpful to draw a sketch of the desired probabilities (I require this)
The Standard Normal Distribution
-
A normal random variable with \(\mu=0\) and \(\sigma=1\) is called a standard normal random variable
-
A standard normal random variable is denoted as \(z\)
-
The cumulative distribution function for a standard normal is defined to be the function
- These probabilities are contained in Appendix Table III on pages A-8 and A-9
Cumulative Standard Normal Distribution

Cumulative Standard Normal Distribution

Standard Normal Problem

Standard Normal Practice Problems
Standardizing (the \(z\)-transform)
- Suppose \(X\) is a normal random variable with mean \(\mu\) and variance \(\sigma^2\)
-
The \(z\)-value is \(z=(x-\mu)/\sigma\)
-
Result allows the standard normal tables to be used to calculate probabilities for any normal distribution
Normal Probability Problem

Normal Practice Problems
Normal Approximation to the Binomial Distribution
- If \(X\) is a binomial random variable,
is approximately a standard normal random variable. Consequently, probabilities computed from \(Z\) can be used to approximate probabilities for \(X\)
- Usually holds when
Problem

- How good are the approximations?
Continuity Correction Factor
-
Is a method to improve the accuracy of the normal approximation to the binomial
-
Examine Figure 6.22 from Walpole, Myers, Myers & Ye. Note that each rectangle is centered at \(x\) and extends from \(x-0.5\) to \(x+0.5\)
-
This table should help formulate problems
| Binomial Probability | with Correction Factor | Normal Approximation |
|---|---|---|
| \(P(X\geq x)\) | \(P(X\geq x-0.5)\) | \(P\left(Z>\frac{x-0.5-np}{\sqrt{np(1-p)}}\right)\) |
| \(P(X\leq x)\) | \(P(X\leq x+0.5)\) | \(P\left(Z<\frac{x+0.5-np}{\sqrt{np(1-p)}}\right)\) |
| \(P(X=x)\) | \(P(x-0.5\leq X\leq x+0.5)\) | \(P\left(\frac{x-0.5-np}{\sqrt{np(1-p)}}<Z<\frac{x+0.5-np}{\sqrt{np(1-p)}}\right)\) |
Normal Approximation - Figure

Rework Problem using Continuity Correction Factor
- Are the approximations improved?
Normal Approximation to the Poisson Distribution
- If \(X\) is a Poisson random variable with \(E(X)=\lambda\) and \(V(X)=\lambda\),
is approximately a standard normal random variable.
Exponential Distribution
-
The exponential distribution is widely used in the area of reliability and life-test data.
-
Ostle, et. al. (1996) list the following applications of the exponential distribution
-
the number of feet between two consecutive erroneous records on a computer tape,
-
the lifetime of a component of a particular device,
-
the length of a life of a radioactive material and
-
the time to the next customer service call at a service desk
-
Exponential Distribution
- The PDF for an exponential distribution with parameter \(\lambda >0\) is
- The mean of \(X\) is
- The variance of \(X\) is
Note that other authors define \(f(x)=\frac{1}{\theta}e^{-x/\theta}\). Either definition is acceptable. However one must be aware of which definition is being used.
The Exponential CDF
The CDF for the exponential distribution is easy to derive
Problem 4-79

Lack of Memory Property
- The mathematical definition is
-
That is "the probability of a failure time that is less than \(t_1+t_2\) given the failure time is greater than \(t_1\) is the probability that the item's failure time is less than \(t_2\)
-
This property is unique to the exponential distribution
-
Often used to model the reliability of electronic components.
Problem 4--80

Relationship to the Poisson Distribution
-
Let \(Y\) be a Poisson random variable with parameter \(\lambda\). Note: \(Y\) represents the number of occurrences per unit
-
Let \(X\) be a random variable that records the time between occurrences for the same process as \(Y\)
-
\(X\) has an exponential distribution with parameter \(\lambda\)
Lognormal Distribution
- Let \(W\) have a normal distribution with mean \(\theta\) and variance \(\omega^2\); then \(X=\exp(W)\) is a lognormal random variable with pdf
- The mean of \(X\) is
- The variance of \(X\) is
Example Problem

Gamma Distribution
- The random variable \(X\) with pdf
is a gamma random variable with parameters \(\lambda>0\) and \(r>0\).
- The gamma function is
with special properties:
-
\(\Gamma(r)\) is finite
-
\(\Gamma(r)=(r-1)\Gamma(r-1)\)
-
For any positive integer \(r\), \(\Gamma(r)=(r-1)!\)
-
\(\Gamma(1/2)=\pi^{1/2}\)
Gamma Distribution
- The mean and variance are
- We will not work any probability problems using the gamma distribution
Gamma Tables

Weibull Distribution
- The random variable \(X\) with pdf
$$ f(x)=\frac{\beta}{\delta}\left(\frac{x}{\delta}\right)^{\beta-1}\exp\left[-\left(\frac{x}{\delta}\right)^\beta\right],\;\; \mbox{ for }x>0 $$ is a Weibull random variable with scale parameter \(\delta>0\) and shape parameter \(\beta>0\)
- The CDF for the Weibull distribution is
- The mean of the Weibull distribution is
- The variance of the Weibull distribution is
Weibull Problem
