Term
|
Definition
the integral from a to b of f(x)dx where the area is above [a, b] and under the graph of f, where f is the "probability density" |
|
|
Term
The probability that a certain random variable X lies in the interval [a, b] = |
|
Definition
P(Xe[a, b]) = the integral from a to b of f(x)dx |
|
|
Term
Continuous random variable |
|
Definition
A random variable that has a probability not just located at a certain number of points, but spread out continuously over an interval.
P(Xe[a, b]) = the integral from a to b of f(x)dx |
|
|
Term
Probability assigned to any particular point is |
|
Definition
0. This is because there's no area that lies above a single point. The equation would be: Probability of {a} = P([a, a]) = the interval from a to a of f(x)dx = 0 |
|
|
Term
If f is continuous at the point a, and delta is a small number, the probability of the interval [a, a+delta] is |
|
Definition
P(Xe[a, a+delta]) = the interval from a to a+delta of f(x)dx which equals f(a)*delta |
|
|
Term
In a small interval around a, f(a) times the length of the interval is |
|
Definition
the probability that X is in the interval |
|
|
Term
If X is a "continuous random variable whose probability density is f, then E(X) = |
|
Definition
the interval from negative infinity to infinity of xf(x)dx |
|
|
Term
If X is a "continuous random variable whose probability density is f, then E(X^2) = |
|
Definition
the interval from negative infinity to infinity of x^2f(x)dx |
|
|
Term
If X is a "continuous random variable whose probability density is f, then Var(X) = |
|
Definition
|
|
Term
If X is a "continuous random variable whose probability density is f, then SD(X) = |
|
Definition
the square root of Var(X) which equals the square root of E(X^2)-E(X)]^2 |
|
|
Term
If X and Y are independent, then P(XeA, YeB) = E(XY)= Var(X+Y) = |
|
Definition
P(XeA)P(YeB) E(X)E(Y) Var(X) + Var(Y) |
|
|
Term
Uniform distribution on (0, 1) |
|
Definition
The probability that U is in [c, d] is d - c. This is because f(x)= 1/d-c because it's a uniform distribution and so f(x) = 1. E(U) = the integral from 0 to 1 of udu = u^2/2 = 1/2 E(U^2) = the integral from 0 to 1 of u^2du = u^3/3 = 1/3 Var(U) = E(U^2)-[E(U)]^2 = 1/12 |
|
|
Term
The probability density function: |
|
Definition
phi(z) = [1/square root of 2pi)]*e^(-x^2/2) |
|
|
Term
E(Z) for the probability density function |
|
Definition
|
|
Term
Var(Z) for the probability density function |
|
Definition
|
|
Term
SD(Z) for the probability density function |
|
Definition
|
|
Term
Cumulative distribution function for the normal distribution is |
|
Definition
phi(delta) = the interval from negative infinity to delta of phi(z)dz |
|
|
Term
For the cumulative distribution function, P(a<= Z <= b) = |
|
Definition
|
|
Term
Normal distribution with mean u and standard deviation sigma variables |
|
Definition
X = u + sigmaZ, =Z X-u/sigma E(x) = u Var(X) = sigma^2 SD(X) = sigma |
|
|
Term
Normal distribution with mean u and standard deviation, P(c |
|
Definition
since X = u + sigmaZ, c-u/sigma < Z < d-u/sigma
so probability is phi(d-u/sigma) - phi (c-u/sigma) |
|
|
Term
if I tell you that a certain random variable X has a normal distribution, then the probability that c < X < d is |
|
Definition
phi(d-u/sigma) - phi (c-u/sigma) |
|
|
Term
If I tell you that a normal distribution is being used to approximate the number of successes in a set of Bernoulli trials, then probability is |
|
Definition
phi((1/2)-u/sigma) - phi ((1/2)-u/sigma) |
|
|
Term
Exponential Distribution explanation |
|
Definition
associated with waiting for events to happen
Is similar to Bernouilli trials with probability p but we stop when the event happens.
So for instance, what is the time T when we find the atom has decayed, the probability distribution is: P(T=1) = p P(T=2) = qp P(T=3) = q^2p etc
Now let’s ask for the probability that T is greater than a certain number: P(T > 1) = q P(T > 2) = q2 P(T > 3) = q3 |
|
|
Term
P(T > n) = (in the discrete case) |
|
Definition
|
|
Term
Exponential Distribution definition |
|
Definition
The random time T when a certain event occurs has an exponential distribution with rate delta if T has probability density:
f(t) = λe^−λt if (t ≥ 0) f(t) = 0 if (t<0) |
|
|
Term
The integral from t to infinity of f(t)dt for an exponential distribution = |
|
Definition
|
|
Term
P(a<= T <= b) for exponential distribution is |
|
Definition
|
|
Term
|
Definition
e^-delta*t
Says it's the probability that this decay takes at least t seconds because P(T>= t) = e^-delta*t |
|
|
Term
the probability that the atom decays before t seconds have passed |
|
Definition
|
|
Term
E(T) for exponential distribution |
|
Definition
|
|
Term
E(T^2) for exponential distribution |
|
Definition
|
|
Term
Var(T) for exponential distribution |
|
Definition
|
|
Term
SD(T) for exponential distribution |
|
Definition
|
|
Term
Memoryless property of the Exponential Distribution
given that the atom survives to time t, what is the chance that it survives to time t + s? |
|
Definition
this is the same as the chance that it would survive to time s in the first place. |
|
|
Term
|
Definition
P(T>h) = 1/2 point blank.
So e^-delta*h = 1/2 |
|
|
Term
Poisson Distribution revisited |
|
Definition
P(N = k) = (e^-u)*u^k/k! The sum of P(k) from 0 to infinity = 1 E(N) = u E(X^2) = u^2 - u Var(X) = u SD(X) = square root of u
Used for Random Scatter problems
Random scatter problems have independent events! |
|
|
Term
ln(1+y) is approx. equal to |
|
Definition
|
|
Term
If X1 and X2 are independent then (X1+X2) has a Poisson distribution with mean |
|
Definition
(u1+u2). This is the addition rule! |
|
|
Term
the probability distribution for the number of drops in one second is a Poisson distribution with mean |
|
Definition
u = delta*A, the expected number of drops to hit the board with area A
think of delta as the expected number of drops per unit area |
|
|
Term
How many phone calls arrive in a given period of time |
|
Definition
in a given time interval, the expected number of calls would be delta * the length of the interval
delta represents the expected number of calls per unit time |
|
|
Term
The number of calls in two minutes is the sum of |
|
Definition
N1+N2 where N1 is the number of calls in the first minute and N2 is the number of calls in the second minute. This is because they are independent from each other |
|
|
Term
what would be the Poisson parameter for the distribution of arriving phone calls in an interval (0,t)? |
|
Definition
the expected number of calls per minute*length of the interval = delta*t-0 = delta |
|
|
Term
|
Definition
N(I) = number of calls received in the interval I
Expected number of calls in the interval = delta*length of I so E(N(I)) = delta*length of I |
|
|
Term
|
Definition
W1 = waiting time until first call arrives W2 = waiting time between first and second calls Waiting times are independent of each other with the same distributions |
|
|
Term
|
Definition
P(0 phone calls in [0,t]) which equals e^-delta*t once you write the equation out
this is also true for any waiting time |
|
|
Term
|
Definition
|
|
Term
distribution of the 5th time of arrival? |
|
Definition
The random variable T5 = W1+W2+W3+W4+W5 |
|
|
Term
CDF F(x) definition for any random variable X definition |
|
Definition
F(x) = P(X<= x) F(b)-F(a) = P(a is the CDF always less than or equal to a valuable x |
|
|
Term
If the distribution determined by F(x) is continuous then P(c) |
|
Definition
|
|
Term
For a discrete random variable, the CDF is |
|
Definition
F(x) = the sum for y <= x of P(y) |
|
|
Term
For a continuous random variable, the CDF is |
|
Definition
F(x) = the integral from negative infinity to x of f(z)dz |
|
|
Term
If you know the probability density you can find the CDF by |
|
Definition
integrating! F(x) = integrating f(x) |
|
|
Term
If you know the CDF you can find the probability density by |
|
Definition
differentiating! F'(x) = f(x) |
|
|
Term
Continuous joint distributions |
|
Definition
The probability that X or Y has some specific value is 0, so we assign a probability to each subset B of the PLANE: P(B) = P((X,Y) eB) |
|
|
Term
Probability of a subset B in a continuous joint distribution |
|
Definition
|
|
Term
The random point (X, Y) has a uniform distribution on a subset D of the plane with finite area if |
|
Definition
(X, Y) is certain to lie in D for each subset C of D, P((X, Y)eC) = area of C/area of D |
|
|
Term
|
Definition
a nonnegative function f(x) such that for any interval [a, b], P(a < X <= b) = the integral from a to b of f(x)dx |
|
|
Term
Cumulative distribution function for F(x) |
|
Definition
F(x) = P(X<=x)
then P(a F(a)<= F(b)
P(c) for a fixed point c = 0 |
|
|
Term
Discrete random variable, the CDF is |
|
Definition
F(x) = the sum from y to x P(y) |
|
|