Term
probability mass function (p.m.f.) |
|
Definition
The probability mass function (p.m.f.) f(x) of a discrete random variable X is a function that satisfies the following properties:
(a) f(x) > 0, if x ε S
(b) Σf(x) = 1, for x ε S
(c) P(X ε A) = Σf(x), for x ε A where A is a proper subset of S
|
|
|
Term
|
Definition
Given a random experiment with an outcome space S, a function X that assigns one and only one real number X(s)=x to each element in S is called a random variable.
The space of X is the set of real numbers {x: X(s)=x, s ε S}, where s ε S means that s belongs to the set S. |
|
|
Term
mathematical expectation
(expected value) |
|
Definition
If f(x) is the p.m.f. of the random variable X of the discrete type with space S, and if the summation Σu(x)f(x) for x ε S exists, then the sum is called the mathematical expectation or expected value of the function u(X), and it is denoted by E[u(X)]. That is
E[u(X)] = Σu(x)f(x) for x ε S |
|
|
Term
properties of mathematical expectation |
|
Definition
When it exists, the mathematical expectation E satisfies the following properties:
(a) If c is a constant, then E(c)=c.
(b) If c is a constant and u is a function, then
E[cu(X)]=cE[u(X)].
(c) If c1 and c2 are constants and u1 and u2 are functions, then E[c1u1(X) + c2u2(X)] = c1E[u1(X)] + c2E[u2(X)] |
|
|
Term
|
Definition
(A ∪ B)' = A' ∩ B'
(A ∩ B)' = A' ∪ B' |
|
|
Term
commutative laws for set algebra |
|
Definition
A ∪ B = B ∪ A
A ∩ B = B ∩ A |
|
|
Term
associative laws for set algebra |
|
Definition
(A ∪ B) ∪ C = A ∪ (B ∪ C)
(A ∩ B) ∩ C = A ∩ (B ∩ C) |
|
|
Term
distributive laws for set algebra |
|
Definition
A ∩ (B ∪ C) = (A ∩ B) ∪ (A ∩ C)
A ∪ (B ∩ C) = (A ∪ B) ∩ (A ∪ C) |
|
|
Term
properties of a binomial experiment |
|
Definition
1. A Bernoulli (success-failure) experiment is performed n times.
2. The trials are independent.
3. The probability of success on each trial is a constant p; the probability of failure is q = 1 - p.
4. The random variable X equals the number of successes in the n trials. |
|
|
Term
mean, variance and standard deviation of a random variable with a binomial distribution |
|
Definition
μ = np
σ2 = np(1-p) or σ2 = npq
σ = √(np(1-p)) or σ = √(npq) |
|
|
Term
mean of a random variable X |
|
Definition
|
|
Term
variance of a random variable X |
|
Definition
σ2 = Σ(x - μ)2f(x), x ε S = E(X2) - [E(X)]2 |
|
|
Term
|
Definition
The conditional probability of an event A, given that event B has occurred, is defined by
P(A|B) = P(A∩B)/P(B)
provided that P(B) > 0. |
|
|
Term
|
Definition
The probability that two events, A and B, both occur is given by the multiplication rule:
P(A ∩ B) = P(A)P(B|A)
or by
P(B ∩ A) = P(B)P(A|B). |
|
|
Term
addition rule for two events
(Hogg and Tanis theorem 1.2-5) |
|
Definition
The probability that either (or both) of two events, A and B, occur is given by the addition rule:
P(A U B) = P(A) + P(B) - P(A ∩ B).
proof hint 1: Rewrite A U B as the union of mutually exclusive events: A U (A' ∩ B).
proof hint 2: Do that trick again: B = (A ∩ B) U (A' ∩ B) |
|
|
Term
|
Definition
Probability is a real-valued set function P that assigns, to each event A in the sample space S, a number P(A), called the probability of the event A, such that the following properties are satisfied:
(a) P(A) ≥ 0,
(b) P(S) = 1,
(c) If A1, A2, A3, ... are events and Ai ∩ Aj = Ø, i ≠ j, then
P(A1 U A2 U ... U Ak) = P(A1) + P(A2) + ... + P(Ak)
for each positive integer k, and
P(A1 U A2 U A3 U ...) = P(A1) + P(A2) + P(A3) + ...
for an infinite, but countable, number of events. |
|
|
Term
probability of the complement event A
(Hogg and Tanis theorem 1.2-1) |
|
Definition
For each event A, P(A) = 1 - P(A')
Proof hint: S = A + A' |
|
|
Term
probability of Ø
(Hogg and Tanis theorem 1.2-2) |
|
Definition
P(Ø) = 0
Proof hint: Let A = Ø then A' = S, then apply probability of inverse theorem |
|
|
Term
relationship between P(A) and P(B) if A is a subset of B
(Hogg and Tanis theorem 1.2-3) |
|
Definition
If events A and B are such that A is a subset of B, then P(A) ≤ P(B)
Proof hints: start with B = A U (B ∩ A'). Remember that for any event A, P(A) ≥ 0. |
|
|
Term
upper limit of probability of event A
(Hogg and Tanis theorem 1.2-4) |
|
Definition
P(A) ≤ 1
Proof hint: A is a subset of S |
|
|
Term
addition rule for three events
(Hogg and Tanis theorem 1.2-6) |
|
Definition
The probability that at leat one of three events, A, B or C, occur is given by the addition rule:
P(A U B U C) = P(A) + P(B) + P(C) - P(A ∩ B)- P(A ∩ C) - P(B ∩ C) + P(A ∩ B ∩ C)
proof hint: A U B U C = A U (B U C) |
|
|
Term
|
Definition
Events A and B are independent if and only if P(A ∩ B) = P(A)P(B). Otherwise A and B are called dependent events.
(Note that this is equivalent to saying P(B|A) = P(B).) |
|
|
Term
independence of events and complements
(Hoggand Tanis theorem 1.5-1) |
|
Definition
If A and B are independent events, then the following pairs of events are also independent:
(a) A and B',
(b) A' and B,
(c) A' and B'
proof hint for (a): P(B' |A) = 1 - P(B|A) |
|
|