Term
|
Definition
A phenomenon is random if individual outcomes are uncertain, but there is nonetheless a regular distribution of outcomes in a large number of repetitions |
|
|
Term
|
Definition
*the proportion of times the outcome would occur in a very long series of repetitions
*each possible event in the sample space S |
|
|
Term
|
Definition
if the probability that one event occurs on any given trial of an experiment is not affected or changed by the occurrence of the other event |
|
|
Term
Sampling with Replacement |
|
Definition
trials are independent only when you put the coin back each time |
|
|
Term
|
Definition
a subset of the sample space |
|
|
Term
|
Definition
a set, or list, of all possible outcomes of a random process |
|
|
Term
|
Definition
Rule 1. 0 ≤ P(A) ≤ 1 for any event A
Rule 2. P(S) = 1
Rule 3. Addition rule: If A and B are disjoint events, then P(A or B)=P(A)+P(B)
Rule 4. Complement rule: For any event A, P(AC)=1-P(A)
Rule 5. Multiplication rule: If A and Bare independent
events, then P(A and B)=P(A)P(B) |
|
|
Term
|
Definition
*if they have no outcomes in common and can never happen together
*The probability that A or B occurs is then the sum of their individual probabilities
*P(A or B) = P(A U B)= P(A) + P(B)
This is the addition rule for disjoint events |
|
|
Term
|
Definition
*any event A is the event that A does not occur, written as Ac
*complement rule states that the probability of an event not occurring is 1 minus the probability that is does occur
*P(not A) = P(Ac) = 1 − P(A) |
|
|
Term
|
Definition
*deal with discrete data— data that can only take on a limited number of values
*these values are often integers or whole numbers |
|
|
Term
We can assign probabilities either: |
|
Definition
*Empirically
*Theoretically |
|
|
Term
|
Definition
*from our knowledge of numerous similar past events
*Mendel, the founder of the new science of genetics, discovered the probabilities of inheritance of a given trait from experiments on peas without knowing about genes or DNA |
|
|
Term
|
Definition
*from our understanding of the phenomenon and symmetries in the problem
*A 6-sided fair die: each side has the same chance of turning up
*Genetic laws of inheritance based on meiosis process |
|
|
Term
Multiplication Rule for Independent Events |
|
Definition
If A and B are independent, P(A and B) = P(A)P(B) |
|
|
Term
General Addition Rule for any two events A and B |
|
Definition
P(A or B) = P(A) + P(B) – P(A and B) |
|
|
Term
Conditional Probabilities |
|
Definition
reflect how the probability of an event can change if we know that some other event has occurred/is occurring
|
|
|
Term
|
Definition
*important application of conditional probabilities
*foundation of many modern nstatistical applications beyond the scope of this course
|
|
|
Term
|
Definition
a variable whose value is a numerical outcome of a random phenomenon |
|
|
Term
|
Definition
X has a finite number of possible values |
|
|
Term
Continuous Random Variable |
|
Definition
X takes all values in an interval
Example: There is an infinity of numbers between 0 and 1 (e.g., 0.001, 0.4, 0.0063876) |
|
|
Term
The 68-95-99.7% Rule for Normal Distributions |
|
Definition
*About 68% of all observations are within 1 standard deviation (s) of the mean (μ)
*About 95% of all observations are within 2 s of the mean μ
*Almost all (99.7%) observations are within 3 s of the mean |
|
|
Term
|
Definition
|
|
Term
|
Definition
*As the number of randomly drawn observations (n) in a sample increases, the mean of the sample (x bar) gets closer and closer to the population mean μ
*It is valid for any population |
|
|
Term
|
Definition
*A weighted average of the squared deviations (X−μX)2 of the variable X from its mean μX
*Each outcome is weighted by its probability in order to take into account outcomes that are not equally likely
*Larger the variance of X, the more scattered the values of X on average
*The positive square root of the variance gives the standard deviation σ of X |
|
|
Term
Variance of a Discrete Random Variable |
|
Definition
variance σ2 of X is found by multiplying each squared deviation of X by its probability and then adding all the products |
|
|