Term
|
Definition
A relationship between two variables. The data can be represented by the ordered pairs (x,y) where 'x' is the independent/explanatory variable and 'y' is the dependent/responsive variable. |
|
|
Term
Independent/explanatory variable 'x' |
|
Definition
It is measured by the horizontal axis. |
|
|
Term
Dependent/response variable 'y' |
|
Definition
It is measured by the vertical axis. |
|
|
Term
|
Definition
A measure of the strength and the direction of a linear relationship between two variables. |
|
|
Term
Range of the correlation coefficient |
|
Definition
The range is from -1 to 1. If 'x' and 'y' have a strong linear correlation, r is close to 1. If there is a negative correlation, r is close to -1. If there is no correlation, r is close to 0. |
|
|
Term
|
Definition
Can be used to determine whether the correlation between 2 variables is significant. |
|
|
Term
Other correlation possibilities: |
|
Definition
1) Is there a direct cause-and-effect relationship between the variables (does x cause y). 2) Is there a reverse relationship (does y case x). 3) Is it possible that the relationship is caused by a third party. 4)Is it possible that the relationship may be a coincidence. |
|
|
Term
Regression Line (line of best fit) |
|
Definition
The line for which the sum of the squares of the residuals is a minimum. |
|
|
Term
|
Definition
The difference -for each data point- between the observed y-value and the predicted y-value given a x-value on the line. The points can be positive (point above the line), negative (point below the line), or zero (point on the line). |
|
|
Term
|
Definition
The sum of the squares of the differences between the y-value of each ordered pair and the mean of y. |
|
|
Term
|
Definition
The sum of the squares of the differences between each predicted y-value and the mean of y. |
|
|
Term
|
Definition
The sum of the squares of the differences between the y-value of each ordered pair and each corresponding predicted y-value. |
|
|
Term
Coefficient of determination |
|
Definition
The ratio of the explained variation to the total variation. |
|
|
Term
Standard error of estimate |
|
Definition
The standard deviation of the observed yi-values about the predicted y-value for a given xi-value. |
|
|
Term
Bivariate normal distribution |
|
Definition
For any fixed values of 'x', the corresponding values of 'y' are normally distributed and for any fixed values of 'y', the corresponding values of 'x' are normally distributed. |
|
|
Term
|
Definition
Shows the observed frequencies for two variables. The observed frequencies are arranged in 'r' rows and 'c' columns. |
|
|
Term
|
Definition
The intersection of a row and a column in a contingency table. |
|
|
Term
|
Definition
Represents the expected frequency for the cell in row 'r' and column 'c'. |
|
|
Term
|
Definition
The frequency that an entire category of one of the variables occurs. |
|
|
Term
|
Definition
The observed frequencies in the interior of a contingency table. |
|
|
Term
Chi-square independence test |
|
Definition
Used to test the independence of two variables. You can determine whether the occurrence of one variable affects the probability of the occurrence of the other variable. |
|
|
Term
Chi-square independence test properties: |
|
Definition
1) The observed test must be obtained using a random sample. 2) Each expected frequency must be greater than or equal to 5. |
|
|
Term
One-way analysis of variance (ANOVA) |
|
Definition
A hypothesis-testing technique that is used to compare means from three or more populations. |
|
|
Term
|
Definition
1) Each sample must be randomly selected from a normal population. 2) The samples must be independent. 3) Each population must have the same variance. |
|
|
Term
|
Definition
Measures the differences related to the treatment given to each sample. |
|
|
Term
|
Definition
Measures the differences related to entries within the same sample. |
|
|