Term
|
Definition
-term used to measure the magnitude of a treatment or relationship effeect
-common currency of meta-analysis studies |
|
|
Term
|
Definition
-usually expressed as r
-ranges -1 to 1 |
|
|
Term
What is the correlation coefficient? |
|
Definition
-expresses quantitatively the magnitude and direction of a relationship between two variables
-correlation does not infer causation
|
|
|
Term
|
Definition
-value of r does not depend on the unit of measurement for either variable, nor does it depend on which variable is label x or y
-the value of r must be between -1 and 1
-a positive value of r indicates a positive linear relationship between the variables; therefore, as x increases so does y
-Pearson's r should only be used when data are normally or somewhat normally distributed are near normal-not heavily skewed or kurtosis |
|
|
Term
Partial Correlations (path analysis)
Definition |
|
Definition
-a correlation coefficient between two variables while holding another variable constant |
|
|
Term
Interpreting the Correlation Coefficient
(r high and low) |
|
Definition
-r ranging from about .20 to .40 may be regarded as indicating a low degree of correlation
-r ranging from about .40 to .60 may be regarded as indicating a moderate degree of correlation
-r ranging from about .80 to 1.00 may be regarded as indicating high correlation
-p > .05 then not significant different than 0
-p < .05 then significantly different than 0 |
|
|
Term
Simple Linear Regression
Equation |
|
Definition
-equation: y=a + bx -a is the intercept, wherever it crosses the y axis
-b is the slope of the line, slope indicates how much y increases for every unit that x increases
|
|
|
Term
R2 (coefficient of determination or correlation coefficient squared) |
|
Definition
-used to determine how well a regression line or model fits the data
-represents the proportion of variability in y that can be explained by the variability in x
|
|
|
Term
|
Definition
->75%=very good
-50-75%=good
-25-49%=fair
-25% and less=poor or meanlingless |
|
|
Term
Purpose of regression analysis |
|
Definition
-develop an equation for prediction purposes, able to predict the value of the DV for any individual in the population based on the IV(s)
-explain the relationship between the DV and IV(s) |
|
|
Term
Assumptions of regression analysis |
|
Definition
-the IVs are fixed
-IVs are measured w/o error
-the relationship between the IV(s) and the DV is assumed to be linear
-the distribution of all variables should be roughly normal
-the DV should be measured on an interval or ratio scale
-errors are not correlated with either the DV or IV(s)
-the variance of the residuals across all values of the IV(s) is consistant
-errors are normally distributed |
|
|
Term
Multiple regression equation- |
|
Definition
-ei = yi - y
-yi= actual value
-y=predicted or expected value (space between line and dot)
-e represents errors called residuals |
|
|
Term
|
Definition
-similar to the R in simple regression and is comparable to the Pearson correlation coefficient
-measures the correlation between the actual value of the DV and the predicted value of the DV |
|
|
Term
Coefficient of determination (adjusted R2) |
|
Definition
-interpreted as the % of variation in the DV that can be explained by the combo of IVs
-same goodness of fit scale |
|
|
Term
|
Definition
-two or more IVs are moderately high to highly correlated, meaning they contribute basically the same info to the model |
|
|
Term
Problems with multicollinearity- |
|
Definition
-restricts the size of the correlation coefficient R
-makes it difficult to determine the importance of individual predictors
-tends to inflate the standard errors of the Betas so that the resulting prediction equation is not very precise |
|
|
Term
How to determine multicollinearity |
|
Definition
-examine the correlations among the IVs that you have measured
-run diagnostic stats that assess collinearity: tolerance=a measure of collinearity, varies from 0 to 1, small values indicate collinearity, and good rule of thumb is a value of .2 or less; variance of inflation factor=another measure of collinearity, don't want too large |
|
|
Term
|
Definition
-want to select IVs that give us an efficient regression equation w/o including everything under the sun
-have at least 15 participants for every one predictor
-forward, backward, and stepwise |
|
|
Term
|
Definition
-computer puts the one with greatest relationship with y
-starts with nothing in the model |
|
|
Term
|
Definition
-starts with all predictors in the model
-shoots out the ones that aren't significant
-run both backward and forward and you will get the same result usually |
|
|
Term
|
Definition
-a variation of forward selection, yet it's a better method (with caution)
-enters predictors, then looks at subsets and could spit out the predictors that are significant |
|
|
Term
Nonparametric tests
Use when... |
|
Definition
-when a continuous DV doesn't meet the assumptions required in a parametric test
-when data are ordinal or ranked |
|
|
Term
|
Definition
-other verison of the Pearson correlation coefficient
-used to calculate correlation coefficients when one or both of the continuous variables is either ordinal or there are extreme values in your data |
|
|
Term
Wilcoxon Signed-Ranks test |
|
Definition
-alternative to the paired t-test
-use when your DV is nonparametric and you are using one measure on two diff occasions (ex. pre/post tests) |
|
|
Term
|
Definition
-used in place of independent t-test or one-way ANOVA
-use when DV is nonparametric and the group variable (IV) has only TWO levels |
|
|
Term
|
Definition
-use in place of one-way ANOVA when you DV is nonparametric and the group variable (IV) has THREE OR MORE levels |
|
|
Term
|
Definition
-like linear regression except the DV (y) is nominal
-relationship between the DV (y) and an IV (x) is expressed by the odds ratio
-odds ratio is a measure of association
-P/1-P=odds event occured
-code 1 to the adverse event to make it easier to explain |
|
|
Term
|
Definition
-where you assign a 0 or a 1 to a variable with two levels |
|
|
Term
What are the standardized regression coefficients? |
|
Definition
-SPSS refers to the standardized regression coefficients as beta
|
|
|
Term
Chi-squared test
Assumptions |
|
Definition
-data should be a representative sample (randomly drawn if possible)
-data must be categorical (nominal data)
-the individual observations must be independent of each other |
|
|
Term
|
Definition
-when a cell (# of the frequency of observations in that category) in a 2 row and 2 column analysis contains at LEAST 5 observations
-Fisher's exact test may replace the chi-squared test when the above requirements are not met |
|
|
Term
Chi-squared goodness-of-fit test |
|
Definition
-a chi-squared test on different levels of a single categorical variable
-look at equation |
|
|
Term
Chi-squared test of independence |
|
Definition
-test on different levels of two categorical variables
-estimating whether two categorical variables are independent or not |
|
|