correlation and correlational research
play

Correlation and Correlational Research Chapter 5 The Two - PowerPoint PPT Presentation

Correlation and Correlational Research Chapter 5 The Two Disciplines of Scientific Psychology Lee Cronbach APA Presidential Address Fundamentals of Correlation correlations reveal the degree of statistical association between two


  1. Correlation and Correlational Research Chapter 5

  2. The Two Disciplines of Scientific Psychology • Lee Cronbach • APA Presidential Address

  3. Fundamentals of Correlation • correlations reveal the degree of statistical association between two variables • used in both experimental and non-experimental research designs • Correlational/Non-Experimental research • establishes whether naturally occurring variables are statistically related

  4. Correlational Research • in correlational research, variables are measured rather than manipulated • manipulation is the hallmark of experimentation which enables researchers to draw causal inferences • distinction between measurement and manipulation drives the oft-cited mantra “correlation does not equal causation”

  5. Direction of Relationship Positive • two variables tend to increase or decrease together • higher scores on one variable on average are associated with higher scores on the other variable • lower scores on one variable on average are associated with lower scores on the other variable • e.g., relationship between job satisfaction and income

  6. Direction of Relationship Negative • two variables tend to move in opposite directions • higher scores on one variable are on average associated with lower scores on the other variable • lower scores on one variable are on average associated with higher scores on the other variable • e.g., relationship between hours video game playing and hours reading

  7. Hypothetical Data Participant Weekly Hours of Perceived Crime Trust in TV Watched Risk (%) Other People (X) (Y1) (Y2) Wilma 2 10 22 Jacob 2 40 11 Carlos 4 20 18 Shonda 4 30 14 Alex 5 30 10 Rita 6 50 12 Mike 9 70 7 Kyoko 11 60 9 Robert 11 80 10 Deborah 19 70 6

  8. Graphing Bivariate Relationships Scatterplots/Scattergram a two-dimensional graph • values of one of the variables are plotted on the horizontal axis (labelled as X and known as the abscissa) • values of the other observations are plotted on the vertical axis (often labelled as Y and known as the ordinate)

  9. Positive (Direct) Relationship Negative (Inverse) Relationship

  10. Calculating Correlations Depends on scale of measurement Pearson product-moment correlation coefficient • Pearson’s r • Variables measured on interval or ratio scale Spearman’s rank -order correlation coefficient • Spearman’s rho • One or both variables measured on ordinal scale

  11. Pearson’s r Ordinal or Ratio Scales • based on a ratio that involve the covariance and standard deviations of the two variables (X and Y) • the covariance is a number that reflects degree to which two variables vary together • as with variance, covariance calculation differs for populations and samples • deal with population calculations

  12. Pearson’s r Covariance -- Definitional Formula 𝜏 𝑌𝑍 = 𝑌 − 𝜈 𝑌 )(𝑍 − 𝜈 𝑍 𝑂 Standard Deviation-- Definitional Formula (𝑌−𝜈 𝑌 ) 2 𝜏 𝑌 = 𝜏 𝑍 = 𝑂 (𝑍−𝜈 𝑍 ) 2 Pearson’s r 𝑂 𝑌𝑍 = 𝜏 𝑌𝑍 𝑠 𝜏 𝑌 𝜏 𝑍

  13. Spearman Rank-Ordered Correlation Based on Ranks for Each of the Two Variables 𝑇𝑞𝑓𝑏𝑠𝑛𝑏𝑜 = 𝜏 𝑌𝑍 𝑠 𝜏 𝑌 𝜏 𝑍 If no tied ranks then can use simplified formula 6 𝐸 2 𝑠 𝑇𝑞𝑓𝑏𝑠𝑛𝑏𝑜 = 1 − 𝑂(𝑂 2 −1)

  14. Interpreting Magnitude of Correlations • In addition to considering the direction of the relationship (i.e., positive or negative), we need to attend to the strength of the relationship. • correlation only takes on limited range of values −1.00 ≤ 𝑠 ≤ +1.00 • absolute value reflects strength/degree of relationship between two variables

  15. Interpreting Magnitude of Correlations • square of the correlation coefficient • 𝑠 2 • aka coefficient of determination • proportion of variability in one variable that can be accounted for through the linear relationship with the other variable thus 𝑠 2 = .8 2 = .64 as does 𝑠 2 = −.8 2 = .64 •

  16. Interpreting Magnitude of Correlations Cohen’s Guidelines • Is the relationship between two variables weak? Moderate? Strong? Guidelines from Absolute value Cohen (1988) of r Weak .10 - .29 Moderate .30 - .49 Strong > .50

  17. Interpreting Magnitude of Correlation Coefficient of determination • If a psychological researcher reports a correlation of .33 between integrity and job performance, can one say that the two variables are 33% related? • No • r 2 ( coefficient of determination ) reveals how much of the differences in Y scores are attributable to differences in X scores • .33 2 = .1089 • so only about 11% of the variability is accounted for

  18. Nonlinear Relationships • magnitude of the correlation coefficient influenced by degree on non- linearity r = 0 test performance sleepy alert panic Alertness • can assess the strength of non-linear relationships with alternative statistical procedures such as 𝜁 2

  19. Range Restriction

  20. Correlation And Causation • Bidirectionality Issue • Third Variable Problem

  21. Bidirectionality Problem Correlation between Religiosity GPA Religiosity and GPA Religiosity Causes GPA GPA Religiosity GPA Causes Religiosity Religiosity GPA

  22. Third-Variable Problem Correlation between Religiosity and GPA Religiosity GPA Parenting Style • spurious relationship

  23. Strategies to Reduce Causal Ambiguity in Correlational Research Statistical approaches • measure and statistically control for a third variable • partial correlation analysis • e.g., relationship between right-hand palm size (X) & verbal ability (Y) 𝑠 𝑌𝑍 = 0.70 • perhaps a spurious relationship caused by a common third variable – age (Z) 𝑠 𝑌𝑎 = 0.90 𝑠 𝑍𝑎 = 0.80 𝑠 𝑌𝑍∙𝑎 = −0.076

  24. Research Designs • Cross-Sectional Designs • bidirectionality potential problem • Prospective Longitudinal design • X measured at Time 1 • Y measured at Time 2 • Rules out bidirectionality problem • Cross-lagged panel design • Measure X and Y at Time 1 • Repeat X and Y measurement at Time 2 • Examine pattern of relationships (i.e., cross-lagged correlations) across variables and time

  25. Cross-Lagged Panel Design Eron et al., 1972

  26. Drawing Causal Conclusions • How do we rule out all plausible third variables (confounds) using correlational research designs? • We can’ t – only the control afforded by rigorous experimentation provides strong tests of causation • as noted by some recent researchers employing such designs: “ longitudinal correlational research can be used to compare the relative plausibility of alternative causal perspectives” but they “ do not provide a strong test of causation”

  27. Correlation/Regression and Prediction • A goal of science is to forecast future events • In simple linear regression , scores on X can be used to predict scores on Y assuming a meaningful relationship ( r ) has been established between X and Y in past research

  28. Linear Regression • interest in predicting scores on one variable (Y) based upon linear relationship with another variable (X) • X is the predictor; Y is the criterion

  29. Regression Equation • based on formula for straight line 𝑍 = 𝑏 + 𝑐𝑌 where 𝑍 is the predicted value of Y for a given value of X a is the Y-intercept (i.e., 𝑍 for X = 0) b is the slope of the regression line • can be plotted on scatterplot

  30. Regression Equation - Calculation • need to calculate values for • a – the y-intercept and • b – the slope 𝑐 = 𝐷𝑝𝑤𝑏𝑠𝑗𝑏𝑜𝑑𝑓 𝑌𝑍 = r ∗ 𝑇𝐸 𝑍 𝑊𝑏𝑠𝑗𝑏𝑜𝑑𝑓 𝑌 𝑇𝐸 𝑌 𝑏 = 𝑍 − 𝑐 𝑌

  31. Interpreting Regression Equation For example assume were looking at the relationship between how many children a couple has (Y) and the number of years they’ve been married. From a sample we calculate the following: 𝑍 = −0.84 + 1.21𝑌  thus, if a couple is married for 0 years we would predict that they would have -0.84 of a child  for each year they’re married we’d expect couple to have an additional 1.21 children

  32. Multiple (Linear) Regression • Multiple predictors are used to predict a criterion measure • ideally want as little overlap as possible between predictors (X’s) • i.e., want each predictor to account for unique variance in criterion (Y) 𝑍 = 𝑏 + 𝑐 1 𝑌 1 + 𝑐 2 𝑌 2 + … . +𝑐 𝑙 𝑌 𝑙

  33. Multiple Regression Example - One Criterion (Y) and Three Predictors (s) 𝐼𝑓𝑠𝑓 𝑞𝑠𝑓𝑒𝑗𝑑𝑢𝑝𝑠𝑡 𝑏𝑠𝑓 𝑑𝑝𝑠𝑠𝑓𝑚𝑏𝑢𝑓𝑒 𝐼𝑓𝑠𝑓 𝑞𝑠𝑓𝑒𝑗𝑑𝑢𝑝𝑠𝑡 𝑏𝑠𝑓 𝑣𝑜𝑑𝑝𝑠𝑠𝑓𝑚𝑏𝑢𝑓𝑒 Structured Structured Interview Interview General General Criterion Criterion CAT CAT Work Work Sample Sample ideally want to avoid multicollinearity in order to maximize prediction

  34. Benefits of Correlational Research • prediction in everyday life • test validation • broad range of applications • establishing relationship • convergence with experiments

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend