26 010 557 26 620 557 social science research methods
play

26:010:557 / 26:620:557 Social Science Research Methods Dr. Peter - PowerPoint PPT Presentation

26:010:557 / 26:620:557 Social Science Research Methods Dr. Peter R. Gillett Associate Professor Department of Accounting & Information Systems Rutgers Business School Newark & New Brunswick Dr. Peter R Gillett February 24, 2006


  1. 26:010:557 / 26:620:557 Social Science Research Methods Dr. Peter R. Gillett Associate Professor Department of Accounting & Information Systems Rutgers Business School – Newark & New Brunswick Dr. Peter R Gillett February 24, 2006 1

  2. Overview I Measurement Theory I Research Design I Internal Validity I External Validity I Research Design Principles I Experimental Validity � Internal, External, Construct, Statistical Conclusion Dr. Peter R Gillett February 24, 2006 2

  3. Research Design I Research Design is the plan and structure of investigation � Framework � Organization � Configuration of elements I Research Design has two purposes � To answer research questions � To control variance N Experimental N Extraneous N Error Dr. Peter R Gillett February 24, 2006 3

  4. Research Design I Research Design tells us � What observations to make � How to make them � How to analyze their quantitative representations I Recall that power = 1 – Beta risk = probability of correctly rejecting a false null hypothesis Dr. Peter R Gillett February 24, 2006 4

  5. Research Design I Example of detecting admissions discrimination � Simple design randomly assigns males or females to colleges and compares admission rates � Factorial design crosses Gender with three levels of Ability (in this case both active variables) and tests interaction � Note that parallel tests at different levels of ability would not be as clear evidence Dr. Peter R Gillett February 24, 2006 5

  6. Research Design I Maxmincon � Maximize systematic variance � Minimize error variance � Control extraneous variance I N.B. Here we are considering the variance of the dependent variable Dr. Peter R Gillett February 24, 2006 6

  7. Research Design I Experimental variance � Design plan and conduct research so that the experimental conditions are as different as possible I Extraneous variance � Choose participants that are as homogeneous as possible on extraneous independent variables � Whenever possible, assign subjects to experimental groups and conditions randomly, and assign conditions and other factors to experimental groups randomly � Control extraneous variables by building them into the design � Match participants and assign them to experimental groups at random I Error variance � Reduce errors � Increase reliability of measures Dr. Peter R Gillett February 24, 2006 7

  8. Research Design I Experiment � In an experiment, the researcher manipulates or controls one or more of the independent variables I Nonexperiments � In nonexperimental research the nature of the variables precludes manipulation (e.g., sex, intelligence, occupation) I “The ideal of science is the controlled experiment” (K&L, p. 467) Dr. Peter R Gillett February 24, 2006 8

  9. Research Design I Four “faulty” designs � Notation: N X ² X is manipulated N ~X ² X is not manipulated (i.e., subject not given X) N (X) ² X is not manipulated but measured or imagined Dr. Peter R Gillett February 24, 2006 9

  10. Research Design I Design 19.1: One Group � (a) X Y (Experimental) � (b) (X) Y (Nonexperimental) I “One Shot Case Study” I Scientifically worthless Dr. Peter R Gillett February 24, 2006 10

  11. Research Design I Design 19.2: One Group, Before – After (Pretest – Posttest) � (a) Y b X Y a (Experimental) � (b) Y b (X) Y a (Nonexperimental) I Group is compared to itself I Measurement, history, maturation, regression Dr. Peter R Gillett February 24, 2006 11

  12. Research Design I Design 19.3: Simulated Before – After X Y � Y b I Here we cannot tell whether the two groups were equivalent before X Dr. Peter R Gillett February 24, 2006 12

  13. Research Design I Design 19.4: Two Groups, No Control � (a) X Y (Experimental) ~X ~Y � (b) (X) Y (Nonexperimental) (~X) ~Y I Groups are assumed equal on all other variables Dr. Peter R Gillett February 24, 2006 13

  14. Research Design I Criteria � Does the design adequately test the hypothesis � Does the design adequately control independent variables N Randomize whenever possible ² Select participants at random ² Assign participants to groups at random ² Assign experimental treatments to groups at random N Control independent variables so that extraneous unwanted sources of systematic variance have minimal opportunity to operate � Generalizability Dr. Peter R Gillett February 24, 2006 14

  15. Research Design I Internal and External Validity � Campbell 1957, Campbell and Stanley 1963 � The primary yardsticks by which the quality of research contributions is judged � These goals can and do conflict with each other I Internal Validity � Did the experimental manipulation really make a significant difference? Dr. Peter R Gillett February 24, 2006 15

  16. Internal Validity Threats / alternative explanations I � Measurement N Measuring participants changes them � History N Events occurring in the specific experimental situation may have influenced the outcome � Maturation N Subjects generally may have changed or grown over time � Statistical Regression N Regression towards the mean � Instrumentation N Changes in the measurement device, instrument or process � Selection N Characteristics of the subjects selected could have influenced the outcome � Attrition / experimental mortality N Loss of subjects in some treatments or with certain characteristics � Interaction Dr. Peter R Gillett February 24, 2006 16

  17. Internal Validity I In a longitudinal study, we take repeated measurements of subjects at different points in time � What are the strengths and weaknesses of such studies as regards internal validity? Dr. Peter R Gillett February 24, 2006 17

  18. Internal & External Validity I “Campbell and Stanley (1963) say that internal validity is the sine qua non of research design, but that the ideal design should be strong in both internal and external validity, even though they are frequently contradictory.” (K&L, p. 477) Dr. Peter R Gillett February 24, 2006 18

  19. External Validity I To what populations can the conclusions from an experiment be generalized � Representativeness N Ecological representativeness N Variable representativeness I Threats � Reactive / interaction effects of testing � Interaction effects of selection biases � Reactive effects of experimental arrangements � Multiple-treatment interference could all have influenced outcomes and therefore compromise generalizability beyond the subjects actually studied Dr. Peter R Gillett February 24, 2006 19

  20. Research Design Principles I Design is data discipline I A design is formally a subset of the Cartesian product of the independent variable(s) and the dependent variable I A complete design is based on a cross-partition of the independent variables I We will not discuss incomplete designs I Analysis of variance is a statistical technique appropriate for experimental designs that is not appropriate if participants cannot be assigned at random and there are unequal numbers of cases in the cells of the factorial design Dr. Peter R Gillett February 24, 2006 20

  21. Research Design Principles I Design 20.1: Experimental Group – Control Group: Randomized Participants � [R] X Y (Experimental) ~X Y (Control) I “Best” for many purposes Dr. Peter R Gillett February 24, 2006 21

  22. Research Design Principles I Control group � Formerly meant exclusively the group that did not receive a treatment � This is less obvious when there are multiple levels of treatment � More generally, now, it means the particular group against which comparisons are made Dr. Peter R Gillett February 24, 2006 22

  23. Research Design Principles I Design 20.2: Experimental Group – Control Group: Matched Participants � [M r ] X Y (Experimental) ~X Y (Control) I Participants are matched on one or more attributes and randomly assigned to the two groups Dr. Peter R Gillett February 24, 2006 23

  24. Research Design Principles I Matching versus Randomization � Randomization is preferred � In practice, matching may be necessary N Matching by equating participants N Frequency distribution matching method ² Can be tricky with multiple variables N Holding variables constant N Incorporating nuisance variables into the research design N Participants acting as own controls � Matching is only relevant when the variables are correlated with the dependent variable Dr. Peter R Gillett February 24, 2006 24

  25. Research Design Principles I Design 20.3: Before and After Control Group (Pretest – Posttest) � (a) [R] Y b X Y a (Experimental) Y b ~X Y a (Control) � (b) [M r ] Y b X Y a (Experimental) Y b ~X Y a (Control) Dr. Peter R Gillett February 24, 2006 25

  26. Research Design Principles I Design 20.3 supplies a control group against which the difference Y a – Y b can be checked I However, difference scores are problematic unless the experimental effect is strong I In addition, the pretest can have a sensitizing effect on participants, which decreases both internal and external validity � Pretests should be avoided when the testing procedures are unusual Dr. Peter R Gillett February 24, 2006 26

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend