survey design
play

Survey Design Points to - PDF document

Survey Design Points to include: a) how you will recruit the participants, including the number of participants, and any inclusion or exclusion criteria b) how you will


  1. Survey Design ����������������������� Points to include: a) how you will recruit the participants, including the number of participants, and any inclusion or exclusion criteria b) how you will collect data from the participants: interviews? questionnaire? focus groups? c) method of recruitment and data collection: telephone, face to face, mail, email, web-based d) what are the outcome and predictor variables you’ll measure and how will you measure them? e) what the data analysis would look like (qualitative? quantitative? looking at differences (t-tests) or relationships (e.g., correlational) ����������������������� 1

  2. Survey Design - Sampling ����������������������� Census vs Sample A census involves every (or virtually every) member of the population of interest - extremely rare A sample is a subset of the whole population of interest, which is hopefully representative of the whole population , so you can infer from the sample to the population ����������������������� 1

  3. Response Rate • What is it? • How is it calculated? • Is it important? Why/why not? • What affects it? ����������������������� Response Rate • How is it calculated? Usually: All usable responses All people invited to participate – those with really good excuses ����������������������� 2

  4. Response Rate • What is it? • How is it calculated? • Is it important? Why/why not? • What affects it? ����������������������� Response Rate • What affects it? - questionnaire/interview length - attractiveness/incentives - importance /authority - follow up phone calls - ease of returning - competition (heavily researched / particularly busy groups) • What can you do about them? ����������������������� 3

  5. Survey Design – Reliability & Validity ����������������������� Validity Validity refers to the degree to which a study accurately reflects or assesses the specific concept that the researcher is attempting to measure. • http://www.niu.edu/assessment/Resources/Assessment_Glossary.htm ����������������������� 1

  6. Validity - Exercise • Imagine we are trying to ascertain high school students’ usual level of alcohol consumption (to measure impact on their performance/ classroom experience). How valid would it be to …. ( Rank the options ) • ����������������������� Validity - Exercise 1. Ask their parents how much the student drinks? 2. Use a breathalyser to measure their blood alcohol content in class? 3. Go through the family’s rubbish and count number of empty alcohol bottles/cans 4. Use pen and paper questionnaire to ask students how much they’re drinking? 1. Asking them in number of pints of beer/bottles, cans of pre- mixed spirits etc. 2. Asking them in g of ethanol 3. Asking them in standard drinks of alcohol ����������������������� 2

  7. Reliability Refers to the reproducibility of results with any criterion or method. http://www.washington.edu/admin/hr/ocpsp/prostaff/researchsciengr/glossar y.html ����������������������� Reliability The measure of consistency for an assessment instrument. The instrument should yield similar results over time with similar populations in similar circumstances. http://www.k12.hi.us/~atr/evaluation/glossary.htm ����������������������� 3

  8. Reliability • Think about a summative essay in a History class 1. What is this assessment designed to measure? ����������������������� Reliability • Think about a teacher/lecturer marking 100 students’ History essays 2. What factors might influence the grade the teacher/lecturer gives to the student? ����������������������� 4

  9. Reliability 3. What does this suggest about the reliability and/or validity of teacher/lecturer grading? 4. What could you do to increase reliability? ����������������������� Validity and Reliability • Reliability is necessary for validity. Something can’t be valid if it isn’t reliable. (e.g., a clock that gains 5 mins a day) • But something can be reliable, but not valid (e.g., a clock that’s always 30 minutes fast) ����������������������� 5

  10. What is a confound? ����������������������� What is a confound? • A factor that is related in some way to the outcome variable and one or more of your predictor variables • May be an alternate explanation for a finding ����������������������� 6

  11. What is a confound? • E.g., a survey finds the less teeth you have, the slower your reaction time • The researchers conclude that oral health is vital for good drivers • Confound? ����������������������� What is a confound? • E.g., a survey finds that moderate alcohol consumers are in better health than abstainers • The researchers conclude that moderate alcohol consumption is good for your health • Confound(s)? ����������������������� 7

  12. What can you do about confounds? ����������������������� What can you do about confounds? • Measure them! • Statistically control for them in your data analysis (e.g., partial correlation, multiple regression analysis) ����������������������� 8

  13. Survey Design - Methods ����������������������� Methods of Recruitment/ Data Collection e.g., postal questionnaire … what else? ����������������������� 1

  14. Methods of Recruitment/ Data Collection • postal questionnaire • email questionnaire • web-based questionnaire (e.g., survey monkey, email recruitment?) • face-to-face interview • focus group • on the spot questionnaire • anything else? ����������������������� Postal Questionnaire • what do you think are the considerations in terms of the guiding principles? • has anyone had any good/bad experiences with this method (conducting or participating)? • advantages/disadvantages - when would you use it, when would you not use it? • how would you boost the response rate for this method? ����������������������� 2

  15. Email / Web-based Questionnaire • what do you think are the considerations in terms of the guiding principles? • has anyone had any good/bad experiences with this method (conducting or participating)? • advantages/disadvantages - when would you use it, when would you not use it? • how would you boost the response rate for this method? ����������������������� Face to face interview • what do you think are the considerations in terms of the guiding principles? • has anyone had any good/bad experiences with this method (conducting or participating)? • advantages/disadvantages - when would you use it, when would you not use it? • how would you boost the response rate for this method? ����������������������� 3

  16. Focus group • what do you think are the considerations in terms of the guiding principles? • has anyone had any good/bad experiences with this method (conducting or participating)? • advantages/disadvantages - when would you use it, when would you not use it? • how would you boost the response rate for this method? ����������������������� On the Spot Questionnaire • what do you think are the considerations in terms of the guiding principles? • has anyone had any good/bad experiences with this method (conducting or participating)? • advantages/disadvantages - when would you use it, when would you not use it? • how would you boost the response rate for this method? ����������������������� 4

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend