Collecting information from people Kristian Wiklund Photo - - PowerPoint PPT Presentation

collecting information
SMART_READER_LITE
LIVE PREVIEW

Collecting information from people Kristian Wiklund Photo - - PowerPoint PPT Presentation

Collecting information from people Kristian Wiklund Photo CC-BY-2.0 Richard Riley Kristian Wiklund | 20134-03-05 | Page 1 http://www.flickr.com/photos/rileyroxx/169900126/ Outline What is a survey? Different ways of collecting


slide-1
SLIDE 1

Kristian Wiklund | 20134-03-05 | Page 1

Collecting information from people

Photo CC-BY-2.0 Richard Riley http://www.flickr.com/photos/rileyroxx/169900126/

Kristian Wiklund

slide-2
SLIDE 2

Kristian Wiklund | 20134-03-05 | Page 2

Outline

› What is a survey? › Different ways of collecting data from people

– Questionnaires, (Qualitative) Interviews, Focus Groups

› Break with discussion › Design: Survey design and How to not ask questions

– Total survey quality – Sampling and Frames – Non-sampling validity threats (valid for all designs above) › Self-reporting, scales, order, …

slide-3
SLIDE 3

Kristian Wiklund | 20134-03-05 | Page 3

What is a survey?

› A survey concerns a set of

  • bjects comprising a population

– The goal of the survey is to describe the population in terms of

  • ne or more measurable properties

– To do this, a sampling frame is needed, from which a sample of the population is selected to be included in the survey

› The survey makes observations on the sample

– The observed measurements are used to make inferences on the population

Population Sampling Frame Sample Describe the population

[Stat100]

slide-4
SLIDE 4

Kristian Wiklund | 20134-03-05 | Page 4

Why Perform a survey?

› Create a profile of a group as whole

– The design of the survey makes it possible to generalize the answers of a selected few to the group

› Types of research

– Exploratory - ‘What is going on here?' – Descriptive - ‘What are the characteristics of the population?’ – Explanatory – ‘Why is something happening?’ – Predictive – ‘What is the likelihood of something happening?’

slide-5
SLIDE 5

Kristian Wiklund | 20134-03-05 | Page 5

Data Collection Modes

Structured: Following a questionnaire Free form: Having a nice chat Semi-structured: A base set of questions and clarification as needed

All interviews conducted in exactly the same way, no explanations, no clarifications. “What does it mean to you?” Self-administered questionnaires

Flexibility

More options for interpretation

  • What are the

reactions of the respondent? “Discourse Analysis”

Questionnaires Focus Groups

slide-6
SLIDE 6

Kristian Wiklund | 20134-03-05 | Page 6

Questionnaires

CC-BY-NC Ikhlasul Amal http://www.flickr.com/photos/21372148@N00/2443194039

slide-7
SLIDE 7

Kristian Wiklund | 20134-03-05 | Page 7

The Problem with Questionnaires…

“Let’s put together a questionnaire and send to some people and check…”

› Ideal situation:

– Researcher writes a number of questions, distributes the survey on Twitter, drinks some coffee and writes a couple of journal papers waiting for the data to drop in – Hordes of willing respondents answer the questions – Researcher analyses the data with Excel (it basically analyses itself), writes a paper with some nice bar charts, sends the paper to ICSE – Paper accepted, nice trip to Florence, wine, best paper award

slide-8
SLIDE 8

Kristian Wiklund | 20134-03-05 | Page 8

The Problem with Questionnaires…

In reality:

– Researcher intent is encoded into questions › Was that done right? – Researcher finds subjects and convinces them to answer › Was the selection done right? – Questions transmit researcher intent to subject › Were they interpreted correctly? – Subject decides to answer and formulates an answer › Is the answer reliable? – Researcher encodes the answers into something that can be analyzed › Was that done right? – Data is analyzed › Was that done right?

slide-9
SLIDE 9

Conducting Interviews

slide-10
SLIDE 10

Kristian Wiklund | 20134-03-05 | Page 10

Main challenges

› Access to subjects › Performing the interviews without introducing bias › Reliable collection of data

CC-BY-NC-SA Gabe McIntyre http://www.flickr.com/photos/38366783@N00/2617316249

slide-11
SLIDE 11

Kristian Wiklund | 20134-03-05 | Page 11

Validity

› “The key to successful interviewing is learning how to probe effectively… › ...that is, to stimulate an informant to produce more Information… › ...without injecting yourself so much into the interaction that you only get a reflection of yourself in the data.”

[Weiss2000]

› The same validity issues as with questionnaire design applies

– ...and will be described in detail later

› In addition to this, there is a clear risk for interviewer bias › Useful skills

– Active listening

› http://www.babblingengineer.com/com munication/how-to-improve-your- active-listening/

– Coaching

slide-12
SLIDE 12

Kristian Wiklund | 20134-03-05 | Page 12

http://www.faculty.londondeanery.ac.uk/e-learning/appraisal/skilful-questioning-and-active-listening

Allow people to answer in their own terms Use non-leading, open, questions to get started, probe for details later Let the subject exhaust one question before moving on Have enough time allocated

slide-13
SLIDE 13

Kristian Wiklund | 20134-03-05 | Page 13

Recording

› I strongly suggest that the interviews are recorded

– Informed consent

› Multiple reasons:

– You will not capture everything in your notes

› Not even if someone else takes the notes

– Self-evaluation

› Did you introduce bias? › Was the design followed? › Personal quirks (“uuuhhhmm…”)

– Capture “how” and not only “what”

http://www.flickr.com/photos/labanex/8668665270/

slide-14
SLIDE 14

Kristian Wiklund | 20134-03-05 | Page 14

Recording

› Tooling:

– Additional computer works okay, separate microphone recommended, but I have had success with the built in microphone

› Transcription

– Get software support for transcription

› I use Express Scribe on a Mac

– Transcription takes time, 2-10 times the interview time

› And you will likely feel silly for most of the time listening to yourself 

http://www.flickr.com/photos/strandarchives/9273941774/in/photostream/

slide-15
SLIDE 15

Focus Groups

slide-16
SLIDE 16

Kristian Wiklund | 20134-03-05 | Page 16

› “A focus group is a carefully planned discussion designed to

  • btain perceptions on a defined

area of interest in a permissive, non-threatening environment” › ...or what most people in industry would call a “workshop”

– Which makes this a very marketable skill.

CC-BY-SA-NC 2.0 Some rights reserved by Nebraska Library Commission [Fisk2005]

slide-17
SLIDE 17

Kristian Wiklund | 20134-03-05 | Page 17

When to use

› Use for:

–Qualitative information –Insights into new area –Preparation for a larger, more formal, study

› Don’t use for:

–Quantitative information –Confidential information –Situations that may go

  • ut of control

› Emotionally charged discussions › Participants with an agenda of their own

slide-18
SLIDE 18

Kristian Wiklund | 20134-03-05 | Page 18

Design Items

› Objectives and questions

– as important here as for interviews and questionnaires

› Agenda for the focus group meeting

– Introduction, questions, summary

› Facilitator

– neutral person with sufficient domain knowledge and capability to move the work forward – not necessarily the researcher

› Note-taker/Assistant facilitator › Group member selection

– 6 to 12 people, without dependencies such as manager/employee

› Location

– “Not the office”

slide-19
SLIDE 19

Kristian Wiklund | 20134-03-05 | Page 19

How to run the meeting

› Basically the same skills and risks as for interviews › Group bias risk:

–A group will easily bike- shed on issues that are not really relevant for the research

› Parkinson’s Law of Triviality: › “Briefly stated, it means that the time spent on any item of the agenda will be in inverse proportion to the sum involved.” [Parkinson1957]

slide-20
SLIDE 20

Kristian Wiklund | 20134-03-05 | Page 20

› For each question, iterate:

–Open question –Deeper follow up probes to keep the ball rolling –Summary of the findings by the moderator

For more information: http://www.hse.gov.uk/stress/standards/pdfs/focusgroups.pdf and http://www.cse.lehigh.edu/~glennb/mm/FocusGroups.htm

http://www.faculty.londondeanery.ac.uk/e-learning/appraisal/skilful-questioning-and-active-listening
slide-21
SLIDE 21

Kristian Wiklund | 20134-03-05 | Page 21

Coffee Break + Discussion

› Self-organize a group and discuss:

– Is there anything in your research area that can be answered by › A survey › Interviews › Focus groups – Why? Why not? – What are the main challenges in conducting this type of study in your area?

slide-22
SLIDE 22

Survey and Questionnaire design

slide-23
SLIDE 23

Kristian Wiklund | 20134-03-05 | Page 23

The Survey Challenge

› The main problems with surveys is that the complexity commonly is under-estimated.

– It is commonly perceived to be trivial to do a survey. – In reality, there are limitless ways to go wrong.

› A survey is a fixed design, which means that we have to live with our mistakes. › Hence, it is very important to be aware of the potential problems.

slide-24
SLIDE 24

Kristian Wiklund | 20134-03-05 | Page 24

What is survey research?

› Survey research is a branch of statistics

– This is evident when the theoretical background is studied › Statistical concepts such as sampling theory is well developed, while psychological concepts such as how to ask questions is still being developed – For a very long time it was assumed that if a survey was designed correctly with respect to sampling, it would be correct. – Now it is known that there are a lot of factors influencing the success of the survey › A survey should be designed to minimize the total survey error (Biemer2011)

slide-25
SLIDE 25

Kristian Wiklund | 20134-03-05 | Page 25

Total Survey Error

(After Biemer2011) Total Survey Error

Mean Squared Error Bias2 + Variance Variable Error Systematic Error Variance Bias

Sampling Error

Sampling Scheme Sample Size Estimator Choice

Non-sampling Error

Specification Nonresponse Frame Measurement Data processing

Many of the contributors to TSE are not possible to measure directly.

slide-26
SLIDE 26

Kristian Wiklund | 20134-03-05 | Page 26

Total Survey Error

› The sampling design influences the TSE

– Frame, number of samples, stratification

› The design of the survey instrument influences the TSE

– How questions are formulated influences the result – The order of the questions influences the result – The sensitivity of the issue influences the result – Options and ranges given as possible answers influences the result

› The way the survey is administered influences the TSE

– On-line, phone, face to face, paper

› The personal traits of the researcher influences the TSE

– Language, dialect, gender, ethnicity, haircut, glasses, clothes

slide-27
SLIDE 27

Kristian Wiklund | 20134-03-05 | Page 28

Low Variance High Variance [Fortmann-Roe] High Bias Low Bias

slide-28
SLIDE 28

Kristian Wiklund | 20134-03-05 | Page 29

Response Rate

› It is important to remember that

– Response rate alone cannot be used as a quality indicator! – The keywords in the newspaper article is “risk” and “could”, meaning “further analysis is required” › If the responses are received from all strata in the population, we get a low bias and can manage the variance with statistics

http://www.dn.se/nyheter/sverige/resultaten-kan-bli-snedvridna

slide-29
SLIDE 29

Kristian Wiklund | 20134-03-05 | Page 30

SURV RVEY EY DES ESIGN IGN P PRO ROCES ESS

slide-30
SLIDE 30

Kristian Wiklund | 20134-03-05 | Page 31

Survey Design Process

› Basically, the same process as all other research:

– Set the goals - What do you want to capture? – Decide on the target population and sample size - Who will you ask? – Determine the questions- What will you ask? – Pre-test the survey - Test the questions – Conduct the survey - Ask the questions – Analyze the data collected - Produce the report [Kuter2001]

slide-31
SLIDE 31

Kristian Wiklund | 20134-03-05 | Page 32

After [Biemer2003]

Survey Design Process

Design Validation Research Objectives Concepts How to collect data Sampling Design Questionnaire Design Planning for data collection and processing Data Collection and Processing Analysis

slide-32
SLIDE 32

Kristian Wiklund | 20134-03-05 | Page 33

SAM SAMPL PLING, F , FRAM AMES, S, STRA TRATIF TIFIC ICATIO TION, RESPO SPONSE NSE RA RATE TE

slide-33
SLIDE 33

Kristian Wiklund | 20134-03-05 | Page 34

Truman defeats Dewey

› Gallup predicted a victory for Dewey

– Printing lead times required printing before counting was complete

› Lesson learned:

– The sampling frame was wrong. By using telephones (in 1948) to do the survey, the group likely to vote for Truman was largely excluded. – A cattle feed company selling feed sacks decorated with elephants and donkeys got a better result. [Curran2002]

slide-34
SLIDE 34

Kristian Wiklund | 20134-03-05 | Page 35

Dewey/Truman Sample Frames

› Gallup frame › Feed company frame

People Having phones People Buying Animal feed

Mostly rich people, likely to vote for Republicans In the 1940’s, all social groups needed animal feed

slide-35
SLIDE 35

Kristian Wiklund | 20134-03-05 | Page 36

Sampling

› Probability sampling

– Each object in the frame have equal probability of being chosen – Frames carefully designed to be representative of the population – Results can be statistically analyzed

› Non-probability sampling

– Not representative of the population – Convenience sampling › Take what you get, such as web visitor surveys – Snowball sampling – Results can generally not be statistically analyzed [Sommer2006]

slide-36
SLIDE 36

Kristian Wiklund | 20134-03-05 | Page 37

“Snowball” sampling

› Snowball sampling relies on spreading a survey via contacts

  • r word-of-mouth in a community

– Do we know what the frame is? – Do we know what the population is? – Is it possible to generalize the findings?

› Snowball sampling still has to be used sometimes, for example when a community is hard to reach

– Drug users, prostitutes, black hat hackers, …

[Sommer2006]

http://www.flickr.com/photos/juniorvelo/2200372991/lightbox/

slide-37
SLIDE 37

Kristian Wiklund | 20134-03-05 | Page 38

Statistics

The following equations are used to calculate sample size. All calculations for proportions can be done with “normal statistics”, using “Yes=1, No=0” For the details, please read [Stat100] and [Stat414]

slide-38
SLIDE 38

Kristian Wiklund | 20134-03-05 | Page 39

Sample Size Estimate

› For proportions:

– Uses the same equations as for the population mean. – For an error margin of 100*d%, with a large population use – Example: If d=0.05, then n0 = 400, we need 400 samples. [Biemer2003]

slide-39
SLIDE 39

Kristian Wiklund | 20134-03-05 | Page 40

SMall Populations

› Fewer samples are needed for small populations. › N = the size of the population › If (1 – n0/N) < 0.9, then we need to correct for population size

slide-40
SLIDE 40

Kristian Wiklund | 20134-03-05 | Page 41

Response Rate

› It is important to remember that

– Response rate alone cannot be used as a quality indicator! – The keywords in the newspaper article is “risk” and “could”, meaning “further analysis is required” › If the responses are received from all strata in the population, we get a low bias and can manage the variance with statistics

http://www.dn.se/nyheter/sverige/resultaten-kan-bli-snedvridna

slide-41
SLIDE 41

Kristian Wiklund | 20134-03-05 | Page 42

Stratified Sampling

› Assume that we want to estimate the average salary in Stockholm

– If we perform a purely random sample, areas of high or low income may be underrepresented – Instead we stratify the population and create one frame per strata – The strata are sampled, and we use the information to calculate the average for the whole population

slide-42
SLIDE 42

Kristian Wiklund | 20134-03-05 | Page 43

Respondent SCreening

› Not all respondents are qualified to provide an answer

– Gender, skills, affiliation, etc

› This means that one may need a mechanism to screen the respondents.

– Adjusting the frame after sampling – In particular if uncontrolled sampling is used

› Is the respondent qualified to contribute to a survey about software engineering principle X?

slide-43
SLIDE 43

Kristian Wiklund | 20134-03-05 | Page 44

IN INCREA REASIN ING THE G THE RES RESPONSE E RA RATE TE

slide-44
SLIDE 44

Kristian Wiklund | 20134-03-05 | Page 45

Reasons for Refusal

› Not motivated, lack of time, fear of being registered, bad timing, screening, survey too difficult, business policy, low priority, too expensive to answer, sensitive questions, boring, bad questionnaire [Biemer2003]

slide-45
SLIDE 45

Kristian Wiklund | 20134-03-05 | Page 46

Motivators

› Reciprocation – induced by a gift or perceived benefits

– Prepaid incentives work better than promised incentives › A prepaid incentive creates a social contract with the respondent, while promised incentives are perceived as payment for services rendered

› Consistency – Complying to the survey request is consistent with the respondent’s beliefs and values

[Biemer2003]

slide-46
SLIDE 46

Kristian Wiklund | 20134-03-05 | Page 47

Motivators

› Social validation – respondents are more likely to participate if they believe that others participate

– Example: Facebook shows that “X likes…” in their ads

› Authority – a higher response rate is likely if the survey comes from a “legitimate authority”

– In our case, it is likely that it is more efficient to use the MDH brand when creating company-external surveys, and to use the company brand when creating internal surveys [Biemer2003]

slide-47
SLIDE 47

Kristian Wiklund | 20134-03-05 | Page 48

Motivators

› Scarcity – It is more likely to get a response if the survey is perceived as a rare opportunity to make one’s opinion heard

– “last day of the survey”, “only 1 in 10000 are contacted”, …

› Liking – Subjects will be more willing to respond if they like the researcher

– Nice person, similar values, dress code, dialect, … [Biemer2003]

slide-48
SLIDE 48

Kristian Wiklund | 20134-03-05 | Page 49

QUES ESTIO TION D DES ESIGN IGN

slide-49
SLIDE 49

Kristian Wiklund | 20134-03-05 | Page 50

Design for Analysis

› Consider how the analysis is to be done when designing the questions

– Not all numbers are numbers that can be averaged and analyzed – If the analysis require ordinal measurement, make sure that ordinal data is created. [Babbie1990]

› “How often do you compile?” (seldom, often, very often) › “How often do you attend meetings?” (seldom, often, very often)

– Vs

› “Do you compile more often than you attend meetings?”

– This forces the issue and eliminates the risk of using different scales for different variables.

slide-50
SLIDE 50

Kristian Wiklund | 20134-03-05 | Page 51

Open vs Closed Questions

› Open question › “What are the effects of a modular software architecture?”

–…………………….

› Closed question › “What is the primary benefit of a modular software architecture?”

–Testability –Maintainability –Scalability –Other

slide-51
SLIDE 51

Kristian Wiklund | 20134-03-05 | Page 52

Open Questions

› Qualitative data that need interpretation and coding before use.

– Coding: Classifying the text into usable categories

› Useful for questions about behavioral frequency, or when it is hard to produce a good multiple-choice list

– Requires that the respondent is able to express the answer

slide-52
SLIDE 52

Kristian Wiklund | 20134-03-05 | Page 53

Closed Questions

› Multiple-choice answers that should be

– Non-overlapping – Unambiguous – Exhaustive

› Problems

– Missing alternatives › If the respondent have another alternative not in the list, we force the respondent to select the wrong answer – Primacy, Recency, Satisficing

slide-53
SLIDE 53

Kristian Wiklund | 20134-03-05 | Page 54

Double-barreled Questions

A double-barreled question asks at least two things simultaneously › "My company's sustainability and corporate responsibility efforts have increased my overall satisfaction with working here”

– The example assumes that it is important to the respondent that the company have such a policy. – It also assumes that a good policy makes the respondent more satisfied. – A negative answer could indicate that › The respondent does not care (the satisfaction is unchanged) › The respondent cares and is dissatisfied with the policy. The company is either doing too much or too little work in the area. – A positive answer indicates that the respondent is happy › The policy could be either good or bad, who knows what motivates people?

slide-54
SLIDE 54

Kristian Wiklund | 20134-03-05 | Page 55

RES RESPONDEN ENT EF T EFFEC ECTS TS

slide-55
SLIDE 55

Kristian Wiklund | 20134-03-05 | Page 56

Simplified Response Process

› Understand the question

– Listen or read to the question – Brain translates it to something that has meaning for the respondent

› Retrieve data

– Use the interpretation of the question to retrieve data from the memory

› Formulate an answer

– Filter the information into something the respondent is comfortable with sharing – Translate the data to something that fits the question

slide-56
SLIDE 56

Kristian Wiklund | 20134-03-05 | Page 57

Adjacency / Context

› “Conversational norms” may influence the survey result › In conversation, we seldom transition rapidly and without warning to an entirely new conversational context.

– In a survey, the previous context will linger with the respondent and may introduce bias.

› The context includes the researcher, the previous questions, the instructions, the survey title and introduction, … › Example: Schwarz1999, page 96

slide-57
SLIDE 57

Kristian Wiklund | 20134-03-05 | Page 58

Satisficing

› Satisficing is a strategy that minimize the effort needed to respond.

–“Providing an answer” is the priority, not providing the right answer.

› Occurs when

–The task is difficult –The motivation is low

› A special case is straight-lining

–Avoid designs that encourage this behavior

slide-58
SLIDE 58

Kristian Wiklund | 20134-03-05 | Page 59

Aquiescence

› Aquiescence – the tendency to agree with a statement › Example:

– “Do you think that the United States should forbid public speeches against democracy?,” 54% of respondents said “yes,” they should be forbidden – “Do you think the United States should allow speeches against democracy?,” 75% said “yes,” suggesting that only 25% would not allow such public speeches [Biemer2003]

slide-59
SLIDE 59

Kristian Wiklund | 20134-03-05 | Page 60

Primacy and Recency

› Occurs in multiple-choice questions › Primacy is the tendency to favor the first options

– This primarily occur in written surveys

› Recency is the tendency to favor the last options

– This primarily occur in verbal surveys › What is most important?

– Travel – Money – Environment – Family

› Primacy:

– TRAVEL, money, blahblahblah

› Recency:

– Blahbhlah, environment, FAMILY

slide-60
SLIDE 60

Kristian Wiklund | 20134-03-05 | Page 61

Rating Scales

› Rating scales [Schwarz1999]

– “How successful are you in life?” › How do we define successful? Different people have different

  • definitions. Lack of failures? Money? Family? Health?

– The scales used influenced the result › 0 to 10: 0 is “not at all successful”, 10 is “extremely successful”

  • 13% answered in the 0-5 range

› -5 to 5: -5 is “not at all”, 5 is “extremely”

  • 34% answered in the -5 to 0 range
slide-61
SLIDE 61

Kristian Wiklund | 20134-03-05 | Page 62

Frequency Answers

› “Middle” alternatives are assumed to be normal. This introduces a bias.

– Right side: 37.5% watches TV for more than 2.5 hours per day – Left side: 16.2% watches TV for more than 2.5 hours per day

› Do not use rating or frequency scales, use open questions.

[Schwarz1999]

slide-62
SLIDE 62

Kristian Wiklund | 20134-03-05 | Page 63

Recalling events

› “How often did you buy vegetables in the first quarter of 2013”

– Basically impossible to answer. – Encourages satisficing by counting, estimating and guessing

› “Major life-events” may be recalled for up to one year, minor events during a significantly shorter time [Kitchenham2002]

slide-63
SLIDE 63

Kristian Wiklund | 20134-03-05 | Page 64

Social Desirability Bias

› People are less likely to be honest in reporting sensitive issues and will conform to what they believe is “normal” or answer in a way that increase their social status

– “How much alcohol do you consume in a week” [Biemer2003] – “How much meat do you eat” [Hebert2008]

› Mitigation

– Use open questions and neutral wording to avoid signaling what is normal. – Increase anonymity › For example, use web-based surveys over interviews. › Can be done by handing a computer to the subject during the final part of an interview – Put demographics at the end of the survey

slide-64
SLIDE 64

Kristian Wiklund | 20134-03-05 | Page 65

Cultural Effects

› Swedes seldom use the extreme values in a rating question

– [Source: Personal communication]

– Other effects are present in other cultures

› This makes pre-testing and instrument validation very important for international surveys

slide-65
SLIDE 65

Kristian Wiklund | 20134-03-05 | Page 66

Summary

› Survey research is used to measure on a subset of a population and use the measurements to draw conclusions about the population. › Total survey error is more important than response rate. › Survey research is a fixed design – do a good design! › Bias must be removed by design. › Snowball sampling should be avoided. › Trigger the respondent motivators to increase response rate. › The way we design the questions and questionnaire is highly influential on the total survey error. › Cultural effects cannot be ignored in international surveys.

slide-66
SLIDE 66

Kristian Wiklund | 20134-03-05 | Page 67

References

› [Biemer2003] Biemer, P., & Lyberg, L. (2003). Introduction to survey quality. New York: Wiley Publishing. Retrieved from http://onlinelibrary.wiley.com/doi/10.1002/0471458740.fmatter/sum mary › [Biemer2011] Biemer, P. P. (2011). Total Survey Error: Design, Implementation, and Evaluation. Public Opinion Quarterly, 74(5), 817–848. doi:10.1093/poq/nfq058 › [Wiki2013a] http://en.wikipedia.org/wiki/Census_of_Quirinius › [Wiki2013b] http://en.wikipedia.org/wiki/Census_in_Egypt › [Wiki2013c] http://en.wikipedia.org/wiki/Domesday_Book › [RSV2013] http://www.skatteverket.se/privat/folkbokforing/omfolkbokforing/folk bokforingigaridag/densvenskafolkbokforingenshistoriaundertresekle r.4.18e1b10334ebe8bc80004141.html › [Schwarz1999] Schwarz, N. (1999). Self-reports: How the questions shape the answers. American psychologist. Retrieved from http://psycnet.apa.org/journals/amp/54/2/93/ › [Friedman] http://academic.brooklyn.cuny.edu/economic/friedman/rateratingsc ales.htm › [Torkar2003] A Survey on Testing and Reuse › [Curran2002] http://www.csudh.edu/dearhabermas/sampling01.htm › [Fortmann-Roe] http://scott.fortmann- roe.com/docs/BiasVariance.html [Hammel2013] https://onlinecourses.science.psu.edu/stat414/node/210 › [Kuter2001] http://lte-projects.umd.edu/charm/survey.html

› [Stockholm] http://www.statistikomstockholm.se/index.php/statistik-pa- karta/arbetsmarknad-kartor › [Stat100] https://onlinecourses.science.psu.edu/stat100/node/15 › [Sommer2006] http://psychology.ucdavis.edu/sommerb/sommerdemo/sampl ing/ › [Hebert2008] J. R. Hebert et al, ...Social Desirability Trait Influences on Self-Reported Dietary Measures among Diverse Participants in a Multicenter Multiple Risk Factor Trial,... J. Nutr., vol. 138, no. 1, p. 226S...234, Jan. 2008. › [Kitchenham2002] Kitchenham, Ofleeger, “Principles of survey research: part 3: constructing a survey instrument”, ACM SIGSOFT Sw Eng 2002, vol 27 issue 2 › [Ji2008] Ji et al, “Some lessons learned in conducting software engineering surveys in china”, ESEM ’08 › [Babbie1990] E Babbie, “Survey Research Methods” › [Fisk2005] http://hccedl.cc.gatech.edu/taxonomy/docInfo.php?doc=40 › [Weiss2000] http://www.jhsph.edu/research/centers-and- institutes/center-for-refugee-and-disaster- response/publications_tools/publications/qualresearch.html › [Parkinson1957] Parkinson, C. Northcote. "Parkinson's law, and other studies in administration." (1957). › [Stat414] https://onlinecourses.science.psu.edu/stat414/node/264

slide-67
SLIDE 67