Quiz 1--Dillman Ch. 2

Ace your homework & exams now with Quizwiz!

Measurement Error

The result of inaccurate responses that stem from poor question wording, poor interviewing, survey mode effects and/or some aspect of the respondent's behavior.

7) Is survey information being collected by more than one mode?

There may be mode differences (e.g. telephone vs self-administered)

Goal of writing survey question

To develop a query that every potential respondent will *interpret* in the same way, be able to respond *accurately*, and be willing to *answer*

4) Avoid vague quantifiers when more precise estimates can be obtained

e.g. "regularly" may vary from person to person; "average" might be interpreted to refer to the social norm -Change to numerical amounts -Both in item stem and response set -The more specific, the closer to true score

11) Develop response categories that are mutually exclusive

e.g. 35-50; 50-65; 65-80

Motivation life bar

•Respondent attention is a limited resource. •Respondent is likely looking for any reason to answer without reading and not feel bad about it. •The credibility of the survey can decrease or increase the level of focus of the respondent. •Very few people will read every word.

Context effects

-Anything coming before or surrounding the question may impact the response -Sometimes we need to use filler questions to get people's mind back to neutral unless you are dealing with people who have a great deal of knowledge

Don't make assumptions about respondents' literacy levels

-Assume it is a 6th grade reading level -Not just the words, but also *instructions* and *length* of survey

6) Is the respondent's understanding of response categories likely to be influenced by more than words?

-Attitudinal and belief questions typically rely on vague qualifiers—the vaguer the question and answer categories, the greater the potential for measurement error -Respondents faced with a situation in which they don't have an obvious answer see themselves in relationship to others students (e.g. "I study more than most, so I should choose the top categories")

Avoid items that are too similar

-But if they are too similar, apologize upfront -You can even tell them that you did not create it -If you explain something that seems odd, they will probably be cool with it and understanding

8) Avoid bias from unequal comparisons

-Closed-ended questions with unordered categories may become unbalanced -True balance may be extremely difficult to achieve

If you want respondents to rate themselves on an attribute it is often good to provide a comparison group

-Compared to what? to who? -This helps with consistency

15) Choose question wordings that allow essential comparisons to be made with previously collected data

-Comparison of survey results with previously collected data often represents a major survey objective

Open-ended questions

-Could be used for exploratory purposes -*Challenge*: Getting adequate answers because probing is not possible. -If you want a longer response, give them a larger textbox -The fundamental problem is that the answer depends upon the extent to which respondents are willing to think hard about the question and write a complete answer--one solution is to provide possible answers in a closed-ended format, but the problem is that surveyors might not know what the possible responses might be

Incentives: Payment

-Do not overpay bc IRB will see this as coercive

16) Avoid asking respondents to say yes in order to mean no

-Don't include questions that contain double negatives -People may miss the word "not" when reading quickly -eg "this thing at this place does not taste good" --> mental gymnastics bc survey demand is increased -also, reverse-coded items reduce Cronbach's alpha

Be a tour guide and let them know what's coming up

-General to specific is the golden rule, otherwise context effects will get yucky -"first I am going to ask you about your general attitudes about various cultures. please only respond about your opinions. Later, we will ask about your particular experiences with these cultures"

Use specific attitudes if you want to predict specific behaviors

-If I am measuring your intention to be an organ donor, ask about YOU registering as an organ donor rather than a general "is it good for everyone to register as an organ donor" bc the respondent might not have the intention to do so -Just bc they agree with a statement does not mean they themselves would behave this way

Distinguish neutral/don't know/NA

-If you do not have an N/A, they might skip it and you will have missing data and not know why it's missing (is it missing at random?) -Caveat: Lazy participants might just select N/A to avoid thinking -But you can instruct them to select neutral if they have no opinion (e.g., "What do you think of the Dodgers?")

Motivational life bar

-In cover letter, tell the survey taker why they are taking it, what it's about, etc bc it will impact where lifebar begins -*If you have a high motivational life bar, it is ok to have high survey demand, because they will get through it* -You don't want the most demanding questions to be given when life bar is at lowest

10) Eliminate check-all-that-apply question formats to reduce primacy effects

-It has been observed that respondents tend to "satisfice" in answering many types of survey questions—they check the first few answers and go down the list until they feel they have provided a satisfactory answer (selecting among the first few answers presented); same with recency effects--cannot resolve by switching up the order bc then you are introducing error all over the place -They may also feel weird about checking them all -Cannot determine intensity -Ask each question individually instead (more questions, but more valid data) -Exception: Ethnicity; list of allergens

9) State both sides of attitude scales in the question stems

-It is tempting to reduce the number of words in questions by mentioning only one side of an attitude scale when posing a question e.g. "to what extent do you agree that..." -Let the respondent know that disagreement is an acceptable answer ("To what extent do you disagree OR agree") and then make sure response options are in the same direction

4) Is the respondent willing to reveal the requested information?

-Just because they know an answer does not mean they are willing to provide it (e.g. income, drug use, theft_ -People are more likely to give honest answers to self-administered than to interview questionnaires, but social desirability is still a problem

Keep strongly agree/disagree as neutral as possible

-Keep it near neutral and let them use the scale to add intensity -e.g., "This was the best training session ever"

5) Will the respondent feel motivated to answer each question?

-Motivation can be encouraged in many ways, ranging from incentives and follow-up reminders to respondent-friendly questionnaire design -Sometimes, the questions themselves are the source of a motivational problem and no matter how much one does with other aspects of survey design, wording remains a major impediment to accomplishing the survey objectives

5) Avoid specificity that exceeds the respondent's potential for having an accurate, ready-made answer

-Nearly as troublesome as too much vagueness is the request for too much specificity -Take care in determining whether categories are needed and which ones are appropriate, e.g. free response "how many books have you read in the past year" vs less than 10, 11-25, 26-50, etc -Don't train them that it's okay to give inaccurate answers -Note: If it is novel and infrequent, it's ok ("how many car accidents have you been in in the last 5 years?")

Closed-ended unordered questions

-No particular order and respondents are asked to pick the one that best describes their opinion, e.g. "which of these should be the company's priority?" [categorical?] -Questions with unordered categories may consist of many different concepts which must be evaluated in relation to each other -Choosing from among several categories is quite complex—it may require considerable effort (e.g. ranking or absorbing all the details to identify the right choice) -*But these are the types of question structures that can sometimes provide the most useful information to survey sponsors*

Measurement error

-Our goal is to use our survey instruments to extract information from people's heads that is as close to their real opinions, thoughts, attitudes, behaviors, and inclinations as possible -The result of inaccurate responses that stem from poor question wording, poor interviewing, survey mode effects, and/or some aspects of the respondent's behavior

17) Avoid double-barreled questions

-Participants may feel differently about one part of the question than the other -Ask each question separately OR make it more general

18) Soften the impact of potentially objectionable questions

-People may skip questions they object to answering, or worse, decide not to answer any more questions -These are subject to context effects (e.g., if they write about depression for 3 mins, it will impact how they respond to later questions) -These questions can be softened by changing the wording or adding buffer items to reduce context effects -Or at least warn them and explain why you're asking -Offer anonymity/confidentiality -Use your survey to build up trust and THEN ask -Also if they write about something sad, leave them in a good mood by asking them to write about something happy

12) Use cognitive design techniques to improve recall

-Respondents answer questions quite rapidly, spending as little time as possible deciding what answer to choose so ask details about the thing of interest to spur recall -But this comes at the risk of increased question length—*reserve for the most important survey questions* -Get them back in that mindset--you may even have to ask them questions that they are forced to answer to get them in that mind space rather than relying/hoping that they actually took this time to do this on their own (tradeoff: long & drains lifebar)

6) Use equal numbers of positive and negative categories for scalar questions

-Respondents draw information from the number of categories as well as from labels -The midpoint for number of categories can easily be interpreted as the neutral point -e.g. if you have 4 positive and 1 negative, they will likely pick something positive -This impacts directionality of how you want them to answer

13) Provide appropriate time referents

-Shorten the time period and use a cognitive recall set of questions prior to asking for answer -When asking about *mundane* things, switch from asking people for a count to asking for a *general estimate* on average -Sometimes it is okay to include, "your best estimate is fine" to avoid nonresponse -Exception: Novel + rare -Also: "Last week in its entirety" or "on the average week"

19) Avoid asking respondents to make unnecessary calculations

-Some will do the math, others will make an estimate -Ask for the numerical information and reduce the burden on respondents (not percentages, but actual amounts) -If you can do it on your own, don't ask them -It will exhaust them, it will frustrate them, plus they might get it wrong if they are doing the math in their head

7) Distinguish undecided from neutral by placement at the end of the scale

-Sometimes attitudinal questions are posed without giving the option of a neutral opinion or no opinion at all (thus making them choose something they don't have a thought on) -Place undecided category as the last position (to separate it from neutral opinions)

Semantic differentials

-Super good to do last minute if you need to put it together -When time is of critical importance, these are great -Test-retest will be high, internal consistency will be high -Sometimes you do not need to come up with items and this is a good way to save your time—sometimes a semantic differential is more than enough

2) Choose as few words as possible to pose the question

-The problem with long questions is that when people read questions they attempt to be efficient, thus giving uneven attention to each word -Long questions/instructions will make the respondent believe it is ok to skip words -Very few people will read every word

Put demographic questions at the end

-These do not change--it is unlikely that due to exhaustion you will answer these questions differently -If you put these at the front, you might make active thoughts that you don't want active (priming)--"Oh, you asked me about my gender and then about baseball? I see what you're doing..."

2) To what extent do survey recipients already have an accurate, ready-made answer for the question they are being asked to report?

-They must recall information about the topic, make a judgment about the information they retrieve from memory and then report a response -This is influenced by different contextual considerations and processes -The vaguer the question, the vaguer the categories, and the more remote these items are from people's experiences, the more likely a question is to produce inconsistent responses if we ask the same person to answer this question at different times

14) Be sure each question is technically accurate

-This impacts the surveyor's credibility -You would not do this intentionally, but you might be working at a job where they used the same survey for many years and has not been updated -People are looking for any reason not to care so don't give them one

Closed-ended ordered questions

-This type of question is most useful when one has a well-defined concept for which an evaluative response is wanted, unencumbered by thoughts of alternative or competing ideas -Scalar concepts (agree/disagree) [continuous?] favorable/unfavorable but these are vague qualifiers = measurement error) -e.g. Likert 1-7 or -3 to +3 -Typically, questions with ordered categories consist of two basic concepts, with one presented in the stem of the question (do you agree or disagree with..) and another in the answer (strongly agree—disagree)

1) Choose simple over specialized words

-When a word exceeds 6-7 letters, chances are that a shorter and more easily understood word can be substituted, but do not automatically assume that all shorter words are acceptable (e.g. deter < discourage) -Replace complex words with a string of shorter words -Avoid jargon -Pretest with actual respondents -Otherwise, they will try to guess & get it wrong, they will get frustrated & end the survey (or worse, they sabotage)

Partially closed-ended questions with unordered response categories

-Writing a partially open-ended question is a possible solution but does not lend itself well to the construction of variables for analysis -e.g. "which of the following is your favorite sport to play?" football, basketball, tennis, baseball, other (please specify) -Limiting choices to the first four means that some people will not be able to give the answer they want to provide; trying to list all of the alternatives means a potential quagmire of making sure to list a huge array of minor sports -Problem: It is likely that fewer people will mention other choices if they are not listed; Cannot say: "respondents to this survey are 10x more likely to prefer X" -Solution: OF the four sports listed below, which one do you like to play the most?

8) Is changing a question acceptable to the survey sponsor?

-Writing questions is a difficult challenge bc so many factors simultaneously influence whether a proposed question obtains accurate answers -Questions with recognized defects cannot always be changed...sometimes a question has been used in another survey and the objective is to replicate the previous survey or make the new data comparable in some other way -If it really sucks, ask the sponsor

Avoid slang/jargon

-You can lose credibility -Slang changes from group to group and over time -If the goal is to increase credibility, put on cover letter "We are a group of graduate students trying to blah blah blah"

Spread out your response sets

-You get more variance -Exception: Low education samples (stick to 3: Yes, No, Maybe)

Make sure distances between intervals are equal

-e.g. 4-point scale: forces people to make a choice, but the distance between disagree and agree is NOT the same as strongly disagree + disagree -Exception: Frequencies

Principles for writing survey questions

1) Choose simple over specialized words 2) Choose as few words as possible to pose the question 3) Use complete sentences to ask questions 4) Avoid vague quantifiers when more precise estimates can be obtained 5) Avoid specificity that exceeds the respondent's potential for having an accurate, ready-made answer 6) Use equal numbers of positive and negative categories for scalar questions 7) Distinguish undecided from neutral by placement at the end of the scale 8) Avoid bias from unequal comparisons 9) State both sides of attitude scales in the question stems 10) Eliminate check-all-that-apply question formats to reduce primacy effects 11) Develop response categories that are mutually exclusive 12) Use cognitive design techniques to improve recall 13) Provide appropriate time referents 14) Be sure each question is technically accurate 15) Choose question wordings that allow essential comparisons to be made with previously collected data 16) Avoid asking respondents to say yes in order to mean no 17) Avoid double-barreled questions 18) Soften the impact of potentially objectionable questions 19) Avoid asking respondents to make unnecessary calculations

Criteria for assessing each question

1) Does the question require an answer? 2) To what extent do survey recipients already have an accurate, ready-made answer for the question they are being asked to report? 3) Can people accurately recall and report past behaviors? 4) Is the respondent willing to reveal the requested information? 5) Will the respondent feel motivated to answer each question? 6) Is the respondent's understanding of response categories likely to be influenced by more than words? 7) Is survey information being collected by more than one mode? 8) Is changing a question acceptable to the survey sponsor?

Three ways a survey question can be structured

1) Open-ended 2) Closed-ended ordered 3) Closed-ended unordered

3) Use complete sentences to ask questions

Don't attempt to minimize words by using incomplete sentences bc the sentence might be misunderstood

1) Does the question require an answer?

In order for an inquiry to constitute a survey question, it must require an answer from each person to whom the question is asked

Mental effort for closed ordered vs closed unordered

Mental effort is different for each of these types: ordered ones require envisioning a scale and figuring out where on that scale one fits, the other requires comparing each discrete category with the others (usually more difficult because of the amount of information that must be processed and about which decisions must be made)

3) Can people accurately recall and report past behaviors?

People who write surveys often want respondents to provide far more detail about past behaviors than can be recalled—keep recall simple and related to recent events


Related study sets

Top 26+ React Interview Questions (2018 Edition)

View Set

Ancient Art 22,000 B.C - 400 A.C

View Set

Introduction to entrepreneurship-led economic development

View Set