Health Research Methods
Steps in Using Existing Instruments:
1. Identify measurement instruments 2. Getting a copy 3. Is it the right instrument? 4. Final Steps
4 Necessary components of questionnaire design and construction:
1. Planning the pre-questionnaire 2. Drafting the questionnaire 3. Preparing the final questionnaire 4. Pretesting
The general steps for survey research include (5):
1. a clearly delineated research problem 2. appropriate questions to respondents to gain info 3. a well-systematized data collection technique 4. a generation of group-level statistics 5. results that are generalizable to the larger population
Criteria for sampling procedures:
1. clearly specifying the population 2. explicitly stating the unit of analysis 3. specifying a method to determine sample size 4. giving a detailed description of selection procedures
Telephone survey advantages and disadvantages:
Advantages: faster than mail or interview techniques, wide geographic area can be used, less expensive than interviews Disadvantages: more expensive than mail surveys, respondent can just hang up, usually limited to 15 min. or less
Mail survey advantages and disadvantages:
Advantages: inexpensive, no interviewer bias, assurance of anonymity Disadvantages: low response rate, likelihood of unanswered questions, wait time for returns
E-mail survey advantages and disadvantages:
Advantages: low cost, fast turnaround time, wide geographic coverage Disadvantages: many respondents may simply delete email, survey possibly answered or sent to someone else, need up to date list of email addresses
Web survey advantages and disadvantages:
Advantages: low cost, very fast, anonymity can be assured Disadvantages: longer surveys have lower completion rates, redistricted to internet users, same person may answer more than once
Interview survey advantages and disadvantages:
Advantages: personalization, higher response rate than mail questionnaires, observe verbal and nonverbal behavior Disadvantages: lack of anonymity, high cost of money and time, openness to interviewer bias
Instrumentation
All measurement instruments used in a study
Semantic Differential Questions
Ask a question and then have a series of polar opposite words. Indicate along spectrum. ex. -2 to 2 ex. strong to weak
Variable
Characteristic or attribute being measured
Semistructured Interviews
Contain a core of structured questions from which the interviewer may move in related directions for in-depth probing
Analytical Designs
Cross-sectional, case control, prospective Addresses relationship of variable in question to other variables Implies more control More sophisticated
Descriptive Designs
Cross-sectional, longitudinal, or group comparison Describes what is Less sophisticated Basic calculations (frequencies and means)
Fairness
Cultural sensitivity and cultural competence Few/no ways to measure fairness
Define terms that could easily be misinterpreted (T or F)
F
Target double-barreled questions (two questions in one) (T or F)
F Avoid them
Use adjectives that fail to have an agree-upon meaning (T or F)
F Do not use them unless they're agreed upon
There should be ambiguity in the questions
F There should be NO ambiguity ( uncertainty of meaning
Phrase questions to be comprehended by some of those in the target population (T or F)
F comprehended by all
Never request information needed for subsequent questions first (T or F)
F do it first
Make sure questions are leading questions (T or F)
F do not want questions to be leading
More difficult questions should be ahead of simpler ones (T or F)
F opposite
Put sensitive questions as well as open-ended ones near the beginning of the questionnaire (T or F)
F put them at the end
Do not underline or boldface a word even if special emphasis is demanded
F want to underline and bold emphasis
Psychometric Qualities
Fairness (hardest), Validity, Reliability of the instruments that you are using Important to internal validity
T or F: Sampling is not important to the quality of your study
False, is important
T or F: Contemporary use of the word "random" is random
False, is not random
Reliability
Measurement produces consistent results over time
Validity
Measures what it purports to measure
Do perfect studies exist?
No
How long should a focus group session last?
No longer than an hour
Do perfect studies exist?
No!
Planning a survey study steps (15):
Plan it, overall design, method of data collection, data analysis plan, sampling, questions/instruments, pre-test the survey, revise the survey, administer the survey, coding the data, verifying the data, data entry, tabulation/calculations, analysis, report results
Measurement
Process of assigning numbers or labels according to a particular set of rules
Unstructured Interviews
Reserved for obtaining information that is very personal and/or potentially threatening
Avoid establishing a response set (T or F)
T
Be careful of double negatives (T or F)
T
Place questions in a logical order when possible (T or F)
T
Separate reliability-check question pairs
T
Vary questions by length and type (T or F)
T
Watch for inadequate alternatives to a question
T
Semantic Differential Scaling
Three elements: (1) the attitudinal concept to be measured, (2) a pair of opposite adjectives, (3) a series of undefined scale posititions good----l----l----l----l----bad
Instrument
Tool used to measure a variable
List from best to worst sampling: Quasi Randomized, True Randomization, Convenience
True Randomization, Quasi Randomization, Convenience Sampling
Response Rate
Want higher response rate
Cover Letter 9 Elements
Who is conducting study, the reason for the study, why is it important for respondent to complete survey, assure that there is no right or wrong, assure confidentiality, assure anonymity, length of time it will take, date of return, how to obtain results
How do you choose the right methodology?
You use the methodology that your research questions dictate
Structured Interviews
a well-defined pattern is followed, similar to a questionnaire
Cross-sectional
at one point in time
The number of participants per focus group should be:
between 8 to 12
Response bias
can happen when the respondent deliberately falsifies their answers
Nominal Data
categorical, no ranking
Ordinal Data
categorical, ranked
Generally, questionnaire forms are:
closed, open, or combination of the two
Interval
continuous, scaled
Ratio
continuous, scaled with absolute zero
Research designs falls into two broad categories:
descriptive and analytical
Multiple-choice questions
each potential answer is listed
Sentence completion questions
ex. I feel that they should _________
Branching questions
ex. If yes, then...
Recall order bias
occurs when a respondent primarily checks the response that easily comes to mind rather than giving it due consideration
Open-ended questions
placed at or near the end of the questionnaire
Recall bias
prior experience may influence a response
Restricted, or closed, form
provides fixed-alternative questions that can be answered by a simple "yes" or "no" or by checking an appropriate box Dichotomous, Multiple-choice, item ranking, likert questions, semantic differential
Group comparison
simply compares groups on the issue Emphasizes what characteristics the groups possess
Likert Questions
strongly agree to strongly disagree
Longitudinal
takes place over a period of time
Dichotomous questions
the answer compromises two parts, one of which is to be selected by the respondents (ex. male or female) Usually categories- ordinal
Rating questions
the respondent indicates a particular view about the psychological object (ex. very important, somewhat important, not important)
Response set bias
the respondent repeatedly answers all the question with the same response
Ranking questions
the respondents simply order the given answers in rank (ex. greatest problem= 1, smallest problem=5)
Unrestricted, or open, form
the response categories are not specified, and the respondent is allowed to answer in his or her own words Usually continuous- interval/ratio
Flow plan
used to outline the design and subsequent implementation of a survey