Research Methods Questions
What are threats to internal validity?
- History = an unanticipated event occurred during the experiment that affected the outcome variable - maturation = levels of the outcome variable changed due to normal developmental processes -regression = because of imperfect reliability, subjects selected on the bases of extreme scores tend to regress toward the mean on subsequent tests Mortality = participants die Testing = pre-test affects scores on the post-test Selection = groups are not equivalent at the start of the study Design contamination = subjects "compare notes" across groups and the control group ends up receiving some treatment John Henry Effect/ Research Demoralization = subjects in the control group know they are not receiving the intervention, causing them to purposefully over or underperform
What is a factorial design?
- If you want to manipulate multiple factors in the same experiment, a factorial design can be used. - A factorial design simultaneously investigates the effects of all the independent variables on the dependent variables. -The most basic factorial design as a 2X2 design, in which there are two factors and each of them has two levels
What is quantitative coding analysis interested in?
- Interested only in content characteristics related to specific hypotheses/research questions (what and how questions) -Codes are developed and finalized before analysis begins -is deductive which means it works from the top down -can deal with large, randomly-selected sample -results are numerical, statistically manipulable and often generalizable
What is systematic sampling?
- It is simple, ensures even coverage of sampling frame -hidden periodicity can create large errors - possible inadvertent clustering
What is the difference between manifest and latent content?
- Manifest content exists unambiguously in the message o It is easily observable and countable - Latent content is conceptual and cannot be directly observed in the messages under analysis o It is difficult - if not impossible - to count
What is basic research?
- Scientific community is the primary audience - Evaluaters are other research peers -High autonomy of researcher -Highest priority is verified truth -Purpose is to create knowledge -Sucess is indicated by publication and impact on knowledge/scientists
What are myths associated with content analysis?
- You don't have to go through IRB if you do content analysis. o Truth: you may need to go through IRB if the material you're analyzing has not been released for public consumption - Content analysis is an "easy out" for a master's paper study o Truth: content analysis can be extremely time consuming and rigorous. Like any other method, a CA study can be high or low quality
What are the weaknesses of quantitative content analysis?
- better at answering "what" questions than "why" questions" -Analysis limited by availability and content of materials -quality of study and findings dependent on quality of coding scheme and how consistently the scheme is applied -Can sometimes be not objective, depends on quality of coding scheme - inherently reductive because condensing and reducing data -time consuming
(MM4) What is concurrent triangulation strategy?
- data collection for qual and quan happens at the same time, ideally weighted the same but in practice one is weighted more than the other. Merges data together.
What is content analysis?
- focuses on the features of recorded information -the systematic, objective, quantitative analysis of message characteristics -tests hypothesis or answers questions about a body of messages -The application of pre-determined "codes" is what sets it apart from qualitative content analysis
What are the benefits of randomization?
- helps to mitigate bias which increases the internal validity of your study -is not haphazard, must be carried out systematically
What are structured interviews?
- includes a set of predefined questions -qs asked in the same order - usually have a limited number of responses -are used to minimize the effect of the instrument and the interview on the research results -more friendly to positivist paradigms b/c good for generalizing and condensing data -can be analyzed quantitatively
What is applied research?
- is for people who are not researhers - Primary audience = practioners, participants, supervisors -Evaluators = practioners and supervisors - Autonomy of researchers = low and moderate -Relevance is the highest priority -The purpose is to resolve a practical problem -Sucesses is indicated by direct application to address a specific concern/problem
What are transformative designs?
- not unique to mixed methods - marked by an explicit intent to address a social issue for a marginalized or underpresented group and engage in research that brings about change - usually framed by a theory; social/activist goal informs everything from the question to the research design to the ways the research is shared
(MM5) What is concurrent embedded strategy?
- one data collection phase, quan or qual data is collected but one type of data is "nested" or embedded within --> one data source supports the other data source - The second data source may address another question or seek info at another level of analysis
What does "going native" mean?
- originally a derogatory term associated with colonialism and notions of primitive cultures representing earlier stages of human evolution -discussed more frequently now as "over rapport" -researcher loses objectivity and ability to critically analyze the culture being studied -debate around whether this is a problem
What are threats to external validity threats?
- poor sampling = sample is not actually representative of the population -Hawthorne Effect = participants modify their behavior because they know they are being observed Setting (ecological validity) = real life is messy; researchers trade environmental control for authenticity. Will people behave the same way outside of the lab?
What are the characteristics of bad qs?
- qs that arouse inappropriate emotional response from subjects -double barrelled qs --> a question that incorporates 2 qs into one -complex qs --> too wordy, not concise or clear
What are the weaknesses associated with mixed methods research?
- takes a lot longer - more costly - requires a lot of expertise - can be more difficult to express results and methods - relatively new method - page restrictions - difficult to summarize entire study
When should you use a case study?
- typically answer what, how, and why questons - Common uses: exploring a new area of inquiry, describing an interesting phenomenon, testing a theory or model in a real-world setting, in an extreme context, or following up on another study to explain previous results
What are the strengths of content analysis?
- unobtrusive and non-reactive -often easy and inexpensive to access materials -relatively few ethical concerns -especially useful for analyzing historical material and documenting trends over time -establishing reliability is straightforward = material can easily be made available to other researchers, high reliability possible for manifest content, potentially less for latent content
What is a case study?
-A description of a particular situation or event. The description of the case serves as a learning tool, providing a framework for discussion. -can be a person, organization, or event -often used in a multi/mixed methods (quan or qual) -defined by their richness, lots of different sources for data, thick and rich description · Cannot extract the data away from the setting = it's about the data and the setting A primary defining characteristics of a case study is that it focuses on a single instance of the unit of analysis (the single person or library etc.)
What is the quantitative paradigm?
-Believes that there is a single reality that we can take apart and accurately describe -The knower (researcher) is separate from the known -It is possible to make time and context free generalizations. -There are real causes occuring before effects and we can identify them - Inquiry is value free - Is a positivist framework
What is the qualitative paradigm?
-Believes there are multiple constructed realities that are holistic and impossible to fully describe -the knower and the known are interactive and inseperable - Only time and context bound statements are possible - All entitties are mutually and simultaneousy shaping one another; it is impossible to distinguish cause and effect. - Inquiry is value bound -Is a naturalist framework
What is qualitative analysis interested in?
-Themes and codes are allowed to emerge from the data through analysis process -Inductive (bottom up) --> building from data -Typically requires relatively small, purposively selected samples -Results are textual descriptions, typolofies, and descriptive models
What are focus groups?
-a group of individuals selected and assembled by researchers to disclose and comment on the topic fo the research -encouraged to compare their interviews with each other, not just reporting, -group members usually challenge challenge each other views, leads to a more nuanced perspective than could be discovered in individual interviews -their social nature mimics the setting in which people often form their opinions and attitudes -- they can shift in real time -usually used in sequentially explanatory or exploratory design -can be a stand alone method but is a much stronger research tool if used in combination with other methods such as in-depth individual interviews, direct observation or suvery
What are some common errors with interviews?
-asking qs that are too ambiguous or to abstract -posing questions which imply one right answer or are closed -framing questions that fail to allow interviewees to think through and answer -Not giving interviewees enough time to respond -Not listening --> leads to failure to ask follow-up qs or to ask for clarification of response -introducing too much variation between interviews
What are the strengths of focus groups?
-data can arise from inviduals and from interactions btw individuals -saves time and sometimes money vs. indvidual interviews -less power imbalance between researcher and participants
How do you chose people for your focus group?
-define the control characteristics of participants -be mindful of how ppl interact with each other -segmentation approach = picking a group so that it represents a relatively distinct segment of the intended audience for the produce or service that is the focus -want to choose strangers to avoid power imbalances
What are the weaknesses of focus groups?
-easy to get off topic -one or two people can dominate discussion --> "Group think" -confidentiality/ sensitive topics -difficult to transcribe
What is a between-subjects design?
-examines difference between individuals or groups -each group exposed to a different treatment -quicker for each user but requires larger pool of subjects to account for individual differences
What is within-subjects design?
-examines differences in a particular variable for individual subjects -each group is exposed to all treatments (sometimes in different orders) -removes individual differences but adds concerns for ordering of treatments and increases time burden per subject
What are the strengths associated with mixed methods research?
-gives you a more complete answer to your research question - not limited by choice of methods - have a broader audience - might help you find biases/ mistakes/ inconsistencies - complements strength and weaknesses of both types of research
How do you analyze focus group data?
-harder to transcribe b/c multiple voices -analysis and coding cares more abour agreement and disagreement -need to balance the views of all participants, not just a view fo the talkative ones -compare themes that emerge with each group, and then w/h those that emerged from other groups -need to distinguish between what participants believe is important and what they believe is interesting
What are semi-structured interviews?
-more flexible -an interview guide rather than a set of qs -both closed and open ended qs -doesn't have a set order of qs, can be brought up based on context of interview -can change the wording of qs or omit qs - more friendly to naturalist -useful if qs do not have simple or brief answers --> want respondents to give explanations/examples
What are unstructured questions?
-no predetermined questions -basically informal/conversational interviews -questions are generated based on participant's responses and intentions of researcher -most flexible method -used to elecit people's social relities
What are some criticisms of case studies?
-not generalizable, but doesn't have to be in order to be useful or contribute to conversation; he can help us push at the limits of generalizable knowledge and can also be transferable -Cannot help reserachers develop predictive theories; concrets, context-dependent knowledge and grounded theories are more valuable -Are not rigorous; they can be if they follow accepted protocols for ensuring trustworthiness -vulnerable to confirmation bias -cannot be easily summarized but that isn't the point
What are some issues with interviewing as a method?
-researcher personality and emotions can greatly impact what data is obtained -power dymanis and how the researcher chooses to represent themselves can impact the data -recorder can be obtrusive/intimidating -question choice, wording, and order can impact responses and interviewee comfort
What is simple and random sampling?
-simple; meets statistical test requirements -small subgroups not always represented -possible inadvertent clustering
What are the different ways to mix data?
1. Connecting = mixing of research are connected between data analyses of the 1st phase and data collection of 2nd phase. 2. Integrating = merging data types, transforming one data type into another and making comparisons; or collecting data types at the same time adn then comparing 3. Embedding = collecting one type of data and then collecting another type to support that data
What are the steps to conduct content analysis?
1. Define your research question and key terms 2. Specify the units of analysis (what will you sample; within that sample, what elements will you code?) 3. Define the population from which units will be sampled and develop a sampling technique 4. Locate data If necessary, format it 5. (For deductive coding only) Develop a coding scheme and carefully define each code 6. Conduct the coding and categorizing of your data. For inductive coding, you will develop and define codes during this step 7. Summarize results 8. Make inferences/ conclusions
What are the different types of observation methods?
1. Direct or Indirect Observation = depends on whether the research observed herseld or relied on the observation of others 2. Structured or unstructured = whether observation guides were used 3. Participatory or non-participatory = whether the researcher was involved in the activity 4. Global or specific = whether the researcher attempted to observe everything, or focused on a small number of things 5. Overt or Covert = whether the researcher's presence and intentions were known 6. Reactive or non-reactive = whether the participants changed their behavior in response to being observed
How do you choose which sampling method to use for your research?
1. If you want to generalize the results beyond the sample --> probability sampling (quantitative research, extensive study) 2. If you don't want to generalize the results beyond the sample --> nonprobability sampling (qualitative research, intensive study) *Unless you already know the population parameters you can never be entirely sure that you have a representative sample
What are some types of content analysis that librarians use?
1. Log Analysis = looking at catalog interactions to get data about their users and collections; can help librarians answer how users get to our site and which pages are more/least popular 2. Bibliometrics = measuremants about books, could be for data not about the text such as author affiliation, word frequency, or subject headings; better at answering what questions vs. why questions; is very popular, is used to measure the impact of individual authors, classifying a set of texts, describing patterns of text use 3. Big Data Sets = large data sets that can be used to reveal patterns, trends, and associations about human behavior and interactions; 4 Vs of Big Data --> Volume = large amount of data, Velocity = data that used to be received and analyzed in batches is not arriving in real-time, Variety = data exists in hundreds of format, Variability = refers to either inconsistenceis in data flows or the overall spread of the data
What are nonprobability sampling?
1. Purposive samplin g= elements are purposefully chosen because of some characteristics; choosing extreme cases or opposite cases, choosing cases to maximize or minimize variabilit 2. Quota sampling = determine which characteristics are of interest and set a quota for each level of that characteristic (similar to stratified) 3. Snowball sampling = initial participants identify more participants 4. Convenience sampling = choosing elements because they are easy to access
What are the types of probability sampling?
1. Simple random sampling = generate a random number list to choose participants. 2. Systematic sampling = generate a random number list to start on then take every nth element, Could possibly not work if you are choosing every other person and every other person is a male or another characteristic that limits your participants 3. Stratified sampling = popultion divided into strata first, then sampled; strata should have some relationship to the study concepts or goals 4. cluster sampling = initial sampling units are groups of elements; you can then sample all the elements in each chosen group, or sample further within each group
What are the four factors that influence mixed methods research?
1. Timing = does data collection happen sequentially (in phases) or concurrently (at the same time)? 2. Weighing = are you giving equal weight to both types of data or is the qual/quant data more important? 3. Mixing = are the two types of data kept separate on the ends of the spectrum, merged on the end of the continuum, or combined in some way on either end? 4. Theorizing = does a larger theoretical design guide the research?
How do you ensure the quality of content analysis research?
1. To ensure reliability/dependability --> have multiple coders, use quantified reliability measures, publish your coding instruments/manuals 2. To ensure the validity/credibility --> have multiple coders, triangulate your data sources, re-use codes from other studies, support your findings with direct quotes or examples
What are the different roles that researchers can play during observation?
1. a complete participant (participants don't know that she is observing) 2. Obserer-as-participant 3. Participant-as-observer 4. complete observer
What are the types of survey error?
1. coverage error = when not all members of the population have a known or nonzero chance of being included; those who are excluded are different from those included on measures of interst. 2. sampling error = surveying some but not all of the numbers of a population 3. nonresponsive error = when ppl selected for the survey who do not respond are different from those
Why are theories important?
1. organize and summarize knowledge over time 2. communication and shared vocab 3. clarify what is observed, helps to interpret findings and understand relationships 4. focuses attention on important variables and relationships 5. predicts outcomes 6. research heuristic = a good theory generates research 7. Generative = challenge exisitng cultural life and generate new ways of living Theories need to be useful
What are the 5 possible reasons for selecting a particular case?
1. representative or typical case = captures the circumstances and conditions of an everyday or commonplace situation 2. Critical case = essential for testing a well-formulated theory 3. Extreme or unique case 4. revelatory case = illuminates previously inaccessible knowledge 5. Longitudinal case = can be repeatedly studied at several different points in time
What are the cons for observation?
1. research may be seen as intrusive 2. some information not appropriate to report 3. quality of observations highly dependent on researcher's skill level 4. difficult to gain rapport with certain participants (e.g. children)
What are the primary purposes of content analysis?
1. to make inferences about the contents of communication = the who and why questions such as characteristics of authorship or analyze trais of individuals or cultures 2. to describe and make inferences about the characteristics of communications = how, what, and to whom such as trends in communication content (tracking language over time), relate characteristics of audience to messages produced for them, analyzing techniques of persuasion 3. To make inferences about the consequences of communications = asks the with what effect questions --> measures readability, analyzes the flow of information, asses responses to communications
What are characteristics of experiments?
1. variables are manipulated -- about control, the idea is that all the possibilities for variation are either controlled or are varied systematically for participants 2. randomization as a means to exert control Experiments typically try to collect evidence for a correlational or causal relationship between two or more variables
(MM3) What is sequential transformative strategy?
2 phases that build on each other, either qual or quan. 2nd phase builds on the 1st. Weight is given to each other, mixing is connected. Theoretical perspective is more important in guiding the study than methods.
What is mixed methods research? What are its benefits?
A method of research that combines both qualitative and quantitative research. Its benefits include combining the strengths of both types of research, interdisciplinary so it brings together a group of researchers, more complex work requires more complex methodologies. The researcher must be familiar with both types of research.
What are the advantages and disadvantages of non-probability sampling?
Advantages: flexible, quick/cheap, tailored to research goals, good with a small population, typically higher participation Disadvantages: greater bias risk, no way to assess certain statistical measures, may not be possible to generalize
Nonprobability sampling
Any form of criteria that doesn't meet the criteria for probability sampling. Sometimes the goal is still representativeness, but not always
(MM6) What is concurrent transformative strategy?
Concurrent collection of data and research is guided by a theoretical perspective
What is a pre-test control group design?
Diagram: Group 1: R O X1 O / Group 2: R O X2 O -This means that group one was randomized, observed, received the first intervention and then observed again. Group 2 was randomized, then observed, then received the second intervention and observed again. -Comparing the data from the first and second observations à comparisons across time.
What is a post-test control group design?
Diagram: Group1: R X1 O / Group2: R X2 O - No pre-test observation is done. -Pros: concerns that pre-test will impact the participant's responses - Cons: don't know where participants were at baseline, no way of knowing if anything changed.
(MM2) What is sequential exploratory strategy?
First collecting qual data, and then collecting quan data. The quan data builds on the qual data -- embedding. Uses the qual data to perhaps build a new research instrument. It's primary use is to explore a new phenomenon.
(MM1) What is sequential explanatory strategy?
First collecting quantitative data and then qualitative data. They inform each other --> the qualitative data helps to explain the quantitative data. Is very straightforward but takes a long time. Best for explaining and interpreting relationships
What is the difference between a fixed and flexible design?
Fixed: -all research elements decided in advance. -More common with quantitative designs -elements = research questions/definitions, data sources etc. Flexible: -research design allowed to evolve throughout the study -everything can be revised in light of new information -more common with qualitative designs
How do you measure trustworthniess of qualitative research?
For measuring truth value: credibility = want to see the interview guide, does the data reflect the concept? To improve truth value = conduct a pilot test (a test run), ground work in existing theories, triangulation For measuring applicability: transferability, can it apply to another context? To improve applicability = dependability, does the researcher account for changes over the course of research? For measuring consistency: reliability = ratio of true variation to measured variation To improve consistency = conduct a pilot test, provide details of methods, collect data from multiple sources, have multiple researchers analyze To measure neutrality: confirmability, assumes that the research cannot distance themselves from subject To improve neutrality = disclose biases and assumptions (in conclusion)
How do you measure trustworthiness of quantitative research?
For measuring truth value: internal validity = trying to make sure tools are validated, are we measuring what we think are measuring? To improve truth value = conduct a pilot test (a test run), ground work in existing theories, triangulation For measuring applicability: external validity/ generalizability, does it apply to the entire population? To improve applicability = use probability sampling technique, calculate margin of error For measuring consistency: reliability = ratio of true variation to measured variation To improve consistency = conduct a pilot test, improve questions, conduct statistical reliability testing To measure neutrality: objectivity = assumes that the researcher should be as distant from research as possible To improve neutrality = document what you do, disclose conflicts of interest
What is qualitative research?
Goal = to discern how humans understand, experience, interpret, and produce the social world. Emphasis is placed on rich description, actors point of view, context, naturalism, and cases vs. variables, - can use observation, surveys, analysis of existing content -all types of qualitative are usually coded to identify features
What is coverage error?
Occurs when your sampling frame doesn't cover the entire population; and excluded elements are different in meaningful ways than included elements Solution = get a better sampling frame if possible
How does pragmatism relate to mixed methods research?
Pragmatisim is focused on "what it works" to solve practical problems. Theories and models are judged by their fruits and consequences or their relationship to other data. Understands that there is a reality outside of the mind but we can never fully know it. When pragmatism is mixed with MM research: - allows researchers to focus on the research problem and use whatever approaches work best to derive knowledge about it - encourages attention to "what" and "how" of research based on intended consequences (the practical goal of research) - emphasis on context and actions allows for social justice, paticipatory/emancipatory, and activist lens
What are the pros using observation as your research method?
Pros: 1. researcher gains first-hand experience and can record information as it occurs 2. unusual or aspects that are invisible to participants cna be observed 3. allows for exploration of sensitive topics
What is a descriptive study?
Purpose = to describe situations, events, behaviors, beliefs, attitudes, processes etc.; key concepts are already defined
What is an explanatory study?
Purpose = to explain casual relationships; key concepts are defined and there are hypotheses about their relationships
What is an exploratory study?
Purpose = to explore a new area or an existing phenomenon in a new context; to begin to define new key features of a phenomenon; paves the way for future research
What is a predictive study?
Purpose = to try and predict/prove causal relationship
What are the types of quantitative and qualitative data collected during observation?
Qual data = verbal or written descriptions, field notes, images etc. Quan data = anything you can count, checklist data, ratings
What is naturalist/ethnographic research?
Research that takes place in real-world settings where the researcher does not attempt to manipulate the phenomenon of interest. Not all qualitative research is naturalistic Intent = provide a detailed, in-depth description of everyday life and practice (thick description), often called ethnographic studies Ethical implications = confidentiality and anonymity, informed consent, could observe questionable behavior
What are the differences between stratified and cluster sampling?
Stratified Sampling: - Few subgroups; you want to sample from all of them -Little variations within subgroups - Lots of variation between subgroups -Random selection within each subgroups -Usually more expensive; subgrouping information can be impossible to obtain Cluster Sampling (reduces cost): - Many subgroups; you want to leave some groups out entirely - Lots of variation between subgroups - little variation between subgroups - Random selection of subgroups (and sometimes elements within too) -Cheap and quick -high error is subgroups are different from each other -statistical analysis is more difficult
What are think-alouds?
The goal of think-alouds is to capture the subject's cognitive processes via verbal reports of their thoughts during experiments, work processes, information seeking occurences etc.
What is triangulation?
The term triangulation refers to the practice of using multiple sources of data or multiple approaches to analyzing data to enhance the credibility of a research study
What is sampling error?
This estimates the uncertainty resulting from the fact that you haven't sampled the entire population. It can be calculated based on the variation in your sample and sample size. Solutions = good sampling design and large samples
What are the different types of experiments?
True experiment = there is a control or comparison group, or multiple measures; participants are randomly assigned Quasi-experiment = there is a control or comparison group, or multiple measures; participants are not randomly assigned Pre-experiment = there is no control or comparison group or multiple measures
Paradigm
broad, foundational assumptions shared by researchers in the field
What are the benefits of concurrent data collection?
can code data and themes from qual and then count how many times it appears in the texts; could examine at multiple levels
What are the benefits of sequential data collection?
can look at outlying data; could use common themes in data from one phase to create an instrument for another phase
Theory
describe, explain, predict or analyze reality. Includes concepts, regularities in the relationships among concepts
What are probing qs?
employed to ask subjects to elaborate on thier answers to a given q
Probability Sampling
every individual in the population has a known and equal chance (nonzero) of being sampled, selection is random, and you have to have a sampling frame
What is the critical incident technique?
looking at a single event, useful for collecting data about each event or behaviors that happen infrequently and so cannot be observed directly
How does internal validity relate to experiments?
o Internal validity for experiments = can we conclude that changes in one variable caused the observed changes in the outcome variable? o High internal validity = strong evidence o Low internal validity = weak or no evidence
What are essential questions?
qs that address the central focuse of the research
What are contrast questions?
qs that allow interviewees to discuss the meanings of situations and make comparisons across different situations
What are structural questions?
qs that attempt to find out how interviewees organize their knowledge
What are extra qs?
qs that can be considered equivalent to certain essential qs but use different wording, in closed questionnaires they are used to gauge reliablity of responses
What are descriptive questions?
questions that allow interviewees to provide descriptions about their activities
Model
simplifies reality Metatheory --> theory --> model
What is time-line interviewing?
the micro-moment timeline interviewing technique, participants descrube an event, establish a timeline then ask qs about each event in the timeline
What are the data types collected during think alouds?
transcription of subje ct/researcher dialogue, video recording or screen capture, researcher notes, eye-tracking or mouse-tracking data, collateral or complementary methods
What is the sampling method for case studies?
typically non-probability sampling, if a case is chosen because of its theoretical dimensions, case selection is called theoretical sampling
What are throwaway qs?
used to develop rapport at beginning of interview, adjust pace, or switch up focus through the interview