NURS324

Ace your homework & exams now with Quizwiz!

Accessible Population vs Target Population

*Target Population*: the entire population that the study is meant to generalize to/represent (ie all of Canada). *Accessible Population*: cases from the target population that the research has access to (ie the patients who live in the same city as the researcher)

Topic Coding vs Analytic Coding

*Topic Coding,* aka Substantive Codes: organized by substance of the topic being studied (ie smoking, Alzheimer's, mental health) *analytic coding,* aka Theoretical Codes: coded on how the substantive codes relate to each other (ie processes, types, techniques, turning points/boundaries, causes, concequences etc)

Editing Analysis (quizbit)

- A qualitative style of analysis where researchers interpret the text to find meaningful segments - a *category scheme* is then created, and data is organized into the categories - then search for relationships/patterns that connect the categories - used in grounded theory, phenomenology, hermeneutics, or ethnomethodology

Cultural Safety

- Addresses the inherent POWER imbalances between pt and HCP - addresses colonialism - wants to make clients do not feel inferior or helpless

Barriers to RU

- Communication difficulties - hard to locate relevant studies - poorly presented studies, with few implications for practical use - cost and time restrains - negative attitudes/resistance towards research - little time to read and implement research

Implementing Practice Change

- Consider "Characteristics" of EBP: is it too complex? is it accessible? is it expensive? - Consider Communication: how do we disseminate it? How do we mandate it gets followed? Who does who report to? --> Social system --> Auditing and feedback

2 Methods to improve dependability

- Inquiry Audits (external hired person who scrutinizes documents & data trail) - Stepwise Replication (splitting the research team in half to conduct the same experiment concurrently but separately)

Conceptual Model vs Theory?

- Made up of CONCEPTS (just like Theories), but the concepts are NOT linked/related in a logical system (remember: a theory is made up of CONCEPTS, and shows the relationships between them in a logical system)

Qualitative Study Elements

- Purposeful sampling (specific inclusion/exclusion criteria- finding participants that have experience in what we're studying) - Naturalistic Setting: going into the field - Data Saturation (continuing to recruit participants and collect data until no new info is coming up) - Data (Thematic) Analysis: ie open coding, highlighting transcripts of individual's life experience

One-group pretest-posttest design

- Quasiexperimental design except there is NO separate control group ("within subjects": participants are their own controls- what I used for the CBD study)

multiple correlation coefficient (R) (key quizbit)

- R, the statistic from a multiple regression analysis - R CANNOT be negative, only 0 to 1 - shows the *strength* of the relationship between multiple independent variables & a dependent variable, but NOT direction

3 ways to protect participants rights

- Risk benefit assessment - implementation of informed consent procedures -confidentiality measures

Symbolic Interactionism

- Symbolic Interactionism: an important tenant of grounded theory - we interacts with each other within social, cultural, historical contexts etc

Coding: Synthesis of Themes and Categories

- Themes are very descriptive: can be verbs, adverbs, adjectives - why, how, in what way, by what means

Critiquing Statistics key quizbit helpful for assignment

- What descriptive statistics are used - Are these appropriate to the level of measurements fuk me see slides

Hermeneutic Circle

- a Heideggerian method to analyze phenomenology 9 - The 'circle' represents continual movement between the parts and the whole of the text while being analyzed -understand that you cannot separate yourself as researcher from the meaning of text

Spradley's method (4 steps)

- a data analysis method for *Ethnographic* data 1. Domain Analysis - identifying domains (units of cultural knowledge, encompass other subcategories within) 2. Taxonomic Analysis - decide how many/which domains to study in depth - taxonomy = system of organizing & studying relationships btwn subcategories within a domain 3. Componential Analysis - look for similarities and differences between cultural subcategories in domain 4. Theme Analysis - domains are connected by 'cultural themes', holistic view of the culture developed

Stauss & Corbin Method

- a different coding method for Grounded Theory from glazer & strauss uses... 1. Open Coding (broken into pieces) 2. *AXIAL CODING* (pieces are put into categories) 3. Selective Coding (integrating the categories around a core concept & refined)

Framework

- a general orientation to understanding a phenomenon, useful to understand factors of interest - the contextual background of a study- often implicit and not fully explained, they assume you know the basics if you're researching. ie framework that outlines the necessary factors in good social support

Hypothesis Testing

- a process of disproving or rejecting the "null" hypothesis (remember the null hypothesis states that the research hypothesis/prediction is false, that there is no relationship between variables) - by disproving the null, you prove that your prediction is right!

convergent vs divergent validity (quizbit)

2 constructs have *Convergent Validity* when the constructs we believe are related actually ARE, in fact, related (ie we predict that people who volunteer really do have higher empathy, and we're right.) 2 constructs have *Divergent Validity* when the constructs we believe are unrelated are actually, in fact, unrelated (ie we predict that there is no correlation between having blond hair and IQ, and we are right).

theories

- a set of interrelated concepts to explain/predict phenomena - systematically explains the relationships between phenomena - systematic explanations of some aspect of the world - theories summarize existing knowledge into a coherent system, make research findings more meaningful - a theory is made up of CONCEPTS, and shows the relationships between them in a logical system ie theories explain the factors in detail which make up good social support

Bracketing

- a strategy used by qualitative researchers, in descriptive phenomenology (but NOT interpretive) - the act of setting aside personal interpretations to avoid bias/preconceptions - reflexive journalling helps - impartiality cannot fully be achieved

Immersion/crystallization analysis

- a style of analysis that uses the researcher totally immersing and and reflecting on the text - used usually in personal case reports, narrative literature

Contingency Table

- a table showing frequency, where the frequencies of two variables are *cross tabulated* (basically just meaning calculated in this table form) - i.e. two variables: gender and hand dominance, compared to see which group has more frequent R handers etc

Frequency Distribution

- a visual method of organizing raw data into a graph or chart - ordering the numeric values in the dataset and ordered from lowest to highest, with a percentage of times each value was obtained

Husserl's phenomenology ("Descriptive Phenomenology")

- phenomenology that is eidetic (vivid, realistic) and descriptive: - studying lived experiences without losing the objective. relatable, and shareable part of experiences - "what do we know as persons" - not as subjective- there is still some understanding of solid reality

Solomon Four-Group Design

- pretest-posttest design with 4 groups: a) pretest only b) posttest only c) experiment, posttest d) pretest, experiment, posttest - uses two treatment groups and two control groups - this prevents any bias from 'retaking' the test twice (pre and post)

Case Study Method

- purposeful sample selection (can be an individual or multiple "cases") - simultaneous data analysis cycles ("iterative process") - result: Thematic case-specific narratives (vignettes, highlights, descriptive)

Content Analysis

- quaLitative analysis method, not a specific approach or research tradition - organizing narrative data to identify prominent themes & patterns that come up in conversation

Issues with Qualitative Research

- researcher is an outsider coming into a personal/naturalistic space - Emergent Design (the need to shift focus/change methods continuously/during the study) - Researcher-Participant relationship (personal, can start to use researchers as therapists) - Reflexivity (we must recognize our own biases!)

Informed Consent Procedures

- signing a consent form - *process consent*: consent may need to be continually renegotiated as the study evolves (often in qualitative studies).

After-only nonequivalent control group design

- similar to the standard quasiexperiment, except a) participants are NOT randomly assigned (nonequivalent), and assessment is given only AFTER the intervention (posttest only) - exactly what the name sounds like, just remember nonequivalent means nonrandomized

What is *stability* and how do we test it?

- stability is the extent that an instrument yields the same value/results every time it's used on the SAME item - tested using *TEST-RETEST PRODECURES* (measuring multiple times on the same samples/people over time) - stability contributes to a study's reliability coefficient

Historical Research

- studies PAST phenomena to guide our present & future (use old info to gain new info) - uses primary and secondary data sources - historical research is NOT = a review of literature about historical events - high concern for validity and reliability (authenticity, understanding the data within the context of the era) - can mesh quantitative AND qualitative research

Ways to use theories

- test a hypothesis based on one theory - test two competing theories in one study (quantitative) - find a problem and develop a study, THEN pick a framework (less strong study)

2 tests to ensure stability

- test-retest - parallel reliability!

Kuder-Richardson coefficient

- tests reliability, specifically homogeneity - used with *dichotomous items;* all items are simultaneously compared using a computer? - aka KR20

Item to total correlation

- the correlation between a particular item and the sum/average of all the other items on the scale - comparing the individual participant's score, to the group's average score without that individual - reliable items have strong correlations with the total score = indicates homogeneity & reliability

Reliability

- the degree of accurate & consistent results over repetition - ie a thermometer measuring the same temperature repeatedly, the results are the same with a diff sample of participants - a highly reliable instrument to collect data has a lower error of measurement (the obtained scores are close to the true scores)

What is *equivalence* and how do we test it?

- the extent that researchers/observers rate behaviours similarly - measured by estimating *Interrater/interobserver reliability* (get 2+ equally trained observers to make independent observations and see how much their resulting data differs)

Credibility

- the extent that the *research methods* engender confidence in the truth of the data - the believability of the data - A credible qualitative study contributes to it's trustworthiness.

Ecological Psychology

- the study of *environmental influence on human behaviour* - ie the multifactorial, bidirectional influences individuals have on their family, friends, neighborhood, politics, society etc and vice versa. (a psychological qualitative research tradition)

Ethnomethodology

- the study of how people make sense of their everyday surroundings, making assumptions and filling in gaps, so that they can behave in socially acceptable ways - studying social group's NORMS and ingrained assumptions that guide our behaviour, even tho they are taken for granted/not thought about - (a sociological qualitative research tradition)

Triangulation

- the use of multiple sources to draw conclusions. - "overcomes the intrinsic bias from single method/observer/theory studies" - The use of triangulation contributes to a study's credibility - can be in both qualitative and quantitative

Positivism Paradigm

- there is only one reality out there: nature is objective and reality exists outside of human observation, not a creation of the human mind. - *Determinism:* phenomena (like illness) are not random- they have causes. - knowledge can be measured - emphasizes the rational and scientific - uses mostly quantitative methods

Template analysis

- type of quaLitative analysis that includes developing a guide/'template' to sort narrative data - template may be revised throughout study as needed - common in ethnography, ethology, discourse analysis, or ethnoscience.

Critiquing within a Systematic Review

- use a FORMAL INSTRUMENT to evaluate the quality of each study included in the review - ie using a scoring tool (eg. "Primary Research Appraisal Tool") - this allows studies 'scores' to be easily compared

2 threats to statistical conclusion validity

- weak treatment/intervention - Low statistical power

How do we estimate a Beta Level

- An acceptable minimum beta probability of error is 0.2, 20/100 cases. - However B is estimated through a "POWER ANALYSIS" (see earlier flashcard) - An acceptable minimum of "power" is 0.8

Cronbach's Alpha

- An indicator of homogeneity/IC, reliability - assessed by examining the average correlation of each item and how it *relates* to all the other items in the study - the scale/instrument is simultaneously *compared* with others - a 'score' is generated, <0.70 provides sufficient evidence of the HOMOGENEITY of the instrument

Where to find background info?

- BMJ Best Practice - Dynamed - Omni (QU library tool) - Joanna Briggs databases - Cochrane Lib - Medline, CINAHL, Psychifno

2 techniques to improve credibility of qualitative data

- Prolonged engagement - persistent observation - peer debriefings - member checks

Discourse Analysis

- Study of human communication/language - meaning, rules, mechanisms, & structures of sentences - data is usually transcripts of conversations

Crossover Design

- subjects are exposed to 2+ conditions in random order - (ie participants are assigned both one week of a diet, then one week of exercise) - subjects serve as their own control, which means the groups have very high equivalence! - but this risks Carryover Effects - cannot be used for something like drugs, bc of drug interactions or long-acting drugs. - aka "repeated measures" design

Intuiting

- the second step of descriptive phenomenology - when researchers remain open to the meanings attributed to the phenomenon by those who have experienced it

Mixed Methods Research

- uses both quantitative and qualitative techniques - integrates their data - frames these procedures with theory and philosophy

Barriers to research utilization and EBP

- weak & unreplicated studies - nurses aren't well trained in researches - organizational support lacking - lacking communication & collaboration between clinical practitioners researchers

Whats stronger relationship: Correlation Coefficient of -0.9 or +0.2?

-0.9 is a strong inverse relationship. bigger number, sign doesn't matter.

Nonequivalent Control Group Before-After Design

- a type of quasiexperiment -*identical to a regular before/after design, except participants are not randomized to their groups*, and are therefore "nonequivalent" - collects data before and after the intervention is given, but only one group is given the intervention - collection of data before the intervention is given provides some assessment/judgment of if the groups are initially equivalent

Why do we use mixed methods research?

- more complete results - to explain initial results - to explore before administrating - to enhance experiemental studies with qualitative methods - to involve participants - to develop, implement, and evaluate

Preexperimental Design

- no control group to compare - no randomization - no attempts to minimize extreneous/control variables

Control groups can be....

- no intervention - a different intervention - standard care - a placebo - different doses - a wait list

Research Ethics Boards

- require at least 5 members - require diversity of academia vs community, gender, and interprofessional representation (law, science, philosophy) - don't review scientific merit - may give feedback/suggestions - you must be granted ethics approval BEFORE research begins

Glazer & Strauss's Grounded Theory: CODING SUMMARY

Coding: organizes data into patterns/concepts *Substantive Codes*: a) Open Coding --> Level I --> Level II --> Level III b) Selective Coding --> Core Category ------> Basic Social Process *Theoretical Codes* coded on how the substantive codes relate to each other

Chapter 15

Data Analysis

Reliability (in a quaNtitative study) is roughly equatable/equivalent to __________ in a quaLitative study.

Dependability (the qualitative stability of qualitative data over time & condition)

Experimental vs Nonexperimental Studies

Experimental: researchers actively introduce a treatment/give an intervention Nonexperimental: researchers just observe existing phenomena/behaviour, no interventions given. - both refer to *quanTitative studies*

TCPS2

Modern research guidelines (2002-2018) "TriCouncil Policy Statement" Involved multiple councils of arts, science, humanity - encompasses many policies from the previous codes - provides a "CORE" Tutorial (course on research ethics)

purposive, snowball, convenience, and quota sampling are random or nonrandom?

NONrandom- "NONPROBABILITY" sampling

Directional vs Nondirectional Hypothesis

Nondirectional: predicts an existence of a relationship, but not its direction - ie "there will be a difference in fatigue between parents of 1, 5, and 10 year old children" Direction: predicts direction - ie "mothers of younger children will have higher levels of fatigue"

Whats better: prospective or retrospective studies?

PROSPECTIVE! - no chicken/egg debate of variables

ProQol-5

PRofessional Qualtiy of Life Scale measures compassion fatigue, burnout, and compassion satisfaction

parametric vs nonparametric tests (key quizbit)

Parametric Test: i) focuses on population parameters ii) require measurements on an *interval or ratio* scale iii) they involve assuming *normal distribution shape* - stronger, more flexible, and more preferred Nonparametric Test: i) do not estimate population parameters ii) measurements can be on an *ordinal or nominal* scale iii) less restrictive about assumptions of distribution shape - useful only if data cannot be measured on interval or ratio scale, or if data is very skewed/not a normal distribution

descriptive vs. inferential statistics

Descriptive Statistics: - merely describes & calculates data, lists and summarizes data in a presentable, efficient way Inferential Statistics: - try to infer causation between variables, used to interpret data/draw conclusions - based on laws of probability - offer framework to decide if the sampling error is too high to provide reliable estimates of the population

2 Types of Evluation of EBP

1. PROCESS Evaluation - during the implementation 2. OUTCOME/Summative Evaluation - after, were the goals met

3 key features of an "experiment"

1. manipulation of the independent variable by introducing a treatment or intervention 2. involves comparing an experimental group to a control/comparison group 3. requires *randomization* (where subjects are allotted to groups at random)

Narrative

goal: to tell someone else's life story data is usually in depth interviews

comparative analysis is used in what kind of study >

grounded theory

Key Informants

guides and interpreters of a culture in ethnography, help with decisions on who and what to sample

Research that uses preexisting records?

historical, meta analyses, secondary analyses

Sources of knowledge

in order of less strong to more reliable: 1. tradition, authority 2. clinical experience/intuition - contains bias 3. trial and error - good if situation is low risk 4. logic/reason - inductive & deductive 5. assembled information - ie bench marking data or quality improvement and risk data 6. disciplined research

Psychometric Properties

include reliability and validity

lab experiments have good _______ternal validity, but not _______ternal validity.

lab: - total control of settings -increases internal validity but compromises external validity

External validity

the generalizability of study results to other samples and situations can it be applied to a larger population? Threatened by: - Selection Effects - Measurement Effects - Reactive Effects

Peer Debriefings vs Member Checks

Peer debriefings: researchers get feedback from PEERS (ie other researchers, academic acquaintances) Member checks: researchers get feedback from informants/PARTICIPANTS - these both contribute to a qualitative study's credibility

Quantitative vs. Qualitative Research

Quantitative: number/statistics Qualitative: vs. descriptions from "being in the field" (observations).

Construct Validity

Recall: "How well the instrument is measuring the construct of interest" - more difficult with more abstract concepts (ie empathy) - requires theoretical, less empirical, personal judgement. - can kinda measure construct validity using the *known groups technique* or *factor analysis*

Shared Theories

a borrowed theory that has been proven to have appropriate relevance to nursing

Covariate

a confounding/extraneous variable, which is removed/controlled for in an ANCOVA analysis

Correlation Coefficient

a numerical value that describes the intensity and direction of a relationship, on a scale of -1 to 0 to +1

Literature Review

a scholarly research step that entails identifying and studying all existing studies on a topic to create a basis for new research a written summary of the current state of knowledge on your research problem

pilot study

a small study carried out to test the feasibility of a larger one

Themes: abstract entities

abstract concept that creates meaning in an experience - themes emerge from data -

Convenience Sampling

aka accidental sampling. Sampling the most readily available & convenient group of people, ie using your classmates when you're a uni student, or sampling the first 50 people you meet on the street.

accuracy

all parts of a study flow from the problem statement

Correlation Coefficient of 1? -1? 0?

Perfect Positive Relationship: when the correlation coefficient is +1. Graph will go up to the right Perfect Negative/Inverse Relationship: when the correlation coefficient is -1. Graph will slope down to the right. Unrelated: a CC of 0 means the variables have NO relationship at all, points will be randomly scattered across the graph

Exploratory Sequential Design

QuaLitative first, use to build --> QuaNtitative intervention --> QuaNtitative data then collected --> interpretation pros: cons: eg. meeting with indigenous people to qualitatively assess their greatest issues with healthcare systems. let's say they answer with language barriers as a primary complaint. then using these interviews to guide the quantitative study of language prevalence.

Who is more likely to triangulate: qualitative or quantitative studies?

Qualitative (especially ethnographies) because they have to!

SAGE Video

Qualitative Research: seek to understand meaning/experiences in lives. - holistic, broad - SOCIALLY constructed, subjective (unlike quantitative) - close, personal relationship between researcher and participant (whereas quantitative seeks objectivity and avoiding bias) - takes place in a natural setting - the data collected in qualitative research are WORDS

Who usually has a larger sample size: phenomenologists or grounded theory researchers?

Grounded theorists usually have 20-30 ppl, while phenomenologists only have ~10.

Sensitivity Analysis (key quizbit)

an assessment to see if including or excluding low quality/non-rigorous studies changes conclusions or vastly shifts data outcomes - In an integrative review, you'd do the statistical analysis twice: once with the "weak" studies, and once without, then see if they change the conclusions.

Case Studies

an in depth assessment, description, and analysis of one individual/case - asks "how/why" the case works and how it exists within real-world context - unit of analysis: one single "case": this can be one event, a program, one activity, one community, one family, or an individual - you can also have multiple "units" within a case: ie studying both the ICU and the ER within the case of KGH

extraneous variable

any confounding variable extraneous to the puropse of the study. a variable other than the IV that might cause unwanted changes in the DV.

Comparative Analysis / Constant Compairason

in *grounded theory*, researchers collect data and analyze is basically at the same time: collecting and then returning for more data repeatedly until saturation is achieved. - new data is constantly compared to old data to search for patterns and variations

in qualitative research, the __________ is the instrument and the __________ is the expert

in qualitative research, the researcher is the instrument and the participant is the expert

Unstructured Observation

includes participant observation (enters a social group of interest and participates in it's function while researching) with logs (daily events) & field notes (interpretations)

Collaborative research

a research project involving both clinical practitioners AND methodologic researchers

Integrative Reviews

a rigorous, systematic appraisal/critique of many research studies on the same topic, to "integrate" all evidence on the same phenomenon integrative reviews follow mostly the same steps as an actual research study: develop hypothesis, collect data, etc. - Meta Synthesis: integration of quaLitative studies - Meta Analysis: integration of quaNtitative studies

When undertaking a critique, it is appropriate to assume the position of....

a sceptic who demands evidence from the report that the conclusions are credible and significant.

Semantic Differential Technique

a series of scales with bipolar adjectives (ie good/bad, important/unimportant), which participants rate their reactions to a particular phenomena.

The Nuremberg Code

a set of 10 principles for human experimentation created as a result of the unethical Nuremberg trials that occurred in WWII. 1. Informed consent 2. likelihood of good outcome 3. based on prior successful animal studies 4. beneficence 5. benefits >> cost 6. experienced researchers 7. right to withdraw consent 8. stop if harm is occurring

Factor Analysis

a statistical analysis where related attributes are clustered together, and measurement of one attribute is used to estimate the measurement of another similar correlated attribute??? "statistical procedure for identifying unitary clusters of items or measures" - performed to determine if the items on a scale/questionnaire measure what they're supposed to???

Sampling Confirming and Disconfirming Cases

a strategy to enrich and challenge qualitative researcher's conceptualizations. As researchers start to notice trends/patterns in the data, they check their theory: confirming cases fit the trend and offer credibility. Disconfirming cases challenge the trend and may signify new insights or the need for revision and re-studying of their original theory. performed near the end of a study to refine.

Pretest-Posttest Design

a study where data is collected both before AND after the intervention/manipulation is applied aka "before-after" designs

Posttest-Only Design

a study where data is collected only once, after the intervention/treatment is applied

Sampling Distribution of the Mean

a theoretical distribution of an infinite number of samples drawn from a population, which you can estimate the irl population's mean from. follows a normal distribution shape.

Panel Studies

a type of longitudinal study where data is collected from the SAME sample of individuals over time

Reliability Coefficient (key summary of reliability)

a value of 0 to 1 that estimates how reliable a research instrument is made of: - stability - internal consistency/homogeneity - equivalence computed through procedures like - test retest approach (stability) - Parallel reliability testing (stability) - Cronbach's alpha technique (homog) - split half technique (homog) - KR-20 (homog) - item-to-total correlation - interrater approaches (equivalence)

Purposive Sampling

aka judgemental sampling, when researchers hand pick cases for the sample (ie if they want to interview experts in a field, it's up to the researcher to decide if they think the participant is knowledgeable enough). Assumes trust in the researcher's knowledge of the population, but lends itself to massive bias.

Extreme Case Sampling

intentionally selecting the most interesting/extreme/unusual outlier cases

Event Sampling

involves the selection of integral behaviours or events to be observed (ie only observing during shift change of nurses, whenever there are pediatric surgeries). Event sampling is better when the thing you want to observe happens more infrequently. a sampling plan used in structured observational studies.

Can you pay a participant to participate in the study?

reimbursements for time, parking, travel costs etc is okay! there isnt really clear set rule for what an appropriate honorarium is

Correlational Studies

studies that examine the relationships between/among variables, but do not do any manipulation/interventions = therefore nonexperimental research - aka "ex post facto" studies - can infer causality, but don't prove it! include a) retrospective correlational design b) prospective correlational design

Eligibility Criteria

the criteria by which people are selected for participation in a study. elements must meet the eligibility criteria to be included in the sample

Internal Validity

the degree to which outcomes (changes in the dependent variable) are due to the intervention (manipulation of the independent variable) the degree to which research is consistent within itself

methodological research

the development and evaluation of the design's instruments and methods

t-test vs. ANOVA vs. chi-squared (key quizbit)

*t-test* - tests significance of difference of means between 2 groups - can be "independent groups t-test" or "paired/dependent groups t-test" *ANOVA* - tests mean group differences between AND within 3+ groups - includes F ratio, repeated measures anova *chi-squared test:* - a nonparametric test to test the differences in proportion of cases that fall into groups - compares expected frequencies to observed frequencies

Validity

- "the degree to which an instrument measures what it is supposed to be measuring" - soundness of evidence- do the study methods measure the thing they're supposed to measure? - ie does this questionnaire actually measure depression, or is it measuring insecurity?

constancy

- 'cookbook' of *protocol*: explicit instructions that keeps all the researchers consistent in how they research - in Quantitative research

FINAL ASSIGNMENT

- 1 slide a minute is a good guideline - strengths/limitations might have one extra slide. - dissemination - number in text citation, but reference slide done in APA

in a Normal Distribution, what does the Standard Deviations look like?

- 3 SDs on either side of the mean is normal - 95% of all scores will fall into the first 2 SDs

Schematic Models

- aka Conceptual Maps - visually depict a *conceptual model* using Images/symbols/diagrams

Between Subjects Design

- comparisons are made between different groups/participants

Patient Oriented Research

- engages patients as partners - focuses on pt outcomes and priorities

Leininger's Method (4 phases)

- for ETHNONURSING research 1. collecting and recording data 2. categorizing descriptors 3. searching for repetitive patterns 4. abstracting major themes

A research proposal must secure 2 things

- funding - ethics board approval

Homogeneity

- having strict inclusion/exclusion criteria - restricting the sample's characteristics to eliminate variability of extraneous variables - contributes to reliability - aka Internal Consistency in your textbook

Integrated Knowledge Translation (iKT)

- involving knowledge users alongside researchers so that it is more relevant/helpful to the knowledge users involve users in developing research question, selecting methods, data collection, outcomes, interpretation, dissemination (ie using social media and mail-out to reach as many knowledge users as possible!)

Maturation

- pts age & change (developmentally, biologically, psychologically) over time

Concepts

- symbolic representation/mental images of an abstract idea, inferred from behaviour or events - concepts make up both theories and conceptual models a model- ie, the "concept" of health as a whole is quite abstract

An alpha level of .05 means ...?

- that only in 5 out of 100 samples would be false positives (null hypothesis be rejected when it should've been accepted) 0.05 is the minimum acceptable alpha level: 6/100+ false positives is too much.

Justice in research encompasses...

- the right to fair and equitable treatment in the study - the right to inclusionary recruitment of participants!

Bricolage

- the skill/task of putting together data from many sources, performing diverse tasks (interviewing, observing, interpreting) - qualitative researchers are bricoleurs: more holistic, creative, intuitive

Standard Error of the Mean

- the standard deviation of this theoretical sampling distribution of the mean - estimates the average degree of error - smaller SEM = more accurate the avrg population value is

Criterion Related Validity

- using a other external criterion as a benchmark for validity of the instrument- if the instrument measurements correspond/align with the scores of the other criterion, it's more valid - we can use the external criterion to estimate a *Validity Coefficient* value, from 0-1. validity coefficients where the instrument and criterion scores match >0.7 are desirable. 2 Types of Criterion Related Validity: a) predictive validity b) concurrent validity

Epistemology

Studying the nature of knowledge

T/F: Bracketing is not used in interpretive phenomenology (hermeneutics)

TRUE! only in descriptive

quizbit

make a list of all the terms that refer to qualitative, and a list for all quantitative stuff

Do ALL collection of data require research ethics approval?

not for quality assurance studies, program evaluation, performance review, etc (ie me participating in QSSET course/prof evaluations)- for educational

Explanatory Sequential Design

quaNitative first, --> quaLitative second --> interpret QUAN data is "explained" by QUAL data Pros: easy Cons: takes longer eg. quantitative assessment of how many workplace injuries occur somewhere in a year. Then do qualitative interviews to assess WHY injuries occur/perceptions of what's causing them.

Research Hypothesis vs Null Hypothesis

research hypothesis predicts that there IS a relationship. Null Hypothesis predicts that there is NO relationship between variables.

equivalence

see slides.

What is sampling?

selecting a portion of the population to represent an entire aggregate of cases

Axial Coding

when categories

Nonprobability Sampling

when elements (cases) are selected using NONrandom methods. Can sometimes be helpful, easier, and cheaper, but huge risk for bias.

The following slides refer to QUALITATIVE studies now

yes

Phenomenology Study Elements

1. purposeful sample selection: find ppl who have experienced the phenomoenon to be studied 2. get memoir written data or oral interviews - achieve data saturation! - see slides

Cultural Humility

-acknowledge unique worldviews, diversity, and power imbalances - responding by being open, self aware, egoless, respectuful, and supportive - lifelong learning - provide empowerment - accept you cannot be completely competent in all cultures

Anonymity vs. Confidentiality

-anonymity means that names and other pieces of information that can identify participants are never attached to the data - researchers never know identities! -confidentiality means that any information or data the participants provides is controlled in such a way that it is not revealed to others -only researcher has access to that data

3 Mixed Methods Designs (key quizbit)

1. Convergent 2. Explanatory 3. Exploratory

Types of research reports

1. Journal Articles 2. Theses & dissertations 3. Books 4. Conference presentations

3 Types of Qualitative Analysis

1. Template Analysis Style 2. Editing Analysis Style 3. Immersion/Crystallization Style

What is a research critique?

A careful, critical appraisal of a study's strengths and weaknesses. Allows us to draw conclusions about worth & significance of results.

Hawthorne Effect

A challenge in Experimentation: when participants know they're being studied, they change their behaviour (aka "reactivity")

Evidence Informed Practice

A clinical practice based on a balance of three elements: a) current scientific evidence b) your clinical experience c) the client's wishes/values

Repeated Measures ANOVA (key quizbit)

A one-way ANOVA that involves correlated groups of participants (ie using the SAME participants, measured at multiple points in TIME) this is analogous to a Paired t-test, but for 3+ groups.

iterative process

A process based on REPETITION of steps and procedures ie data analysis of case studies, must repeat a lot

Time Series Design

A quasiexperimental study where data is re-collected *over a period of time, multiple times*, before and after the treatment

Evidence Hierarchy

A ranked arrangement of the validity/ dependability of evidence in a study, based on the rigor of the method that produced it.

Primary Source Research Report

A report that uses primary sources: documents or data that were written or created during the time under study.

Analysis of Covariance (ANCOVA)

A statistical procedure used to test mean differences among groups on an outcome variable, while controlling for one or more extraneous variables ("covariates")

4 Research Paradigms (key quizbit)

A) Positivism: there is only one reality + knowledge can be measured B) Constructivism: there are multiple realities + knowledge needs to be interpreted C) Pragmatism: reality is constantly re negotiated + knowledge should be examined using the best tools to solve the problem D) Critical Social Theory: see textbook

The F Ratio is a value used in what statstical test

ANOVA

When a participant tends to always say yes or no (yay/naysayers) on a questionnaire, what response set bias is this

Acquiescence RS bias

Simple Random Sampling

All members of the accessible population are given a number, creating a "sampling frame" list, and then random numbers are picked to select participants randomly from the frame.

Alpha vs Beta Level of Significance

Alpha Level: the probability of making a Type 1 Error (false positive) Beta Level: the probability of making a Type 2 Error (false negative)

Cultural Humility

An acknowledgement of one's own barriers to true intercultural understanding, power imbalances, a lifelong commitment to self-evaluation and self-critique

10 QuaLitative research 'Traditions/Roots' (key summary)

Anthropology: 1. Ethnography (study of cultures) 2. Ethnoscience Philosophy: 3. Phenomenology 4. Hermeneutics Psychology: 5. Ethology 6. Ecological Psychology Sociology: 7. Grounded Theory 8. Ethnomethodology Sociolinguistics: 9. Discourse analysis History: 10. Historical Research

case control vs cohort study

Case-Control: - subjects with a certain trait (ie disease) are sought out and grouped together - create a control group - look retrospectively and compare their exposure - cheaper and immediate evidence, but not as strong evidence - a type of *cross sectional* RETROPSECTIVE design Cohort: - seek out subjects that have been exposed to something - create a control who have not - follow both groups forward over time and see what develops, compare incidence between groups - a type of *longitudinal* PROSPECTIVE design

Causal vs Functional/Associative Relationships

Causal: cause and effect relationship Functional/Associative relationship: variables are connected, but one is not caused by the other

Knowledge to Action Framework (Jan's Guest Lect- maybe a quizbit)

Centre piece: New knowledge is created (ie research, literature syntheses, practice guidelines, tools, products) Outer circle: Action Cycle, aka application: the centre piece informs the outer circle of our nursing practice.

Consumer vs Producer

Consumer: reads research, applies findings to clinical practice Producer: person who designs/conducts the research

prolonged engagement, persistent observation, member checks, and peer debriefs all improve what kind of rigor?

Credibility (qualitative)

Predictive & concurrent validity are what type of validity

Criterion related validity

Chapter 17

Critiquing Research Reports

Cultural safety

Culturally appropriate health services to disadvantaged groups while stressing dignity and avoiding institutional racism, assimilation (forcing people to adopt a dominant culture), and repressive practices.

CINAHL database

Cumulative Index to Nursing and Allied Health Literature covers references to virtually all English-language nursing and allied health journals, books, book chapters, dissertations, and selected conference proceedings

T/F: the tuskgee study violated justice and respect for persons, but maintained benificence

False! violated all three. Withholding KNOWN treatment for a fatal disease is not maximizing outcomes, even if not doing active harm.

What data analysis method do Grounded Theorists use

GLAZER AND STRAUSS's THEORY - involves constant comparison and coding key quizbit

are cohort and case controls experimental?

NO! no intervention is being given, no randomization (but there is a control = preexperimental)

Asking: "do you have pets? what kind?" uses what kind of NOIR

Nominal

Error of Measurement

Obtained Score = true score +/- error - Obtained score is the value that the researchers get (ie HR, anxiety scale) True score: the value that would be obtained if there was a way to get an infallible measure (hypothetical and unknowable) Error of measurement = difference between obtained and true score

Primary vs Secondary Sources

Primary: data that was collected specifically to answer the question you asked Secondary: using data that was maybe for something else but is indirectly helpful to answer your question

4 Evaluation of Research

Process Evaluation (In-progress feedback from the people involved in the study while it's being conducted (participants, staff, key informants) SUMMARITVE EVALUATIONS: Impact Evaluation (Evaluation at the end for immidiate changes: has a behaviour changed, an attitude developed, knowledge has increased etc) Cost Analysis (benefit/cost) Outcome Evaluation (were long term changes achieved)

Prolonged engagement vs persistent observation

Prolonged engagement: - collecting an adequate SCOPE of data - working with participants over a long PERIOD OF TIME (ie over 12 months) Persistent observation - collecting adequate DEPTH of data - working with participants for a large total NUMBER OF HOURS (ie 200 hours total) - investing enough time/hours to create trust, rapport, and deep understanding of participants

Pros and Cons of Questionnaires

Pros: inexpensive, anonymous option, no interview bias. Cons: ppl may not fill them out (low response rates), accessibility issues (language, literacy), less rich/deep responses than interviews.

Which type of study is more likely to contain "raw data?"

Qualitative! Quantitative reports almost never contain any raw data—data exactly in the form they were collected, which are numeric values. Qualitative reports, by contrast, are usually filled with rich verbatim passages directly from participants.

medication error reports, needle stick injury rates within a hospital etc are what kind of data

Quality improvement & risk data

4 aspects of research collaboration

Sharing (sharing responsibility & rewards) Partnerships (unique needs for both partners) Interdependence (both bring something and need something) Power (defined and shared)

4 Types of Probability Sampling

Simple Random, Stratified Random, Cluster, Systematic

Simple vs Complex Hypothesis

Simple: predicts a relationship between 1 independent variable and 1 dependent variable Complex: predicts a relationship between 2+ IV and 2+ DV

sampling where you specific that you're surveying every 7th person on a list is what kind

Systematic sampling (probability)

T-Tests, ANOVA, and Chi Squared are what kind of test?

T-test and ANOVA = Parametric Chi Squared = Nonparametric

Confidence Interval (CI)

The confidence that the true population value lies within the distribution A range of values, calculated from the sample observations, that is believed, with a particular probability, to contain the true value of a population parameter. A 95% confidence interval, for example, implies that were the estimation process repeated again and again, then 95 percent of the calculated intervals would be expected to contain the true parameter value. Note that the stated probability level refers to properties of the interval and not to the parameter itself which is not considered a random variable.

Auditability

The degree to which an outside person can follow the researchers methods/decisions/conclusions/process. - auditability is enhanced when researchers share *audit trails* and *decision trails* in reports

Content Validity

The degree to which the content of a test/parts of an instrument measure what they're supposed to, and are universal/widely representative of the content of the items being measured - ie when testing knowledge: do the questions on this quiz cover all the content that could be asked of this topic? - content validity can be sometimes calculated into a "content validity index," but mostly it's based on expert's judgement.

Confirmability

The degree to which the results are objective, neutral, unbiased, and could be confirmed by others - one of the criteria for a trustworthy qualitative study

implementation potential

The extent to which an innovation is amenable to implementation in a new setting, an assessment of which is often made in an evidence-based practice project. assess feasibility, transferability, cost/benefit ratio etc

Ontology

The nature of reality, being - qualitative research believes in multiple realities bc we all have different life experiences/worlds views and perceptions of reality

Degrees of Freedom

The number of individual scores that can vary without changing the sample mean. Statistically written as 'N-1' where N represents the number of subjects. calculated on a computer usually

what kind of sampling would a grounded theory study use?

Theoretical (qualitative) sampling (bc it's EMERGENT)

a within-subjects factorial design would be...

a crossover design

Iowa Model

acknowledges that RU/EBP starts with a "trigger" that pushes an institution to explore changes to their practice. a) Problem focused trigger: changes caused by issues in the institution: ie high infection rates, poor financial data, complaints. b) Knowledge focused trigger: changes caused by new literature published/guidelines by a board

post hoc tests (multiple comparison tests)

additional hypothesis tests that are done after an ANOVA to determine exactly WHICH mean differences are significant and which are not isolate differences between group means that are responsible for the ANOVA rejection or acceptance we use Post Hoc tests instead of doing a bunch of t-tests (group A versus group B, group A versus group C, and group B versus group C) to avoid the risk of a Type I error

Likert Scales (key quizbit)

aka "summated rating scales." Scales that make a statement about a phenomenon and ask participants to rate how much they agree or disagree on numerical scale. At the end, all their scores are added/"summed" up.

Worldview

beleifs and assumptions- abstract.

Using multiple sets of data across many diff countries is a technique to overcome threats to ________ validity.

external validity

T/F: Random sampling guarantees that the sample will be representative of the population

false! It does, however, guarantee that any differences are purely by chance and not bias.

T/F: a study requires that researchers disclose which participants have been placed in the control or intervention group

false! ie double blinding and placebos are a thing

a time series design is experimental

false, quasiexperimental bc it only uses one group

codes of ethics

formalized rules and standards that guide ethical research

When might you NOT want conceptual frameworks?

ie in a qualitative study when you want to hold back assumptions/conceptualizations of the phenomena, decreases judgement and bias ie in Phenomenology (learning about someone's lived experience), you must believe them

research misconduct

ie plagiarism, fabrication or falsification of data and results! differs from unethical conduct which refers to treatment of participants

Critique

important that a study provides a vivid/thick description of the environment, culture, time, and context that the study took place in, so you can decide if the findings could be transferred to your situation (the "FITTINGNESS")

inductive vs. deductive reasoning

inductive reasoning: develops/builds up larger generalizations from specific/individual observations - ie putting together many pieces to make a puzzle deductive reasoning: develops specific/individual predictions based off more larger/general principles - ie deconstructing the puzzle to look at individual pieces

Tacit Knowledge

information about a culture that is so deeply ingrained, members may not talk about it or be aware of it

empirical evidence

information/data we can verify with our senses - directly observable.

Instrumentation Threat

instrumentation threat is things like using an uncalibrated measurement tool, different scales for weights etc OR using the wrong instrument to measure the thing

Maximum Variation Sampling

intentionally selecting participants so the sample has a wide range of variation

Homogenous Sampling

intentionally selecting participants who are similar to decrease a samples range of variation.

Informed Consent

legal principle that requires a researcher to inform participants about benefits & risks of participation, BEFORE they participate. this is a continuous, ongoing process of dialogue/Q&A during the research. No manipulation or coersion. the researcher themselves should NOT be the one who gives info and collecting informed consent- should be an impartial physician, assistant, etc

Postpositivism

less strict than positivism- acknowledges the impossibility of total objectivity. however still seek objectivity, understand that phenomenon PROBABLY have a cause, but not always.

what's better: longitudinal or cross-sectional?

longitudinal!

In Ordinal data, which central tendency is mostly used/reported?

median

Theoretical Frameworks for research

most abstract - worldview - framework - theories - concepts - variables most concrete

Double Blind Experiment

neither participants nor researchers know which participants belong to the control group vs. test group

Borrowed Theories

non-nursing specific theories that are borrowed/used by nursing researchers ie Theory of Stress and coping

Inferential Statistics

not just data presentation, but provides analysis and inferences about population from it - includes hypothesis testing and estimation of parameters

Stratified Random Sampling

population is divided into homogenous subgroups and THEN elements are selected from the groups AT RANDOM. (the probability sampling equivalent to nonprobability quota sampling)

Descriptive Research

studies that summarize/describe a phenomena - but don't have an intervention or control etc, therefore "nonexperimental" research

Phenomenology

studies the lived experiences of individuals: learning about people's LIFE experiences, and the meaning of their lives. - usually interview multiple people who have had a shared/similar experience (ie women who have experienced miscarriage) - seeks "what is the *essence* and meaning of this experience?"

Retrospective Correlational design

study that begins with a DEPENDENT variable (outcome) and then looks backwards for the cause or influencing risk factors - ie starting with people with cancer and comparing differences in their previous lifestyle to a control group of ppl without cancer

Statement of purpose

summarizes the overall goal of the study: introduces the population being studied, and the key variables

The protection of blood and tissue samples is under what code?

the declaration of helsinki

Risk/Benefit Assessment

the participants benefits and the societal benefits of research are weighed against the negative costs to the individual participant

Structured Observation

use a SAMPLING PLAN

Probability Sampling

uses RANDOM selection of elements/cases from the population. Provide better population representation, but are more expensive and complicated to run. Probability sampling permits researchers to estimate a sampling error value.

Parallel Reliability

using multiple questions on a questionnaire that ask the same thing, but worded differently to cut out outliers - part of equivalence in reliability

Research Utilization (RU)

using research findings to improve client care (can be used interchangeably with EBP, as both do not consider client values or clinical expertise)

How do grounded theory studies do sampling?

usually use Theoretical Sampling

4 types of Validity

1) construct validity 2) external validity 3) internal validity 4) statistical conclusion validity (quantitative studies)

Qualitative use of Purposive Sampling (5 Strategies)

1. Maximum Variation Sampling 2. Homogenous Sampling 3. Extreme Case Sampling 4. Criterion Sampling 5. Confirming and Disconfirming

5 Criteria to Establish Causality

1. *Covariation/Association*: the variables occur together 2. *Time-Order/Temporality*: the cause precedes the effect 3. *Coherence*: there is some pretty logical theoretical/biological explanation for their association 4. *Continuity*: the effect occurs pretty soon after the cause (not required but helpful to infer causality) 5. *Exclusion of other plausible alternatives*

3 Types of (Qualitative) Rigor

1. *Credibility:* - do participants recognize the experience as their own? (ie it hasn't been lost in the researchers translation) - has ample time been allotted to understand the phenomenon 1. *Auditability* - are we able to follow the researchers thinking - is the process documented/outlined? could I easily replicate the study? 3. *Fittingness*: - are the findings applicable and meaningful outside of the study situation? - is the research strategy compatible for the purpose of the study?

2 Types of Biophysiologic Measures

1. *In Vitro* data (in tubes, plates, etc, anywhere outside the body, ie urinalysis in a cup). 2. *In Vivo* data (measurements performed within/on the living body, ie blood pressure).

Interpretation Process

1. Analyzing credibility of results 2. Determining meaning 3. Considering their importance 4. determining generalizability (for quaNtitative) / transferability (for quaLitative) 5. assessing implications for nursing & future research

Questions to assess Research Collaboration Applicants

1. Appropriate expertise/experience 2. Appropriate engagement/commitment - do they have enough time to commit? - clear role responsibilities? 3. Appropriate environment/resources - infrastructure, facilities, staff, equipment for success?

2 perspectives of Wholistic Perspective in Ethnography

1. Behaviour: observing how the group acts: customs, day to day life, interactions 2. Cognitive: assessing the participants thoughts, ideas, knowledge, beliefs

Structure of Interviews

1. Completely Unstructured (conversation) 2. Semistructured Interviews (uses a broad topic guide) 3. Focus Group Interviews (discussion with a small group) 4. Life History Interviews (participants narrate their life experiences around one theme/topic) 5. Think Aloud Method (participants talk through their decisions as they make them) 6. Diaries (participants maintain daily records of some aspect of life) 7. Critical Incidents Technique (probes about the circumstances surrounding an incident of interest)

4 Intellectual Processes of Qualitative Analysis

1. Comprehending - make sense of data, see big picture phenomenon - obtained only once saturation is achieved 2. Synthesizing - sifting thru data, find what is normal for the phenomenon and what is variation 3. Theorizing - develop alternative explanations of the phenomenon and then see if they align with data - test and retest multiple theories until one best is found 4. Recontextualizing - exploring the now-developed theory's applicability to other groups/settings - these are NOT linear necessarily

Ottawa Model of Research Use

1. Consider the Practice environment 2. consider who the potential adopters of research are 3. Create evidence-informed innovation how to transfer/disseminate the info 4. Adoption of research for use 5. Outcomes (dont need deets but just purpose of the model and why it's used)

2 Scales to measure Reliability

1. Cronbach's Alpha 2. Composite Reliability

QuasiExperiment

1. DO have an intervention/treatment *2. but are either lacking a control group 3. OR they do not randomize* - quasiexperiments attempt to compensate for this by controlling extraneous variables

5 Threats to Internal Validity

1. History threat - contextual influences of the current events in the world (ie having a pandemic onset halfway through your study would fck up the post-test data!) 2. Selection threat - poor selection 3. Maturation threat - changes in the dependent variable due to the passing of time rather than the intervention (ie studying delayed development in children, wound healing,- it will improve on its own no matter what) 4. Mortality threat - caused by different levels of subject attrition between the experimental and control groups

3 Criteria to evaluate Qualitative Sampling

1. Informational Adequacy (quality and quantity of data, saturation reached, no thin spots). 2. Appropriateness (correct method was used for sampling, selection not biased, participants were not excluded). 3. Transferability (how well the study applies to a different population)

Data Collection for Nurse Researchers (3)

1. Self reports, 2. Observations, 3. Biophysiological Measures

Measurement Errors

1. Situational contaminants 2. response biases 3. transitory personal factors (ie fatigue) 4. Administration variations 5. Item sampling

3 Types of Response Set Bias

1. Social Desirability bias (tendency to give answers that are consistent with prevailing social views/please the majority). \ 2. Extreme Response bias (tendency to express the most extreme attitudes about everything, ie saying strongly agree or disagree always). 3. Acquiescence Response bias (tendency to either always agree with everything ("yea sayers") or always disagree ("nay sayers") no matter what the question is asking.

4 Dimensions of Data Collection

1. Structure (ie unscripted interviews, or mathematical test?) 2. Quantifiability (ie transcripts or score out of 100?) 3. Researcher Obtrusiveness (ie is the researcher very present? is the participant aware they're being studied? or just observed?) 4. Objectivity (how unbiased is the data collection?)

2 types of grounded theory

1. Substantive Theory - a theory grounded in data from one single study on one specific topic (ie PPD) - many substantive theory studies make up formal theory! - more specific: ie tailored clothes 2. Formal Grounded Theory - more abstract, general compilation of substantive theory studies - more broad: ie store bought clothes

3 Types of Longitudinal Studies

1. Trend Studies 2. Panel Studies 3. Follow Up Studies

Follow Up Studies

A type of longitudinal study that uses patients who have not yet experienced the outcome of interest.?? study continues until this outcome occurs.

Ottawa Model

6 components: - environment - the adopters of said research - evidence based innovation (the "thing" being adopted) - strategies for innovation transference into practice - adoption of evidence - health/outcomes

A validity coefficient of greater than ____ indicates strong validity.

>0.7

audit trail vs decision trail

Audit Trail: - the systematic documentation of info, (ie interview transcripts, reflective notes, raw data, statistical analytic calculations) which allows an auditor to evaluate the trustworthiness of a study Decision Trail: - outlines the researchers decision rules/justification for categorizing data and making the inferences they did

Basic vs Applied nursing research

Basic: designed to provide general information, for the sake of knowledge Applied: designed to solve a specific problem

ethnographic researchers would want to develop an (etic/emic?) perspective.

EMIC- an inside, participatory perspective by living in the culture.

Emic Perspective vs Etic Perspective

Emic Perspective: insider's perspective - ie an ethnographic research wants to achieve an emic perspective - an emic perspective helps to discover "tacit knowledge" in a culture Etic Perspective: an outside perspective

T/F: the goal of a study is to prove a hypothesis is true

False: hypotheses are never truly "proven" right or wrong, just accepted or rejected based on support or not from the data.

social interactionism, core variables, substantive & formal theories are all part of what kind of qualitative theory

GROUNDED THEORY

Heidegger vs Husserl

Heidegger: interpretive phenomenology Husserl: descriptive phenomenology

Construct Validity (& 2 ways to measure it)

How well the instrument is measuring the construct of interest - more difficult with more abstract concepts (ie empathy) - requires theoretical, less empirical, personal judgement. - can kinda measure construct validity using the *known groups technique* or *factor analysis*

3 Types of Nonprobability sampling

Includes Convenience sampling, Quota sampling, and Purposive sampling.

dependent vs independent groups t-test

Independent groups t-test: - if the two groups being compared are made up of different participants paired/dependent groups t-test: - if the groups being compared are made up of the same participants, receiving the interventions at different times & acting as own control.

Internal criticism vs External criticsm

Internal = ie Reliability External = ie validity

Which paradigm uses Determinism

Positivism

Clinical Trial Phases

Preclinical/Phase 0 Phase I: testing on mass populations, ie pharmacokinetics of a new cancer drug Phase II: now focus on target population (ie cancer pts) Phase III: randomization now occurs, comparing new therapy on current standard of care Phase IV: monitoring of long-term sfx

Predictive Validity vs Concurrent Validity

Predictive Validity: - an instrument's ability to differentiate between participant's statuses based on a *future* external criterion - ie students with good marks in high school often also have good marks in post-secondary. therefore high school grades have strong predictive validity for uni GPA Concurrent Validity: - an instrument's ability to differentiate between participant's status *presently/currently* using an external criterion - ie pts take a psychological test to assess readiness for discharge. those patients are also assessed by nurses who declare if they think they're ready for discharge. If the tests and the nurses often come to the same judgement, the test has strong concurrent validity, based on the external criterion of expert nurse judgement.

in a study where Qualitative data was prioritized, and preceded supplementary Quantitative data, what would the shorthand notation be?

QUAL-->quan

template, editing, and immersion/crystallization analysis are for what kind of data?

QUALITATIVE data analysis! ie ethnography, ethnomethodology, phenomenology type studies

Research Site vs Setting

Site: - overall location - ie the community of Leaside, or Sunnybrook hospital - can be multisite Setting: - specific place where data collection occurs - ie people's homes within Leaside, or in patient's rooms in the hospital - a *naturalistic setting* = fieldwork in the "field", observing ppl in their homes - vs a formal laboratory setting

Conceptual Models of Nursing

The concepts in the nursing models are: person, environment, health, and nursing ie Moyra Allen's McGill Model of Nursing

Gaining entree

The process of gaining access to a study site and its participants through the cooperation of key *gatekeepers & stakeholders* of the selected community or site. in qualitative studies

Research Control

The process of holding/controlling constant confounding influences on the dependent variable so the relationship can be better understood

statistical conclusion validity

The strength of evidence that a relationship exists between the two variables. The extent to which we can be certain that the researcher has drawn accurate conclusions about the statistical significance of the research.

Ethnography

The study of CULTURES - the values, patterns, beliefs, myths, behaviors and experiences of a cultural group - participatory fieldwork- understanding the culture, learning from the community, rather than "studying" the people. Living from their worldview. - often also uses simultaneous data collection and analysis

critical discourse analysis

The use of linguistic analysis to explore and challenge the *ideologies, positions and values* and their producers - the language that speakers bring to a debate over issues of interest like social justice

What does it mean to 'interpret' research findings/results?

to search for broader meaning and implications of the results

Triangulation is _____titative

trick question- it's used in BOTH quantitative and qualitative studies!

T/F: the larger the sample, the more likely the sample will represent the population accurately

true!

descriptive theory

thoroughly explains a phenomenon

Hypothesis

a predicted relationship between 2 or more variables. A hypothesis is testable if you have an independent and dependent variable.

Situational Contaminant

conditions of the study that affect the measurements - ie participants change their behaviour with the awareness of an observer, temperature, time of day

Measures of Scientific Merit (summary)

QUANTITATIVE Reliable Validity QUALITATIVE Trustworthiness Credibility Dependability Auditability Triangulation

which model of RU/EBP is used for *individual* clinicians?

*Stetler Model*: for integration of EBP in individual clinician's practice

4 Levels of Measurement/Scales key quizbit

"NOIR" for Likert Scales in order of lowest to highest quality measurement: 1. *Nominal Measurement* - classifying objects into mutually exclusive categories based on attributes. - Often dichotomous, binary but not always - ie grouping participants by gender. "females" are coded as "1" and "males" are coded as "2" - numbers in nominal measurement don't actually hold mathematical numerical value, just placeholders for groups, no meaning. 2. *Ordinal Measurement*: - ranking objects based on their relative standing based on an attribute - ie ranking participants from most dependent to independent to perform ADLs: 1= requires complete assistance, 2= some assistance, 3= completely independent. 3. *Interval Measurement*: (aka Continuous) - ranking objects like in ordinal, PLUS indicating the distance between them - OR, as soon as there's more than 4 ordinal objects/options - ie raking participants temperatures in celsius, or in IQ score results. These use interval scales to measure difference between each ranking - but: interval measurement scales do not have a 'baseline' zero (even celcius zero is still an actual temperature, its not the absence of temperature, and zero IQ is meaningless/arbitrary- its not really the absence of intelligence) 4. *Ratio Measurement*: - like interval, but the scale has a rational, meaningful zero point - i.e Ranking participants from lightest to heaviest based in lbs (weight DOES have a zero, 0lbs is the absence of mass)

Symmetric Distribution vs Skewed Distribution

*Symmetric Distribution* - median and mean are the same - could be folded in half equally *Skewed Distribution* - peak(s) are asymmetric and/or a tail is longer than the other - Positive & Negative Skew - Unimodal (one peak), Bimodal/multimodal (2+ peaks)

What is a phenomenon?

"occurrences, circumstances, or facts that are perceptible by the senses" assessing lived experience to find day-to-day life and experiences that are maybe even taken for granted

3 Phases of Naturalistic Inquiry

(naturalistic inquiry: in the field, natural setting) 1. Orientation: to determine what is interesting/important about a phenomenon 2. Focused Exploration: close examination of these features 3. Confirmation and Closure: confirms findings

Compare the following Central Tendency Mean Median Mode

*Central Tendency*: a measurement that indicates the typical value/average result of the participant. Mean, median, and mode are all TYPES of central tendency. *Mode*: the number that occurs most frequently in the set/distribution *Median*: the middle number, that divides the scores in half. If there is an even number of values in the set, the median is the average of the two most middle numbers. Medians are insensitive to extreme outliers *Mean:* the average. Add up all the values and divide them by number of participants. Mean IS sensitive to outliers.

Colaizzi vs Giorgi vs Van Kaam

*Colaizzi:* involves validation of the results by returning to participants for 'member checking' *Giorgi*: believes the researcher alone should analyze, no consultation of participants our external judges. *Van Kaam*: believes in intersubjective collaboration between many expert judges to agree

How to eliminate bias in quaLitative research?

*Reflexivity*: the process of reflecting critically on one's own values/biases that could sway data interpretation/collection

What is Homogeneity and how do we test it?

- aka Internal Consistency - Homogeneity improves the extent that an instrument is measuring the same ATTRIBUTES/CHARACTERISTICS of the item in a study, and eliminates possible confounding variables measured using either a) *Split-Half Reliability Technique* (separating the items into two groups of 'even' and 'odd' and measuring them- if both groups averages are similar, the instrument is measuring the same attributes), or b) the *Cronbach Alpha Coefficient* (splits the items into two groups in a bunch of different ways- not just even and odd- and tests the internal consistency like above). c) KR-20 for dichotomous items d) Item to total correlation

Constructivist Paradigm

- aka naturalistic - reality is constructed by humans and exists within a context, therefore different people have different interpretations of realities - there is no ultimate truth or falsity - knowledge is gained by understanding each other's realities/interpretation - uses mostly qualitative methods

Moderator effects

- aka subgroup effects - effects that are moderated by a third variable - ie if we are testing efficacy of a new drug, the effect size might be calculated separately within different subgroups of men vs women, pregnant vs non-pregnant participants

Participatory Action

- allowing the people who are LIVING the issue to spearhead the change - exploration, reflection, and action on social and health problems - the researcher is a consultant and aid, not an expert or leader! - gives a voice to the community to plan action that will help them - the final report/plan should be approved and reviewed by the people who are being studied - focuses on ACTION as an outcome

Multiple Regression

- allows them to use multiple (2+) independent variables/interventions to predict a dependent variable - dependent variables are interval or ratio level - independent variables ("predictor variables" when in multiple regression) are are interval, ratio OR nominal level variables. - the resulting value from a multiple regression analysis is = "R", the multiple correlation coefficient

Multivariate Statistics

- analyses with >3 variables - includes multiple regression and ANCOVA, the big 2 - other multivariate procedures are discriminant function analysis, logistic regression, factor analysis, multivariate analysis of variance (MANOVA), multivariate analysis of covariance (MANCOVA), path analysis, and LISREL.

Phenomenological Analyses

- analysis searching for common patterns of personal experiences - immerse, reread - extract significant statements - find relationships between themes - synthesize themes to describe a phenomenon - Include Colaizzi, Giorgi, Van Kaam (all Husserl's philosophy school of thought)

ANCOVA (analysis of covariance)

- basically ANOVA + Multiple Regression - often used when nonequivalent control group design is used

Journal Articles

- briefly describe the studies - intend to communicate how the study has contributed to knowledge (in clinical prac)

Correlation Matrix

- common way of showing the correlation coefficients (R or Rho!) among several variables - table in which the variables are named on the top and along the side and the CC values among them are all shown

Within Subjects Design

- comparisons are made "within" the same participant.: comparing the same people but in different times/conditions a research design that uses each participant as his or her own control; for example, the behavior of an experimental participant before receiving treatment might be compared to his or her behavior after receiving treatment

Considerations in Knowledge Dissemination

- create a dissemination PLN - consider who will use info, how, why, and if they'll even want to - dissemination messages should be clear/simple/action oriented, and tailored to the literacy and interests of the audience - evaluate dissemination success after

Variable

- elements that are observable through the senses/measurable - a characteristic that varies from person to person- takes on different values! ie blood pressure, BMI, creatinine clearance rates are variables that are used to measure the concept of health

Vulnerable Subjects

- maybe not able to make a truly informed decision about study participation (e.g., children); - may have diminished autonomy (e.g., prisoners) - may have circumstances that put them of higher risk of physical or psychological harm (ie pregnant women, traumatized pts) vulnerable subjects require additional protection

Heideggerian / Hermeneutic Phenomenology ("Interpretive Phenomenology")

- more *interpretive* and subjective than Husserl's: focus not just on describing experiences, but understanding it - "what is being?" - everyone truly lives a completely different perspective

Directions/Trends for nursing research in the future

- more focus on evidence BASED practice - more rigorous/confirmatory methods: more replication of the same study for consistency - systematic reviews (gathering of multiple studies on the same topic) - more transdisciplinary/interprofessional research - active dissemination (ie online distribution/accessibility) of findings - more cultural/SDoH focus

Factorial Design (quizbit)

- more than one independent variable (ie a group assigned both diet AND exercise as an intervention) - allows researchers to test both main effects and interaction effects

Refereed Journals

- nursing journals that have a policy that journals must be *"blindly reviewed"* (neither identity of author or reviewers are revealed) by 2+ peer experts in the field - refereed journals which are peer reviewed are more prestigious than nonrefereed ones.

Research Ethics Board

- performs external evaluations of a study's ethical actions - may be required if the research is being funded by a diff organization

Quantitative research steps

1. identify research purpose & question 2. review literature to find what is currently known & where the gaps are- give readers some background 3. create framework that relates concepts to each other 4. pick a study design 5. select sample and measure 6. analyze collected data and state if hypothesis was true or not

Notations used in Mixed Methods

1. *Shorthand* - qual - quan 2. *Uppercase* - we capitalize what is prioiritized: ie QUAL 3. *Lowercase* - we lowercase the method that is just supplementary, ie quan 4. *+* plus sign indicates concurrent methods 5. *-->* arrow indicates the sequence of methods

4 Types of Triangulation

1. *data source triangulation:* the use of multiple data sources (ie interviewing diverse groups: pts, doctors, nurses. looking at multiple maps for directions) 2. *investigator triangulation*: using more than one person to collect/analyze/interpret data, different researchers with different backgrounds/clinical expertise 3. *theoretical (theory) triangulation*: using multiple theories (perspectives) to interpret a set of data 4. *method triangulation:* using multiple research methods in the study to address the problem (ie observing participants AND doing interviews. ie using a qualitative AND quantitative method) 5. *Interdisciplinary triangulation*: the use of more than one discipline/field to study a topic (kind of like investigator triangulation)

Iowa Model of EBP

1. nurses are triggered by either a) a clinical problem, or b) a new piece of knowledge 2. decide if it's a priority 3. Consider current research and either a) use it, or b) conduct your own 4. implement into practice 5. evaluate

3 Types of Research Utilization

1. Instrumental (bedside, clinical practice changes) 2. Persuasive (policy changes, financial) 3. Conceptual (changing ideology, stigma, values)

3 types of coding

1. Open Coding - breaking down the data into it's parts and comparing it for similarities and differences - ie taking highlights from transcripts and grouping quotes into "codes", like in the SVDP project 2. Axial Coding - groups similar codes into larger categories - codes are linked around a conceptual "axis" 3. Selective Coding - grouping categories into one big core category (the "central category", the main theme of the study) - integrating and refining all the categories (this is the stauss & corbin strategy for grounded theory?)

5 Dimensions to consider when reviewing a study

1. Substantive & theoretical dimensions (is this problem significant, relevant? clear, testable research question?) 2. methodologic (are the choices they made in methods to test their hypothesis logical? good fit? too much compromise for the sake of 'easier' research?) 3. ethical (participants wellbeing, privacy protected?) 4. interpretative (are the conclusions/implications provided good interpretations? are they transparent about limitations they faced?) 5. stylistic & presentation dimensions (is there any vital info missing? is it well written and logically laid out? ie IMRAD for quantitative)

Type 1 vs Type 2 Errors

1. True Positive 2. False Positive - Type I Error - when the null is incorrectly rejected 3. True Negative 4. False Negative - Type II Error - when the null is incorrectly accepted

5 steps of evidence informed practice

1. ask the burning clinical question 2. collect relevant and best evidence 3. critical appraisal of evidence, confirm validity/legitimacy 4. integrate all evidence 5. evaluate the practice decision or change

4 Stages of Adopting an Innovation

1. awareness 2. persuasion 3. occasional use 4. regular use

4 Steps of Descriptive Phenomonology

1. bracketing 2. Intuiting 3. analyzing 4. describing

When are critiques done?

1. by students to practice skills 2. as part of a bigger integrative review (ie meta-analysis) 3. by other researchers (peer reviews) to assist journal editors with publication decisions

4 Components of Evidence Informed Practice

1. clinical expertise 2. research based therapies, opinion leaders 3. available resrouces, client history, phsyical exam results 4. pts values and preferences

Nursing Research

1. find a knowledge gap, we have a question we want answered with an absence 2. develop a research question & collect qual/quantitative data 3. distribute the knowledge in reports, presentations, and to clinicians/professionals 4. apply findings in clinical settings, alter policies & protocols 5. review and revise knowledge based on results from implementation of changes

4 Techniques to control for participant characteristics/"quirks" in a QuaNtitative study?

1. homogeneity 2. matching 3. statistical procedures (ie analysis of covariance) 4. randomization

Qualitative research steps

1. identify research purpose & question 2. find population that has experienced phenomenon of interest 3. interview/observe ppl 4. analyze data and look for patterns/recurring themes 5. review literature 6. conduct more interviews/observations over and over until no new themes occur ("saturation") 7. review literature again 8. summarize findings and describe the human experience

primary source

A document or physical object which was written or created during the time under study, a firsthand account

journal club

A formally organized group that meets periodically to share and critique contemporary research in nursing, with a goal of both learning about the research process and finding evidence for practice.-

Cross Sectional Study

A study in which a representative cross section of the population is tested or surveyed at one specific time. people of different ages are compared with one another

prospective correlational design

A study that begins with an INDEPENDENT variable (presumed cause) and look to see if it does really influence the dependent variable ie studying a group of smokers compared to nonsmokers and see who gets lung cancer

subject attrition

A threat to internal validity When participants are lost from an experiment/drop out! - changes the nature of a group from that established prior to the introduction of the treatment, therefore destroying the equivalence of groups that had been established through random assignment

Trend Studies

A type of longitudinal research that takes random samples for data collection out of one defined population over time, but with different individuals making up the sample group each time

after-only design

Basically a normal RCT experimental design with a treatment group and a control group. BUT this design differs from the true experiment in that both groups are measured only AFTER the experimental treatment (posttest)

sampling where each attribute/condition is ensured to be represented in the control and experimental group first, then remaining spots are randomly filled in is...

Block sampling (probability)

conceptual vs theoretical framework

CONCEPTUAL: more *inductive (developing own framework)* , constructed from a review of literature. based on what we find, we propose relationships in the phenomena THEORETICAL: more *deductive (tests an existing framework)*. philosophical, answers WHY the phenomena occurs. see text

CHSRF

Canadian health services research foundation funds chairs and programs that do nursing research

what kind of research uses the iterative process?

Case studies! basically repetition if data collection

Clinician vs Scholar

Clinicians: provide safe, competent, evidence informed care based on client needs Scholar: nurses who are committed to continuous learning, critical inquiry, application of research evidence to practice Nurses should be both clinicians and scholars as per CNO competencies

Operational Definition vs Conceptual Definition

Conceptual: the 'abstract', theoretical meaning of the thing you're actually trying to measure. ie. anxiety, depression Operational: the mechanisms used to measure the conceptual definitions, ie questions on a questionnaire used to measure anxiety quantitatively.

CONSORT

Consolidated Standards of Reporting Trials website to guide what to include in your studies - what were the inclusion/exclusion criteria? - how many participants? how many were assessed by excluded and why? - how many participants were lost during the study? etc

Hermeneutic Data Analysis (2 types)

Deikelmann's Method: - searches for *Constitutive Patterns,* relationships among themes Benner's Method: - searching 3 things: i) *paradigm cases* (strong examples of the phenomenon being studied, used early in analysis) ii) thematic analysis (comparing sim& differences across cases) iii) analysis of exemplars (examples that highlight specific aspects of a paradigm case)

Dependent vs Independent Variable

Dependent: the behaviour/characteristic/value that the research is measuring Independent: the presumed cause/influence on the dependent variable- the thing that is changed in a control study

Response Set Biases

Enduring personal characteristics (ie being always agreeable) of the participants that affect their questionnaire results, independent of the actual question's content.

block randomization

Ensures groups have representation from a certain characteristic or intervention & control pop equally represented control procedure in which the order of conditions is randomized but with each condition being presented once before any condition is repeated a process of randomization that first creates treatment blocks containing one random order of the conditions in the experiment; subjects are then assigned to fill each successive treatment block

spradley and leininger's methods are used to analyze data in what kind of study?

Ethnography

ethnomethodology vs ethnography vs ethology vs ethnoscience

Ethnomethodology: study of social NORMS & tacit knowledge Ethnography: study of CULTURES Ethology: study of human BEHAVIOUR Ethnoscience: study of a culture's COGNITION

Evidence Based vs Evidence Informed

Evidence Based: old term, ignores context, focus only on science. includes only clinical expertise & research, ignores #3 and #4 on next card. Evidence Informed: incorporates evidence from research, clinical expertise, client preferences, available resources etc. Evidence informed care is based off of evidence-based information, but accounts for context. Merges science AND art.

Generalizability vs Transferability

Generalizability: the extent to which findings can be applied to other groups and settings *in QuaNtitative studies* - generalizability doesn't matter unless the study is proven valid and reliable Transferability: the extent to which findings can be applied to other groups and settings *in QuaLitative studies* - a part of trustworthiness - involves thick description

Glaser & Strauss vs Strauss & Corbin

Glaser & Strauss - uses constant comparison of data to find emerging patterns - uses Open, Selective, & Theoretical coding - outcome: explains how a Basic Social Problem is processed in a social setting Strauss & Corbin - breaks down data: ie taking apart a single sentence, incident etc. - uses Open, Axial, & Selective coding - outcome: a full description of a concept

The Tuskegee Study

Initiated in 1932 and lasted 40 years!! recruited black men to study Syphilis, never told me they had it, did not treat disease, just observed. an unethical study about syphilis in which subjects were denied treatment so that the effects of the disease could be studied.

4 Types of Ethnography

Institutional ethnography: assessing the culture of an organization/institution/building: ie a hospital culture Feminist ethnography: assess values/beliefs/behaviours that effect and oppress women Ethnogeriatrics: focuses on health and aging issues of elders, the cultural beliefs value and practices of seniors Critical Ethnography: focus on beliefs/behaviour that limits human freedom, social justice, and democracy in a culture

which model of RU/EBP is used for groups/whole organizations?

Iowa Model (of evidence based practice to promote quality care) Ottawa Model (of research use)

Proportionate Sample

Lets say you are sampling 100 people from the school of nursing. The whole student population is 85% white, 10% Asian, and 5% black. A *proportionate sample* would have 85, 10, and 5 of each ethnicity respectively. About accurately representing the population, not evening out the percentage.

3 Levels of Open Coding

Level I: "in vivo" codes, derived directly from vivid language that "grab" Level II: condensing the level I codes into similar categories Level III: "theoretical constructs", abstract codes that add scope beyond local meanings

Main effect vs Interaction effect

Main effect: an effect directly resulting from the intervention Interaction effect: an effect resulting from the interaction of two combined treatments

In Ratio data, which central tendency is mostly used/reported?

Mean

What is Measurement

Measurement: a set of rules/numeric values assigned to objects to represent varying degrees of an attribute - assigning numeric values (quantifying) the qualities of objects - no attribute has inherent numeric value, humans invent them to measure concepts - ie assigned measurement of attributes like pain, depression, confidence

Face Validity

Measures whether a test LOOKS like it tests what it is supposed to test - ie the participants can tell what the instrument is measuring

Meta-analyses vs Metasyntheses (key quizbit)

Meta Synthesis: - mass integration of *quaLitative* studies - use statistical analysis of data across multiple studies to create new results - seeks to retain the essence and unique contributions of each study included - uses comparative analysis and interpretive synthesis Meta Analysis: - mass integration of *quaNtitative* studies - amalgamation to create new interpretations

Odds Ratio (OR) Formula (key quizbit)

OR = a x d / b x c an OR of 1 means the intervention produced no significant result. >1 or <1 odds ratio, means the intervention was significant used in retrospective HARM STUDIES, when we look at if exposure to something harmed ppl

AND/OR in a database

OR: asks the computer to search for articles about EITHER/BOTH TOPICS AND: asks the computer to search for articles that talks about BOTH- the crossover in the venn diagram.

define Ontology, epistemology and research paradigm (key quizbit)

Ontology + epistemology = paradigm Ontology: examines the nature of reality- interested in WHAT is reality? a) there is only one reality b) there are multiple realities c) reality is constantly changing/redefined/debated Epistemology: examines HOW we can examine reality: how can I know reality? a) knowledge can be measured with tools b) reality must be interpreted c) reality should be examined with whatever tools are best suited for the problem Paradigm: a worldview, perspective on the "real" world. A paradigm guides how you perform research

Open Coding vs Selective Coding

Open Coding: including all data: ie verbatim words from the participants, which are then sorted into categories. Selective Coding: only including data that is related to the core category - BOTH are types of SUBSTANTIVE coding

Debriefing Sessions

Opportunity for participants to air concerns after data collection

Asking: "how many years of post secondary education do you have? 1-2, 3-4, or 5+?" is what kind of NOIR

Ordinal (could be ratio, but they provide explicit categories)

Cochrane Collaboration

Organization founded by Archie Cochrane - a database of systematic reviews to systematically inform health-care decisions & evaluate evidence - Cochrane proposed the "Evidence hierarchy," ie with RCT at the top and weaker qualitative studies below.

PICO(T) Question

P- patient population I- intervention C- comparison/control O- outcome T- time

PICOT example

P: seniors >65y with cancer I: shared decision making / advance directives C: other (maybe: patients without advanced directives or SDM) O: improved family relations = "in elderly patients with cancer, does shared decision making and/or advance directives improve family relations?"

Pros and Cons of Mixed Methiods

PROS: - manages strengths of both qual and quant design - more evidence total - encourages diverse worldview/paradigms - practical - broadens skillsets CONS: - requires skills in both qual and quantitive research - takes more time - takes more resources

2 Correlation Index Tools

Pearson's r (product-moment correlation coefficient) - used for INTERVAL or RATIO measures - can test whether a correlation is significantly different from zero Spearman's Rank Order correlation (Spearman's Rho) - Used for ORDINAL measures

Considerations when selecting Research Collaboration applicants

Peer Review Manual: Adjudication Criteria and Interpretation Guidelines 1. Concept (is the research significant and impactful?) 2.Feasibility (Approaches and Methods? Do we have the Expertise, Experience and Resources?)

Phases of a Quantitative Study

Phase 1: Conceptual Phase - Literature Review - develop framework and definitions - formulate hypothesis Phase 2: Design and Planning Phase - Research design - Intervention protocol if experimental - specify the population - develop a sampling plan - ethical concerns to protect rights - pilot study if necessary Phase 3: Empirical Phase - collecting data - coding the data (preparing data for analysis) Phase 4: Analytic Phase - statistical analysis & interpretation of significance Phase 5: Dissemination Phase - communicating findings, promoting the use of findings

when quantitative and qualitative data are concurrently collected, what shorthand notation is used when quantitative results are prioritized?

QUAN+qual

Research vs Quality Improvement vs Program Evaluation

RESEARCH: - wants to generate new knowledge that is generalizable to wide populations - tests brand new innovations (ie pilot testing of interventions) - disseminated as widely as possible to entire scientific community QI: - wants to improve a local, internal practice (not interested in generalization) in multiple directions - to improve a practice that currently is in place - disseminations is only within the organization - ie Queen's QSETT evaluations of profs PROGRAM EVALUATION: - like QI, but has a pre-defined goal: moving towards an endpoint. - looks at future programming - communicate to the organization, usually just to the decisionmakers/those who commissioned the evaluation

Research Questions vs Hypothesis

Research Question: specific question (ends in question mark!) that the researcher wants to ANSWER when addressing research Problem - neutrally stated: no implied outcome or answer, no bias Hypothesis: the researchers prediction about relationships between research. DOES make a prediction

Negative Skew

When the graphs tail is longer going to the LEFT - ie graphing "age of death" would have more people dying when older (a feature in Asymmetric Distribution in a Frequency Distribution)

research problem vs problem statement

Research problem: the reality of the issue, the actual condition Problem Statement: states the problem and purpose of study Statement of purpose: overall study goal Research Aims/Objectives: specific accomplishments to reach by conducting the study

8 key ethical principles (by the tri-council)

Respect for... 1. human dignity 2. free & informed consent 3. vulnerable persons 4. privacy and confidentiality 5. justice & inclusiveness 6. balancing harms & benefits 7. minimizing harm 8. maximizing benefits

Sensitivity vs Specificity

SENSETIVITY - an instruments ability to correctly identify a case, and not let it slip by undetected - aka the instrument's *rate of true positives* SPECIFICITY - an instruments ability to weed out noncases - aka *rate of true negatives*

Split half technique vs Known Groups technique vs Stepwise replication

SPLIT HALF = quantitative method to improve homogeneity in reliability. Splits groups into even and odd and compares them! KNOWN GROUPS = quantitative method to test Construct Validity. groups are separated into what is expected to differ and tested STEPWISE = qualitative method to improve dependability. splitting research team in half and doing the same experiment separately

a high p value indicates...

STATISTICAL SIGNIFICANE! a high p value has a high probability of being accurate/significant

In most databases, you begin with a ___________ search

SUBJECT search but other databases can also use TEXTWORD searches and AUTHOR searches

Statistical/Null Hypothesis vs Research Hypothesis

Statistical/Null: expresses ABSENSE of a relationship (only used in statistical testing) Research: predicts that there IS a relationship

Study participants vs informants

Study participants - more passive role - in both qualitative and quantitative Informants: - more active role, in qualitative studies - ie key informants Respondents: - participants who provide information specifically via interview/answering question

Cluster Sampling

Subgroups are created and sampled successively, getting smaller. I.e because there is no Sampling Frame list of all nursing students in Canada, researchers may draw a random sample of nursing schools, then randomly sample students from selected schools. aka 'multistage sampling.'

Positive Skew

When the graphs tail is longer going to the RIGHT - ie income distribution, graphed, would show most people in low-mid class, with a small number of the 1% very rich on the right (a feature in Asymmetric Distribution in a Frequency Distribution)

Nonexperimental Research

the collection of data WITHOUT introducing any intervention/treatment (ie just surveying) Includes: a) descriptive research b) correlational studies (retrospective and prospective)

IMRAD Format

The organization of a research report into four main sections + 0. Abstract (overall synopsis/summary of whole article) 1. Introduction (explains the context & question being asked) 2. Method (explains strategies to test/collect data) 3. Results (findings) 4. Discussion (what this means, interpretation) + 5. Reference list IMRAD is mostly used for QUANTITATIVE journals

Lowering the risk of a Type I error increases the risk of a Type II error: t/f

True! The stricter the criterion for rejecting a null hypothesis, the greater the probability of accepting a false null hypothesis.

T/F: both the researcher and the participant get one copy of a signed waiver/consent form

True! copied and stored.

T/F: A study requires participants receive a contact sheet of researchers

True! requires explanation who whom to contact with questions about any area of the study

Trustworthiness

Trustworthiness is evaluated using - credibility - dependability - confirmability - transferability - triangulation - a qualitative measure!!

Credibility (in a qualitative study) is roughly equatable/equivalent to __________ in a quantitative study.

Validity!

Reliability vs Validity

Validity: - Accuracy, the study measures what it says it does Reliability: - measured from 0 to 1 - Consistency, repetition of the same results - attributes: stability, equivalence, homogeneity/internal consistency

Axiology

Values and value judgments (ethics) - researchers and participants have their own values and beliefs- we must acknowledge this in qualitative research

3 Measures of Variability (key quizbit)

Variability: how spread out the data are 1. *Range:* the distance between the highest and lowest score (highest - lowest). - con: is extremely skewed by extreme outliers. 2. *Standard Deviation*: How much, on average, scores deviate from the mean. - a smaller SD generally means more homogeneity 3. *Interquartile Range*: the middle 50% of scores in a distribution: between the 25th and 75th percent

Why is it harder to maintain confidentiality in quaLitative studies?

Verbatim quotes from participants may reveal personal information

Ethnoscience

While ethnology focuses on a cultural group as a whole, *ethnoscience* focuses on the *cognitive* world of the culture - ie semantic rules of the culture, shared meanings, how the culture THINKS as a whole -aka Cognitive Anthropology

Hermeneutics

While phenomenology studies lived experiences of ppl, *hermeneutics* is the science of interpretation - focus on how the *social and historical contexts shaped how the individual INTERPRETS* their experience/world - using lived experiences of individuals to understand the political, social, and historical CONTEXTS that they occurred in - a philosophical qualitative study tradition

Disproportionate Sample

You might now select 60, 20, and 20 of each ethnicity, or an even 33 of each ethnicity. Disproportionate sampling is to ensure more adequate representation of viewpoints form minority groups.

Snowball Sampling

a TYPE of convenience sampling, where current participants are asked to recommend/refer other potential participants so you end up sampling a bunch of people who are already friends/know each other. Helpful if you're looking for niche characteristics in eligibility criteria (ie people who are afraid of clowns are hard to find except by recommendation).

Basic Social Process

a TYPE of core category in selective coding in grounded theory something which evolves over time!! an evolutionary process of coping, adaptation

Research Critique

a careful, critical appraisal of strengths and limitations of a journal/piece of research. considering the worth of it's evidence for nursing practice.

Normal Distribution

a certain type of frequency distribution in a BELLCURVE shape - symmetric - unimodal - not crazy high peaked or very flat- just moderate.

What is a literature review

a comprehensive summary of current scientific evidence/knowledge on a topic/research problem - provide context before presenting a study - justify need for study - identify gaps

phenomenology was born as a critique/counter to

a critique of positivism: there's more than one way to understand the world

Emergent Design

a design that unfolds/changes/evolves during/throughout the course of a quaLitative study, as the researcher makes ongoing redesign decisions reflecting what has already been learned in the field.

How to reduce response set balances? (key quizbit)

be careful with your wording of questions, create a non-judgemental, honest atmosphere, assure confidentiality or anonymity, and use *Counterbalancing*, when you switch between positively and negatively worded questions.

what kind of data provides info on things like the rates of procedures (ie Caesarian sections) or rates of infections

benchmarking data

beneficence vs nonmaleficence

beneficence: active performance of some good nonmaleficence: protection of participants from harm

instrumental utilization vs conceptual utilization

both are types of research utilization, on opposite ends of the continuum instrumental utilization: the direct use of specific, tangible innovations from research conceptual utilization: more theoretical/ideological changes in thinking about an issue, due to research

Research Utilization vs Evidence Based Practice

both use research as a basis for clinical decision making, but.... *RU* starts with research results, which are evaluated for possible use in clinical practice. *EBP* starts with a clinical problem in mind, and searches for research evidence to solve it - de-emphasizes traditions and focuses on following data evidence.

Vignettes

brief descriptions of people/situations, baby anecdotes, which participants are asked to reflect on/react to

How to ensure randomization in systematic sampling? (key quizbit formula)

by randomizing the *SAMPLING INTERVAL,* ie taking the total number of the population (ie 100) and dividing it by the desire size of the participant group (ie 20) = every 5th person is selected. *Sampling Interval = Total Population / Desired Sample Size*

Participatory action research (PAR)

close collaboration with groups/communities that are oppressed or vulnerable to a dominant culture. Research methods are less important, more focus on Emergent Process to generate community empowerment, solidary and participation. PAR is used to collect knowledge to be used for political exertion. Tied to feminist and critical research/ethnology.

Longitudinal Design

collecting data 2+ times over an extended period of time - ie interviewing the same kid at 5, 10, and 15 years old throughout his life

Cross Sectional Design

collecting data at ONE short period in time (snapshot) - ie interviewing a 5, 10, and 15 year old at the same time and comparing their answers - helps remove maturation effect!

known-groups technique and factor analysis are used to test what?

construct validity

Conceptual Files

creating physical files sorted by category, and then sorting all materials into the categories, so you can retrieve all the content on a particular topic by reviewing the file

final exam:

cumulative 100 mc Q's (100 marks) 6 short answer Q's (25 marks)

concealment vs deception

deception: disclosing false information or withholding info from participants concealment: collecting data without participant's knowledge or consent (ie watching the participant's behaviour in the waiting room)

Thick Description

detailed, rich, thorough description of the CONTEXT (research setting, observed processes) of the research, so others can understand and make inferences about contextual similarities. a thick description enhances transferability of a QuaLitative study

Administration Variations

differences in the methods of data collection between participants , causing altered measurements - ie taking vitals on one patient before dinner, and one after dinner

Theoretical Sampling vs Purposeful Sampling

different! Theo sampling discovers categories, their properties, and their interrelationships: searching for which group/subgroup to turn to next. Groups are chosen as needed for their theoretical relevance as the study progresses.

Directional vs Nondirectional Hypothesis

directional: specifies the direction of the relationship. Nondirectional states only a correlation/relationship betwn variables

Emic vs Etic Perspectives

emic: perspective from a person within the culture (insider's view) etic: perspective of a behavior or belief by an observer/outsider/researcher of the culture - more culturally neutral and can be used to generalize and apply to other societies

Item Sampling

errors in measurement caused by a certain sample of data selected ie a student's score on a 100 question will be influenced to a certain extent by WHICH 100 questions are included, of the 500 questions answered total.

Ethnographic Analysis

ethnographiers enter the field and search for patterns of behaviour Includes Spradley's and Leininger's methods

An archery target with all the arrows clustered in the top left hand corner is

high reliability, but low validity

Matching

matching up individual participants in opposite groups (intervention & control) by extraneous variables (ie age, race, gender, etc) to make the different sample groups comparable creating similar pairs! cons: hard to do with more than 2 or 3 variables,

How to adjust for bias in disproportionate sampling?

mathematically adjust the *WEIGHTING* of the groups to best estimate the real world population values

Statistically Significant

means the obtained results are probably true, and unlikely to be due to chance fluctuations 'significant' is a mathematical term, it does NOT mean the results are meaningful/important

Visual Analogue Scale (VAS)

measures subjective experiences (ie pain scale), participants rate along a straight line (actual line, with no markers on it) on a bipolar continuum

Known Groups Technique

measuring two groups whose results are EXPECTED/highly likely to differ (ie fear of labour in primiparas vs mutiparas). If they don't differ, then construct validity of the instrument should be questioned

Q Sort

method where participants sort a set of statements into groups/piles along a bipolar continuum according to specified criteria/instructions. (usually 60-100 cards sorted into 9-11 piles). I.e, the cards all have personality traits written on them, and participants sort them into piles along the continuum from "exactly like me" to "not at all like me."

In Nominal data, which central tendency is mostly used/reported?

mode

evidence-based practice

nursing care provided that is supported by sound scientific rationale (but does not consider the other patient and nurse elements, unlike Evidence INFORMED practice)

joan tranmer's guest lecture

on a study of how night shift work of nurses affect their cardiovascular health

Self Reports

oral interviews or written questionnaire. Important but susceptible to errors of human self reporting.

ladder of abstraction

organization of concepts in sequence from the most concrete and individual to the most general places the most concrete words on the lower rungs, and arranges words on the upper rungs as they increase in abstraction

*Constant Comparative Method*

part of glazer & strauss's grounded theory method - comparing elements from one data source to another (ie from one participant's interview to anothers), repeatedly, until all elements have been compared to every source & patterns arise - involves *"Fit"*, comparing characteristics of data pieces to see if they "fit" in certain categories

Self determination

participants right to control their own actions: ie refuse participation, or not answer some questions self determination must be respected under "Respect for human dignity"

Which type of study is more flexible and less linear?

quaLitative

Convergent Design

quan & quant --> merge results --> interpret Pro: can collect data and analyze at the same time Con: may be difficult to merge with different sample sizes, eg: if you were assessing the success of a new policy on a floor quantitatively, and also assessing staff perceptions/attitudes of the policy qualitatively

Effect Size

quantifies the magnitude of the relationship between the independent and dependent variables - the degree of association between the variables, the impact of an intervention on an outcome. - ie "how much did the outcome change from the control group to the experimented-on group?" - not just the existence of a relationship, but the strength/magnitude of the relationship correlation. effect size of studies are measure when performing a *meta analysis* Integrative Review

Systematic Sampling

randomly selecting every nth (ie 10th) person on a list. Basically the same as simple random sampling except you actually specify the numeric sampling interval.

How to eliminate bias in quaNtitative research?

randomness!

Saturation

redundancy of information - data no longer shows any new themes or categories, no new information. This means you have collected all the data you can. helps contribute to trustworthiness in qualitative studies

Bracketing

researcher IDs personal biases about a phenomenon, and sets them aside when working with the participants a technique one can use to enhance reflexivity: "bracketing" our thoughts in so we can look at them objectively from the outside and not let them interfere during our work

Data Saturation

sampling until no new information is provided (redundancy)

Criterion Sampling

selecting only cases that meet a predetermined criteria

Psychosocial Scales

self report tools to quantitatively measure human characteristics (attitudes, needs, perceptions, mood) etc on a numerical scale

Most common data collection method for nurses

self reports

Quota Sampling

separating the target population into homogenous groups/strata, and then (nonrandomly?) picking a certain number of participants from each strata. This ensures a diverse sample and doesn't over or underrepresent certain populations)

Grounded Theory

studies the socio-psychological processes in SOCIAL interactions/settings/structures - researcher "ground" their theory in data and context, not own values. - constructs a theory to explain what is going on among this group of people in this time and place -interviews 20-30 people - seeks to discover a *core variable* that explains what's going on in the social interaction - believes that meaning cannot be separated from context (social, historical, cultural context) - people use interpretive processes to handle and change meanings in dealing with their situations

Belmont Report (1979)

summarizes the basic ethical principles and guidelines for the protection of human subjects of research. 1. *Respect for persons* (voluntary consent, self-determination/choices, autonomy) 2. *Beneficence* ("do no harm" maximize beneficial outcomes, minimize costs) 3. *Justice* (equal recruitment & treatment of participants, fair distribution of potential costs & benefit) KEY QUIZBIT

Quasi-Statistics

tabulation of frequency, except it illustrates THEMES or patterns in the data out

Transitory Personal Factors

temporary individual factors or states that influence study measurements - ie participant is feeling tired that day, hasn't eaten breakfast, just went through a breakup

Declaration of Helsinki (1964)

the World Medical Association established recommendations guiding medical doctors in biomedical research involving human participants, AND protects their data, blood samples, human tissue.

statistical power

the ability of the study to detect a true relationship the likelihood of finding a statistically significant difference when a true difference exists

Cultural Competence

the ability to work and communicate effectively with people from other diverse backgrounds

Composite reliability

the aggregate reliability of two or more items or judges' ratings? see text basically just an alternative to cronbach's alpha

What is an 'element'?

the basic unit of a population from which a sample is drawn, ie one human, one household, one hospital,

What is Determinism

the belief that events are not haphazard/random, but rather the result of direct causes part of positivism

reactivity

the change in behaviour due to a participant knowing they're being observed (they're "REACTING" differently, caused by the Hawthorne Effect) this messes with INTERNAL validity

Sampling Error

the fluctuation of the value of a statistic taken from a research sample compared to the actual population value (ie the sample's average HR was 90, but the target population's average HR (which we can't really know) would've been 102).

Research Problem

the issue/phenomenon that a researcher wants to address/fix, through systematic inquiry, usually surrounding clinical outcomes, social issues, theories, and relevant literature. Usually starts broad and narrows to a question that aligns with a paradigm of choice.

Randomization

the most effective controller of extraneous variables bc it controls for influencing factors without the researchers having to identify and measure them Requires - allocation concealment - strict adherence - collection of baseline data before starting (so they don't even know what group they're in yet)

Core Category

the pattern of behaviour/the problem that is relevant to the study (ie smoking) in selective coding, you only code data that is related to this topic

Research Utilization:

the process by which knowledge generated from research becomes incorporated into clinical practice RU is a spectrum from instrumental to conceptual utilization

Research Question

the question that will be answered by the study, to address the research problem.

F ratio

the ratio of between-groups variance value, to within-groups variance value in an ANOVA test If the F-ratio is > the alpha level, we reject the null hypothesis (and accept your research prediction as true)

Full Disclosure

the researcher has fully described the study, the person's right to refuse participation, and possible risks and benefits to the prospective participant.

Time Sampling

the researcher selects the time periods/frames that observations will occur (ie for 30 seconds at 2 minute intervals, in the morning, etc). Time sampling provides a beter representation of the observed behaviours. a sampling plan used in structured observational studies

Research Design

the researcher's overall plan to answer the research question - outlines the intervention, nature of any comparisons, methods to control extraneous variables, timing and location of data collection

Theoretical Sampling (key quizbit)

the selection of participants based on emerging findings as the study progresses to ensure representation of important themes! (emergent process). selecting key informants with expertise/experience, guiding the researchers to select good subsequent samples to maximize information collection. Bc not random selection, theoretical sampling is used only in qualitative research, like *grounded theory studies* (continuous research: going back and resampling the 'thin' areas of the theory)

Ethology

the study of *human BEHAVIOUR* in its natural context observation of universal behavioral tendencies (a psychological qualitative tradition)

Sampling Bias

the systematic over or underrepresentation of some segment of the population in a sample during research

Response Set Biases

the tendency of some participants to respond to items in characteristic ways, independently of item content - enduring individual characteristics of participants (ie that's just their personality, has nothing to do with study environment/conditions)

How can we collect data accurately if full disclosure would compromise the study/cause bias?

use *Covert Data Collection*

How to strengthen a retrospective study?

use a *case control design* - comparing a "case" (ie person w cancer) to a control (cancerless) who has very similar background characteristics (ie similar family hx, lifestyle etc), which allows them to rule out some extraneous traits

Structured Self Reports

use formal instrument (ie Questionnaire, Interview Schedule). May use either Open ended or Close ended questions

bracketing is used in ______ptive phenomonology, but not _________ptive phenomenology

used in descriptive, but NOT interpretive

Power Analysis

used to estimate how big your sample size needs to be: - researchers will estimate how big group differences will be on the outcomes. - The estimate might be based on previous research, on our personal experience, or on other factors. - If expected differences are large, you don't need a large sample to ensure that the differences will be revealed in a statistical analysis - but if small differences are predicted, large samples are needed and you might need more participants required to achieve adequate power

critique assignment

useful resources: - AGREEII Instrument -CASP.uk.net - Equator Network Chapter 18 and 19 in textbook. important to mention the independent and depended variables in the TITLE


Related study sets

Chapter 46: Physiology of the Autonomic and Central Nervous System

View Set

American History | Module 3 | Lesson 4: Quiz "Building Up"

View Set

Saunders - Schizophrenia Practice Questions

View Set

Unit 1: Becoming a Licensed Florida Real Estate Broker - Florida Real Estate Broker Course

View Set

Physical Science 235 Alicia Courville

View Set

BUS/475: Integrated Business Topics - Week 2

View Set

Environmental Emergencies (Multiple Choice)

View Set

Chapter 16 and 17 ECON 1040 Final

View Set

Chapter 18: Checkpoint Exam Questions

View Set