Health Promotion Program Evaluation
1. Defining Program Evaluation
- (book definition) Systematic collection of information about a health promotion program for the purpose of answering question and making decision about the program - know if the program worked and to see what the objectives worked
Validity
- Ability of evaluation instruments to accurately measure what the evaluator wants to measure.
Reliability
- Ability to evaluation instruments to provide consistent results each time they are used.
Evaluation Frameworks
- CDC Evaluation Framework - RE-AIM - Institute of Medicine Evaluation Framework - should do these steps to evaluate
3. Evaluation Framework
- Consistent approach, structure, and format, that helps program participants, staff, and other stakeholders understand the thinking that went into the evaluation, the type of questions asked, how the information was collected, and the type of report that might be expected.
Time Frame for Evaluation
- Continuous program improvement - Evaluation should continue as long as the stakeholders seek to improve the program - Time to conduct analyses and complete report within funding deadline
Cultural relevance
- Evaluation methods and materials have been developed to be culturally acceptable and applicable to the priority population.
Finding and Working with an Evaluator
- Helps the program staff determine what the evaluation methods should be and how the resulting data will be analyzed - Evaluators can be found at universities and colleges and through the American Evaluation Association
What would you want to know if you were evaluating your program?
- How many people took advantage of the program and benefited from it. Conducting a post survey
How do I get started with program evaluation?
- Ideally, program evaluation starts when the program is being designed --- Critical task: To design the program so that all program components are in tight alignment and congruent with one another --- Including a skilled program evaluator on the team during program design can help to ensure proper alignment of program components
Questions Guiding Program Evaluation
- If this program is effective, what specific measurable changes in the priority population will we see? - What kinds of data are needed to determine whether the SMART objectives were met? - How and where can this information be obtained? - What resources are available for collecting this data, analyzing it, and reporting it? - When and how often will data be collected to make periodic revisions? - What decisions will be made based on the evaluation findings? - What type of report would be most useful for program planners, funders, stakeholders, and the priority population?
4. Gather Credible Evidence
- Involves the process of collecting, managing, organizing, analyzing, synthesizing, and summarizing the data in order to answer the evaluation questions - we shouldn't be starting from scratch here. We discussed collecting evidence earlier this semester - in what context did we discuss data collection? - what info we need to evaluate the program
Data Collection Methods: Document Review
- Medical records - Review policies
Finding Good Measures
- Peer-reviewed articles Healthy people 2020 (what was used as their health indicators?) - Obesity: NCCOR - CDC Program Performance and Evaluation Office (PPEO)
Program Evaluation Designs
- Post-test for program group - Pretest and Post-test for program group - Time series - Pretest and post-test with a control or comparison group - Time series with control or comparison group - from weakest to strongest (in terms of evidence that that the program was effective), does not measure change - post: only done one time- weak bc you only measure change once, don't have anything to compare - pre/post: have two time points, tested in the beginning/before, pretest could be needs assessment , two time points - time: add another time point (follow up- seeing if the positive behavior stuck) or you could have a time point in between two points (process), measure long-term outcome - the bottom two add a comparison group- a challenge would be losing interest, time- how many time points and what you getting out of it --- how many data collection points --- who the data is being collected on
Ethical Considerations
- Program participants have rights that need to be protected - Projects must be reviewed by an institutional review board (IRB) - Informed consent must be obtained - Providing confidentiality or anonymity is necessary and collected data should be safely stored
Data Collection Methods: Interviews and Focus Groups
- Qualitative data
RE-AIM
- Reach, Effectiveness, Adoption, Implementation, Maintenance - Recognizes the importance of the internal and external validity - Useful for estimating public impact, comparing different health policies, and identifying areas for integration of policies.
Evaluation Costs
- Related to the complexity of the program being evaluated, the expertise and credentials of the program evaluator, and the program's internal resources and expertise - Typically range from 5% to 20% of the program budget depending on the need for a strong evaluation design
Data Collection Considerations
- Reliability and Validity - Cost - Resources and Finding Good Measures
Types of Data Collection
- Survey questionnaire - Direct Measurement of Health Behaviors or Health Outcomes - Observational measurement - Document Review - Interviews - Focus Groups
Why evaluate a health promotion program?
- know what changes need to be done
Center of Disease Control and Prevention Evaluation Framework
6 steps __1__Engage stakeholders __2__Describe the program __3__Focus evaluation design __4__Gather credible evidence __5__Justify conclusions __6__Use and share lessons learned
Why evaluate a health promotion program?
Adds great value and benefits to programs - Helps to improve the program during implementation and for the next time the program is offered - Helps the ensure the program is actually helping the priority population - Demonstrates the true return on investment to funders and stakeholders - Helps to improve the odds of sustainability and continued funding
Data Collection Methods: Observational Measurement
Certain health behaviors - Examine changes in environments - Do stores still offer cigarettes? - Do locations display media strategies? - Improved quality in the park?
Data Collection Methods:Direct Measurement
Direct Measurement of Health Behaviors or Health Outcomes - Result of screening test - Physical activity - Height and weight - Blood pressure
2. Types of Evaluation
Formative - Guides program development and improvement, especially during implementation to avoid pitfalls Summative - Measures the short and long term changes that occurred as a result of the program - F: creating and informing -outputs, needs assessment, process - S: change in knowledge and change in health outcome- outcome, impact (immediate effect)
Summative Evaluation
Impact Evaluation - Measure the immedicate effect of a health promotion and the extent to which the program's objectives were met Outcome Evaluation - Measure the longer term changes in people during or after their participation in the health promotion program
Formative Evaluation
Needs Assessment - Identifies the health needs of a priority population Process Evaluation - Assess the implementation process of a program - Measures things such as program reach, retention, fidelity of implementation, perceptions of program quality - evaluation: how your program is going in real time- who is going? Who is coming back?
Data Collection Methods: Surveys
Survey questionnaires - Administer via: Face to Face, Mailing, Phone, Electronic - Individual-level measures or indicators - Knowledge, Attitudes, Perceptions, Self-report health behaviors Examples: Knowledge of Dietary Guidelines for AmericansSelf-report of screening
Objective Example: By the end of the program, there will be a 15% reduction in the number of college students who use e-cigarettes.
What evaluation design might be effective? - Pre-test and post-test What data collection measure might be effective? - Survey (a concould be recall basis: lying)