Chapter 13 - Evaluation Research

Ace your homework & exams now with Quizwiz!

Question 1 - Needs Assessment

-A needs assessment attempts to answer the question about whether a new program is needed. -How do we identify need? Collect data to determine how many people in a community will need services and what level of services or personnel already exists to meet the need -Often uses survey research methods (e.g. survey, focus groups, interview, census) -As in other forms of evaluation research, it is a good idea to use multiple indicators and perspectives.

Cost Benefits (CB) vs. Cost Effectiveness (CE) Analysis

-CB=Is a type of evaluation research looking at cost of program and the benefits ($ value), get either a net gain or loss -CB: ex ante analysis vs. ex post analysis (more reliable, based upon actual cost, not estimates) -Direct costs vs. opportunity costs -Monetizing benefits -Timing of benefits (predicting future) -Cost/benefit to whom? Depends (subject, agency, society, government) -Cost effectiveness (CE) alternative analysis looking at $ costs and outcomes/benefits (no $, e.g. quality of life)

Cost Effectiveness (CE) Analysis

-Compares program costs (in dollars) with program effects measured (whatever unit as appropriate, eg. Self esteem, achievement) -Question of how to put value on outcome -Can be oversimplified -Should not letting $ be the final decision on program worth (value of effectiveness not always measurable in $)

Question 4 - Efficiency Analysis

-Efficiency analysis compares program effects to costs. Are benefits worth the costs? -Program funders often require some sort of efficiency analysis. This can be either a cost-benefit analysis or a cost-effectiveness analysis (see later slides)

Evaluation Research

-Is a way of supplying valid and reliable evidence on operation of programs/practices -Is evidence based practice -Unit of analysis is programs/organizations -Politics: control of social resources and power, value laden/biased; stakeholders: funders, sponsors, policymakers, clients, workers

Implementation Monitoring/Process Evaluation

-Is evaluation research of ongoing programs -Is the target population being served? ("take up rate" e.g. food stamps) -Are the planned services being delivered? (implementation) -Is the quality of the services adequate? (client satisfaction) Many programs are not effective because they were never implemented as planned!

Question 3 - Outcome Evaluation

-Outcome evaluation involves examining the extent to which a treatment or program has an effect. Did program/intervention work? -The evaluator compares what happened after a program with what would have happened had there been no program. An experimental design is the best way to maximize internal validity. Quasi-experimental designs may also be used.

Question 2 - Process Evaluation

-Process evaluation involves evaluation research that investigates the process of service delivery. -The evaluator examines how the program is operating, often to identify which program activities lead to desired program outcomes. -If the findings of process evaluation are used to help shape and refine the program (see later slides), then also called formative evaluation.

Alternative Designs

-Quasi-experimental -Use of matching groups (to equalize sample differences) -Cohort groups (beware of differences, threat of history) -Regression discontinuity design (selection and statistical regression) Statistical control (not a design, but a method)

The Logic Model

A schematic representation of the various components that make up a social service program. Basic Logic Model components include: -Social Problem -Target population -Assumptions (theoretically based) -Inputs -Activities -Outputs -Outcomes

Why evaluate? 3 Major reasons

Administrative purposes -assess daily operations -assess most efficient means of running program or agency Impact assessment -what, if any, effect program is having -impact: how well program is meeting goals Can test hypotheses/evaluate practice approach

Evaluation Research - Four Questions

Answers 4 questions: 1. Is program needed (Needs Assessment)? 2. How does program operate (Process evaluation or implementation evaluation/formative evaluation)? 3. What is program impact (outcome evaluation)? 4. How efficient is program (efficiency analysis)?

Specific Decisions for Evaluation Projects

Black box or program theory: Do we care how the program gets results? (theory driven evaluation: either descriptive or prescriptive) Researcher or stakeholder orientation: Whose goals matter most? In evaluations, programs often set the research questions (different approaches: stakeholder, social science and integrative) Quantitative or qualitative methods: Which methods provide the best answers? Depends on goals/purpose Simple or complex outcomes: How complicated should the findings be? Most evaluation research measures multiple outcomes to understand program impact, but challenges for complex outcomes

Designs for Evaluation Research

Evaluation research is often concerned with cause-and-effect relationships Randomized experimental designs are best for causal relationships and to control validity threats (see textbook for review) -not always feasible to use -ethical issues with denying treatment or asking clients to wait for treatment -opposition from clinicians -cost

Evaluation Basics (cont'd)

Feedback: information about outputs or outcomes that impacts inputs to the program Target Population: population for whom program is designed to serve Stakeholders: individuals or groups who have some interest in the program (e.g. clients, staff, funders, the public)

Formative Evaluation Research

Focuses on providing information to guide the planning, implementation, and operation of specific programs ="foundation" Goal: to ensure smooth operation of program -Gather data on target population needs -Examine existing services -Plan specific intervention strategies -Specify staff skills needed to deliver program -Determine is program feasible as planned

Implications for Evidence Based Practice

In-depth, ongoing program evaluation facilitates both the generation of evidence about a program's impacts as well as an understanding of what contributes to a program's success. Outcome (summative) evaluations provide some level of evidence about the effectiveness of the program for the target population process (formative) evaluations can identify mechanisms of a program's implementation that contributes to the program's success (or lack of success).

Evaluation Basics

Inputs: resources, raw material, clients, and staff that go into a program, are usually the independent variables e.g. type of counseling Program Process: the complete treatment or service delivered by the program Outputs: direct products of the program's service delivery. The evaluator identifies both intermediate outputs and final outputs (referred to as service completions). -Intermediate outputs: # of assessment completed per month, # of hours of individual counseling provided per month, etc -Service completion/final outputs: Child attends 80% of individual and group counseling sessions; Family completes psychoeducational program.

Purpose of Evaluation Research

Is the use of scientific research methods to: 1. plan intervention programs 2. monitor the implementation of new programs 3. monitor the operation of existing programs 4. determine how effectively programs or clinical practices achieve their goals In short, evaluation research is social work research that is conducted for a distinctive purpose: to investigate social programs.

Evaluation Research - Differences

It's applied, is NOT a separate set of techniques; some (5) differences and similarities Differences: 1. Findings have immediate applicability 2. Goals shaped by funders/stakeholders 3. Judgmental quality 4. Priority of program vs. research 5. Dissemination of results

Why do programs fail to be successful?

Logic Model allows us to see where breakdown occurs -Input failure = Insufficient inputs -Program process failure = program activities fail to set causal process in motion; incomplete, insufficient or poorly designed -Theory failure = cause and effect not theoretically linked

Evaluation Basics (cont'd)

Outcomes (intermediate and final): impact of the program process on clients served. Goals are typically the dependent variables or desired outcomes -may not lend themselves to evaluation if vague -may need to use proximate/intermediate goals can be realized in the short term and are related to the achievement of long term/final goals (e.g. Head Start) -the evaluator defines the kind of changes expected to occur as a result of the program, including both intermediate and final outcomes. -intermediate outcomes: reduction in psychiatric symptoms, increased anger management skills, improved coping skills, increased parental knowledge, + job skills -service completion/final outcomes: child can function independently in school, home and community; employment

What are some barriers to use of evaluation research?

Poor design may result in non-clear cut results Poor communication Evaluators may fail to advocate for adoption of findings Resistance to change -need to disseminate findings and involve users of programs Ethical concerns

Evaluation Research in a Diverse Society

Researchers and agency administrators often define evaluation questions from their own perspective, which does not necessarily reflect the perspective of the participants. Grouping of participants into groups (race, gender, class...) issues Ethics of evaluating cost benefits

Evaluation Research - Similarities

Similarities 1. Look for cause and effect relationships 2. Need to take into account ethics 3. In order to answer evaluation questions, use the same research methods -Designs e.g. experimental, observational -Data collection methods e.g. surveys, interview -Data analyses


Related study sets

Managerial Accounting Exam 1 Based on Learnsmarts

View Set

Characteristics of Chemical Reactions

View Set

Chapter 12: Extending Surface Area and Volume

View Set

Lithium Side Effects and Signs of Lithium Toxicity

View Set

English 11- Scott Fitzgerald, Winter Dreams, and The 1920s

View Set

Chapter 30: Perioperative Nursing

View Set

Загальні закономірності світу

View Set

EXAM 4 : Chapter 50 (Nursing Management: Patients With Hearing and Balance Disorders)

View Set

IB Sports Exercise Health Science: Unit 1

View Set