Evaluation

Ace your homework & exams now with Quizwiz!

Program Theory by Bickman (1987)

"A plausible and sensible model of how a program is supposed to work"

Pragmatism in Action: CDC Framework: "Gather credible evidence"

"Compiling information that stakeholders perceive as trustworthy and relevant for answering their questions. Such evidence can be experimental or observational, qualitative or quantitative, or it can include a mixture of methods... Whether a body of evidence is credible to stakeholders might depend on such factors as how the questions were posed, sources of information, conditions of data collection, reliability of measurement, validity of interpretations, and quality control procedures."

Not AEA Statement

"The generalized opposition to use of experimental and quasi-experimental methods evinced in the AEA statement is unjustified, speciously argued, and represents neither the methodological norms in the evaluation field nor the views of the large segment of the AEA membership..."

Social science theory

"attempts to provide generalizable and verifiable knowledge about principles that shape social behavior"

methodological argument

"method is the service of substance-- never the master" 1. identify inquiry purposes 2. identify your questions 3. select a method that best suits your purposes and questions

Alkin (2004) on Evaluation Theory

"offers a set of rules, prescriptions, prohibitions, and guiding frameworks that specify what a good or proper evaluation is and how evaluation should be done"

AEA Statement

"we believe the proposed priority manifests fundamental misunderstandings about (1) the types of studies capable of determining causality, (2) the methods capable of achieving scientific rigor, and (3) the types of studies that support policy and program decisions. We would like to help avoid the political, ethical, and financial disaster that could well attend implementation of the proposed priority."

4 Steps to the Logic of Evaluaton

(1) Select the criteria of merit (2) Set standards of performance based on those criteria (3) gather data pertaining to the evaluand's performance (4) Integrate results into a final value judgement

How does evaluation focus the study?

-It identifies background of evaluand and purpose of evaluation. Then it identifies evaluation stakeholders and audiences. Lastly, it develops key questions that will guide the evaluation

Challenge of the gold standard (7 things)

1. AEA vs. Not AEA statement 2. theoretical 3. practical 4. methological 5. ethical 6. ideological 7. political

Evaluation Theory, Shadish AEA Presidential Address 1998 Main Points (8 things)

1. all evaluators should know evaluation theory because it is central to our professional identity 2. it is what we talk about more than anything else 3. it gives rise to our most trenchant debates 4. it gives us the language we use for talking to ourselves and others 5. it encompasses what evaluators care about most 6. perhaps most important, it is what makes us different from other professions 7. it is important to make evaluation theory the very heart of our identity 8. every profession needs a unique knowledge base. For evaluation, evaluation theory is that knowledge base

Logic models do what four things?

1. define shared language and vision 2. provide a framework for evaluation activities 3. communicate how your program is expected to work to others 4. help ensure there are logical connections between activities and intended outcomes

what are some political games by evaluand at the beginning of the evaluation? (4 things)

1. denying the need for evaluation 2. claiming the evaluation will take too much time away from their normal workload 3. claiming the evaluation is a good thing, but introducing delaying tactics 4. seeking to form close personal relationships with the evaluator to convince the evaluator to trust him or her

political games by evaluand during interpretation (3 things)

1. denying the problem exists 2. downplaying the importance of the problem or attributing it to others or to forces beyond their control 3. arguing that the information is now irrelevant because things have changed

Program theory-driven evaluation science (3 things)

1. develop program theory of change 2. formulate and prioritize evaluation questions 3. answer questions

CDC 6 steps

1. engage stakeholders 2. describe the program 3. focus the evaluation design 4. gather credible evidence 5. justify conclusions 6. ensure use and share lessons learned.

Arguments against RCTs (6 things)

1. ethical argument 2. feasibility argument 3. the 'other methods' argument 4. CDC argument 5. the 'privileging the elite' argument 6. the 'honoring complexity' argument

Roles for Theory in Evaluation according to Donaldson & Lipsey (2006) (3 things)

1. evaluation theory 2. program theory 3. social science theory

Four conclusions about program theory as an evaluation tool

1. forces evaluators to think before acting 2. helps tailor evaluations to answer important questions 3. improves evaluation design sensitivity and validity 4. helps evaluators meet professional evaluation standards

political games by evaluator during design (2 things)

1. insisting evaluations be quantitative (stats done lie) 2. using the experts know best line (evaluators do not trust those being evaluated and want to have them caught)

arguments for RCTs (4 things)

1. internal validity argument 2. affirmative action argument 3. democratic theory argument 4. ethical argument

political games by evaluators during interpretation (4 things)

1. not stating or shifting the measurement standards 2. applying unstated criteria to decision making 3. applying unstated values and ideological filers to the data interpretation 4. ignoring findings of evaluations

political games by evaluand during data collection (3 things)

1. omitting or distorting the information they are asked o provide so they do not look bad 2. providing the evaluator with huge amounts of information so it is difficult to sort out what is relevant and what is not (snow job) 3. coming up with new data at the end

Three theories of knowledge:

1. post-positivist 2. constructivist 3. pragmatist

Types of evaluation questions (five)

1. program process 2. program outcomes 3. attributing outcomes to the program 4. links between processes and outcomes 5. explanations

What is the experimental design of the "gold standard"? (3 things)

1. random assignment 3. experimental control 3. ruling out threats to validity

What are the three roots of the evaluation theory tree?

1. social accountability 2. social inquiry 3. epistemology

Four different ways of thinking about Evaluation Theory

1. stage model 2. evaluation theory tree 3. revised eval tree (with social justice included) 4. purpose focused

Purpose of CDC framework (five)

1. summarize and organize essential elements of program evaluation 2. provide a common frame of reference for conducting evaluations 3. clarify the steps in program evaluation 4. review standards for effective program evaluation 5. address misconceptions about the purposes and methods of program evaluation

Guiding Principles for Evaluator (five)

1. systematic inquiry 2. competence 3. integrity/honesty 4.respect for people 5.responsibility for general and public welfare

Three groups of stakeholders

1. those involved in program operations (sponsors, collaborators, managers, etc...) 2. those served or affected by the program (clients, family members, etc...) 3. primary users of the evaluation (the people who are in a position to do something regarding the program.

stage model of evaluation theory development (three stages)

1. truth 2. use 3. integration

CDC standards (four)

1. utility standards 2. feasibility standards 3. propriety standards 4. accuracy standards

1. input 2. activities 3. output 4. outcome

1. what you put in 2. what you do 3. what you get out of it 4. what changes as a result

gathering credible evidence CDC step IV

Compiling information that stakeholders perceive as trustworthy and relevant for answering their questions. This enhances the evaluation's utility and accuracy; gives priority to the most defensible information sources This can be done by choosing indicators that meaningfully address the eval questions, and by establishing clear procedures and training staff to collect high-quality information.

How does research focus the study?

Develops a problem statement, reviews the literature on the topic, develops a theory-based hypothesis or research question, identifies terms and definitions, identifies variables to be studied

Key differences between research and evaluation in one flashcard

E particularizes, R generalizes. E is designed to improve something, R is designed to prove something. E provides the basis for decision making, R provides the basis for drawing conclusions, E asks so what? R asks what's so? E-- how well it works? R-- how it works? E--what is valuable? R-- what is?

Evaluation vs. Research (Audience)

E: Clients (internal and external) R: other researchers

Ensuring use and sharing lessons learned CDC step VI

Ensuring that stakeholders are aware of the evaluation procedures and findings, and ensuring that the findings are considered in decisions or actions that affect the program. Also, ensuring that those who participated in the evaluation process have had a beneficial experiences This ensures that evaluation achieves its main purpose-- being useful. This can be done by designing the evaluation to achieve intended use by intended users, providing continuous feedback to stakeholders, and scheduling follow-up meetings with intended users to facilitate the transfer of evaluation conclusions into appropriate actions or decisions.

Donaldson & Christie definition of evaluation

Evaluation generates information for decision making, often answering the bottom-line question, "does it work"? Follow-up questions to this basic question, frequently asked by those evaluating are, why does it work, for whom does it work best, under what conditions does it work, how do we make it better? Evaluations provide program stakeholders with defensible answers to these important questions

systematic inquiry

Evaluators conduct systematic, data-based inquiries

What does it mean to engage stakeholders? CDC Step I

Fostering input, participation, and power-sharing among those persons who have an investment in the conduct of the evaluation and its findings. This helps increase chances that evaluation will be useful, and it can improve the evaluation's credibility, clarify roles and responsibilities How to do this: consult insiders, incorporate less powerful groups, coordinate stakeholders input throughout process of design, operation and use.

affirmative action argument

In the past there has been a paucity of available evidence about the effects of different educational strategies, policies, programs and supports. RCTs were de-prioritized and rarely funded in evaluation; greater focus and funding for RCTs are required to redress the balance.

What are the branches of the evaluation theory tree? (3 major ones and 1 additional one in the revision)

Major one: 1. use 2. methods 3. valuing Revised Addition: 4. social justice

Justifying conclusions CDC step V

Making claims regarding the program that are warranted on the basis of data that have been compared against pertinent and defensible ideas of merit, value, or significance. This reinforces conclusions central to the evaluation's utility and accuracy This can be done by using appropriate methods of analysis and synthesis to summarize findings, interpreting the significance of results for deciding what the findings mean, and generating alternative explanations for findings and indicating why these explanations should be discounted

Focusing the evaluation design CDC Step III

Planning in advance where the evaluation is headed and what steps will be taken. This provides investment in quality, increases the chances that the evaluation will succeed. Identifying procedures that are practical, viable, and cost-effective Ways to do this is to meet with stakeholders to clarify the real intent or purpose of the evaluation; learning which persons are in a position to actually use the findings, then orienting the plan to meet their needs

ethical argument

Position that suggests it is not ethical to provide services if you don't know whether it makes things worse

internal validity argument

RCTs are high in this. They are high in this because random assignment reduces the likelihood of systematic bias in the assignment of participants to treatment and control groups is. Consequently, this rules out numerous rival hypotheses for any effects that are observed.

Main purpose difference between evaluation and research

Research is more focused on advancing knowledge in the field whereas evaluation is more focused on assessing the merit or worth of a program. Applied research is sort of a dual purpose?

Evaluation vs. Research (Data Analysis)

Same-- inferential and descriptive statistics; content analysis/grounded theory

Evaluation vs. Research (data collection methods)

Same: tests, surveys/questionnaires, observation, interviews, archival analysis, unobtrusive measures

Evaluation vs. Research (Study Design)

Same: naturalistic/qualitative, experimental/quantitative * Difference is that evaluation is bounded by organization's timeframe requirements whereas research is based on researcher's timeline and available funding

Evaluation vs. Research (Reliability & Validity)

Same: pilot testing, member checks, controlling variables through designs, test/retest reliability measures Difference: Evaluation: is rooted in values and politics, generalizability is not a major goal or concern Research: attempts to be objective and value free; seeks to establish generalizable findings

who are the major theorists on the valuing branch of the evaluation theory tree?

Scriven, Guba & Lincoln, and Robert Stake

Describe the program CDC Step II

Scrutinizing the features of the program being evaluated. Description includes information regarding the way the program was intended to function and the way it actually was implemented. This improves the evaluation's fairness and accuracy, permits a balanced assessment of strengths and weaknesses How to do this: list specific expectations as goals, objectives, and criteria for success. Characterize the needs addressed by the program, assess the program's maturity or stage of development, analyze the context within which the program operates

the 'honoring complexity' argument

Simplicity and certainty are what governments seek. Complexity and uncertainty are what we [qualitative evaluators] habitually deliver...The government wants to know what works, and we have to tell them that nothing works everywhere, and that their best bet is to fully understand why this is so

Scriven Definition on Evaluation

The evaluation process normally involves some identification of relevant standards of merit, worth, or value; some investigation of the performance of evaluands (that which is under study) on these standards, and some integration or synthesis of the results to achieve an overall evaluation or set of associated evaluations Identifying relevant standards of merit, investigating the performance of evaluands on these standards, integrating the results to achieve an overall evaluation (conclusion)

democratic theory argument

There is a need for accurate information about program consequences if representative democracies are actually able to serve the will of the people. By virtue of their ability to eliminate alternative explanations, RCTs provide the most accurate means of establishing causal connections between public actions and social consequences.

Preskill & Torres's definition of evaluation

We envision evaluative inquiry as an ongoing process for investigating and understanding critical organizational issues. It is an approach to learning that is fully integrated with an organization's work practices. Process for investigating and understanding critical organizational issues

What's a logic model?

a graphic representation of a program showing the indended relationships between investments, activities, and results They typically refer to a program's inputs, activities, outputs, and outcomes

ethics

a set of values and beliefs that guide choices

Weiss's Definition of evaluation

an evaluation is examining and weighing a phenomenon ( a person, a thing, an idea) against some explicit or implicit yardstick... as a means of contributing to the improvement of the program or policy

political games of evaluators during data collection (1 thing)

collecting information off the record then allowing that information to enter into the interpretation phase

pragmatist

consider consequences of research pick the best methods for the context methodlogical pluralism

Fundamental difference between research and evaluation

control; who is asking the questions? Who has control over the data?

attributing outcome to the program

determining if the program caused any of the observed changes; making a causal claim. Very challenging-- attributing cause to the program is one of the hardest things to do. This is what stakeholders want most, however. Example: do participants in the program differ on outcomes as a result of participation in the program?

purpose of research

develop new knowledge, seek conclusions, seek new laws and theories, topic and questions are determined by researcher.

responsibility for general and public welfare

evaluators articulate and take into account the diversity of general and public interests and values

integrity/honesty

evaluators display honesty and integrity in their own behavior, and attempt to ensure the honesty and integrity of the entire evaluation process

competence

evaluators provide competent performance to stakeholders

respect for people

evaluators respect the security, dignity, and self-worth of respondents, program participants, clients, and other evaluation stakeholders

scientific credibility

governed by scientific principles

Explanations

how did the program achieve its results? Challenging question; focuses on factors that contributed to results. qualitative methods are useful tools to inform answers to this type of evaluation question For example: How/why did the online portion increase test scores?

systematic assessment

indicates nature of evaluation procedures; importance placed on formality and rigor; applies equally to qualitative and quantitative research

credible evidence

information that stakeholders perceive as trustworthy and relevant for answering their questions

What would be a logic model for a family vacation? Inputs? Activities? Outcomes?

input: family members, budget, car, camping equipment activities: drive to state park, set up camp, cook, play, talk, laugh, hike outcome: family members learn about each other; family bonds: family has a good time

propriety standards

intended to ensure that an evaluation will be conducted legally, ethically, and with due regard for the welfare of those involved in the evaluation, as well as those affected by its results (culture)

feasibility standards

intended to ensure that an evaluation will be realistic, prudent, diplomatic, and frugal

accuracy standards

intended to ensure that an evaluation will reveal and convey technically adequate information about the features that determine worth or merit of the program being evaluated

utility standards

intended to ensure that an evaluation will serve the information needs to intended users

program outcomes

is the program achieving the desired outcomes? This can answer about the outcome but cannot answer about evidence to attribute effects to intervention. For example, we can tell whether math scores improved, but we do not know why they improved. Was it really the intervention?

formative evaluation

main purpose is improvement. Formative is related to learning about the program, and involves the evaluation of program operations and processes, asking questions such as is this program being implemented as it was planned? How is it doing that? and what are the processes involved? concentrates on studying the program processes. Focus is on fidelity of program implementation, or just want to learn what the program is doing in general (no template to compare it to)

Summative evaluation

main purpose is making decisions about a program, either cancelling or extending it. Uses judgement on how well a program is working in order to guide decision-making. It answers questions such as is the program doing what it is supposed to in terms of end results and objectives? Is the program generating the benefits expected from the investment on it? Finding the program outcome. Are the participants gaining the benefit they were intended to receive? Should we keep doing it?

feasibility argument against RCT

position that argues RCTs cannot be feasibly used in a large number of situations

CDC argument against RCTs

position that argues credible evidence is something that is relevant or credible to the stakeholders with which you are interacting

ethical argument against RCT

position that argues services cannot morally be withheld from needy populations

the 'other methods' argument against RCT

position that argues there are other methods that can reliably establish causality

Purpose of evaluation

provide information for learning and decision making (intention is use), seek to describe particular phenomena, and undertaken at the behest of a client-- service oriented

the 'privileging the elite' argument

questions about the causal effects of social interventions are characteristically those of policy and decision makers, while other stakeholders have other legitimate and important questions. This is radically undemocratic

Do questions drive the methods or do the methods drive the questions?

questions drive the methods!!!!!!!

credibility

the perceived accuracy or believability of an evaluator, evaluation process, or information produced in an evaluation

constructivist

there are multiple subjective realities acknowledge and consider bias cause-effect impossible to dinstinguish

post-positivist

there is a single reality reality can be studied objectively causation is observable

program process

they look for information about what the program does and how it does it, involving whether it does what it has planned to do. *Relatively easy to answer. They are based on monitoring data and outputs. For example, did they attend their class and meet with their mentors?

highlights from abstinence only education evaluation case study (mathematica 2007)

this evaluation: followed many steps in the CDC framework; engaging stakeholders, creating advisory groups to help with design, justifying conclusions. Findings were used to make decisions about future funding; able to change policy based on results; defunding of abstinence-only program.

explicit standards

those that are explicitly stated. For example, they may be tied to funding. They may be chosen to determine the final evaluation judgement. One example of an explicit standard is an education program requiring an X amount of students to meet a specific standard

implicit standards

those that are implicit, embedded in the way the evaluation is done. For example, implicitly, no one believes they can meet the goal so the program creates their own rubric.

Patton's definition of evaluation

utilization-focused evaluation (as opposed to program evaluation in general) is evaluation done for and with specific, intended primary users for specific, intended uses.

links between processes and outcomes

which program components/services are related to better or poorer outcomes? For example: did the online portion increase test scores more than the in-person?

evaluation corruptibility

willingness to twist the truth and produce positive findings intrusion of unsubstantiated opinions because of sloppy, capricious or unprofessional practices shaded evaluation findings as a result of prejudices or notions inducement to clients or participants failure to honor commitments


Related study sets

Chapter 5: Linux Filesystem Administration

View Set

ASVAB FOR DUMMIES WORD KNOWLEDEGE

View Set

Traditions and Encounters Ch. 40

View Set

ENGLISH PLUS 1, STARTER UNIT: PREPOSITIONS - English x English/pictures

View Set

Ch. 9 Statistical Process Control (SPC)

View Set

Chapter 2: Traditional Health Beliefs and Practices

View Set

Chapter 5 - Adjustments and the Worksheet

View Set

Electrical Engineering Fundamentals of Engineering Review Material

View Set