ch 14 Evaluation in Healthcare Education
evidence based practice
"the conscientious use of current best evidence in making decisions about patient care" "a lifelong problem-solving approach to clinical practice that integrates . . . the most relevant and best research . . . one's own clinical expertise . . . and patient preferences and values"
practice-based evidence
"the systematic collection of data about cli- ent progress generated during treatment to enhance the quality and outcomes of care"
focus of evaluation
(1) audience, (2) purpose, (3) ques- tions, (4) scope, and (5) resources
purpose for conducting data analysis
(1) to organize data so that they can provide meaningful information (2) to provide answers to evaluation questions
process (formative) evaluation
Am I giving the client time to ask questions? •Is the information I am giving orally consistent with the written material?• Does the client look bored? Is the room too warm?• Should I include more opportunities for return demonstrations?
reporting evaluation results
Be audience focused .•Stick to the evaluation purpose. •Use the data as intended
Reasons for not reporting evaluation results
Ignorance of who should receive the results Belief that the results are not important or will not be used Lack of ability to translate findings into language useful in producing a final report Fear that results will be misused
impact
N/A Overall self-care and health maintenance
data collecting methods
Observation Interview Questionnaire or written examination Record review Secondary analysis of existing databases
process
Patient assimilation of information during teaching Patient education interventions
content
Patient information retention after teaching Patient/family performance following learning
outcome
Patient use of information in day-to-day life Patient/family performance at home
Outcome (summative) evaluation
Purpose is to determine the effects of teaching efforts.
total program evaluation
Purpose is to determine the extent to which all activities for an entire department or program meet or exceed goals originally established
impact evaluation
Purpose is to determine the relative effects of education on the institution or the community.
content evaluation
Purpose is to determine whether or not learners have acquired the knowledge or skills taught.
process (formative) evaluation
Purpose is to make adjustments in educational activities.
impact evaluation
Scope of ____ is broader, more complex, and usually more long term than that of process, content, or outcome. •Use with programs teaching content critical to client's well-being
content evaluation
To what degree did the learners learn what was presented? •To what degree did learners achieve specific objectives?
evaluation instruments
Use existing ___ if possible. •Need strong reliability and validity •Checklists for teaching and evaluation are helpful.
Outcome (summative) evaluation
___ criteria measure more long-term change. •Collected six months after baseline data are collected.
reflective practice
a process that helps the individual "to understand the meaning of a problematic situation, which is the relationship between causes, actions, and consequences"
evaluation
a systematic investi- gation of the worth or value of something
internal evidence
data generated from a dili- gently conducted quality improvement project or EBP implementation project within a spe- cific practice setting or with a specific popu- lation
assessment
focuses on initially gathering, summarizing, interpreting, and using data to decide a direction for action.
time series design
include only one group of learners from whom evaluative data are collected at several points in time, both before and after receiving an educa- tional intervention.
mixed methods or pluralistic designs
include participants from diverse settings or perspectives, or that require both program processes and outcomes to be included in the evaluation comprehensive, resource intensive, and long term in nature, they are most appro- priate for program evaluation.
level 0
learner's dissatisfaction & readiness to learn (needs assessment)
level IV
learner's maintained performance & attitude (ongoing; impact)
level 1
learner's participation & satisfaction during intervention (initial; process)
level III
learner's performance & attitude in daily setting (long-term; outcome)
level II
learner's performance & satisfaction after intervention (initial; process)
reflection on action
occurs when the health professional introspectively ana- lyzes a practice activity after its completion to gain insights for the future
reflection in action
occurs when the health professional introspectively considers a prac- tice activity while performing it so that change for improvement can be made at that moment.
conducting evaluation
pilot test first. •Include extra time .•Keep a sense of humor
external evidence
reflecting that it is intended to be generalizable or transferable beyond the spe- cific study setting or sample.
program evaluation
the scope of ___ encompasses all aspect of educational activity—process, content, outcome, and impact.
designing evaluation
•Design structure •Evaluation versus research •Evaluation methods •Types of data to collect •From whom or what to collect data •How, when, and where to collect data •Who is to collect the data •Evaluation instruments
barriers to evaluation
•Lack of clarity •Lack of ability •Fear of punishment or loss of self-esteem