You are here

10-Step Evaluation for Training and Performance Improvement
Share

10-Step Evaluation for Training and Performance Improvement



October 2018 | 344 pages | SAGE Publications, Inc
Written with a learning-by-doing approach in mind, 10-Step Evaluation for Training and Performance Improvement gives students actionable instruction for identifying, planning, and implementing a client-based program evaluation. The book introduces readers to multiple evaluation frameworks and uses problem-based learning to guide them through a 10-step evaluation process. As students read the chapters, they produce specific deliverables that culminate in a completed evaluation project.
 
Introduction
Performance improvement and evaluation  
What is evaluation?  
What is not evaluation?  
How does evaluation compare with research?  
Program evaluation in the HPI context  
Different evaluation designs used in program evaluation  
Descriptive case study type evaluation design  
Frameworks for conducting evaluations in the HPI context  
The 10-step evaluation procedure  
 
Chapter One: Identify an Evaluand (Step 1) and Its Stakeholders (Step 2)
Identify a performance improvement intervention as an evaluand  
Use the 5W1H method to understand the intervention program  
Ask why the intervention program was implemented  
Check if program goals are based on needs  
Sell evaluation to the client  
Identify three groups of stakeholders  
 
Chapter Two: Identify the Purpose of Evaluation (Step 3)
Differentiate evaluation from needs assessment  
Gather information about the evaluation purpose  
Assess stakeholders’ needs for the program and the evaluation  
Determine if the evaluation is a formative or summative type  
Determine if the evaluation is goal-based or goal-free  
Determine if the evaluation is merit-focused or worth-focused  
Keep in mind using a system-focused evaluation approach  
Write an evaluation purpose statement  
 
Chapter Three: Assess Evaluation Feasibility and Risk Factors
Incorporate macro-level tasks into micro-level steps  
Assess feasibility of the evaluation project  
List project assumptions  
Estimate tasks and time involving stakeholders  
Assess risk factors for the evaluation project  
 
Chapter Four: Write a Statement of Work
Prepare a statement of work for the evaluation  
Determine sections to be included in a statement of work  
Develop a Gantt chart  
Review a sample statement of work  
 
Chapter Five: Develop a Program Logic Model (Step 4)
Apply a theory-based, if-then logic to developing a program  
Review United Way’s program outcome model  
Review Kellogg Foundation’s program logic model  
Review Brinkerhoff’s training impact model compared to the four-level training evaluation framework  
Compare elements used in different frameworks  
Develop a program logic model  
Develop a training impact model  
 
Chapter Six: Determine Dimensions and Importance Weighting (Step 5)
Think about dimensions of the evaluand to investigate  
Start with the stakeholders’ needs  
Relate the purpose of evaluation to the program logic model elements  
Incorporate relevant theoretical frameworks and professional standards  
Write dimensional evaluation questions  
Determine importance weighting based on usage of dimensional findings  
Recognize a black box, gray box, or clear box evaluation  
Finalize the number of dimensions  
 
Chapter Seven: Determine Data Collection Methods (Step 6)
Determine evaluation designs for dimensional evaluations  
Select data collection methods that allow direct measures of dimension  
Apply critical multiplism  
Triangulate multiple sets of data  
Select appropriate methods when using the four-level training evaluation model  
Select appropriate methods when using Brinkerhoff’s Success Case Method  
Review an example of data collection methods  
Use an iterative design approach  
Assess feasibility and risk factors again  
Conduct formative meta-evaluations  
 
Chapter Eight: Write an Evaluation Proposal and Get Approval
Determine sections to be included in an evaluation proposal  
Review a sample evaluation proposal  
 
Chapter Nine: Develop Data Collection Instruments I – Self-Administered Surveys (Step 7)
Comply with IRB requirements  
Use informed consent forms  
Determine materials to be developed for different data collection methods  
Distinguish anonymity vs. confidentiality  
Develop materials for conducting self-administered surveys  
Determine whether to use closed-ended questions, open-ended questions, or a mix of both types  
Ask specific questions that measure the quality of a dimension  
Design survey items using a question or statement format  
Recognize nominal, ordinal, interval, and ratio scales  
Decide whether to include or omit a midpoint in the Likert scale  
Decide whether to use ascending or descending order of the Likert scale options  
Follow other guidelines for developing survey items  
Develop survey items that measure a construct  
Test validity and reliability of a survey instrument  
Conduct formative meta-evaluations  
 
Chapter Ten: Develop data collection Instruments II – Interviews, Focus Groups, Observations, Extant Data Reviews, and Tests (Step 7)
Determine whether to use a structured, unstructured, or semi-structured interview  
Develop materials for conducting interviews or focus groups  
Solicit interview volunteers at the end of a self-administered web-based survey  
Develop materials for conducting observations  
Develop materials for conducting extant data reviews  
Develop materials for administering tests  
Conduct formative meta-evaluations  
 
Chapter Eleven: Collect Data (Step 8)
Follow professional and ethical guidelines  
What would you do?  
Use strategies to collect data successfully and ethically  
Use strategies when collecting data from self-administered surveys  
Use strategies when collecting data from interviews and focus groups  
Use strategies when collecting data from observations and tests  
Use strategies to ensure anonymity or confidentiality of data  
Conduct formative meta-evaluations  
 
Chapter Twelve: Analyze Data with Rubrics (Step 9)
Use evidence-based practice  
Keep in mind: evaluation = measurement + valuation with rubrics  
Apply the same or different weighting to the multiple sets of data  
Analyze structured survey data with rubrics  
Analyze unstructured survey or interview data with rubrics  
Analyze semi-structured survey or interview data with rubrics  
Analyze data obtained from observations, extant data reviews, and tests with rubrics  
Determine the number of levels and labels for your rubrics  
Triangulate results obtained from multiple sources for each dimension  
Conduct formative meta-evaluations  
 
Chapter Thirteen: Draw Conclusions (Step 10)
Revisit formative or summative use of evaluation findings  
Develop a synthesis rubric  
Draw evidence-based conclusions and recommendations  
Conduct formative meta-evaluations  
 
Chapter Fourteen: Write a Final Report and Conduct a Summative Meta-Evaluation
Extend the evaluation proposal to a final report  
Present dimensional results in the Evaluation Results section  
Present supporting information in appendices  
Present conclusions  
Report the findings ethically  
Conduct a summative meta-evaluation  
Report limitations  
Write an executive summary  
Present the final report to stakeholders  
Follow up with stakeholders  
Present complete sections in a final report  

Supplements

Instructor Teaching Site

Password-protected Instructor Resources include:

·         Chapter quizzes including pre-written, editable multiple choice and short-answer questions help assess students’ progress and understanding.

·         Editable, chapter-specific Microsoft® PowerPoint® slides offer ease and flexibility in creating a multimedia presentation for your course.

·         A sample syllabus provides a suggested model for structuring your evaluation course.

·         All figures and tables from the book available for download.


Student Study Site

The open-access Student Study site includes downloadable versions of the templates and worksheets in the book:

·         Sample Scope of Work for a real evaluation (Ch. 4)

·         Gantt chart template for planning and scheduling (Ch. 4)

·         Sample evaluation proposal and final report (Ch. 8)

·         Worksheet for identifying, planning, and conducting an evaluation project  (Appendix B)


“This is a very well written book. It is easy to read, follow, and the application of the material from chapter to chapter is well constructed.”

Charles E. Moreland
Barry University

“Yonnie Chyung has clearly and concisely discussed a ten-step process for evaluation that will appeal to scholarly practitioners across multiple disciplines. Incorporated throughout the text are user-friendly examples, tables, and samples.”

Jennifer Fellabaum-Toston
University of Missouri

10-Step Evaluation for Training and Performance Improvement provides tools for practitioner, students, professors, evaluators, and so many more to address questions as they relate to practical program evaluation. This text offers a solid theoretical framework while offering practicality and readability to its audiences. The tools provided within the text share a best practice point of view that are easily adaptable to many situations and various environments.”

Mary Leah Coco
Louisiana Department of Transportation and Development

"This book was an exceptional point-by-point, systematic process for my students to develop project-based learning cases of their own. Overall, it was a practical application to program evaluation."

Dr. Suzanne Ensmann
The University of Tampa
Key features

KEY FEATURES:

  • The book is uniquely written to facilitate project-based learning to help readers learn program evaluation by doing it on their own.
  • Unlike other evaluation books, this one explains how to conduct a program evaluation for specific groups such as HPI, HRD, and T&D.
  • Each chapter provides directions with "Now, your turn" features which come with suggestions on which templates to use.
  • Pedagogical features including examples presented in tables, figures, exhibits, and sample SOW and proposals offer evaluators benchmarks to reinforce concepts.

Preview this book

For instructors

Select a Purchasing Option


Paperback
ISBN: 9781544323961
$70.00