The Evaluation
- Josephine Akinwumiju
- Dec 2, 2025
- 2 min read
Updated: Dec 6, 2025
Over the past 90 days, I’ve had the opportunity to closely examine and immerse myself in the curriculum and overall learning experience we provide to new hires. In not so many words, an redesign is needed. We have an unique opportunity to design an engaging experience that reflects our organization's commitment to excellence. This project focuses on redesigning NEO 100, NEO 200, Agency, and NEO SIM to create a streamlined, accessible, and interactive training experience that supports learner success from day one.
The Project Goal
Based on what I’ve learned, the goal of this project is to enhance the quality, consistency, and accessibility of STP’s training materials, beginning with NEO 100 & 200, Agency, and NEO SIM, to create a revitalized learning experience that shortens new hire ramp up time and increases retention of KPHC standard workflows.
The Project Objectives
By the end of the project, we aim to:
Redesign and launch NEO 100, NEO 200, and Agency with 100% accurate workflows and materials that meet adult-learning, accessibility, and instructional design standards (UDL, WCAG 2.1 AA).
Include at least four interactive or experiential activities in each redesigned course.
Achieve a minimum 80% learner mastery rate on in-class knowledge checks, simulations, or skill demonstrations.
Deliver a post-training retention Web-Based Training (WBT) 2–3 months later, with at least 80% of learners scoring 80% or higher.
Ensure at least 85% of new hires report feeling prepared for their first week during onboarding evaluations.
Implement a maintenance workflow with quarterly reviews, a standardized update intake process, and a dashboard tracking status, owner, and last review date for all NEO and Agency materials.
The Evaluation Model
While Kirkpatrick's evaluation model is the one most commonly used within the organization, this project aligns more closely with Kaufman’s model, which expands evaluation beyond individual learning outcomes to include organizational value and system-wide impact.
The Evaluation Plan
With that model in mind, to understand how well the redesign is working, I’m evaluating it in three phases: pre-project, ongoing, and post-project. The pre-project phase focuses on past learners and what they didn’t like or didn’t feel supported by in the old training. The ongoing phase gives those same learners space to share input as we design the new experience, allowing us to make adjustments in real time. The post-project phase brings together past learners, current new hires experiencing both versions of training, and future learners who will only receive the updated content. This structure helps us check our progress, confirm that our objectives are being met, and get a clear before-and-after comparison of learner confidence and preparedness, while still allowing new learners to shape the process.
One key evaluation instrument is a structured pre- and post-training survey designed to measure confidence in key workflows, perceived clarity of instruction, and overall preparedness.
As with any project, this work is meant to be an iterative process. The pre- and ongoing evaluations will help us confirm that we are moving in the right direction, while the post-project evaluation ensures that our goals have been fully met. Most importantly, this process allows us to build a training experience that truly supports our learners. By staying responsive to the data and continuously improving each component, we can create a strong, thoughtful, and effective onboarding journey that sets every new hire up for success.

Comments