SIOP Annual Conference

Impact Evaluation: From Employee Training to Leadership Development

Saturday, April 25, 12:30PM-1:20PM
Cancelled due to COVID-19 Policies

Organizations continue to spend more and more on learning and development (L&D), as a way to build their workforce for business success. However, there has not been much evidence that all the spending on L&D is producing more capable talent. Most evaluation efforts are not effective in demonstrating the impact of L&D practices. To help L&D professionals effectively evaluate the impact of their efforts, this session draws on the combined expertise of a diverse panel of experts in different L&D areas, ranging from employee training, high potential programs, to leadership development programs. Panelists will share insights on current practices and future opportunities in impact evaluation and ways to build value using evaluation data.

Panelists


 

Carolyn Hill-Fotouhi

Associate Director of Workforce Analytics, Merck & Co., Inc.

Dr. Neelima Paranjpey

Senior Consultant, Vaya Group

Dr. Eric A. Surface

CEO and Principal Scientist, ALPS Insights

Dr. Chia-Lin Ho (Chair)

Independent Consultant, Ho Leadership & Talent Management Associates

RECENT INSIGHTS

Improving Instructor Impact on Learning with Analytics

Each of us can recall an instructor who made learning engaging, relevant and impactful, inspiring us to apply what we learned. Unfortunately, each of us can also recall an instructor who failed in one or more these areas. Instructors are force multipliers, reaching hundreds — if not thousands— of learners, impacting both their learning experience and motivation to transfer. So, how can we improve instructor impact on learning?

read more

Two Fundamental Questions L&D Stakeholders Should Answer to Improve Learning

During recent conference presentations and webinars focused on analytics, big data and evaluation, we noticed audience members asking, “What questions should I be asking [and answering with evaluation data and analytics]?” Speakers typically answer these questions one of two ways: either by recommending collecting specific types or “levels” of data, as if all relevant questions for all learning and development stakeholders should be immediately identified and addressed; or by recommending collecting and tagging as much data as possible so the data analysts figure it out, as if the important questions will only emerge from analyzing all the data after the fact.

read more

Qualitatively Different Measurement for Training Reactions

Training evaluation should provide insights not only about the effectiveness of training but also about how it can be improved for learners and organizations. In this context, the term “insights” implies a deep understanding of learning, the training process and its outcomes as well as evaluation procedures – designing, measuring, collecting, integrating and analyzing data from both a formative and summative perspective.

read more

Beyond Levels: Building Value Using Learning and Development Data

For many learning and development (L&D) professionals, training evaluation practices remain mired in the muck. What ends up being evaluated hasn’t changed much over the past two or three decades. We almost universally measure whether trainees liked the training, most of us measure if they learned something, and beyond that, evaluation is a mix of “We’d like to do that” and, “We’re not sure of what to make of the data we get.” Perhaps more critically, in one recent national survey, nearly two-thirds of L&D professionals did not see their learning evaluation efforts as effective in meeting their organization’s business goals.

read more

The Key To Quality Comments is Asking the Right Questions

Anyone who has participated in a training event is familiar with open-ended survey items like this one: “Please provide any additional comments you have about the training program you just completed.” After getting into the rhythm of clicking bubble after bubble in response to closed-ended survey items, many trainees come to a roadblock when provided with a blank box of space and asked to provide feedback in their own words.

read more

Evaluating Instructional Behaviors for Improved Training Outcomes

ALPS Solutions engaged in a series of studies to understand why instructors were having such a large impact on student outcomes in the Special Operations Forces (SOF) community. If training is supposed to be a standardized experience, then the instructor to which a student is assigned should not cause a variable experience for students across classes. The goal of this research was to identify and reduce variability to create a more standardized and a positive experience.

read more

 

 

Let’s Connect.

Ready to learn how ALPS Insights can help your organization improve?

  • This field is for validation purposes and should be left unchanged.