TK2020 Event
Create More Value With Your Learning Evaluation, Analytics, and Feedback (LEAF) Practice
Wednesday, February 5, 11:30AM – 12:00PM
Optimizing your LEAF practice is your best opportunity to improve learning and its impact. Less than half of organizations indicate evaluation helps them meet their learning and business goals. Data alone doesn’t create value. People acting on data create value. Our ALPS Ibex™ platform drives effective, purpose-driven evaluation empowering L&D stakeholders with insights and creating a culture of continuous improvement in the workplace. Examples demonstrate how using ALPS Ibex helps L&D stakeholders Act on Insights™ to drive improvement and impact.
Applications on the Job:
- Understand how improving LEAF practices can be a game changer in your organization.
- Support your evaluation use cases in the ALPS Ibex platform.
- Empower L&D stakeholders to create value with evaluation data.
Presenters
Dr. Reana Poncheri Harman
Vice President and Director of Consulting
Dr. Reanna Poncheri Harman has supported organizational leaders in their efforts to develop and implement L&D initiatives since 2004. Reanna is passionate about helping organizations focus their L&D on efforts to create alignment between on-the-job needs and what is taught in learning programs. Reanna is particularly interested in leveraging the power of qualitative data collected with open-ended items on organizational surveys and through interviews and focus groups. She helps clients interpret qualitative input in combination with quantitative data to inform organizational change. Reanna has experience leading all phases of applied research and consulting projects, related to work analysis, training needs assessment, training evaluation, and curriculum development. She has extensive experience designing focus group protocols, organizational surveys, moderating focus groups, and conducting qualitative analysis. Reanna participates in the design and development of software tools to support consulting projects involving training evaluation and training needs assessment and works with the ALPS Team on the development of ALPS Ibex. Reanna has served as a Principal Investigator, Associate Principal Investigator, and Project Lead for military, government, corporate, and non-profit organizations. She has presented at several conferences, including the American Psychological Association (APA), the Society for Industrial and Organizational Psychology (SIOP), the American Council on the Teaching of Foreign Languages (ACTFL), and the Language Testing Research Colloquium (LTRC). She has published in peer-reviewed journals including Organizational Research Methods and the Journal of Applied Psychology. She received her M.S. and Ph.D. in Industrial/Organizational Psychology in 2008 from North Carolina State University. She received her B.S. in Business from Mt. St. Mary’s University in 2003.
Mrs. Elisabeth Gnida Dezern
Data and Product Development Manager
Mrs. Elisabeth Gnida Dezern ensures quality survey design, data collection, data processing and reporting for military, government, corporate and non-profit organizations. As part of her role, Elisabeth works closely with clients to advise and manage data collection logistics associated with training evaluation efforts. She is passionate about advising clients on the best practices for managing data collection logistics and providing timely support for all client requests. Elisabeth uses her knowledge and experiences with clients in her role as the Product Development Manager for ALPS Ibex. Elisabeth works closely with a team of database programmers to develop, ALPS Ibex, a unique and engaging web-based tool that helps organization create value with learning and transfer data. She also provides customer support and leads training sessions for ALPS Ibex. Elisabeth is proficient in many software programs, including Microsoft Excel, InDesign, Qualtrics, Survey Monkey, Remark, and Adobe Acrobat and uses her skills in those programs to inform the design and development of ALPS Ibex. She received her B.A. in General Psychology from North Carolina State University in 2012.
RECENT INSIGHTS
Improving Instructor Impact on Learning with Analytics
Each of us can recall an instructor who made learning engaging, relevant and impactful, inspiring us to apply what we learned. Unfortunately, each of us can also recall an instructor who failed in one or more these areas. Instructors are force multipliers, reaching hundreds — if not thousands— of learners, impacting both their learning experience and motivation to transfer. So, how can we improve instructor impact on learning?
Two Fundamental Questions L&D Stakeholders Should Answer to Improve Learning
During recent conference presentations and webinars focused on analytics, big data and evaluation, we noticed audience members asking, “What questions should I be asking [and answering with evaluation data and analytics]?” Speakers typically answer these questions one of two ways: either by recommending collecting specific types or “levels” of data, as if all relevant questions for all learning and development stakeholders should be immediately identified and addressed; or by recommending collecting and tagging as much data as possible so the data analysts figure it out, as if the important questions will only emerge from analyzing all the data after the fact.
Qualitatively Different Measurement for Training Reactions
Training evaluation should provide insights not only about the effectiveness of training but also about how it can be improved for learners and organizations. In this context, the term “insights” implies a deep understanding of learning, the training process and its outcomes as well as evaluation procedures – designing, measuring, collecting, integrating and analyzing data from both a formative and summative perspective.
Beyond Levels: Building Value Using Learning and Development Data
For many learning and development (L&D) professionals, training evaluation practices remain mired in the muck. What ends up being evaluated hasn’t changed much over the past two or three decades. We almost universally measure whether trainees liked the training, most of us measure if they learned something, and beyond that, evaluation is a mix of “We’d like to do that” and, “We’re not sure of what to make of the data we get.” Perhaps more critically, in one recent national survey, nearly two-thirds of L&D professionals did not see their learning evaluation efforts as effective in meeting their organization’s business goals.
The Key To Quality Comments is Asking the Right Questions
Anyone who has participated in a training event is familiar with open-ended survey items like this one: “Please provide any additional comments you have about the training program you just completed.” After getting into the rhythm of clicking bubble after bubble in response to closed-ended survey items, many trainees come to a roadblock when provided with a blank box of space and asked to provide feedback in their own words.
Evaluating Instructional Behaviors for Improved Training Outcomes
ALPS Solutions engaged in a series of studies to understand why instructors were having such a large impact on student outcomes in the Special Operations Forces (SOF) community. If training is supposed to be a standardized experience, then the instructor to which a student is assigned should not cause a variable experience for students across classes. The goal of this research was to identify and reduce variability to create a more standardized and a positive experience.
Let’s Connect.
Ready to learn how ALPS Insights can help your organization improve?