The Key to Quality Comments. Asking the Right Question.

BY REANNA P. HARMAN, Ph.D. | Originally Published in Training Industry Magazine

Anyone who has participated in a training event is familiar with open-ended survey items like this one: “Please provide any additional comments you have about the training program you just completed.”

After getting into the rhythm of clicking bubble after bubble in response to closed-ended survey items, many trainees come to a roadblock when provided with a blank box of space and asked to provide feedback in their own words. While trainees tend to complete all or most of the closed-ended items on organizational surveys, the same is not true for open-ended items. A recently published series of studies investigated responses to open-ended comments on training evaluation surveys and found that 28 to 53 percent of trainees provided a response to openended items.

Reaction measures are widely used in the training industry for evaluation and are typically measured on surveys that include both closedended items and open-ended items. But, why are there less responses to open-ended items and what are the practical implications?

At the annual meeting of the Society for Industrial and Organizational Psychology in 2007, Poncheri and Thompson presented a study that openended items are viewed as less important than closed-ended items on web-based surveys. Additionally, open-ended items carry a higher response burden for trainees when compared to closed-ended items because it takes more effort for the trainee to respond. If trainees think that providing comments is not important or that it takes a lot of eort to respond, the combination could be enough to discourage response.

Despite what trainees may perceive, training managers and other decision makers highly value input provided in response to open-ended items because the comments are in the trainees’ own words. Comments are very rich and can be very powerful. Even a highly negative, critical comment may sway a decision-maker to cancel or make significant changes to a program, while highly positive comments can provide value to market training or justify additional investment.

Because of their value, comments should be interpreted with care. Research shows that comments are negative in tone and that dissatisfied trainees tend to provide comments. Comments most likely do not represent the opinions of all trainees, but may reflect input from the most dissatisfied trainees. Training managers who read comments should interpret them appropriately as they may only be hearing from the dissatised trainees.

Knowing how to get the most out of comments can provide key strategic advantages for organizations if used and interpreted appropriately. Here are four evidence-based tips for training managers and other key decisionmakers to improve training survey practices and to better leverage the feedback provided.

EACH SURVEY ITEM SHOULD HAVE A CLEAR LINK TO A GAP IN ORGANIZATIONAL KNOWLEDGE OR A STRATEGIC PURPOSE.

1 – Define Your Needs Before Designing Your Items

Any survey item, whether closed-ended or open-ended, should be aligned with your needs. Surveying for the sake of surveying is not good practice as it can frustrate respondents and waste valuable company resources. A study presented at the International Convention of Psychological Sciences by Hause, Hendrickson, & Brooks in 2015 showed that only 5 percent of comments provided in response to a global financial survey were useful, (i.e., defined as having the ability to lead organizational change). Each survey item should have a clear link to a gap in organizational knowledge or a strategic purpose. This makes designing the items, getting feedback and acting on the results streamlined and effective.

2 | Alter Items to focus on the Dialogue You Control

The intent of open-ended items on surveys is to provide a format where trainees can have a “voice” and provide feedback in their own words. However, this does not mean that the item must be unstructured or vague. It is better if the item signals to trainees what kind of information the organization is seeking. Survey designers often include a “catch all” item at the end of a survey to make sure respondents have been given a chance to comment, but will responses to these items provide any value? Studies have shown that altering item wording can have an impact on response. There is also limited evidence that item wording may reduce biased responding by encouraging both positive and negative feedback.

3 | Be Transparent and Frame the INstructions to Elicit Valuable Feedback

Do not let trainees assume that the open-ended items are not important. Use item instructions to tell them how important they are and what action will be taken.

Poncheri and Thompson presented a study at SIOP in 2007, titled “Open-Ended Comments: To Require or Not to Require?” In this experimental study, undergraduate students were assigned to three conditions:

  1. A control condition (standard web-based survey where no items were required or encouraged).
  2. An experimental condition where open-ended items were required (if the item was skipped, an error would be shown much like the error that happens on surveys when required closed-ended items are skipped).
  3. An experimental condition where comments were encouraged (the instructions communicated that the survey designers found the comments important).

The two experimental conditions led to a similar pattern of results: increasing response rate and favorable perceptions of open-ended items without adversely aecting reactions to taking the survey.

In a 2015 presentation at ICPS titled, “Employee Input from Across the Globe: ‘Nudging’ for More Useful Ideas,” Hause, Hendrickson, and Brooks used an approach called nudging. Nudging involves providing respondents with information in the item instructions about the kind of responses the organization is seeking, (e.g., specic descriptions, realistic solutions), as well as example comments. This global study found that nudging leads to the provision of comments that are ve times more useful when compared to a control group. These findings support the importance of being . transparent about the kind of feedback being sought by communicating the importance of comments and providing trainees with more guidance on the kind of feedback that is most valuable.

4 | Use the Technology to Customize the Experience

Although some training surveys are still delivered via paper and pencil, web surveying is increasingly used and the tools they provide should be leveraged to improve the use of open-ended items. Item branching can be used to probe for more information based on responses to closed-ended items. If a trainee marks “very dissatised” with the training schedule, they can be branched to an item that targets that area. This serves to limit the trainees responding to the open-ended item and creates a customized survey experience for the trainee that should encourage a more valuable contribution based on their training experience. These items often have higher response rates because they are targeted at the trainee’s specic experience.

Asking the right questions is essential to receiving quality comments, but there are many other factors that contribute to the usefulness of feedback provided in response to open-ended items on training surveys. Any behavior (such as commenting on a survey) is impacted by the characteristics of the trainee, as well as factors in the environment. Although research shows there are dierences in terms of who comments on training surveys, this article has provided some key tips for encouraging all trainees to respond and provide quality feedback.

Dr. Reanna Poncheri Harman is the vice president and director of consulting practice at ALPS Solutions whose research focuses on training evaluation, training and organizational surveys, and needs assessment. Email Reanna.

RECENT INSIGHTS

Impact Evaluation: From Employee Training to Leadership Development

SIOP Annual Event: Sat, April 25, 12:30PM-1:20PM [Cancelled due to COVID-19 policies]
Drawing on the combined experience of a diverse panel of learning and development experts, this session will examine and discuss current practices and future opportunities in impact evaluation for a wide range of interventions, from employee training to leadership development programs. Panelists will share insights to help build value using evaluation data.

read more

Create More Value With Your Learning Evaluation, Analytics, and Feedback (LEAF) Practice

TK2020 Event: Wed, February 5, 11:30AM – 12:00PM
Optimizing your LEAF practice is your best opportunity to improve learning and its impact. Less than half of organizations indicate evaluation helps them meet their learning and business goals. Data alone doesn’t create value. People acting on data create value. Our ALPS Ibex™ platform drives effective, purpose-driven evaluation empowering L&D stakeholders with insights and creating a culture of continuous improvement in the workplace. Examples demonstrate how using ALPS Ibex helps L&D stakeholders Act on Insights™ to drive improvement and impact.

read more

Want More Value from Evaluation? AIM to Answer Two Questions

TK2020 Event: Thurs, February 6, 9:00AM – 10:00AM
While almost all learning is evaluated, less than half of organizations report that evaluation helps meet their learning and business goals. Data create no value. People acting on meaningful data within the L&D process create value. The Alignment and Impact Model (AIM) focuses evaluation on helping all stakeholders create value. AIM incorporates purpose, process, stakeholder roles, and two questions to guide evaluation design and focuses on maximizing learning, transfer, and impact. Examples demonstrate the fundamentals of AIM and how it can be implemented and used.

read more

Create More Value With Your Learning Evaluation, Analytics, and Feedback (LEAF) Practice

TK2020 Event: Thurs, February 6, 10:15AM-10:45AM
Optimizing your LEAF practice is your best opportunity to improve learning and its impact. Less than half of organizations indicate evaluation helps them meet their learning and business goals. Data alone doesn’t create value. People acting on data create value. Our ALPS Ibex™ platform drives effective, purpose-driven evaluation empowering L&D stakeholders with insights and creating a culture of continuous improvement in the workplace. Examples demonstrate how using ALPS Ibex helps L&D stakeholders Act on Insights™ to drive improvement and impact.

read more

Improving Instructor Impact on Learning with Analytics

Each of us can recall an instructor who made learning engaging, relevant and impactful, inspiring us to apply what we learned. Unfortunately, each of us can also recall an instructor who failed in one or more these areas. Instructors are force multipliers, reaching hundreds — if not thousands— of learners, impacting both their learning experience and motivation to transfer. So, how can we improve instructor impact on learning?

read more

Two Fundamental Questions L&D Stakeholders Should Answer to Improve Learning

During recent conference presentations and webinars focused on analytics, big data and evaluation, we noticed audience members asking, “What questions should I be asking [and answering with evaluation data and analytics]?” Speakers typically answer these questions one of two ways: either by recommending collecting specific types or “levels” of data, as if all relevant questions for all learning and development stakeholders should be immediately identified and addressed; or by recommending collecting and tagging as much data as possible so the data analysts figure it out, as if the important questions will only emerge from analyzing all the data after the fact.

read more

Let’s Connect.

Ready to learn how ALPS Insights can help your organization improve?

  • This field is for validation purposes and should be left unchanged.