Book A Demo
Resources / Blog

How to Improve Training: Conduct "Voice of the Learner" Sessions

Why use "Voice of the Learner Sessions" to improve training and development

Conducting Voice of the Learner sessions will enable you to see how your learners are actually interacting with your online learning experience and to get their honest thoughts and reactions—it's usability testing by a friendlier, and more learner-centric, name. By conducting Voice of the Learner sessions, you’ll understand how your site is actually working for learners and be able to make informed decisions on how to improve training and development before you launch the program to your full cohort.

Here’s the cool part, it really doesn’t take a whole lot of time; you only need to get feedback from about three to five participants before knowing what you can improve next. Since a typical feedback session is only about 50 minutes, it's easy to find out what wrinkles there are to iron out to improve training and provide better overall online learning experience.

What you’re really doing in a Voice of the Learner session is revealing what is working or what isn’t from the learner's point of view rather than your own. You can explore what draws a learner into the content and how they go through it to find out what grabs their attention or what is confusing. You can give them tasks to find out if your content flow and activities are coming across the way you need them to.

Why is this important to improve training in the workplace?

Within a few sessions you can catch issues and improve usability before rolling out a whole new course to learners. Would you publish an important white paper without having someone else proofread it? You might be able to get away without help, but you should always have someone check the work - just one new set of eyes on the site can tell you how to make it better.

Holding sessions will save both you and your learners time and grief. As a result of the sessions, you’ve listened to the actual users of the online learning experience, refocused on relevance, and adjusted to their needs. You can find out for example: 

  • Is there anything that is confusing them?
  • What is grabbing their attention?
  • Are they finding what they expected to find?
  • Is there anything missing?
  • Can they complete what they are supposed to?
  • Are they skipping over anything?
  • What questions or concerns does it leave them with?
  • What worked well and what was useful?

Download a sample Usability Testing template. 

How to improve training and development - White paper

Arranging the Voice of the Learner sessions

Find three to five participants to hold a session with. If you can’t find all of your participants right away that’s okay, you’ll start getting useful information within your first sessions. Find a mix of participants by level of experience, familiarity of content, demographics, country of origin.

Sessions only need to be up to 50 minutes. They can be held and recorded via web conference.

Include one team member to facilitate the test and one to take notes. The facilitator will lead the session while the note taker records observations and the participant’s reactions.

Holding the sessions

Follow a test script for each session. A script will give welcome and introductions, inform participants of procedures and expectations, and help the facilitator guide the participant through test scenarios with tasks and follow up questions.

Help your participant feel comfortable. Introduce who is going to be facilitating the session and then ask them a few questions about themselves. You can use this as an opportunity to know a little more about their experience level, where they are from, and what they do.

Inform them that they can’t do anything wrong here. They won’t hurt your feelings and will be helping you improve the site with their reactions and feedback. Make sure they know that they can skip questions or leave the session at any time. Ask for permission from them to record the session.

Use the think aloud method to observe learners while they explore the site or try to perform key tasks. Ask them to voice any reactions they have, what they are thinking, reading, or questions they have as they go along.

As they go through the site, give them a few moments to describe first reactions of what they are experiencing, or trying to do, then ask them about it. If they get stuck, let them voice their thinking, ask them what they were expecting to do there (or see), then politely help them move on to the next section or next task.  

Watch how the participant reacts and ask them follow up questions for example, “What makes you react that way?” or “What gives you pause here?” Try not to ask leading questions as they go through the site like “Do you think you would use this?” Instead, follow up on what they are actually doing and thinking.

Always close the session by thanking the participant and letting them know that you much appreciate their time and help.

Using learner feedback to improve training and development

After your sessions are finished hold a time with your team to debrief so that you can identify themes throughout the sessions and decide on the next most important improvements to make.  

For example, here are some findings from a professional development course we worked on with a client, and how we improved it based on learner feedback:

  • Voice of the Learner Feedback: Learners expressed the desire to click out of wordy announcements when they landed in the course homepages. There was too much information. Nobody wanted to read it.
  • How the Course Was Improved: The wordy announcement was replaced with a short video introducing the course instead. The homepage was simplified and made to feel more inviting by using brighter colors and photos of people with natural expressions for improved visual design.
  • Voice of the Learner Feedback: It was also ambiguous where to start or where to go next. They had difficulty in understanding what the content titles were telling them and how the visuals communicated a sense of order or progression through the course.
  • How the Course Was Improved: Where to start was communicated better by simply changing the relevant title to “Get Started.”
  • Voice of the Learner Feedback: The course stated in its description that “This is not a course.” This phrase was intended as a clever way to frame how the course experience would go, however sessions showed that is was more confusing than it was worth.
  • How the Course Was Improved: The description saying “This is not a course” was removed and replaced with one that simply explained what the purpose of the course is.
  • Voice of the Learner Feedback: Learners appreciated that not all of the content was text, there’s content in different formats, also audio and SME e-learning videos to add life to the learner experience.
  • How the Course Was Improved: We kept the diversity of content moving forward.
  • Voice of the Learner Feedback:  Learners also voiced frustration with redundant and repeated content. They wanted content to get to the point quickly and didn’t need unnecessary re-explanation. Learners didn’t want to waste time, they wanted to keep moving forward.
  • How the Course Was Improved: Activities were updated with relevant introductions and seeded with examples.

What we've learned from past sessions

Here’s some of the things we’ve learned from conducting Voice of the Learner sessions that can help you avoid pitfalls and instead focus on success with your courses:

Common pitfalls:

  • Too much content all at once. As a result learners turn their attention away and they don’t read it.
  • Impersonal. It doesn’t speak to them. Doesn’t answer the questions: “What’s in it for me?” or “How do I apply it?”.
  • Ambiguous. Labels aren’t communicating where to start or where to go next, the visuals aren’t giving them the right cues, and the next steps just aren’t getting across.
  • Confusing. Descriptions try to be too clever or just aren’t gaining their attention or interest. There’s no line of sight for where they are headed or what the content is building towards.
  • Diminishes trust. Content seems irrelevant or redundant. Learners aren’t getting what they expect in return for their clicks, time, and effort. 
  • No guiding examples or no response. Most learners prefer to have some kind of model to follow and to see how other people have responded. Most don’t want to be the first one to post. They’re motivated by knowing that they’ve helped someone or that they’ll get timely feedback.

Successes:

  • Content is concise and relevant. It is focuses on the learner and what’s in it for them.
  • Simple to wrap your mind around. It feels inviting and eyes are drawn to where learners need to begin.
  • Gains attention and motivates. Learners respond well to pleasing visuals, positive faces, and welcoming videos. Both polished and homegrown videos can gain attention, amplify excitement, and boost learner engagement within a course.
  • Respects the learner and their trust. Learners know what they are getting into. They experience content that flows easily, is consistent, reliable, and is focused on their needs. 

Answers to two important questions about holding "Voice of the Learner" sessions

1. How do I keep and organize notes during the sessions?

Organization of your notes is straightforward. You can capture the thoughts of your participants following the model of this Voice of the Learner template.

Create a row in a spreadsheet for each area of the site that they will encounter and the main ideas you want to focus on, for example About Me (participant characteristics), Best Parts, Areas to Improve, Homepage, Overview, Navigation, Get Started, Week 1, Week 2, Visual Design, Other, etc.

Then create columns for Participants, Themes, Ideas for Changes, and Decisions on Actions to Take. After you collect notes from your sessions you’ll identify emergent themes and issues which will go into the Themes column. You’ll use this to combined with your raw participant notes to debrief with your team, during which time you’ll generate Ideas for Improvement and Decisions on Actions to Take.

When evaluating feedback and making improvements to training and development, always focus on the learners. The notes you collect are going to inform incremental changes to the most relevant issues and get to the heart of a better learning experience.

2. Can testing 3-5 participants provide enough valid information to address how to improve training?

As soon as you begin you’ll start getting insights about how someone with fresh eyes uses your site, what they are thinking when they use it, what confuses or delights them, and questions they may have. You’ll find that this is very valuable qualitative information as a part of an ongoing process of inquiry (warranted assertability - John Dewey). Voice of the Learner sessions are an early start on that process which you can continue with pilot, ongoing feedback, and improvements to training and development.

The Nielson Norman Group’s research study suggests that five participants in a round of testing is generally appropriate but the right number of participants in any one round may be as low as two in order to optimize the benefits you get from your sessions. It’s a pragmatic approach that will still generate useful information and save you hours early on in your process. Here’s a bit of what NN Group actually had to say about it:

“The answer is five, except when it's not. Most arguments for using more test participants are wrong, but some tests should be bigger and some smaller.”

“For really low-overhead projects, it's often optimal to test as few as two users per study.”

In his book Rocket Surgery Made Easy, Steve Krug also acknowledges how using a small number of participants works. The point is not to prove statistical significance but to go out there to find problems and fix them. A small sample will simply get you going and save time so you can debrief and quickly move on to fixing things. Here’s what he had to say:

“What do I tell people who say, ‘But if you’re only testing three people at a time, it can’t be statistically valid. You can’t prove anything that way.’? Here’s what you should say to them: You’re absolutely right. Testing with so few people can’t possibly produce statistically valid results. The samples are way too small to even bother with statistics. But the point of this testing is not to prove anything; the point is to identify major problems and make the thing better by fixing them.”

Both NN Group and Krug make great points supported by their research and experience. Starting with a small number of participants tends to be optimal for generating early, actionable feedback (especially when you’re operating a very lean process). Start with three, but if you have the time for five participants, go ahead and schedule those additional sessions. 

Want to know more?

Summary

To realize the full potential of your training and development programs, it's crucial to harness the invaluable insights gained through "Voice of the Learner" sessions. By comprehending how your learners engage with online learning experiences, you can make well-informed decisions to improve training and development before initiatives are launched more broadly.

The efficiency of these sessions, requiring feedback from just three to five participants in approximately 50 minutes, facilitates swift identification and resolution of potential issues, ensuring a learner-centered design. Learning from past sessions, we've observed how addressing learner feedback significantly improves the effectiveness of training courses. To assist with this process, download our sample Usability Testing Results template and start capturing insights from the very individuals who matter most – your learners.

Contact us with any questions on how to conduct these sessions, and we invite you to download our white paper on how collaborative learning can be used as a tool to improve training and development, too. 

 

Nick Iverson is on Intrepid's Learning Experience Design team.

 

 

false LOREM IPSUM TITLE

RELATED RESOURCES

brochure

White Paper

2023 Research: Collaborative Learning and Performance

WEBINAR

WEBINAR Series

Intrepid IMPACT: A Bite-Size L&D Demo Series

blog-post

Blog

5 Proven Learner Engagement Strategies