Conducting Voice of the Learner sessions will enable you to see how your learners are actually interacting with your online learning experience and to get their honest thoughts and reactions—it's usability testing by a friendlier, and more learner-centric, name. By conducting Voice of the Learner sessions, you’ll understand how your site is actually working for learners and be able to make informed decisions on how to improve training and development before you launch the program to your full cohort.
Here’s the cool part, it really doesn’t take a whole lot of time; you only need to get feedback from about three to five participants before knowing what you can improve next. Since a typical feedback session is only about 50 minutes, it's easy to find out what wrinkles there are to iron out to improve training and provide better overall online learning experience.
What you’re really doing in a Voice of the Learner session is revealing what is working or what isn’t from the learner's point of view rather than your own. You can explore what draws a learner into the content and how they go through it to find out what grabs their attention or what is confusing. You can give them tasks to find out if your content flow and activities are coming across the way you need them to.
Within a few sessions you can catch issues and improve usability before rolling out a whole new course to learners. Would you publish an important white paper without having someone else proofread it? You might be able to get away without help, but you should always have someone check the work - just one new set of eyes on the site can tell you how to make it better.
Holding sessions will save both you and your learners time and grief. As a result of the sessions, you’ve listened to the actual users of the online learning experience, refocused on relevance, and adjusted to their needs. You can find out for example:
Download a sample Usability Testing template.
Find three to five participants to hold a session with. If you can’t find all of your participants right away that’s okay, you’ll start getting useful information within your first sessions. Find a mix of participants by level of experience, familiarity of content, demographics, country of origin.
Sessions only need to be up to 50 minutes. They can be held and recorded via web conference.
Include one team member to facilitate the test and one to take notes. The facilitator will lead the session while the note taker records observations and the participant’s reactions.
Follow a test script for each session. A script will give welcome and introductions, inform participants of procedures and expectations, and help the facilitator guide the participant through test scenarios with tasks and follow up questions.
Help your participant feel comfortable. Introduce who is going to be facilitating the session and then ask them a few questions about themselves. You can use this as an opportunity to know a little more about their experience level, where they are from, and what they do.
Inform them that they can’t do anything wrong here. They won’t hurt your feelings and will be helping you improve the site with their reactions and feedback. Make sure they know that they can skip questions or leave the session at any time. Ask for permission from them to record the session.
Use the think aloud method to observe learners while they explore the site or try to perform key tasks. Ask them to voice any reactions they have, what they are thinking, reading, or questions they have as they go along.
As they go through the site, give them a few moments to describe first reactions of what they are experiencing, or trying to do, then ask them about it. If they get stuck, let them voice their thinking, ask them what they were expecting to do there (or see), then politely help them move on to the next section or next task.
Watch how the participant reacts and ask them follow up questions for example, “What makes you react that way?” or “What gives you pause here?” Try not to ask leading questions as they go through the site like “Do you think you would use this?” Instead, follow up on what they are actually doing and thinking.
Always close the session by thanking the participant and letting them know that you much appreciate their time and help.
After your sessions are finished hold a time with your team to debrief so that you can identify themes throughout the sessions and decide on the next most important improvements to make.
For example, here are some findings from a professional development course we worked on with a client, and how we improved it based on learner feedback:
Here’s some of the things we’ve learned from conducting Voice of the Learner sessions that can help you avoid pitfalls and instead focus on success with your courses:
Organization of your notes is straightforward. You can capture the thoughts of your participants following the model of this Voice of the Learner template.
Create a row in a spreadsheet for each area of the site that they will encounter and the main ideas you want to focus on, for example About Me (participant characteristics), Best Parts, Areas to Improve, Homepage, Overview, Navigation, Get Started, Week 1, Week 2, Visual Design, Other, etc.
Then create columns for Participants, Themes, Ideas for Changes, and Decisions on Actions to Take. After you collect notes from your sessions you’ll identify emergent themes and issues which will go into the Themes column. You’ll use this to combined with your raw participant notes to debrief with your team, during which time you’ll generate Ideas for Improvement and Decisions on Actions to Take.
When evaluating feedback and making improvements to training and development, always focus on the learners. The notes you collect are going to inform incremental changes to the most relevant issues and get to the heart of a better learning experience.
As soon as you begin you’ll start getting insights about how someone with fresh eyes uses your site, what they are thinking when they use it, what confuses or delights them, and questions they may have. You’ll find that this is very valuable qualitative information as a part of an ongoing process of inquiry (warranted assertability - John Dewey). Voice of the Learner sessions are an early start on that process which you can continue with pilot, ongoing feedback, and improvements to training and development.
The Nielson Norman Group’s research study suggests that five participants in a round of testing is generally appropriate but the right number of participants in any one round may be as low as two in order to optimize the benefits you get from your sessions. It’s a pragmatic approach that will still generate useful information and save you hours early on in your process. Here’s a bit of what NN Group actually had to say about it:
“The answer is five, except when it's not. Most arguments for using more test participants are wrong, but some tests should be bigger and some smaller.”
“For really low-overhead projects, it's often optimal to test as few as two users per study.”
In his book Rocket Surgery Made Easy, Steve Krug also acknowledges how using a small number of participants works. The point is not to prove statistical significance but to go out there to find problems and fix them. A small sample will simply get you going and save time so you can debrief and quickly move on to fixing things. Here’s what he had to say:
“What do I tell people who say, ‘But if you’re only testing three people at a time, it can’t be statistically valid. You can’t prove anything that way.’? Here’s what you should say to them: You’re absolutely right. Testing with so few people can’t possibly produce statistically valid results. The samples are way too small to even bother with statistics. But the point of this testing is not to prove anything; the point is to identify major problems and make the thing better by fixing them.”
Both NN Group and Krug make great points supported by their research and experience. Starting with a small number of participants tends to be optimal for generating early, actionable feedback (especially when you’re operating a very lean process). Start with three, but if you have the time for five participants, go ahead and schedule those additional sessions.
To realize the full potential of your training and development programs, it's crucial to harness the invaluable insights gained through "Voice of the Learner" sessions. By comprehending how your learners engage with online learning experiences, you can make well-informed decisions to improve training and development before initiatives are launched more broadly.
The efficiency of these sessions, requiring feedback from just three to five participants in approximately 50 minutes, facilitates swift identification and resolution of potential issues, ensuring a learner-centered design. Learning from past sessions, we've observed how addressing learner feedback significantly improves the effectiveness of training courses. To assist with this process, download our sample Usability Testing Results template and start capturing insights from the very individuals who matter most – your learners.
Contact us with any questions on how to conduct these sessions, and we invite you to download our white paper on how collaborative learning can be used as a tool to improve training and development, too.
Nick Iverson is on Intrepid's Learning Experience Design team.