Are you in a corporate training environment? Dick Carlson in his mild-manner way muses on how learners feel about training (Learner Feedback? You Can’t STAND Learner Feedback!).
Dick and I have some differences — I think dogs ought to have noses that they themselves can see — but not in this area. The core of Dick’s post is the ultimate assessment: can you now accomplish whatever this training was supposed to equip you to accomplish?
(Yes, that does mean “that you couldn’t accomplish before due to a lack of skill or knowledge.” Don’t be cute.)
Because — if we start with a true skill deficit that prevented you from producing worthwhile results — that’s vastly more important than whether the training fit your purported learning style, whether the ratio of graphics to text was in a given range, and whether the person helping you called herself a trainer, a teacher, a facilitator, a coach, or the Bluemantle Pursuivant.
If you need to learn how to recover from an airplane stall or how to control paragraph borders through a class in CSS, learning assessment comes down to two words: show me.
With all that, I do think that how the learner feels about what’s going on does influence the learning situation. I just want to make clear: that’s very different from saying that those feelings matter in terms of assessing the learning.
High profile? You bet your assessment.
I was once in charge of instructor training and evaluation for an enormous, multi-year training project. In the final phase, we trained over 2,000 sales reps to use a laptop-based, custom application. 90% of the them had never used a personal computer.
Which was a drawback: the client decided that as long as the sales reps were coming for training on the custom application, we should “take advantage of the opportunity” to teach them email.
And word processing. And spreadsheets. And a presentation package. And connection to two different mainframe applications using simple, friendly 3270 emulation software.
In a total of five days (one 3-day session, a 2-day follow-on one month later).
Our client training group was half a dozen people, so we hired some 30 contractors and trained them as instructors. I mention the contractors because we needed a high degree of consistency in the training. When a group of sales reps returned for Session 2, we needed to be confident that they’d mastered the skills in Session 1.
(If the informal learning zealots knew how we electrified the fences within which the instructors could improvise, they’d have more conniptions than a social media guru who discovered her iPhone is really a Palm Pre in drag.)
We used a relentlessly hands-on approach with lots of coaching, as well as “keep quiet and make them think” guidance for the instructor. The skills focused on important real-world tasks, not power-user trivia: open an account. Cancel an order. Add a new contract.
We conducted nearly 600 classroom-days of training, and we had the participants completed end-of-day feedback after 80% of them. I never pretended this was a learning assessment. I’m not sure it was an assessment at all, though we might have called the summary an assessment, because our client liked that kind of thing. We had 10 or so questions with a 1-to-4 scale and a few Goldilocks questions ( “too slow / too fast / just right” ), as well as space for freeform comments.
Why bother?
I made the analogy with checking vital signs at the doctor’s or in the hospital. Temperature, pulse rate, blood pressure, and respiration rate aren’t conclusive, but they help point the way to possible problems, so you can work on identifying causes and solutions.
So if we asked how well someone felt she could transmit her sales calls, we knew about the drawbacks of self-reported date. And we had an instructor who observed the transmit exercise. We were looking for an indication that on the whole, class by class, participants felt they could do thist.
(Over time, we found that when this self-reporting fell below about 3 on the four-point scale, it was nearly always due to… let’s say, opportunity for the instructor to improve.)
When we asked the Goldilocks question about pace, it wasn’t because we believed they knew more about pacing than we did. We wanted to hear how they felt about the pace. And if the reported score drifted significantly toward “too fast” or “too slow,” we’d decide to check further. (2,204 Session 1 evaluations, by the way, put pace at 3.2, where 1 was “too slow” and 5 was “too fast.” )
Naturally, to keep in good standing with the National Association for Smile-Sheet Usage, we had free-form comments as well. We asked “what did you like best?” and “What did you like least?” (In earlier phases of this project, we asked them to list three things they liked and three they didn’t. Almost no one listed three. When we let them decide for themselves what they wanted to list, the total number per 100 replies went up. )
Early in the project, our client services team sat around one evening, pulling out some of the comment sheets and reading them aloud. It was my boss at the time who found this gem, under “what did you like best?”
My instructor made me feel
safe to be dumb.
Everybody laughed. Then everybody smiled. And then everybody realized we had a new vision of what success in our project would mean.
We wanted the learners to feel safe to be dumb. Safe to ask questions about things they didn’t understand. Safe to be puzzled. Because if they felt safe, they felt comfortable in asking for help. And if they felt comfortable asking, that meant they felt pretty sure that we could help them to learn what they needed to learn.
What about weaving their feedback into the instructional design? In general, newcomers to a field don’t know much about that field, which means they’re not especially well equipped to figure out optimal ways to learn.
Please note: I am not at all saying newcomers can’t make decisions about their own learning. In fact, I think they should make ’em. In a situation like this, though, my client wasn’t the individual learner. It was (fictionally named) Caesar International, and it had thousands of people who needed to learn to apply a new sales-force system as efficiently as possible.
Mainly procedural skills. Low familiarity with computers, let alone these particular applications. High degree of apprehension.
(By the way, Ward Cunningham installed WikiWikiWeb online eight months after our project ended, so don’t go all social-media Monday-morning-quarterback on me.)
I felt, and still feel, that our design was good. So did the Caesar brass: within six months of the end of the project, a nearly 25% increase in market share for Caesar’s #1 product, and the honchos said that resulted from successfully training the reps to use the new sales software on the job.
When you feel safe to be dumb, you don’t stay dumb long.
CC-licensed images:
Yes / no assessment by nidhug (Morten Wulff).
“Cover-the-content” adapted from this photo by antwerpenR (Roger Price).
The safe-to-feel-dumb thing is huge. I’ve worked a on couple of projects for a very beginner level audience and had oddly similar conversations with the clients:
—
Client: It’s like xyz for dummies, but of course we can’t call it that.
Me: Why not? (Aside from the obvious copyright issues)
Client: People would be offended if we call them dummies.
—
Which I find funny, because those books have sold how many gazillions? I’ve tried to explain that the dummies books promise exactly the opposite — they promise to NOT make you feel stupid.
Try to learn about wine as a beginner from Wine Enthusiast magazine, and you are going to constantly be confronted with information you don’t have context for, and it’s going to make you feel dumb. Wine for Dummies, on the other hand, promises that you won’t get any vocabulary or concepts that you aren’t prepared for. And, if you do know a little of it already, you actually get a little flurry of superiority as an added bonus.
Julie,
Among the contract instructors for our Caesar International project was a very bright guy who’d been head of a computer user group in the area and who regularly taught beginning classes for the type of computer the sales reps were learning. Despite our coaching and the example of our many successful instructors (whom he observed and team-taught with), he didn’t quite work out. At one point, a sales rep said to him, “If I ask you a question, will I get an answer?”
We didn’t keep him long.
Speaking of wine, I read Andrea Immer’s Great Wine Made Simple. Frankly, I didn’t care for her style, which was a bit too peppy for me. But her approach — that was something else. To help you understand oak versus unoaked, for example, she gave lists of chardonnays in several price ranges. Buy one from the oaked column, one from the unoaked. Drink some of each. The difference is the oak. Clear, direct, practical, and tailored to the newcomer’s perspective.