From Dean Shareski’s Ideas and Thoughts blog, an energizing presentation by Chris Lehmann. He’s the principal of Philadelphia’s Science Leadership Academy, working under a time crunch (20 slides, 15 seconds per slide), and having a great time.
I found myself connecting what he says to the world of work; I’ll keep that to myself till after you hear Chris. (Note: You can’t see all Lehmann’s slides, so I’ve posted them below the video clip.)
I actually stopped the video a few times to scribble stuff down.
“Good data costs a lot more than we want to spend.”
That’s true for schools, and it’s also true in the world of work. There’s a lot of lip service paid to Kirkpatrick’s levels and to ROI, but in reality, we can’t afford to assess everything at Level IV, and if we’re doing a full ROI assessment on whether to devote a day and a half of our own time to learning some new technology, we’re going to end up getting to spend more time with our families.
I absolutely believe in the value of data — it’s the requirement for performance improvement — but as I listened to Chris Lehmann, I realized that ofter we are in great shape if we have good enough data. Claude Lineberry (as energetic a guy as Lehmann) hammered in the point that businesses don’t do control groups. Some data, carefully chosen, is a hell of a lot better than no data, which is what many people run with all the time.
“Tests and quizzes as dipsticks…”
When I get gas for my car, I always get a fill-up; I calculate the mileage and record it in a booklet I keep in the glove compartment. This is a kind of dipstick — it’s one stream of data that I can glance at, and if I see a variation from my car’s typical performance, then I go looking for more data and for causes.
Lehmann is pushing back from treating tests as goals. As he talked, I thought of the painful annual corporate ritual, the performance review. More than once in my career, I was asked to create a list of what I’d done so my boss could “update” my goals. In other words, I was backing from accomplishments to goals.
Which, I suppose, is better than being slammed for not doing stuff people forgot about nine months ago. The platonic ideal, where you and your manager (or, you poor schmo, your “leader”) regularly look at what you’re doing, what you’re getting done, and what needs to get done — I don’t know how often that happens, but when it does, it’s the dipstick model in action.
“You want to see what kids have learned, give them a project.”
As Lehmann points out, we adults learn when we’re trying to solve something, which means we’re trying to achieve a result. A depressing amount of corporate “learning” involves passive reception: listening to presentations, clicking through page-turners, reading documents. Nothing happens, which means there probably aren’t any new neural connections forming and few old ones getting stronger.
Working on a specific outcome probably leaves gaps in your learning. You can hear someone saying, “Okay, great, you got the web page menus to work entirely with CSS — but you don’t know how to do A, B, and C.” There are two assessments there: was the point to get the menus working, or to do A, B, and C?
I content that much of the time, getting X accomplished is the way to go. If afterward, you feel you don’t have the right result, then you go back and redefine X. I have seen perfectly harmless people subjected to a one-hour lecture on the step-by-step telephone switch, only to learn afterward that their telephone-company employer did not actually own any step-by-step switches; the last one had been replaced more than 10 years before, by computers.
But it was “good for them” to learn about the switches.
2 thoughts on “Schools: dipsticks and demonstration”
Great post Dave – what really bought the message home for me was “you want to see what kids have learned, give them a project”.
In face to face training there are many activities that are used (games, theatrical projects, poster presentations etc.) that play an important role in knowledge retention. Unfortunately these types of activities are often not included in elearning. You have got me thinking now . . .
Matthew, for me there’s been a logical and steady progression: training → learning → performance.
The first centers much more on the content and the instructor (even if it’s computer-based); the second, more on the individual; the third, on the individual at work.
The challenge is reducing the problem of transfer — or, even better, connecting the learning experience so clearly and directly to the job that “training” doesn’t seem like an outside event.