Complex learning, step by step

This entry is part 1 of 21 in the series Ten Steps to Complex Learning (the book).

I’ve been (slowly) reading Ten Steps to Complex Learning, by Jeroen J. G. van Merriënboer and Paul A. Kirschner.  The subtitle explains why: A Systematic Approach to Four-Component Instructional Design.

I read a lot about the death of instructional design, the end of training, and the New Jerusalem of learning that’s due any day.  Certainly a lot of superstition and nonsense gets daubed with the label “instructional design,” like a kind of cognitive Clearasil.  Still, I can’t help think that few people are going to learn to manage power-generation stations, conduct clinical trials, sell aircraft engines, or produce FEMA-acceptable flood elevation certificates solely through self-guided learning.

So I decided to plow through this book, which I’ve described with a bit of humor as being written in a language very much like English: the prose is dense, and very academic.  So far it’s worth the effort, and I’m going to summarize key parts here.

(Key part: something I pay enough attention to that I make a note on paper as I’m reading.  This is an ancient custom among my people.)

Van Merriënboer and Kirschner aren’t shy:

The fundamental problem facing the field of instructional design these days is the inability of education and training to achieve transfer of learning.

Which is something like AIG not being able to actually insure anything, isn’t it?

One point the authors make is that most complex skills require the learner to coordinate from a range of “qualitatively different constituent skills.”  That last phrase is important to them: not only is the whole of a complex task more than its parts, but the constituent skills are not parts of the larger task but aspects of it.  They’re not sub-skills, which you add together to make up the Big Skill.

Which, they argue, makes the analytic approach of many traditional instructional design approaches counter-productive.  For example, what they call the transfer paradox comes into play: the instructional methods that work best for isolated objectives often work poorly for integrated objectives.

To make that plainer: we spend too much time fiddling around with nice, clear, low-level objectives.  Then we lack time and money (and, perhaps, the will) to develop integrated learning.  Then we wonder why the training/learning function has such a dismal reputation.

But those isolated ones are what we tend to grab onto, because it’s easier to design around them, easier to create test items, and easier to cram them into an LMS (“Lessons Mean Simplicity”).

Learning to use the Amtrak reservation system is a complicated task, but maybe not all that complex.  Learning to act on traveler’s questions is also complicated.  Developing training for either set of skills is inherently less difficult than developing holistic training for an effective Amtrak reservation agent–but that’s what Amtrak’s really looking for.

The usual answer to the problem often seems to be “watchful waiting.” The performers go from training to the job, and we hope that their random encounters with reality end up filling the gaps.

Van Merriënboer and Kirschner want to grapple directly with such complex learning problems.  The model they advocate sees four main components to a learning blueprint:

  • The learning tasks that someone needs to master.  (Strictly speaking, I’d say these are the on-the-job tasks which the person currently doesn’t know how to do, but it’s not my model.)
  • The supportive information that comes into play when you’re working with skills that are performed differently from problem to problem.  These skills, which they call schema-based, benefit from things like mental models of the overall domain (e.g., pharma research) and cognitive strategies for working in that domain.
  • The procedural information that guides those skills that are performed the same way from problem to problem.  This is the how-to knowledge (e.g., using the clinical trials database) that’s a routine part of the overall task.
  • Part-task practice to strengthen and automate certain “recurrent constituent skills.”

Van Merriënboer and Kirschner argue that people can only perform certain constituent skills (which are aspects of the larger task, remember) if those people have a certain level of knowledge about the larger domain.  “Select an appropriate database,” as they point out, doesn’t make any sense if you don’t know what makes databases appropriate to the search you’d like to perform.

To foster integration and avoid compartmentalization, their model includes an emphasis on inductive learning: you as the learner work with specific problems so you build and improve mental models for the principles behind those specific problems.

…all learning tasks [should] differ from each other on all dimensions that also differ in the real world, such as the context…in which the task is performed, the way in which the task is presented, the saliency of the defining characteristics, and so forth.  This allows the learners to abstract more general information from the details of each single task.

That’s how we learn a great deal of what we know.  And, yes, a good deal of that happens informally, though I don’t see that as an argument for not trying to create learning situations when the informal can happen more predictably and more rapidly.

Related to this idea, the authors advocate always having learners work with whole tasks.  That might mean starting with simple cases or examples.  Other approaches include providing support (say, a process overview for the clinical-trial system) and guidance (a job aid for forming database queries).  They also make use of task classes, by which they mean categories of tasks.  In their ongoing database-query example, one task class has to do with performing searches when the concepts are clear, the keywords are in a specific database, when the search involves few terms, and where the result includes only a limited number of articles.

I’d call that the “clear, simple search” class.

You can imagine the other extreme: a poorly phrased request involving unclear concepts, with little knowledge of the appropriate databases, calling for complex search queries and producing large numbers of relevant articles which require further analysis.

How many task classes do you need?  That seems to depend on the range of variation between the Clear Simple Search class and the Nightmare Search class.

There’s a lot more going on; without intending to, I guess I’m starting another series.

 

 

Figuring things out (the plodding edition)

This entry is part 1 of 3 in the series Figuring Things Out.

Here’s how Joe Harless helped me figure things out.

When I took his Job Aid Work Shop, he recommended a technique for analyzing tasks. Joe called this paradigming.   It works best on procedural stuff, though I’ve also used it to find my way around very complicated systems.

Section 1:  complex theoretical discussion

Each step in an on-the-job task has two parts.  (I like to say I’m a Reform Behaviorist, so you might see some behavioral-psych roots here.)

  • The stimulus, meaning the starting state.
  • The response, or what you do when you perceive the stimulus.

As in:

  • S: email from Queen Elizabeth (notice: it’s a noun–a thing)
  • R: open email (a verb–an action)

The response leads to a new stimulus (opened email) which calls for a new action.

  • S: opened email (mail that has been opened—Ss are always nouns)
  • R: decide next action

“Decide?”  Yes; I use that a lot.  It’s a good launch pad for chains of activity or decisions.

  • S: “Read text” (the quotes show it’s a decision—a noun)
  • S: “Open attachment”
  • S: “Forward message”

And so on.  More later; one more complex theoretical idea awaits:

In paradigming, there are only three kinds of steps:
the basic chain, the discrimination, and the generalization.

Section 2: three examples

The basic chain is simply a sequence of actions with no decision making.

paradigm01chain1

At a higher level, of course, you might collapse a lot of behavior into a single step:

  • S1: Vacant land — R: purchase land.
  • S2: Purchased land — R: construct 12,000-square-foot house.
  • S3: Completed — R: furnish tastefully.

Sometimes, you need to distinguish between different stimuli that each call for a different response.  That’s a discrimination.

paradigm02disc1

Remember, this is a step in a larger process.  In the example, the previous step might have been “receipt from purchase” and the response might have been “identify form of payment.”

What you’re discriminating among here are the different possibilities for form of payment.  I left some out because the image would get too large, but you’d put in as many stimuli as exist: check, debit card, form not legible, and so on.

The third kind of behavior is generalization: you have more than one stimulus and they all lead to the same response.

paradigm03general

Section 3: So what?

For me, paradigming offers several payoffs:

  • It’s a great way to track down loose ends and uncertainties in a complicated process.
  • It ensures that I don’t forget about something that puzzled me.
  • It magically becomes the scaffold for job aids.

I’ll create an example or two of paradigming in action, and of that scaffolding, in another post.

Rummler and Brache: Improving Performance

This entry is part 1 of 4 in the series Improving Performance (the book).

Managing the white spaceFollowing the recent death of co-author Geary Rummler, I’m reading Improving Performance: How to Manage the White Space on the Organization Chart.   This post is the first in a series based on that book and on the implications of that white space.

I’ve read a lot of what Rummler wrote; I took the Performance Analysis Workshop he and Tom Gilbert designed; I was lucky enough to be invited to a Rummler-led session for PAW grads (where I saw among other things a professional-development system for officers on an ocean freighter).   I often work within a single department of a client organization and often with people on the front line, where the organization meets its external customers.  Going through this book, as they say in Congress, “revises and extends” my viewpoint.

Geary Rummler and Alan Brache argue that true performance improvement demands a systematic view of the entire organization.

The traditional view of an organization is the organization chart.   Take a look at one for an organization you know well.   You see the CEO or other chief honcho; the main levels of the chain of command; the prime departments.

What’s missing?   Customers.   Products and services.   The processes that produce the products and services.

In small or new organizations, this vertical view [the traditional organization chart] is not a major problem because everybody in the organization knows each other and needs to understand other functions.   However, as time passes and the organization becomes more complex, as the environment changes, and as technology becomes more complicated, this view of the organization becomes a liability.

Traditional organizations lead to silos built around departments.   Silos make it nearly impossible to resolve interdepartmental issues at low or middle levels.   Functions get better at meeting their own goals ( “manufacturing hit its numbers” ) but that doesn’t necessarily help the organization as a whole.

As the authors emphasize many times, the greatest opportunities for performance improvement often lie in the functional interfaces — those points at which the baton (for example, production specs) is being passed from one department to another.

An organization chart shows who and with whom, but not the what, why, and how of the business.   In real life, bosses are often managing the organization chart, not the business. Rummler and Brache see a different ideal:

A primary contribution of a manager at the second level or above is to manage interfaces.   The boxes already have managers; the Senior manager adds value by managing the white space between the boxes.

Organizations are adaptive systems.   Chapter 2 lists 10 features of the organization as system.   For example, it converts various inputs into products and services, guided by internal criteria and feedback as well as by feedback from the market.

Any organization that survives, they argue, has adapted — but the health of the organization depends on how well it’s adapted.   “18 months after Peters and Waterman published their list of excellent companies, one third of them had dropped off the list.”

What messages do I draw from this?

  • If an organization’s an adaptive system, so are its components.
  • What matters is first what gets accomplished, and then how that happens.
  • In an organization, learning (in the broad sense) and training (in the focused sense) need to connect both to individual and organizational needs.

More than anything, the value is emphasized in an interview ASTD had with Rummler last year:

(ASTD) What are some of the things that currently frustrate you about the learning and development profession?

The same thing that frustrated me 45 years ago–the fact that i’s a solution in search of a problem. People have developed all this wonderful stuff around learning and development, and it’s become a thing in and of itself rather than something that exists to help people be more effective in their jobs.

Bad management makes it worse because managers read the magazines, see the fads, and call the training people to say, “I want us to try this.”  There’s no corrective force in that relationship. In fact, training has become in many ways the enabler for bad management because now the default solution is to fix the people. You’ve got vendors inventing things, business publications promoting them, managers reading them and thinking they should be doing this, and the training department going along with it all too eagerly. It is a whole business.

“Fix the people” isn’t all that far removed from “teach them what they need to know.”

Memory, learning, and great-uncle Gillies

This entry is part 1 of 9 in the series The Brain Rules.

This post is part of the Working/Learning blog carnival for April, 2008, hosted this month by Manish Mohan, who blogs at Life, the Universe, and Everything about eLearning and Content Development. It’s the second run of the carnival; the first was in March 2008.

I’ve been reading John Medina’s Brain Rules. I’m also trying to relate them to learning and to things that affect my work. In other words, using his rules as a framework, what can I do with them?

I’ve decided to start with rule six, “remember to repeat.” Why this one? Because last Wednesday was the 262nd anniversary of the Battle of Culloden.

‘Twas love of our prince drove us on to Drumossie
But in scarcely the time that it takes me to tell
The flower of our country lay scorched by an army
As ruthless and red as the embers of hell…

Although I don’t weep over the defeat of Bonnie Prince Charlie, neither do I let April 16 pass unnoticed. Why is that?

Medina writes about how we move information from short-term to long-term memory. Nothing much new: repetition and restatement. One of the principles that we know (but don’t always capitalize on) is spacing out the input. Or as I like to call it, three times 20 is more than 60.

If you’ve got a a given amount of time to learn something, you’ll almost certainly learned better and more thoroughly by spacing out your exposure. Instead of cramming for two hours, try four sessions of 30 minutes each. As the descendant of Scottish Highlanders, I’ve certainly spaced out my exposure to stories of the Jacobite rebellions and songs about “The ’45.”

old_books.jpgMedina also says that when information is retrieved from long-term memory, it’s not fixed as if it were a book pulled from a library shelf. It’s almost a repetition of the initial learning — the information is once again labile, malleable, something we can re-work.

That means when it’s re-stored, it’s been changed. Not always leading to greater accuracy.

Which brings in my great uncle. Actually, Gillies Mhor MacBain is my great-great-great-great-great-great-grand-uncle, if I can trust a genealogical history called The Mabou Pioneers. Gillies fought for Prince Charlie and died at Culloden.

Google his name, and you’ll find dozens of accounts saying that he was 6 foot 4, that he killed at least 13 redcoats, and that an English officer tried in vain to have Gillies spared because of his bravery.

Who knows what really happened? The story of Gillies MacBain has been told and retold. Details were lost on the battlefield and over the years; without a doubt, new details have been supplied. They’ve altered the cultural memory the way recall and reconsolidation can alter your personal memory.

Over time new information in the brain reshapes what’s already there. We can “remember” things that never happened.

That suggests things we can do, in the world of learning at work, to increase the value of that reworking and reconsolidation. Focus the learning on what’s important to the job, for example. Create support and structures to ease recall and increase accuracy.

brainfunnel.jpgThink hard about questions like:

  • What’s our rationale for a three day workshop?
    • Does it make sense to firehose information this way?
  • If we must have one, how do we design for spaced input?
    • Can we break up topics and interweave them?
  • Are we focusing on tasks rather than on content?
    • Even (or especially) for concepts and principles, can we provide opportunities to work with them, apply them in job-relevant contexts?
  • How do we design, create, or organize information externally to make it easy to retrieve and apply as needed?

I spent more time than expected thinking through this post as I was writing it. While I don’t see Medina’s brain rules as the fulcrum of all knowledge, I like the idea of trying to apply them to the blog carnival themes of “work at learning; learning at work.” So I think this post will be a first in a series based on Medina’s rules. Feel free to chime in.

Old book photo by alpoma / Alejandro Polanco.
Brain funnel image by Beth Kanter.

Intro to “Software Training Isn’t”

This entry is part 1 of 1 in the series Software Training Isn't.

Many projects I’ve worked on have involved helping a client’s employees learn computer applications as part of their job — sometimes off-the-shelf applications like a word processor, often large custom applications like Amtrak’s reservation system.

A fundamental principle for such projects: software training isn’t.

Yes, the client wants training. And typically the learners don’t know how to do their work with new or upgraded software. So there’s a genuine skill-knowledge gap.

But that gap isn’t between the learner and the software; it’s between the learner and his work. Or as Tom Gilbert would say, between the worker and his accomplishments — the results he produces on the job.

This is more than a semantic difference; I think it’s a beacon for helping people learn.

This post introduces a series I call Software Training Isn’t. I’ll talk about things my coworkers and I learned on a huge project for Caesar International (my fictional name for a Fortune 100 consumer-products company). The series will include:

  • The Intro: you’re reading it.
  • Water-Ski with Caesar: delivering 12,000 student-days of training in four months.
  • A Computer, Not a Way of Life: mastering basic skills without ever hearing the word “binary.”
  • They’re Applications; Apply Them: “How to Use Excel” isn’t an objective, it’s a sentence fragment.
  • The Way Things Work: narrowing the gap between the classroom and the real world.
  • She Doesn’t Work for Us? : preparing and sustaining effective instructors.
  • Liking, Learning, Using: tracking progress and demonstrating success.

You may know the adage, “Good judgment comes from experience. Experience comes from bad judgment.” I hope this series offers some experience you can use to enhance your own good judgment.

Note: I’ve fictionalized aspects of the client I call Caesar International, but not anything relevant to the backgrounds and abilities of the learners, to the design and delivery of the training, or to the results we produced. This was far and away the largest project ever undertaken by the Client Training team at GE Information Services, for which I was the chief instructional designer. It was also far and away our greatest success.