Driving Behavior Change with Workflow Learning

This blog is excerpted from the Performance Matters Podcast episode titled “Experience Matters | Sam’s Club” in which Bob Mosher interviews Jennifer Buchanan (Senior Director II, Field Learning & Development at Sam’s Club) about driving behavior change through a shop-floor workflow learning approach.

Bob Mosher (BM): We are honored to have a remarkable learning leader with us today, Jennifer Buchanan, who is the Senior Director of Field Learning and Development at Sam's Club (where I am a card-carrying member). It's a remarkable organization with great service, which is because of your good work, Jennifer.

You wrote a wonderful article for ATD, the Association for Talent Development, called “Training in the Flow of Work”, which is obviously a sweet spot for this podcast and for me. We'd like to dig a bit deeper into that during this session. The article pivots on this program you call “Manager in Training” (MIT is the acronym in the article). Give us an overview of the intent of the program and what motivated you to take this to the flow of work vs. a more traditional approach that is often used in classic manager training programs.

Jennifer Buchanan (JB): The Manager in Training program prepares our high-potential team leads for future careers in management with the company. The program takes them on a journey through the work groups in the club (Fresh, Members, Specialty, Merch, and Curbside). At the end of the program, through the flow of work training, they can receive up to a semester’s worth of college credit [through] our free tuition program at five different universities. 

What made us decide to do the program? When I first started, we began having conversations with different leaders and associates in the club (conducting focus groups). At the top of the list for everyone was the Manager in Training program. I heard things about it, but when we really started to dig in, we found about five or six different stop-and-start versions of the program. So, there was something making it unsustainable. The content itself was fine, but something wasn't sticking.

There was also the fact that it leveraged a 300-page binder and 170-page sponsor guide. We actually had (sitting beside my desk) all of the papers piled up. I'm five foot two, so not very tall, but that pile was taller than me. Wow. Right there, we knew we had to find a way to deliver it that was going to be more digitally enabled; something that was going to make the content more exciting and that was going to make it stick.

The turning point was realizing that it's very focused on tasks. It was almost like we weren't giving people credit: people can think critically, and we can open their minds. We could show this in a whole new light. That's when the content transformation started. Then, with the digital piece, it just seemed so obvious. We needed to integrate this into our existing digital ecosystem.

The key element in all of this is that we wanted associates to have an opportunity to immediately apply what they learned. That was really the gap with having that 300-page binder. I mean, those [binders] are in every company everywhere, in every role I've ever had. But really, the “click” is how do I create learning in a way that people can go out and practice it immediately and retain it?

BM: Love the “stickiness” aspect, and context is a huge pivot. We've talked to so many folks in this series who have said that content was almost never the gripe; rather, it was the lack of application and stickiness. To your point, I've walked through [Sam’s Club] stores and the digital footprint there is remarkable, so what seems intuitive to you makes it more interesting that we still haven't made that jump [to technology] in other areas.

You go into four guiding principles in your article, so would you take us a bit deeper into those? [Workflow learning] is such a transformational change for an L&D team or department.

JB: With the behavioral based learning framework and our view on learning:

  • We want it to be personalized.
  • We want it to be a journey.
  • We want it to be customized.
  • We want to bring the associates closer to work.

When you put it down on paper, it's so obvious: yes, of course we should be doing this. But I think as L&D folks, sometimes it tends to be more [request based]:

“We would like training.”

“Okay, please fill out this needs assessment form.”

And then we go off and make training, right? Six months later, the business has flown by us and we are delivering outdated training. We really wanted to reengineer a new way to think about it. We knew if we started with the behaviors that we wanted to change, then we could try from there. When we frame it up that way, it makes a lot more sense to the business, but also to the team. I said to the team, “We want to be enablers of the business, right? We don't just want to be order takers.” If we really want to drive change, we ultimately need to track the behavior. We need to have deep conversations with the business about what behavior we are trying to change, and then let's drive from that behavior. Then we can create the training or the experience. Sometimes when you have those conversations, you realize you actually don't need training. You realize it might be a different issue. I could create the best training program in the world, but if it's a performance or talent issue, my training is not going to solve that problem. So, it's really getting down to the core of what it is. That is basically how we started to think about all of this.

Also, why wouldn't we do that for our own L&D team? If that's what we're doing for our associates, meaning we're trying to open their minds and encourage curiosity and critical thinking, as an L&D team, we need to evolve our way of thinking as well.

BM: Brilliant. From behavior back! We talk about pivoting on “apply” here all the time. It's amazing how for years in L&D, we've focused on knowledge first, and then hope those binders and those amazing times in the classroom somehow transcend to the workflow.

This behavioral based learning framework—how does this work for you guys? Can you take us through sort of “a day in the life” for the learner?

JB: A couple things: first, we wanted to go from technical to more conceptual when we think about how we frame up the learning with this framework. Second, we wanted to first define the behavior we wanted to change, and then create a common language around that behavior. So, there was a lot of change management on the back end, before we ever got to creating any type of content. Our conversations with the business were along the lines of, “Put yourself in the future and nothing stands in your way. How is the world going to look different after they get this training, or they get this experience?” That's what enabled us to create the journey.

When you think about the journey itself on the floor, an associate will have in their handheld a landing page that recognizes them as an MIT associate. It's going to frame up their learning. It's going to track where they are in the program. It's going to show what type of activity they have: do they have an immersion, do they have an activity with their handheld, do they have a debrief, do they have a reflection? A concrete example would be that we have a digital voice response assistant called “Ask Sam”. To put the learning in the context of the Club, it might say, “Use Ask Sam to get a spec sheet on a product.” The associate would then have to pull up that spec sheet and evaluate it for accuracy. Perhaps they would have to show another associate how to do that, and then there would be a debrief and reflection at the end. They're basically mirroring—in the context of the Club—what their work would look like day to day.

Another example would be at the jewelry counter. We’re trying to do things to make learning more interactive and fun, right? If you work at the jewelry counter, you need to know the gemstones. Instead of just having five paragraphs about the different gemstones, how about we have a gemstone matching game? That's a lot more fun, and that's something that they can do right there in the flow of work to learn the information. So that's really how we brought it to life.

I think the other piece that I would say on this, and people can slap my L&D hand, is community over content. If you're able to create the community, the SMEs come along, the content will flow in, and you create that current community of learners. That's what creates the buzz around the training. You really have to be able to do that to drive the behavior change.

BM: Yes, relevance and application. In my career, we used to have to beg SMEs for time to help us do our job and get our training built. But when you make this shift, there's buy-in. They understand the relevance. It's what they do every day and helping others do what they do, including themselves. Their desire to want to help and be a part of this is just a dramatically different thing.

So, getting people up to speed is one thing (on the gemstones and other things), but remaining competent in the world we live in today is crazy. The rate of change is exponential, especially in retail with supply chain issues. Given all the challenges they're faced with today, for employees to come into work every day on the shop floor and do what we call Apply, Solve, and Change (vs. New and More) is the hardest and most difficult work it's ever been in my recollection. How does your solution flow into that part of the world and their work?

JB: First, it's integrated into the flow of work and into the digital ecosystem. Training is not a separate siloed area: it's part of the experience. Also, our framework is very similar to yours (maybe our words are a little bit different). We like to say Activate, Apply, Demonstrate, and Integrate throughout the journey and throughout the behavioral learning framework. The way that is organized on the back end is where that contextualization and curation of all the content comes into play, because that's how you keep that content organized and relevant. You don't want an ecosystem of content that you can't keep track of, and we try to think of that within the context of the Club, within all the work groups of the Club. The connectivity of those pieces allows us to keep content up to date. I would also say, just walk the floor with your SME, so they have the same experience [as learners] and really put themselves in the shoes of the associate, because that's how you really see the impact of what you create. 

BM: Let’s go a bit deeper. You used two great words: curation and maintenance. When it comes to traditional instruction, we've got almost a waterfall design. We have iterations of our content (e.g., versions 1/2/3, alpha/beta, 100/200/300—all this kind of stuff). It's almost an academic model, frankly. But we're talking about a whole different world here. We're talking about things at the moment of need in the immediacy of the day. How did you help your L&D department and those you engaged in this journey keep content current in your organization? It's a very different way of thinking and it blindsides most L&D folks when they cross into workflow learning because of the immediacy with which a lot of this content needs to be kept. It almost swamps their ship in maintenance alone. How did you do this differently?

JB: You can't own everything within the actual content. Instead, what you own is how you evolve that [content] into a training and make that a great experience on the floor for the associate—in partnership with the SME. I think of that game “telephone” where one person says something, and then it goes down the line to others, and when the last person says it, it's not the same thing anymore. That's what happens if you don’t have a partnership.

Also, with the SME part of it, it's back to that piece around keeping things within the context of the Club. All of that training needs to fit within those work groups of the Club, and that enables the associate to cross train into those other areas of the Club—because we have all that connectivity between the content, and we can see where all the pieces fit together.

In L&D right now, because of what's happening in the outside world, we are having to immediately pivot and create content. I didn't have a crystal ball, so I didn't know about COVID. What was interesting is that it changed the way members wanted to shop: they didn't want to come into the Club. So then curbside pickup was created, and we needed training for that, which needed to be integrated into our digital tools. I think no matter how quick you are on your content development models, you still really have to pay attention to what's happening outside in the environment around you to pick up on the signals and have the strategic foresight to plan. You may not know exactly what you're going to create, but you need to know that you're going to need that bandwidth in the future.

BM: You know Jennifer, there are so many things you did right here that set you up for success. When we’ve talked to other leaders since COVID, the ones that have done really well were doing workflow learning beforehand (like you). That shift to immediacy and the way you're so in tune with the flow of the business, plus your ability to adapt and go with it in a more agile (if I may use that word) way, I think puts you way ahead of the game.

Listen to the full episode for Bob and Jennifer’s complete conversation!

Subscribe to The Performance Matters Podcast to stay up to date on all the latest conversations and guests in the 5 Moments space.

Visit our website for additional resources: workshops and courses, an eBook on workflow learning, and our latest EnABLE Methodology white paper.

Attend our upcoming Summit.

Copyright © 2022 by APPLY Synergies, LLC
All Rights Reserved.

Why Technology Matters with Workflow Learning

This blog is excerpted from the Performance Matters Podcast episode titled “Why Technology Matters with Workflow Learning” where Con Gottfredson, Ph.D., RwE invites his colleagues Carol Stroud and Sue Reber to discuss the critical role that technology plays in enabling the 5 Moments of Need framework.

Conrad Gottfredson (CG): Our focus is making sure that organizations can implement the 5 Moments of Need framework and enable workflow learning to ensure that people learn to perform. That is solved through what we call a Digital Coach, which makes sure that within 2 clicks and 10 seconds, performers have what they need to do their jobs. Also, there is Targeted Training—only for tasks with a Critical Impact of Failure that merits stopping work to learn. All of this is enabled through technology. Today, we want to help you look at technology through the framework of the 5 Moments and figure out how to see it all. Carol, I'd like you to briefly introduce yourself and then tell us how we can look at this technology landscape with all that's going on and make sense of it.

Carol Stroud (CS): That's a that's a big question, Con. Hi, my name is Carol Stroud and I've been working with this methodology for almost 14 years, working closely with Con on lots of different projects, but also being out and about working to implement it in different organizations. I've certainly experienced many kinds of environments in terms of the different puzzle pieces that fit for each one. From there, I started to understand that I needed to learn the methodology. I did that and then found that once I started to apply it in an organization, there was a ripple effect. It wasn't just about doing the methodology, because the outcomes were so different: I wasn't doing eLearning anymore. It was it was a very different focus. I had to figure out how we could actually implement the methodology across a variety of different areas. One thing that we always hit upon was the technology issue. How well did it fit in an organization? Did they have what they need? What did we actually need to implement in order to produce a successful solution?

With all that experience and working with Sue and Con on different projects, we decided to put together an implementation framework for the 5 Moments of Need. That framework—captured on strategic, tactical, and technical levels—helped us start to wrap our arms around what implementation really takes. Then, another layer went on top of that implementation framework, which was a maturity model that lets us help people figure out where their organization is in terms of maturity and implementation. There are four levels from the very beginning to the very high end, and descriptions across those four levels to explain what it looks like when you're at the very beginning of the process, and what it looks like when you're rockin’ at the other end.

Technology is a huge piece of that conversation. In our process of putting together what we call “technology ecosystems”, Bob Mosher, Con, Sue, and I had some spirited conversations about how to break those down. We came up with three main areas to address when we talk about technology:

  1. Content, solution development, and maintenance. How do you create your content and maintain it?
  2. Delivery and optimization. How do you actually deliver the content or the solution and optimize it?
  3. Track, measure, report. How do you approach these key elements?

We carved out those areas in this way because when we look at the maturity levels of organizations, in some cases, all people can do is produce content. Maybe it sits in a PowerPoint and maybe it’s delivered via email once it becomes a PDF. Even though it’s low-tech, it still provides task-level support; it still works and can be effective. This example gives us some of the background as to why we look at the three different areas: content development, content delivery, and how to actually track, measure, and report.

We put these lenses on top of all sorts of different technologies. As Con said, initially, we were talking about a Digital Coach and Targeted Training. But as we started to learn more and talk to different people, we figured out that there's way more to this (things like adaptive learning and different tools we can use to embed learning in the flow of work, which make for a comprehensive series of Venn diagrams that we've put together). Our overall intent for this work was to put some categories together and group certain types of technologies into, for example, a Digital Coach category or an LCMS category, etc. We wanted to map how all sorts of different technology comes together and overlaps in a full 5 Moments of Need solution. We just needed that foundational framework for us to be able to speak the same language. Often, when talking about technology, people have their own perspectives and might misinterpret that someone only means “this” (a very high end, integrated solution) when in actual fact, the organization might not have that capability (they only have a very low end, disintegrated solution). The question becomes, “How can we produce good solutions using ALL types of technology?”

CG: Technology is important. Certainly, it's vital. But not all organizations have access to all the technology they need to do everything they want to do. One of the great advantages of working with Sue Reber is that she's always looking, as does Carol, at how to implement and take advantage of the technology an organization already has. Sue, as you have worked with organizations and considered all the technology out there, what's the greatest challenge you see as organizations work to step into 5 Moments of Need capability (and getting the technology there to help them do that)?

Sue Reber (SR): Organizations have this tendency to get wrapped up in the technology: what is it that we're going to build this 5 Moments of Need solution in?! And that can really hurt you when you're trying to come up with a good solution, because the methodology is really based around a workflow, right? You can use any technology. You just need to start with what you have available and go from there. You don't need to be thinking about technology before you've actually implemented the methodology and applied it to get to your workflow solution.

CG: Carol, I remember watching you at a time when there was no technology, and you took the methodology and solved the problem with a book of answers.

CS: I did live that dream, but it speaks so successfully to what Sue was saying. We were pushing for a technology solution and designing and working that way, but about six weeks out, it became apparent we weren't going to get there. I still had the task of providing support to people at the time they were going to need it, so I looked at the design of what we had put together (intending for it to be put into the technology) and decided we could actually create a print version. We had a workflow. We had tasks identified. We knew what people needed to do and what they needed to know about to go ahead and do it. So, within those six weeks, we were able to put together a print solution, which was still effective because it followed the methodology. It supported workers in the days of their new hospital opening. Interestingly, it quickly went from print to a digital version as a PDF that went onto their SharePoint site. Gradually, as the organization matured, different technologies were used. But that's a great example of “start where you are". Have a solid design based on the methodology and the Rapid Workflow Analysis, and then learn to build as you move through those stages of maturity in the implementation framework.

CG: Sue, if I want to build a Digital Coach and you tell me that methodology is the key, what do I need to have? What do I need to be able to do with that methodology using technology?

SR: You need to be able to support a workflow. We have a workflow map because everything is based on the workflow with our methodology. And through that workflow, within 2 clicks or 10 seconds, we can get to the support for every task, and all the support for a task is available from the task itself. There is also immediate access to any supporting resources for that task. This structure is going to be consistent. And the other critical piece is being able to support all different audiences. Whether someone is an expert or just starting out, we want to make sure to get them the support they need as quickly as possible so they can get right back to work. If you think about it that way, the technology itself is not as important. And you know, Carol talked about a print version. I've built Digital Coaches in PowerPoint, we've built Digital Coaches in SharePoint, and we've used a whole host of different technologies (from low tech to high tech) to build effective support. But the key thing is that it's all built around the workflow and what people do on the job. Ultimately, you will evolve to where you're employing software designed specifically to help you create, maintain, and deliver a Digital Coach. But what we want you to understand is that it's methodology that matters. You can start from where you are. You can look internally at technology, and you can prove it and grow. As your functional requirements increase, you can look at other technologies, right?

CG: Yes, but those technologies must be founded upon the right methodology. Is that what you're talking about, Sue?

SR: Yes. They're not the driver for the solution. The workflow is the driver for the solution: what people need to do on the job.

CG: As we as we talk about technology, it's exciting stuff. I mean, there are amazing things happening technologically. But we also see a lot of decisions being made around technology with no forethought. So, what is it that we need to do with that technology from a methodology perspective? It is just so crucial that we are driving our technology decisions based upon what it is that we need to be able to do. Technology enables methodology and methodology ensures that technology is worth the investment.

Listen to the full episode for more guidance around leveraging technology to enable the 5 Moments of Need framework in your organization!

Subscribe to The Performance Matters Podcast to stay up-to-date on all the latest conversations and guests in the 5 Moments space.

Visit our website for additional resources: workshops and courses, an eBook on workflow learning, and our latest EnABLE Methodology white paper.

 Copyright © 2022 by APPLY Synergies, LLC 

All Rights Reserved. 

 

A Bit of Workflow Learning Hogwash

by Conrad Gottfredson, Ph.D, Rw.E.

Bob Mosher just invited me to read through the Harvard Business Review (HBR) article “Build Learning into Your Employees’ Workflow.” I jumped right into it and discovered a well-intentioned academic researcher who proffers up some good insights but also some misunderstandings that we on the farm often call “hogwash” *. These misunderstandings are so common in our discussions regarding workflow learning that they need to be addressed.

The article begins by declaring that “Learning and Development (L&D) programs are critical for the success of any organization. These programs…ensure that employees have the skills and capabilities necessary to do their jobs well…”.

But in 90% of organizations today, L&D programs are failing to do this, which the article acknowledges with this statement:

“Unfortunately, many organizations struggle to demonstrate a return on their L&D investments. In fact, one estimate found that only 10% of the $200 billion spent every year on corporate training and development in the United States delivers real results.”

Here are three reasons the article provides that only partially account for this failure:

1. Trainings typically take place outside of the organization, making it difficult to translate what is learned in the classroom into real workplace applications.

2. Trainings tend to require the learner to invest a substantial amount of their own time, while still being expected to fulfill all their regular work duties.

3. The onus for applying the learning is typically placed on the learner, with minimal follow-up from the instructor once the training has concluded.

These are primarily training transfer issues that do matter. Transfer is where performers first meet the real moments of Apply and Solve in their flow of work. But the article ignores a critical phase that follows these transfer challenges. Once performers have navigated the rugged journey of transferring what they have learned to their specific work environments, the moments of Apply and Solve are joined by the moments of Change, Learn More, and Learn New in the ongoing flow of work. Here, performers must adapt to an everchanging work environment, which requires them to unlearn (Change), and then relearn (New and More) while they are working.

Here is where the HBR article really goes awry. Stopping work to learn is costly no matter where it occurs. Workflow learning isn’t just about embedding micro learning events into the workflow. This is actually the hogwash part of this article. Performers are still interrupting their work to learn in those micro events, which have their own set of challenges. The cognitive responsibility to transfer what performers have learned to their work is certainly easier in the context of work, but there is still the cognitive requirement of transfer. True workflow learning eliminates the transfer phase.

True workflow learning occurs to the degree that people learn while actually doing their work rather than stopping work to learn.

Learning while working requires a Digital Coach that provides 2-click/10-second access to all the resources people need to perform effectively at the job task level. As performers land on the step-by-step instructions and follow them, drawing upon other supporting resources as needed, they are doing their work and learning at the same time. 

This doesn’t require stopping work for reflection time, and nudges happen naturally. Whenever workers need to perform a task (Apply), resolve a challenge (Solve), or learn something New or More, the workflow naturally nudges them to access the Digital Coach that has been designed to support learning while working at all 5 Moments of Need. 

The following graphic shows the two criteria that govern true workflow learning:

1. The degree to which the workflow learning solution is embedded in the flow of work 
2. The degree to which performers are NOT required to interrupt (stop) their work to learn   

Where the HBR article and others get it wrong is they only address the degree to which learning is embedded in the workflow: they ignore the power and potential of learning in the flow of work while actually working. The following graphic shows how these two criteria can help sort through the maze of discussion around workflow learning.


If you read an article or hear any discussion about workflow learning in which it is defined only by creating or placing learning events in the workflow—where learners must still stop working to learn—then you can count it as hogwash. 

* Here’s one definition of hogwash (if you’ve not had experience feeding pigs): noun. refuse given to hogs; swill. any worthless stuff. meaningless or insincere talk, writing, etc.; nonsense; bunk

Be sure to visit our website for these additional resources: workshops and courses, an eBook on workflow learning, and our latest EnABLE Methodology white paperAlready shifting to a performance-first approach? Bring your team to our 2023 Summit! Registration is now open.


Copyright © 2022 by APPLY Synergies, LLC 

All Rights Reserved. 


Why Don't We Weigh Them?

 Written By: Gloria J. Gery*

Throughout the years, folks have developed measures for training effectiveness, satisfaction, and learning. All kinds of approaches from smile sheets to yardsticks have evolved. When I was running a data processing training organization at a large insurance company I once got disgusted with the statistics I had to submit each month. As the functional manager I was responsible for use of facilities, instructor resource, equipment, and assuring "value" for the training dollars spent on IT technical professionals and management -- and the user community. My monthly report had to list items such as:

  •        number of student days
  •        student/instructor ratio
  •        number of "no shows", drop-outs, and last minute cancellations
  •        dollars charged back to departments using training
  •        percent utilization of facilities
  •        cost per student-day
  •        average "satisfaction" scores on our "smile sheet" evaluations
  •        on-time completion and on-cost development of new courses
  •        actual vs. planned operational budget expenditures.

At a meeting one day, I suggested a new measurement criterion.

"Why don't we weigh the students and report on a cost per pound?"

A deep quiet overcame the meeting. It was finally broken by a softly spoken question.

"What?"

I guess I was being given a chance to reconsider, but I didn't take it.

"Why don't we install a scale in the entry way," I said, "like the one they use for cattle. We can have each student stand on the scale before entering class each day. We can then calculate the return on our investment by volume."

Needless to say, this attitude was a subject for much discussion both on that day and on my annual appraisal. While I wasn't exactly serious, the idea didn't seem any more irrelevant than some of the success indicators I was reporting on monthly.

None of the measurements I was supposed to take asked if anyone learned anything or if our interventions changed their performance.

One of the men who worked with me was angry about my attitude. He said: "Do you know what your problem is?" (Note: it's always a bad sign when somebody starts talking about "what your problem is.")

"No", I responded.

"You're trying to get the right numbers instead of making the numbers come out right!" he said.

I am still working on a response to that one. But I long ago gave up trying to make the numbers come out right in favor of finding the best way to measure what we're trying to accomplish.

Today, I encourage different measures. It's much easier to actually employ these assessments in a performance support environment because the connections between performance support in the actual work context is so much more direct than the distance between training events and work performance. That very statement says a lot, doesn't it?

Let me share some of the objectives and measurements that rule my work today.

  •        decreased time to understanding
  •        decreased time to performance
  •        reduced performance cycle times (associated with a task, process, customer                   interaction, deliverable, creation, etc.
  •        reduced implementation costs (for a system, product, new process, etc.)
  •        reduced support costs (number of coaches per group)
  •        reduced handoffs of work, calls, problems to others
  •        increased customer satisfaction with organization representatives as measured by           surveys, follow-up calls, complaint activity
  •        quality improvements
  •        ability to shift work to less experienced employees or to customers
  •        reduced transaction costs
  •        decreasing the gap between less experienced and star performers
  •        competitive differentiation as reported by customers
  •        organizational flexibility
  •        increased performer confidence -- and confidence by those they work or interact               with

When an organization can accomplish something like institutionalizing best practice into the work situation and make performance less a focus of individual competence and more a function of the environment itself, weighing people just doesn't come to mind for me. Does it for you?


*Originally published in CBT Solutions Magazine, May/June 1997



Develop Performance Objectives Aligned to the Workflow

By: Conrad Gottfredson, Ph.D., Rw.E.

When I shifted my mindset from learning to “performance-first” in 1984, the way I viewed and created learning objectives changed. “Performance-first” pushed me to design solutions that enabled knowledge enriched performance in the flow of work, which in turn required me to ask and answer four key questions about job performance:
  1. What is the fundamental unit of job performance?
  2. What is the role of knowledge in enabling effective job performance?
  3. What is a job skill?
  4. How does all of this influence performance-first instructional design practices?
What is the Fundamental Unit of Job Performance?

No matter the role, all work is comprised of a group of workflow processes, each with a set of job tasks. The tasks that make up each process have steps that are procedural (algorithmic) and/or principle-based (heuristic). 

For example, the following workflow map shows the processes for a company’s leaders who are responsible for leading sales teams:



Behind each of the processes shown in the map above, there is a grouping of job tasks. The following graphic shows those task groupings:

Focusing on tasks as the fundamental units of job performance provides the optimum framework for aligning all performance support and learning instruction. We know from cognitive research that the way performers encode skills into memory affects how efficiently and effectively they retrieve and translate those skills to action (see Efficiency in Learning: Evidence-Based Guidelines to Manage Cognitive Load, by Ruth Clark, Frank Nguyen, and John Sweller - 2006, Pfeiffer). In simpler words, how we train people affects how readily they can transfer what they have learned to their specific work environment. For example, when you were taught the letters of the alphabet, you were most likely taught those letters in a sequential order. You therefore encoded those letters in that order in your long-term memory. Now, if I ask you to retrieve those letters in a different order - say backwards by every third letter - you will likely struggle. This is exactly what happens when the training people receive isn’t directly aligned with how they perform their work. It’s like they are running into a brick wall. This is why after learners complete their training/learning experience and return to the workplace, they are often greeted with the comments, “Forget what you have learned in class. We’ll show you how it’s really done.” 

Performers need to learn according to how they will perform in their flow of work. Job tasks are the fundamental units of that work and when task training is aligned with the workflow and supported by a Digital Coach (i.e., EPSS) transfer can happen in the blink of an eye.

What is the Role of Knowledge in Enabling Effective Job Performance?

Today’s speed of change favors an organization whose workforce is inherently adaptive. The workflow is the best schoolroom for developing this critical skill. Why? Because any object or situation experienced by an individual, in the flow of work, is unlikely to recur in exactly the same form and context. Every time performers effectively respond to a recurring but altered situation, while working, they are practicing the skill of adaptive response.  

Research by Albert Bandura demonstrates that successful adaptive performance not only increases the adaptive capacity of a workforce, but also its workers’ overall self-efficacy. Increased self-efficacy fuels more effective performance. (Cognitive Therapy and Research, VoL 1, No. 4, 1977, pp. 287-310) 

So how does all this address the need for knowledge enriched experience? Performance without requisite knowledge is sterile and mechanical, but when knowledge is infused into a job task, performers are better able to successfully adapt to change. Knowledge contributes to better decision making while adapting. All of this opens the door to the benefits identified by Bandura.

A performance-first approach to instructional design begins with the identification of job tasks. Then, those tasks are mapped to workflow processes. The next step is to identify the key knowledge topics (i.e., concepts) that support meaningful performance of those job tasks. In the sales leadership example shown above, we identified 65 job tasks that make up 9 workflow processes. The table below shows the supporting knowledge topics for two of those processes. These are topics that performers need to understand in the context of the specific tasks. For example, leaders need to understand what the organization means by “Bench Strength” as they perform tasks 1, 2, and 5 in the “Build Your Bench” process.



What is a Job Skill?

There are many ways to define a skill. Here’s one way in the context of workflow learning: “A person has mastered a skill when they can successfully and consistently perform a job task with full understanding of its requisite supporting knowledge.” This is a tactical definition in that it allows the consistent identification of skills that can be specifically targeted for support, training, and measurement.  

Thus, a skill set is an integrated set of tasks and associated concepts (i.e., supporting knowledge) that comprise a specific workflow process. A learning module is generally comprised of a series of lessons, each focusing on a task and its supporting knowledge.  

The following example shows these tactical relationships: 



How Does All of This Influence Performance-First Instructional Design Practices?

The great danger of all learning and instructional theory research is the academic backdrop of researchers and their subjects. But there is much to be learned from academically focused research. As a graduate student, I was highly influenced by the research of David Ausubel, who, for me, set the research standard for the value and use of advanced organizers. Ausubel believed that one of the key roles of an advanced organizer is to trigger previous knowledge and experience as well as prepare learners to look for and process new knowledge and experience. Workflow performance objectives can and should serve these two purposes.   

When I answered the three previous questions (1. What is the fundamental unit of job performance? 2. What is the role of knowledge in enabling effective job performance?
3. What is a job skill?) forty years ago, it fundamentally shifted the way I thought about objectives. Once tasks and their supporting knowledge topics were identified and aligned with the flow of work, I found I had all I needed to write meaningful learning objectives.  

NOTE: I need to pause here and acknowledge the high probability that some readers are going to push back on what I’m going to write next. Please take a deep breath and brace yourself.

A performance-first approach to instructional design focuses on the workflow first and designs the performance support solution ahead of its associated learning experience solution. As stated earlier, these two solutions need to be integrated into a cohesive overall solution.   

Inherent in the design of the learning experience is establishing the scope and sequence of the learning modules and the lessons within them. This is where the magic happens when training is aligned with the workflow. At the completion of every module, learners need to be able to do two things:
  1. Successfully perform the job tasks that comprise the workflow process represented in the module.
  2. Demonstrate understanding of the supporting knowledge associated with those tasks.
More specifically, every lesson in a module has the primary objective of enabling learners to successfully perform a specific task with the help of performance support. In addition, learners need to develop their understanding of the supporting knowledge related to that task in the context of performing it.

The following is an example of an objectives statement taken from a module overview:

In this module you are going to learn how to Build your Bench. Specifically, you will learn how to:
  • Identify talent in the market
  • Build a relationship with a potential candidate
  • Justify a position
  • Secure position approval
  • Interview and close an offer
  • Manage succession planning

And here is an example taken from a lesson overview within the module for “Build Your Bench”:

In the following lesson you will learn how to Identify Talent in the Market. To do this you must first understand: 
  • Bench strength/roles
  • High performing leadership/seller behaviors
  • The competitive/market landscape 

Here’s the good news. In the realm of workflow learning, it is absolutely possible to verify (measure) successful performance of each and every task via usage data and micro-polling gathered by a Digital Coach (i.e., EPSS). In addition, adaptive learning systems can most certainly reinforce knowledge learning while gathering measurement data to verify ongoing understanding of the supporting knowledge associated with those tasks. 

This approach to objectives isn’t theoretical. It has been proven by 40+ years of real-world experience (Rw.E.) developing comprehensive performance-first solutions that span chronologically, culturally, linguistically, and logistically diverse audiences in settings ranging from small to large international corporations, government agencies, and religious organizations. Thank you for investing your valuable time reading this blog. I hope it hasn’t been too stressful and welcome any questions or comments you have. 


Copyright © 2022 by APPLY Synergies, LLC 

All Rights Reserved. 

Insights to Unlock Performance Support for Your Organization

This blog is excerpted from the Performance Matters Podcast episode entitled “Insights to Unlock Performance Support for Your Organization” where Bob Mosher and Dr. Katie Coates, director of learning at McKinsey and Company, discussed her latest research findings behind the true power of performance support.  

Bob Mosher (BM): I'm incredibly honored to have Dr. Katie Coates of McKinsey & Company with me today. Katie it is so wonderful to have you here—welcome!

Katie Coates (KC):  Thank you so much, Bob. It's great to be with you today.

BM: And a recently accredited PhD, correct?

KC: Well, I'm finishing things up. But, yes, shortly.

BM: That’s absolutely remarkable, do you mind jumping in and sharing a bit about your dissertation process?

KC: Yeah, so there was some rigor around this; it followed the qualitative research methodologies. I had a committee of professors that were with me step by step looking at what I was doing and giving me feedback. It then had to be approved by our research board to make sure that everything I was doing was ethical. So, it was really in the university confines and monitored and watched by experts in qualitative research.

But the first thing you do in a dissertation is to come up with “what's the problem?” “What's the question?” And my question was around adoption and really focusing on those upfront decisions that leaders make. What are the events and experiences that lead Senior Learning and Development professionals to adopt and implement performance support? I wasn't looking at the end user adoption, but the upfront decisions that were being made. So, you frame that question, you do the lit review that I talked about and my argument was that performance support is effective and can have an impact. I just don't understand why more organizations aren't doing it.

And that's really it.

BM: Wow, this is all so badly needed.  So, let's get to it. What are some of your key findings?!

KC: Sure! So I think that there's still a lot of myths about performance support out there in the world, one of the things I continually hear was, “Well, it's good for help desks, good for very procedural activities.” So, I wanted to test that.

And I didn't say that to my participants, it was an open-ended question. I had 36 different examples from the 17 interviews. And then I synthesize that down to a number of nine, I'm not going to go through everything. But of course it's about access to consistent work processes that increase efficiency and quality. We know, that's a big part of it, that support professionals time to competency. So, during onboarding, giving them the right tools to help them.  There are examples of how this is used in soft skills as well. I mean, I hate to use soft skills, but you know, it's not the harder thing, those power skills, if you will.

BM: Love it.

KC: Yeah. There was one example in an organization where they a people leadership hub, it was like, “Okay, how do I hire people? How do I develop people? How do I review their performance? How do I handle different scenarios?” Now there's some procedural things in there, but there were a lot of things around interviewing or coaching, giving feedback and performance support pieces to really support that whole process. So, lots of different types - regulations and compliance, hybrid working models when people were working from home and they don't have the water cooler, or the person next to them to ask. So, these were the types of examples. So, so many things out there. That was one big lesson learned.

BM: You know what I love about that is it so resonates with where we are today with the pandemic situation. But these challenges have been around forever, I just think the circumstance has exacerbated it and exposed the cracks in the dam that might have been around anyway. But compliance, these kinds of things, the hybrid worker, the disruptive workforce, to keeping up with the rate of change, again, those have always been in every organization you walk into, but the nature of the stress or the time, or the anxiousness of those things, was just something that we kind of swept under the rug, frankly, in some ways, but you found that these people ran right at it with this kind of an approach.

KC: Oh, there were a couple of amazing examples around how performance support showed up during COVID. One in particular that's really interesting was a hospital and the learning team. They had gone to a conference, and they learned about performance support, and they really wanted to do it. But the way they described it, to get it in the door, would have been a very difficult process for them—lots of bureaucratic red tape. Then, when COVID hit, they shut down their in-person Academy but they had to train nurses from one ward, so maybe they were working in the ICU, and they now had to work in the COVID ward. They had the baseline skills, but there were some differences. So, they did have some things that they had to teach them. And the learning manager picked up the phone and called her boss and said, “I think we need to do performance support now.” And they went and took a quick proposal to the leaders and leader said, “Yes, that makes total sense. Bring it in.” They then worked with a team of external experts and in 10 days they produced a performance support solution. It was amazing, absolutely amazing. And now they all love it. They have doctors coming and saying, “When do I get my performance support?”

BM: Happens every time.

So, my friend, I've known you through this whole journey and you seem to have emerged more passionate than when you started this research.  It must have reinforced your own experience. And then on top of that you got to talk to these wonderful leaders across all those industries and you saw again and again the impact and enthusiasm.

Here's my question, Katie. In your opinion, you've known so many leaders, and you've met so many L&D professionals. If you look at our industry, why do you think we continue to lag in putting performance first? Where do you think that comes from? And how do we break that cycle?

KC: Yeah, I do think one big thing we've talked about, is there are a lot of learning professionals that grew up creating learning experiences. And that is fun. They learned the ISD model and they are passionate about it. They're really focused on that.

And so I think that's one piece where we need to help them understand that this too can be fun! Maybe even more fun due to the impact that it can actually have!

BM: We know we are currently in the industry minority, but I have faith that because of wonderful work a and research like yours, and people like you, that our voice is getting stronger. We are becoming a cohort and a community, which is what we need.

I can't thank you enough for your time today, for your dedication, for your friendship, and for your leadership. This is not for the faint of heart, but as you've so well demonstrated in your research, it is well worth doing. So let's all work to be a bit be more like you and do more of this. Katie, thank you so much!

Listen to the full episode for more on Dr. Coates’ research and her organization’s performance support journey.

Subscribe to The Performance Matters Podcast to stay up-to-date on all the latest conversations and guests in The 5 Moments space.

Visit our website for additional resources such as: workshops and courses, an eBook on workflow learning, and our latest EnABLE Methodology white paper.

Copyright © 2022 by APPLY Synergies, LLC 

All Rights Reserved.