Insights to Unlock Performance Support for Your Organization

This blog is excerpted from the Performance Matters Podcast episode entitled “Insights to Unlock Performance Support for Your Organization” where Bob Mosher and Dr. Katie Coates, director of learning at McKinsey and Company, discussed her latest research findings behind the true power of performance support.  

Bob Mosher (BM): I'm incredibly honored to have Dr. Katie Coates of McKinsey & Company with me today. Katie it is so wonderful to have you here—welcome!

Katie Coates (KC):  Thank you so much, Bob. It's great to be with you today.

BM: And a recently accredited PhD, correct?

KC: Well, I'm finishing things up. But, yes, shortly.

BM: That’s absolutely remarkable, do you mind jumping in and sharing a bit about your dissertation process?

KC: Yeah, so there was some rigor around this; it followed the qualitative research methodologies. I had a committee of professors that were with me step by step looking at what I was doing and giving me feedback. It then had to be approved by our research board to make sure that everything I was doing was ethical. So, it was really in the university confines and monitored and watched by experts in qualitative research.

But the first thing you do in a dissertation is to come up with “what's the problem?” “What's the question?” And my question was around adoption and really focusing on those upfront decisions that leaders make. What are the events and experiences that lead Senior Learning and Development professionals to adopt and implement performance support? I wasn't looking at the end user adoption, but the upfront decisions that were being made. So, you frame that question, you do the lit review that I talked about and my argument was that performance support is effective and can have an impact. I just don't understand why more organizations aren't doing it.

And that's really it.

BM: Wow, this is all so badly needed.  So, let's get to it. What are some of your key findings?!

KC: Sure! So I think that there's still a lot of myths about performance support out there in the world, one of the things I continually hear was, “Well, it's good for help desks, good for very procedural activities.” So, I wanted to test that.

And I didn't say that to my participants, it was an open-ended question. I had 36 different examples from the 17 interviews. And then I synthesize that down to a number of nine, I'm not going to go through everything. But of course it's about access to consistent work processes that increase efficiency and quality. We know, that's a big part of it, that support professionals time to competency. So, during onboarding, giving them the right tools to help them.  There are examples of how this is used in soft skills as well. I mean, I hate to use soft skills, but you know, it's not the harder thing, those power skills, if you will.

BM: Love it.

KC: Yeah. There was one example in an organization where they a people leadership hub, it was like, “Okay, how do I hire people? How do I develop people? How do I review their performance? How do I handle different scenarios?” Now there's some procedural things in there, but there were a lot of things around interviewing or coaching, giving feedback and performance support pieces to really support that whole process. So, lots of different types - regulations and compliance, hybrid working models when people were working from home and they don't have the water cooler, or the person next to them to ask. So, these were the types of examples. So, so many things out there. That was one big lesson learned.

BM: You know what I love about that is it so resonates with where we are today with the pandemic situation. But these challenges have been around forever, I just think the circumstance has exacerbated it and exposed the cracks in the dam that might have been around anyway. But compliance, these kinds of things, the hybrid worker, the disruptive workforce, to keeping up with the rate of change, again, those have always been in every organization you walk into, but the nature of the stress or the time, or the anxiousness of those things, was just something that we kind of swept under the rug, frankly, in some ways, but you found that these people ran right at it with this kind of an approach.

KC: Oh, there were a couple of amazing examples around how performance support showed up during COVID. One in particular that's really interesting was a hospital and the learning team. They had gone to a conference, and they learned about performance support, and they really wanted to do it. But the way they described it, to get it in the door, would have been a very difficult process for them—lots of bureaucratic red tape. Then, when COVID hit, they shut down their in-person Academy but they had to train nurses from one ward, so maybe they were working in the ICU, and they now had to work in the COVID ward. They had the baseline skills, but there were some differences. So, they did have some things that they had to teach them. And the learning manager picked up the phone and called her boss and said, “I think we need to do performance support now.” And they went and took a quick proposal to the leaders and leader said, “Yes, that makes total sense. Bring it in.” They then worked with a team of external experts and in 10 days they produced a performance support solution. It was amazing, absolutely amazing. And now they all love it. They have doctors coming and saying, “When do I get my performance support?”

BM: Happens every time.

So, my friend, I've known you through this whole journey and you seem to have emerged more passionate than when you started this research.  It must have reinforced your own experience. And then on top of that you got to talk to these wonderful leaders across all those industries and you saw again and again the impact and enthusiasm.

Here's my question, Katie. In your opinion, you've known so many leaders, and you've met so many L&D professionals. If you look at our industry, why do you think we continue to lag in putting performance first? Where do you think that comes from? And how do we break that cycle?

KC: Yeah, I do think one big thing we've talked about, is there are a lot of learning professionals that grew up creating learning experiences. And that is fun. They learned the ISD model and they are passionate about it. They're really focused on that.

And so I think that's one piece where we need to help them understand that this too can be fun! Maybe even more fun due to the impact that it can actually have!

BM: We know we are currently in the industry minority, but I have faith that because of wonderful work a and research like yours, and people like you, that our voice is getting stronger. We are becoming a cohort and a community, which is what we need.

I can't thank you enough for your time today, for your dedication, for your friendship, and for your leadership. This is not for the faint of heart, but as you've so well demonstrated in your research, it is well worth doing. So let's all work to be a bit be more like you and do more of this. Katie, thank you so much!

Listen to the full episode for more on Dr. Coates’ research and her organization’s performance support journey.

Subscribe to The Performance Matters Podcast to stay up-to-date on all the latest conversations and guests in The 5 Moments space.

Visit our website for additional resources such as: workshops and courses, an eBook on workflow learning, and our latest EnABLE Methodology white paper.

Methodology Matters: Rethinking Learning Objectives

By: Dr. Conrad Gottfredson

Note: This blog is the second in a series of articles looking at traditional Instructional Systems Design (ISD) practices through the lens of a “performance-first” mindset. Click here to access the introductory blog to this series: “Methodology Matters: A Performance-Based Instructional Design Methodology”.

In 1978, I took my first graduate-level course in Instructional Systems Design. At the time, I was an Undergraduate Research Trainee (URT) for the department of Instructional Science. The first lesson in that course taught me how to write measurable learning objectives. Our textbook was titled Preparing Instructional Objectives: A Critical Tool in the Development of Effective Instruction (written by Robert Mager). For the next six years, I wrote hundreds of Mager-perfect objectives. I was a true believer. But after graduate school, when I awakened to the realities of workplace learning, things changed. As I looked at learning objectives through a “performance-first” lens, I recognized that I needed to rethink their role.

Fast forward to more current times, when our team was asked to help an organization restructure a key course to meet the requirements of all 5 Moments of Need. It was a traditional 5-day course with over a thousand slides and 270 traditional learning objectives. Focused on safety, this course had the overall performance requirement of enabling participants with the skills they needed to safely secure information, facilities, and people.

The first thing we did was conduct a Rapid Workflow Analysis (RWA). We identified 56 job tasks that participants needed to perform in their security roles and the 51 supporting knowledge topics they needed to understand as they performed the various steps within those 56 tasks.

We also conducted a Critical Impact of Failure Analysis with key stakeholders to rate every task and knowledge topic using a modified version of the following rubric:


To check the instructional health of the existing course, we mapped the 270 learning objectives to the tasks and knowledge topics that our SMEs identified during the RWA. Here’s what we found:
  1. More than 80% of the learning objectives were focused on knowledge rather than performance. Only 52 of the 270 learning objectives related directly to actual job tasks. 
  2. Significant workflow performance areas were missed. The existing 52 performance-focused learning objectives addressed only 30% of the job tasks we identified through the RWA.  
  3. There was a significant imbalance of learning objectives for knowledge topics. Although the remaining 218 objectives only missed 15% of the 51 supporting knowledge topics, objectives were micro-focused and contributed to cognitive overload: 20% of them included 10 to 25 learning objectives per concept.
  4. The most significant indictment of the course design was that 70% of the high-risk job tasks (where the critical impact of failure was significant to catastrophic) had been missed entirely (one of which included the potential for loss of life).
  5. Lastly, we determined that 40% of the 270 learning objectives could be met exclusively within the workflow – as people worked – rather than during the 5-day course.
In our experience, these kinds of differences aren’t anomalies. A performance-first approach to solution design doesn’t even consider learning objectives at the outset. They may or may not come later in the design and development of activities that support knowledge and task-skill development.  

There are four realities of performance-focused design that should guide how we approach writing learning objectives:

1. Knowing doesn’t guarantee performing.
2. Job performance is best developed and supported at the job task level.
3. Effective performance must be supported with knowledge.
4. Real learning requires experience.

Knowing Doesn’t Guarantee Performing

My sister taught me how to kiss. I want to be CRYSTAL CLEAR here: she didn’t train me. She gave me some helpful pointers that I thought I understood. But the night I made my first attempt to move from simply understanding to actual application, I learned the vital lesson that Sophocles taught more than two centuries earlier:


Knowing about something doesn’t guarantee that effective performance will follow. We have no certainty of the ability to perform until we have actually acted upon what we have learned.   At the moment of my first kiss, I gained absolute certainty that knowing about kissing doesn’t guarantee successful performance.

The requirement to write “measurable” learning objectives has pushed designers to use verbs that lean heavily towards knowledge acquisition rather than on-the-job performance. Why?  Because it is easier to measure knowledge acquisition than performance. Only one of the six levels of Bloom’s Taxonomy proports to directly address performance (i.e., application). Even then, many of the recommended verbs are limiting when it comes to describing true on-the-job performance.

Job Performance is Best Developed and Supported at the Job Task Level

Here’s what we have learned during our past 40 years of focusing on performance first:

We learn best at the job task level.
We remember best at the job task level.
We perform best at the job task level.
We measure best at the job task level.

A job task has a set of steps that, when followed, lead or contribute to a specific result. Those steps can be procedural or principle-based (for soft skills). The following table provides examples of both procedural and principle-based tasks.  

Procedural Workflow Tasks

Principle-Based Workflow Tasks

Contact the injured or ill employee.

Establish performance expectations.

Arrange for a case management meeting.

Align employees’ goals.

Hold a meeting.

Develop employees’ job descriptions.

Engage in and communicate about your treatment plan.

Set company expectations.

Document ongoing management in the employee health record.

Set educational goals. 

Maintain connection with an employee off work (manager/supervisor).

Conduct one-on-one meetings.

Gather case information from the manager/supervisor.

Conduct annual performance appraisals.

Request medical documentation.

Discuss employees’ impact on workplace and culture.

Provide medical documentation.

Provide quarterly goals updates.

Receive medical documentation.

Conduct department meetings.

Send reports.

Promote learning.

Report injury, illness, and/or challenges for remaining at work.

Provide support and resources.

Conduct a triage assessment.

Monitor employees’ progress.

Determine the appropriate EDMP stream.

Assign mentorship opportunities.

Enroll an employee.

Review comparative reports.

Make a triage report.

Plan job shadowing opportunities.

Identify barriers to returning to/staying at work.

Set employee development plans.

Obtain medical assessment and/or treatment.

Motivate employees.

Resolve wage and benefit issues.

Provide networking opportunities.

Assemble the case team.

Empower employees.

Refer to healthcare services.

Celebrate employees’ success.

A critical initial step in a performance-first approach to instructional design is to identify the job tasks that a work team needs to perform in their flow of work; then, organize those tasks into workflow processes that represent how the work is done. It is at this job task level that the work is performed. These tasks should become the performance targets we adopt in the solutions we develop.

Effective Performance Must be Supported by Knowledge

A performance-first approach doesn’t ignore the acquisition of knowledge. Knowledge and experience are fundamental to effective performance in the flow of work. We know that knowledge is best retained and retrieved when it is anchored to specific areas of performance (e.g., job tasks). And in a performance-first approach, a specific skill is the combination of a job task with its supporting knowledge. 

The following example is excerpted from a Learning Experience and Performance (LEaP) plan. It shows a set of skills that regional sales directors need to grow their markets via external activities. The supporting knowledge topics are mapped to each of the tasks. For example, the skill of “network in your region” requires performers to complete the steps of the first task with an understanding of the first four supporting knowledge topics in the lower half of the table.

The point here is that although a performance-first approach focuses on the ability of a work team to successfully perform job tasks, effective performance also requires each task to be performed with an understanding of the key knowledge that supports it.

Real Learning Requires Experience

The learning solutions we create (synchronous or asynchronous) – whether eLearning, virtual learning, micro-learning, instructor-led or any other type of learning – represent just the beginning of the learning process. Real learning occurs in the flow of work, over time, where experience is developed. Expertise requires experience. When I had my heart valve replaced, I wasn’t concerned about the classes my surgical team members had taken. I wanted to know about their experience: how many surgeries they had done and their success rates.

Ask yourself, “Do learning objectives that are written upfront (to guide the design and development of the solutions we create and implement) truly address the continuous development of experience in the flow of work? Do they naturally lead us to skill development that is task-based and reinforced with supporting knowledge? Do they ensure that we address the full range of performance requirements?”

It is our responsibility to constantly challenge our traditions against the backdrop of the here and now. I know this blog is questioning an area of instructional design that is a long-standing and deeply held practice. Please know that my intention here has been to provide a view for you to consider. We have found what we believe is a better, faster, and more reliable way: Rapid Workflow Analysis (RWA). It provides us a prioritized view of the job tasks and related supporting knowledge that work teams need to do and understand to perform effectively. More to come on this RWA process in another blog.    

Learn More.