ROI is Measured in the Workflow

This blog is generated from the Performance Matters Podcast episode titled ROI is Measured in the Workflow. In it, Bob Mosher and Dr. Con Gottfredson are joined by APPLY Synergies’ Executive Director of Consulting Services Sara Chizzo. Together, they explore how embedding learning in the workflow empowers learning professionals to finally measure impact, effectiveness, and return on investment of learning solutions.

Bob Mosher (BM): Welcome back to another Performance Matters podcast. I can't tell you how excited I am about two things: 1) the measurement topic, which we really need to talk about after an event I recently attended, and 2) the people who are with me. They are two of the most remarkable colleagues that I'm fortunate enough to work with. First, the famous Dr. Conrad Gottfredson. Welcome back. Good to have you here.

Con Gottfredson (CG): Good to be back.

BM: And this next person just rings our bell. I've been fortunate to know her professionally and watch her work for about 20 years, and now we are fortunate to have her as part of our group. I'd like to introduce you to Sara Chizzo. Sara, welcome to the podcast.

Sara Chizzo (SC): Thank you. It is a pleasure to be here and a pleasure to work with you both.

BM: It's just a dream come true for us. Sara has a remarkable history in measurement. Sara, can you tell us a little bit about your journey?

SC: Absolutely. About 25 years ago, I pivoted to the learning and development space working for a technical training company called Productivity Points. That was my first foray into professional learning and understanding that companies actually pay for this kind of stuff, and for external providers and experts to help them train and develop their employees. After a few years of doing that, I got a little bit frustrated. My largest account was Motorola Solutions. At the end of their contract term, they came back to us and asked some questions: what did we train on during this period and what was the return on that investment? First, we couldn't even provide them with accurate information globally about what we'd actually trained their people to do. The systems didn't talk to each other. But we really didn't have any way of measuring the impact, the effectiveness, and the return on that sizeable investment. It was about that time that I joined a colleague of mine named Kent Barnett, who started a company called KnowledgeAdvisors with the desire of bringing some additional discipline to the space. We wanted to help provide more information and data to companies so that they could understand whether their programs were moving the dial. So, that was my journey. I’ve spent about 18 years in measurement and analytics for learning.

BM: Spectacular, and great work. We are really excited to have you here, because now we move into the workflow. Now we move into performance. Sara, why don’t you share a couple interesting stats? We're going to frame this discussion with some things we've heard lately that, frankly, have been a little troubling. So, why don’t you give us a couple to start?

SC: I'm first going to totally indict myself and the work that I did (or didn't) do over the last 18 years with the first two data points. I think it's important to understand where we are and the data about measurement in the learning and development space to better understand where we have been able to move the dial. So, I want to share a couple of things. The first is a data point from a McKinsey & Company study. Learning leaders say that only 25% of their programs improve performance. I remember the first time I read that. I asked myself, “What are we doing?!” What are we doing if, absent of any information, our learning leaders are instinctively saying that three quarters of our programs aren't doing anything? What are our stakeholders paying for? And then the second data point is from a follow-up study that the Performative team did where they asked learning leaders a bunch of additional questions. Ninety-seven percent of those polled (I think it was a sample size of about 250) said that there was waste somewhere in the learning process, but they had no idea where. Essentially, we were providing learning that wasn't netting an impact, either to the individual's performance or to that of the organization (talent or business outcomes). And now that I've had the benefit of looking at this from the 5 Moments of Need (5 MoN) lens, I cannot help but go back to the source, which is how we think about supporting our learners at the time of their work, as they're performing.

If you think about the first data point about three quarters of programs not improving performance, it's because we're not taking a performance lens to the development of our programs and the development of our solutions in the first place. We're not designing for performance. So, if we're not designing for performance, then we shouldn't be surprised when three quarters of our programmatic investments don't net a result. And to the second point about waste in the process, there's far too much of a disconnect between the actual work that's being done and where we're looking to try and support learners in doing their work. Of course, there's going to be waste in the process because the learning is not happening while the work is happening. So, those two data points really jumped out at me within the context of workflow learning and the 5 MoN because I think they both could be dramatically and positively impacted with the right design approach up front.

BM: Perfect. Con, what do you think about when someone says, “But you know, Con, it's busy. It's hard. There are just so many influencers out there after the training. My training could get lost in that. It's not a fair measure of my training, because the learner’s manager, the learner’s discipline, how soon they try to practice when they get back…all those things are unfair to measure my training against because those influencers cloud the measure.” So, what's your answer to that? What do you think that points to?

CG: Every time I hear that it tells me they're chasing a skunk down a hole. They're in a training mindset. They're certainly not thinking, designing, building, and implementing around enabling effective job performance. Gloria Gery saw this. What a remarkable visionary she was. In the 90s, when she wrote her book Electronic Performance Support Systems, she was very clear. In terms of indicting what was being done under the umbrella of training, she said that it wasn't leading to performance and that it needed to. And what she saw was that you can't measure impact if you're trying to get there from the training alone, because too much happens after the fact. By the time you measure learning, those learners have had to access a lot of other things to get where they need to be, because the training wasn't enough. That's the bottom line. The training was not enough to get them to productive performance. So, what do they do? They rely on other people, they work through other systems, and they do other kinds of things to achieve performance. Gloria said that if we build a performance support system that supports people as they do their work, that system—in the work and the workplace—gives us the ability to gather data to make those direct connections. Back to your question, when I hear that, it just lets me know that they're looking at the training, and the only way learners get to productive performance is by going and involving other things [besides training].

BM: I’ve shared my frustration around what I heard recently, which is, “Well, then let's just back off this. I'm sick of chasing the ROI thing. It's hard.” That's like a fireman saying, “Well, I just don't want to know if the fire is out because it's hard to sift around and dig in the rubble. It's hard to get into the dirt and the afterburn because it's messy.” But until we get to that level and understand that there are embers there, we don't go beyond just throwing water on the fire. For so long in training, it has been just that—and we see the flame go out. We assume the learner did well. They like the experience. They feel like they can apply what they learned and what they heard was relevant. So, we pick up our trucks and leave, but the learner is left with the mess of work and the reality of how messy and hard and volatile the workplace is. So, we have to get beyond training and training alone. I want to be careful. We're not condemning training as an entity. We're just saying that, for too long, it's been a safe place and our only answer to this world we need to journey into a bit more.

Con, talk a bit about this workflow thing and why it's been such a missing part of our analysis and our understanding for so long. Don't SMEs give us that? They tell us all that's important when we put them in a room.

CG: They could give us that if we ask the right questions. We generally go into a task analysis or whatever analysis we're doing with the mindset that we're going to build a training solution—not that we're going to enable effective performance on the job. And how are we going to enable effective job performance without facing the workflow? That's where performance occurs. Glory Gery called the workflow the “performance zone”. So, you have to face that; therefore, when you're dealing with effective job performance, the measurable objectives include the ability to complete a job task, whatever that task is—and we can measure that, we can gather data around that—but we've got to be able to face it. And you can't face that without stepping into the workflow and mapping it.

Let me just say this. Sara, you mentioned waste. When we look at onboarding programs and we step into the workflow, and we build workflow solutions that support people as they move through the training, as they move through that transfer phase, as they move into and begin to sustain performance in the flow of work—the moments of Apply, Solve, and Change—when we build that kind of a solution, we consistently see that time to proficiency is cut in half. I mean, we just saw a client take an 18-month time to proficiency down to 5 months, because they focused on performance. They brought in the power of workflow learning and they ended up with people being able to perform more effectively, more productively, and with less oversight. They were able to measure and demonstrate all those things because they were facing the workflow and designing and building and measuring around that.

BM: Sara, as you have learned more about this and you think about the world you came from, why does a Digital Coach excite you? As you look at L&D wanting to get to KPIs and other things we've talked about for so long, what does a Digital Coach add to a measurement conversation? Why do you think it gives us a different level of impact and approach?

SC: I think it’s really important to remember what we're in service of as learning professionals within our organizations. We are in service of the business. We are in service of performance. What gets me excited in thinking about how we provide solutions that allow our learners to optimize their work—while they are building their capabilities and their ability to do their work as effectively as possible—is that it really solves the measurement problem, right? The measures that we've been focusing on so far around waste and scrap learning, misalignment—those are almost entirely assuaged when you actually have a Digital Coach sitting shoulder-to-shoulder with you in the workflow. Because whatever my work is, I’m going to encounter a moment when I don't know how to do something. I'm struggling, and if I can quickly get the answer to my question and the support that I need, then I can get back to my work. Then we can say the work stoppage was a matter of minutes vs. other types of measures. For example, 60 or 90 days after a learning event, we hear from learners that they were only able to use 3% of the training we provided back on the job. I mean, how is that useful and helpful for a company that is in a very competitive industry and that’s trying to improve profit margins and overall competitiveness? So, that's what excites me.

The data points that I shared in the beginning of this conversation should be a call to action. We need to build our training differently. We oftentimes translate those data points to mean we need to find out what's happening with our learners so we can either improve the front-end training or provide them with some type of a job aid on the back end. Neither of those things is going to solve the issue. We need to robustly support them while they're doing their work.

BM: Anyone who can work “assuaged” into the answer to a question is in a whole new world for me. That was stunning, Sara. Love that.

So, let's talk about something I heard recently. I’d love to get your reactions. One measure we use is confidence, or self-efficacy, which is to feel that beyond just remembering everything, I have the metacognitive skills and tools—it's a combination of both—to enable and improve my performance. Recently, in a more traditional analysis, a learning professional learned that practice (aka “doing”) builds confidence, so their solution (to your point, Sara) was to go back to the front end—in a training mindset—and build more practice into the class. I agree that practice is better than 50 more PowerPoint slides. I'm all for that because of the cognitive load, etc. But that's confusing practice with true confidence building. Because even after I leave, having practiced a lot, I still just don't know. I'm entering the world of real work.

Con, run at this practice thing for me for just a second.

CG: Well, years ago, we did some work for the world's largest manufacturer of pumps. It was a European company and when they approached us, they said, “When our people finish their onboarding training, they are so confident. They leave, we survey them, and they're very confident. Even at six months, they are still confident, but something happens at the one-year mark. They suddenly say their training and onboarding experience was terrible and they have no real confidence in what we did for them. Can you explain why?” And I said, “Well, it takes them a year to figure out that what you were doing for them didn't help them.”

Albert Bandura did the most salient body of research in terms of self-efficacy and confidence building as it relates to learning. He found that the sooner people perform effectively and can recover if they make a mistake, that's the best way to build self-efficacy. Performing effectively in a controlled classroom is very different than performing effectively in the workflow. As you said the other day, Bob, the most powerful practice is work. That, to me, is a profound statement in and of itself. Work is practice: it's applying, it's doing the work in the workflow. The moment a person successfully performs on the job, that's when their confidence is reinforced and grows. The sooner we can enable effective performance in the flow of work, we begin to build that self-efficacy. The sooner people can recover when they make a mistake, we build that self-efficacy. Employee engagement is at the heart of self-efficacy, according to Bandura. That's where we've got to focus. It’s not about more practice in the classroom. It's about making sure that when people step into the workflow, they have the help they need from the Digital Coach (aka EPSS) to perform effectively on the job.

BM: If you want to practice anything, practice your Digital Coach. I don't want you to leave training because you did 10 practice runs of the same activity and got to the point where you think you can do it in your real work. What if I had you do 10 practice runs with a Digital Coach, so that when you leave, you know where to find what you need (you know how to recover)? Failure is a remarkable teacher and it's going to happen in the workflow. What if I mitigate your time to remediate if I help you avoid failure by teaching the practices around using a Digital Coach while performing so you do things correctly? That's where performance improves and confidence is raised. So, it's confidence in my ability to troubleshoot and survive in the workflow vs. confidence in my ability to memorize well.

SC: I'll give you an example. In between a couple tours of duty in the learning measurement space, I went to work for one of the most renowned business schools in the world, which also provides leadership development to corporations. That was the business unit I worked in. I was immediately going to be taking over a team and I had some things that I really needed to address with its members. Talking about this issue of confidence and coming into a role like that, I remember going through my onboarding process and being quite stressed about my personal brand and my reputation. I was asking myself, “Am I going to be able to hang with these people who are pretty incredible leaders?! Because we do world class leadership development!” I was worried I might mess up the first performance conversation, or the first time I had to coach somebody. I would have felt 100% more confident if I had come away from that onboarding with curated resources in a Digital Coach that were aligned to the work I was doing. I had experience doing the work I was being asked to do, but everything's a little bit different company to company and I hadn't flexed certain muscles in a while. So, I think that’s where we need to think about the confidence component. Onboarding is a great example of that. We want folks to feel confident coming out of onboarding, but we want them to feel confident because they know they're going to be well supported.

BM: And I think we confuse support with training or learning sometimes, meaning we don't think it's the same. I don't think a learner looks through the lens of “this is a training asset”, or “this is a support asset”. They look at it as a performance asset. So, if it helps me perform, and I learn while doing, that's training (in a way). Right, Con?

CG: Gloria Gery referred to that as unconscious learning. What she observed is that when you're in the workflow and you're doing your job, you're learning. If you have a tool to help you do that job, you are learning. It's not conscious and you’re not in a classroom. Again, she called it unconscious learning. But let's admit this: no matter how powerful and wonderful a training class is, when a person leaves that class, they are not competent. They are not proficient. They're ready to start. They're at the beginning stage of that, but expertise is developed over time through experience in the flow of work. So, if you want somebody who has expertise, it's not going to come from the classroom alone. It's going to come from a classroom combined with the workflow and experience over time. And that's real learning. Real learning happens in transfer. Real learning happens in those real-world practice activities, Bob, that you mentioned, when people are doing their work.

BM: Let's change the narrative. If we want to measure performance, let's live at the point of performance. Let's not live only in the weeks, days, and months before performance and try to correlate. Until we make this pivot, until we understand the workflow through analysis, until we enable it with a Digital Coach, until we understand the architecture and design of the performance support pyramid, criticality, and all the things we talked about in so many podcasts before, we're never going to get into those higher levels of Kirkpatrick and Phillips. We can talk about them all we want, but when we go to the C-suite, those leaders who demand these metrics are going to poke and shoot holes in them.

The exciting thing I like about what we do and about these podcasts, about the clients we're blessed to work with and folks like you, Sara, is that we know that the future in this is now. It's no longer something to talk about. It's no longer something to walk away from. Yes, it's hard, but it's doable. Sara, I love what you say sometimes: I think we've made measurement harder than it really is. I've heard you say that over and over again. We complicate this, partially because we don't understand the narrative. It gets simpler when you understand it from this perspective.

SC: One of the things that Con has been schooling me on that I've been so excited to incorporate into some of our thinking around measurement is really the partner to performance and productivity, which is the work stoppage piece. In the example that I gave you earlier when I had a job change, in the absence of a Digital Coach, what did I do? I made good friends with the best sales manager in that entire organization. Every time I needed something, I called him, I emailed him, I texted him, I Slacked him, etc. and got my support through his coaching. I mean, that is incredibly costly, not only in terms of my own work stoppage, but I was also causing his work stoppage. Ultimately, I ended up getting a lot of the answers that I needed, but at what cost? We can amplify performance and we can increase productivity dramatically without having that work stoppage. That's the important piece that I think is really the partner to productivity in the ROI conversation that I'm excited to have now.

CG: Across the board, the cost of stopping work to learn doubles the cost of learning per employee. It just doubles that cost. And it's a real cost because people are stopping work. Our goal is to enable and sustain measurable, effective job performance in a way that minimizes interruption of the work that employees are hired to do. That requires us to step into the workflow and support people in the workflow, which at the same time enables us to measure what is happening in that workflow and directly demonstrate that what we're doing is making a difference in terms of people being able to perform effectively.

BM: You know, I don't think we can go any further than that. That summed it up perfectly. And this is why I'm finally excited about this conversation. It's been the elephant in the room for my 40 years of doing this. In the last 10 or 20 years, Con, working with you and getting into the workflow the way we can now, the narrative changes. We can do this, but we as an industry must choose to change our deliverable, our approach, and the conversation with the business—and step up to wanting to do ROI. We can’t give up on it.

Thank you so much. You're both spectacular. Great podcast. Great conversations.

Subscribe to The Performance Matters Podcast to stay up to date on all the latest conversations and guests in the 5 Moments space. 

Visit our website for additional resources: Certificate courses, an eBook on workflow learning, and our latest EnABLE Methodology white paper. 

Join the conversation on LinkedIn. 

Be part of the Workflow Learning & Performance Alliance.

Copyright © 2023 by APPLY Synergies, LLC
All Rights Reserved.

Embrace the Benefits of Safe Failure

by Conrad Gottfredson, Ph.D., RwE

Any formal learning solution that lacks effective ongoing performance support leaves in its aftermath random acts of failure. This failure generally goes undetected by the organization unless its consequences are visible.

Even then, the distance between training and these subsequent failure points is often great enough to allow plausible denial of any culpability on the part of the learning solution. A key reason why we don’t see this failure is that the “grading” traditions of most school systems have oriented learners in their workstreams to do everything they can to avoid failure. When we throw them over the wall of our formal learning events into the real world of job performance, they tend to work hard to compensate for the limitations of those inadequate learning solutions. When they fail, they usually fail quietly.

Learning from Mistakes

From our earliest experience in formal education, we have been oriented to get things right and avoid making mistakes. Certainly, those of us who design and develop learning solutions should pursue effective performance as the primary indicator of success.

Yet, there’s a profound lesson to learn from former executive chairman and CEO of Cisco Systems John T. Chambers. When he interviewed potential leaders for his company, he rightly asked first about results and walked through what they had done right. But his next question was, “Can you tell me about your failures?” Chambers looked for candidness about the mistakes they’d made, but then wanted to know, “What would you do differently this time?”

Chambers understood that we’re a product of the challenges we face in life, because how we handle those challenges probably has more to do with what we accomplish than our successes.

Thomas Edison credited failure coupled with determination as the pathway to his success: “Genius? Nothing! Sticking to it is the genius! I’ve failed my way to success.” (1)

Now, no learning professional wants to take a chance on failure when the consequences are significant to catastrophic. This is where an approach called “critical impact of failure analysis” can help sort out tasks where failure can be a safe learning experience. Think about times when you have failed—where that failure didn’t harm anyone or anything. It might have been uncomfortable, but you learned from it, right?

Safe Failure

Learning through “safe” failure is most certainly a contributor to personal growth; therefore, our learning methodology ought to include identifying skills that people can safely learn while working, with the help of a Digital Coach, so that if failure happens, they learn from it and recover in the workflow. Here is an example of how instructionally powerful safe failure can be:

Recently, I was at a family home in southern Utah with my grandson. I asked him to load and turn on the dishwasher. Here is a 30-second video of his life changing learning experience:

Watch this short video!

Although Joseph had been taught by his mom and dad never to do what he did, he still made the mistake. After this safe failure experience, he will never make that mistake again. Through safe failure, he learned in one of the most instructionally powerful ways possible.

Again, no learning professional wants to take a chance on failure when the consequences are significant to catastrophic. But in our experience, on average, half of the skills taught in corporate courses can be learned safely in the flow of work, while working, without the need for employees to stop the work they have been hired to do. If an employee fails to complete a task successfully, that failure can be a safe and instructionally impactful learning experience. All that is needed is a Digital Coach to provide access to the support required to quickly recover.

Spend a few minutes studying the following rating scale:


Figure 1: Critical Impact of Failure Scale 

Consider the implications of identifying skills that score in the 1 to 3 range in the scale above. For these skills, an effectively designed Digital Coach provides a safety net that allows complete transformation of the classroom. How? By delivering 2-click, 10-second access to just what's needed to enable learning in the workflow, you can take these lesser-rated skills out of the classroom. This allows greater instructional focus on the remaining higher-rated skills.

Without this, most courses cram in too much for the allotted time. To cover it all, proper instructional methodology is often sidelined, which forces instructors into presenter mode; yet the critical impact ratings for some of that content call for significant investment of methodology.

Learning in the Context of Work

Critical impact of failure analysis has proven to be the means for safely removing an average of 50% of content from the learning queue and putting it into the workflow to be learned at the moment of Apply. This is actually the optimal environment for learning content and skills, provided the consequences of failure aren't significant to catastrophic. The real world, not the classroom, provides legitimate context and pressing need.

The closer a learner is to the place and moment of Apply, the more open and ready that learner is to learn. Consider your own learning mindset while in the workflow compared to when you step away from it to learn in the fabricated environment of a classroom or an eLearning course. At which of those moments are you most motivated to learn and ready to engage mentally, emotionally, and physically?

In closing, 1) experience confirms that we are most attuned to learning when we are in the context of our work, and 2) research shows that our work is the environment in which learning is most naturally optimized. Here's the good news: we can confidently push the learning of "safe failure" skills into the workflow, to be exclusively learned there with the help of a Digital Coach. If performers make mistakes, they learn from them in a very powerful way. Pushing "safe failure" skills into the workflow also allows us to give greater instructional attention to skills where the critical impact of failure is significant to catastrophic. By doing this, we responsibly mitigate potential failure points rather than leaving failure up to chance—something we should never do.

(1) https://www.inspiringquotes.us/author/3870-thomas-a-edison/about-genius 

Subscribe to The Performance Matters Podcast to stay up to date on all the latest conversations and guests in the 5 Moments space. 

Visit our website for additional resources: Certificate courses, an eBook on workflow learning, and our latest EnABLE Methodology white paper. 

Join the conversation on LinkedIn. 

Be part of the Workflow Learning & Performance Alliance.

Copyright © 2023 by APPLY Synergies, LLC
All Rights Reserved.