Bring Out the Talent: A Learning and Development Podcast

From Designer to Engineer, New Roles for Learning Professionals

March 14, 2022 Maria Melfa & Jocelyn Allen Season 1 Episode 20
Bring Out the Talent: A Learning and Development Podcast
From Designer to Engineer, New Roles for Learning Professionals
Show Notes Transcript

Learning engineering may not be new (it emerged in the 1960s) but new applications of AI in our industry, specifically adaptive learning, bring increased focus on that specialized skillset. How will learning analytics and adaptive solutions radically improve the speed and quality of learning?

In this episode of "Bring Out the Talent," we speak with Michael Noble, PhD, President (Americas) for Area9 Lyceum, the industry leader in adaptive learning. We discuss the emergence (or re-emergence) of a unique role for learning engineers, how that role differs from that of instructional designer or LX designer, and why it's not just another trend. Innovations in technology are creating a demand for increased competencies in data analysis and data-driven design. And with a much more ambitious aim--to accelerate expert performance and mastery in our organizations. 

Tune in and discover the next big shift in the future of learning.

Maria Melfa: [00:00:05] Welcome everyone to Bring Out The Talent. My name is Maria Melfa, and I am the President and CEO of The Training Associates, otherwise known as TTA.

 

Jocelyn Allen: [00:00:14] Hi everyone, I'm Jocelyn Allen. I'm a Talent Recruitment Manager here at TTA. And once again, we're so excited to have you here with us. Back again by popular demand, we have with us one more time our Director of Learning Solutions, John Laverdure. Hey, John.

 

John Laverdure: [00:00:29] Hello there, Jocelyn. Great to be back.

 

Maria Melfa: [00:00:31] So excited to have you join us again. And we're even more excited for one of our favorite guests of all time, Dr. Michael J. Noble. Dr. Noble is the President of Area9 Lyceum. Dr. Noble was a former Chief Learning Strategist and Chief Operating Officer at TTA. He worked as the Managing Director of Learning Solutions at Franklin Covey. In that role, he oversaw a library of leadership and personal effectiveness programs offered globally in both live and digital modalities. Prior to that, for most of his career, nearly 19 years, Dr. Noble was with Allen Communication Learning Services. He began as a summer intern and progressed to the role of Executive Vice President and Chief Learning Officer. He has experience with mobile, micro, adaptive, social, curated, virtual, and other pioneering solutions. In 2015, he was recognized as a Utah Business CXO of the Year. As a thought leader, Dr. Noble has presented at dozens of conferences and been a keynote speaker at international events. He is also a regular contributor to industry publications. He holds a Ph.D. from the University of Louisiana at Lafayette, as well as M.A. and B.A. degrees from the Brigham Young University in Utah. Welcome, Michael J.

 

Jocelyn Allen: [00:02:06] Yeah Michael Noble!

 

Michael Noble: [00:02:09] Thank you very much for having me. It's great to be back home with TTA. I have a ton of respect for the work that you do and the fantastic podcast that you're running, so I'm honored to be a guest.

 

Maria Melfa: [00:02:22] So, Michael, we hear of this new term “learning engineer”. What exactly is that, and what skills are needed as a learning engineer?

 

Michael Noble: [00:02:31] So I think this question is a good one, right? Because we've seen a lot of change in our industry over the last 20 years, right, used to be that, “oh, you were an instructional designer?” and people ask, “Well, what is an instructional designer do?” Right. And you would explain, “Oh, I'm not a programmer, I'm not a graphic designer. I'm an expert in instruction.” Right? And then we had like two old instructional designers who were more one-man bands. I can do a little bit of this. I can do a little bit of development. I also instructional strategy. Then the pendulum swung to specialists, right? “Oh, I'm a learning strategist or I'm a performance consultant. I do the thought work, but not so much on the production or the curriculum development work.” Right? We have learning technologists that focus on, you know, just LMS’s and LXP’s. And so, you know, and recently, with the LXP’s, we have learning experience designers that are looking at moments of need, and how do we design a comprehensive learner experience that's in the flow of work? So it may be that we had a little bit of like saturation with new terminology, but one of the reasons why the new terms keep coming up is because the job keeps changing.

 

Michael Noble: [00:03:46] There is more to be done. The technology keeps changing. And when we talk about a learning engineer, there are a couple of things that I would point to. There's kind of two things that I would say differentiate learning engineering from other kinds of instructional design. One would be an understanding of data and learning analytics. Most people, when they talk about instructional design, they refer to this model. That's A-D model, right? Analyze, design, develop, implement, evaluate. It's been, you know, it's not really an instructional model. It's this catchy and so it goes viral. Learning engineers do not subscribe to that model, right? They actually look at data first, and then analyze. It's an aggregate, the information you have and then analyze that. So, one of the things I'm going to say is that it's a focus on data and it's understanding the. The second thing I'd say is understanding the differences between just creating. If you think about creating material for an instructor or creating material for a learning experience, there's a sequence and flow to that. If you're creating material that will be delivered by artificial intelligence, it needs a different format, right? And that's those are the two big differences I would point out.

 

Jocelyn Allen: [00:05:04] It's interesting to see where our technology is going, especially in the last couple of years, where it's all we've had a chance to focus on really is that sort of adaptation. How does authoring adaptive learning programs differ from the other types of offering, as you were starting to mention, and why does this need a more specialized skill set?

 

Michael Noble: [00:05:25] I touched on that in a way in my last answer, but let's dive a little bit deeper there. We work with subject matter experts, and our goal is to make that as easy as possible for our learners. We want to distribute it. We want it to be at their fingertips. We want that to be easy. We talk about engagement, and we want that to be all about, “OK, how do I capture their attention? How do I keep them engaged?” And we have a certain level of measurement that we do. Mostly, if we're honest and I don't mean to be cynical about my own field of specialization, but if we're honest with ourselves, a little bit of it is, “Hey, let's hope we don't embarrass ourselves because this is going out enterprise wide. The executives are going to see it. Everyone's going to see this, and we're going to get all kinds of flack if people don't like our e-learning.” Right? That is really our motive for most of what we do. And that's a little bit cynical, I admit, with offering an adaptive learning, you're looking at different criteria. For example, the whole goal of adaptive is to focus on what you don't know. Instead of serving the same experience to everybody, we're going to give you an experience just for you. So, people talk about Netflix and learning and, “Oh, wouldn't it be cool if we had an algorithm that that did this?” Right? Well, that's great in terms, of course, recommendations, but Netflix isn't going to assemble a movie based exactly on what I want. A brand new movie that no one else has ever seen. That's just for me.

 

Michael Noble: [00:06:58] Right? But that's exactly what we're trying to do with adaptive learning. And so, to do that, you have to offer the content in small one to two minute chunks, right? And that that are aligned with activities because we put practice before content delivery, right, in terms of our approach. And so, it's that emphasis. And then we pay a lot of attention to predictive validity because we care about the data, like does this question or activity “How well does it really assess that learning objective?” Right? And there's an art to that right. And a lot of people have written and talked about how to write higher validity questions. And we're really committed to a science-based approach that takes the best of that, and leverages it. And so, it's not as much about, “Hey, let's try not to embarrass ourselves, but let's do something that's really going to make an impact where I am. I will show you the reports on how we changed people's understanding, how long they took all that kind of data.” That's a long answer to a simple question, and I could go on probably for the whole podcast. I'm going to pause there. Can I ask you guys a question?

 

Jocelyn Allen: [00:08:10] Please? 

 

Michael Noble: [00:08:11] Sure, you have an extensive network of talent in this space. I want to know what kinds of conversations you might be having with talent or customers about design skill sets when they when you're talking to customers about what they need, right? Maybe this is a question for John, specifically. You know, how do you drill down and get to the type of the type of work that needs to be done to meet the customer's needs?

 

John Laverdure: [00:08:38] Yeah, there's a lot of components to that. It's really everything from understanding the deliverable, what the outcome of the training is going to be, what they have for infrastructure and talent or currently, whether they expect it to be collaborative exercise or something more independent. If the organization has any particular constraints or preferences in the realm of, you know, work style, you know, is it that more waterfall ADI model or is it a more iterative approach to because we've run into that, you know, where people have a certain methodology that they're very used to working in as soon as they veer off that methodology, it's a very different it's a very different experience for them and it can it can throw them. So, there's a lot that goes into it. A lot of the more staff augmentation solutions that we're engaged in are they're usually pretty prescriptive like we get a really tight requirement around particular authoring tools, skill sets that they need, industry experience they need, et cetera. We don't have clients approaching us for adaptive solutions. It's like it's literally never the asset that's being worked on.

 

Maria Melfa: [00:10:01] Meaning that they don't necessarily come to you for that, but that might be the solution. Yeah, I think so, I think that's kind of

 

Michael Noble: [00:10:10] What that's the answer I would expect, right, which is this is an emerging technology, right? It's not like you went to instructional design school and you had a detailed analytics class, right? Because there's different ways of accomplishing this, and the field is really emerging. A few months ago, I was presenting at a conference with it was a NATO technology training conference. And it was interesting to me because one of the keynotes from the ADL, they are the governing body for score and the standard that we use to integrate with different learning platforms. And she talked about learning engineering, and I was kind of surprised because I'm like, “Oh, maybe it's not just us? They're talking about this new field.” And she defined it in exactly the way that I would have defined it. But I do think there is not a ton of consensus, and I think it's worth noting for like if I'm listening to this podcast and I'm a let's say I'm a professional in the field, this is something I can you can explore, get ahead of. Is this the kind of thing that interests you? Do you like there's been there has been a trend, I guess, over the last couple of years where you'll see like MythBusters, who are debunking, you know, the easy target is like learning styles, but there's other kind of things that are kind of built into our practices that aren't necessarily science based.

 

Michael Noble: [00:11:46] ADI’s also kind of an easy target. But what that signals is it's easier for us now to have access to data and evidence than we ever had before. And so, you know, I think for a certain part of our audience, it's going to be listening to this and is maybe hearing about adaptive or learning engineering for the first time. It's almost like, “OK, given the fact that our field is there's so much specialization there?” right? In some ways, it opens the door to different kinds of talent. Right? So that an organization like TTA has a real advantage because there's not one particular skill set you can really tap into the broader range of kind of background and experience that's out there. We find it to change management issue with organizations, right? If you're used to typical e-learning and suddenly you get what you think is a test thrown at you. It takes some explanation to say, “Oh, you're going to get a bunch of questions and activities to complete. Its diagnostic. It's so the artificial intelligence will know how to create that custom learning or that movie just for you.” And you know, we see that on both the with learning, engineering and on the learner side.

 

John Laverdure: [00:13:08] So, Michael, we have a lot of clients that have this very high level conceptual understanding of adaptive learning where, you know, they get that it's creating a unique experience, but they haven't thought through all the nuance and around that or really any way to qualify it for their organization. So that's my question for you is how would a learning professional know when it makes sense to use adaptive learning as part of their strategy?

 

Michael Noble: [00:13:36] Again, another great question. There's a couple of different ways they could answer this question. One of them I want to answer with like, “hey, you don't have to,” you know, like if a customer is working with us because I know where the market's at and what's going on, we expect to do a fair amount of upskilling and customer readiness, right in terms of working with a customer. And we're ready to invest that because we totally care about our solution. And because there is a little bit of a barrier of entry to getting here. It's worth having those longer conversations and making sure that it's that it's really a good fit. I mean, in terms of value, prop the place where this is maybe my second answer, right? The business answer to that is. Large companies that need, for example, have compliance needs, right? There's that particular angle, so it's, “Oh, we're a large financial organization where we need the data that shows that we have prepared our that we're compliant.” Right? And working in a regulated environment is something that they've always been on the front line of new learning approaches, right? Finance, pharma, the life sciences and finance and insurance. They're kind of on the front line there because they have that critical need. And the other one which everybody is talking about right now is a time to proficiency with know and this is an overused term

 

Michael Noble: [00:15:04] But the great resignation or kind of our the weariness that we're feeling with our career and COVID and everything else. We've got an onboarding challenge. How do we get people up to speed faster? If you think of a pie chart of your knowledge, right, and I've got, “OK, there's a slice of pie of stuff, I don't know. But the bigger piece of the pie, the whole rest of the pie is stuff I already know.” And if we've standardize that learning experience, which we always do because we're basing it on an average or typical learner, I'm going to go through. You know, 60, 70, 80 percent of the stuff I already know. I'm bored. I'm disengaged. I'm spending an equal amount of time on stuff I know and stuff I don't know. Here in time to proficiency, I can just focus on that slice of stuff. I don't know. Get the other stuff out of the way, right? And we usually see. An average of half the time to try to proficiency, which is a big. We're seeing that in government, we're seeing that across the board in all kinds of industries, high tech, especially herever there's a talent deficit, wherever we need to get people on board quickly.

 

John Laverdure: [00:16:19] So, Michael, that time to proficiency that you're mentioning, I mean, I feel like there's a lot to that, right? I mean, you got the obvious corporate perspective on it. You know, I've got more productivity because I have less time lost in the in the workforce. But I feel like there's more to it than that. You know, perhaps learner engagement. Can you talk us through some of those other components that that make adaptive learning so valuable?

 

Michael Noble: [00:16:50] In addition to time to proficiency, we've also seen because we have the data and because we do a really robust kind of refresh strategy. The artificial intelligence is smart enough to know about when you're going to forget something, and which content you're going to forget, right? So, we can also make it stickier. So, it's not just, “hey, you got through the content faster”, but it's really an apples and oranges comparison because that standard e-learning course that you went through, it probably didn't have reinforcement built in. It probably wasn't. It probably wasn't smart enough to know these are the key points that this particular individual might struggle to remember based on how they, you know, on how they're doing. And you know, the other thing that I think time to proficiency masks. There's a book by Todd Rose called “The End of Average”, which highlights this problem. Ok, well, if we look at how long it takes and we take the average, that's about how much time we expect people to take in the course. Really, what I'm saying that the average time is reduced, some learners take four times as long as other learners to complete the training. And that's hidden behind something that is time to proficiency, and so when it's individualized,

 

Michael Noble: [00:18:06] Right, if you're a slow learner like I am, I'm going to take longer. And so that expectation is key to put in there, too, that it really is individual. So shorter time in general, although individuals may be different, I would say the stickiness of the content, right? And you get a higher-level engagement because people aren't bored. And the other thing I will say on the engagement front is it is it takes effort. A big part of prepping people is, “Oh, you're going to be working.” It's not like, sit back in my chair. Watch a video. Click Next. Read. Read. Answer multiple choice question. For someone like me that is easily distracted, I will admit that I click, click, click to get through a course, right? I can't do that with Adaptive. It's measuring how I'm doing. Once I've gotten used to how the AI works, I'm looking for the answers. I'm coming back to the questions. It’s effort. And if you take high quality effort and feedback, the depth of the learning, the retention. There's so many other benefits that, like you say, we're only just scratching the surface with time to proficiency, and it's an easy like entry level kind of value prop to bring. But you're right, you're definitely right.

 

John Laverdure: [00:19:27] So, Michael, revisiting the learning engineer role, can you tell me a little bit more about the types of things that they work on?

 

Michael Noble: [00:19:33] I think that is going to change and evolve. But right now, a lot of the work that a learning engineer does is actually converting kind of legacy content and courses into an adaptive format, right? Which is different than like new content creation. And that's because when you adapt, when you when you adopt a platform that is kind of data based, you want to get you want some data and you want the content in the system, and then you do data driven design, which is I look at the course that I've converted, right? Or I've put in some content, and I see where learners are struggling. We can actually create a heat map of, “Oh, which learning objectives are we struggling on which questions are good? Which questions are we getting challenges on? How much time are people spending?” And then we do our design work in a targeted way at what the data tells us is happening. And so, there's content conversion bringing over content from Legacy Tools or from Instructor With Learning. There are some of the typical instructional design work that you would expect in terms of matching content with learning objectives. But there's it's done at a very granular level and then we're using data to tell us where we invest, right? Whereas old paradigm, I'm looking at it and I'm using my judgment to say, “I think this is a difficult concept. I think we should spend more time on this, or the subject matter tells us that matter. Experts tell us that.” We can actually have data, reinforce, and inform our decisions on where we make our investments. And that's, to me, that's the realization of what a learning engineer can do. An instructional designer today might struggle a little bit with that.

 

John Laverdure: [00:21:13] Well, it seems that the data is really necessary for a lot of those, you know, prescriptive solutions or digging in deeper in certain topical areas. And I don't know that a lot of organizations get that from their typical e-learning programs or their LMS’s or if they do, they get that very high-level large chunk knowledge, as opposed to more detailed knowledge.

 

Michael Noble: [00:21:36] We've learned to not to ask for it even, right? We've kind of learned, “Oh, I know I can just get a score. So just give me a completion. And maybe if a score on the test or something and you know, it's more data than I had before, so I'll be happy with that.” But to get analytics on learners, to get analytics on content, and to be able to go really granular with that is. I have not seen another way of accomplishing this without that kind of detailed learning engineering work.

 

Maria Melfa: [00:22:08] Speaking of data and analytics, what sort of insight can we get from adaptive learning analytics with respect to soft skills and hard skills?

 

Michael Noble: [00:22:18] There's a couple of things you, of course, you can get learner progress. You can see you can monitor the learners, see exactly where they are, where they might be getting hung up. You can measure metacognition. Metacognition is how aware is a learner of what they know and what they don't know, right? Most of us are unconsciously incompetent on a few things where we think we know the answer, but we don't, right? An adaptive platform is really good at converting that. Here's the stuff you thought you knew, but you didn't really know, and converting that to proficiency over time. And so, we measure what your initial level of proficiency was against where you end up. You can get data on other kinds of activities, whether that's inquiry-based activities or projects. We can even manage experiences in the platform. So, if you're looking at an inline learning solution or a let's say in the flow of work, right? We talk about that we can actually track as learners start to apply the skills on the job. We can track all of that data as well on maybe they need so many repetitions of a certain task, or maybe they need to shadow and then guide other people. Those are those are activities that we can structure in the system as well. They're not adaptive in the same way as the AI kind of knowledge-based activities, but you have a lot of opportunities for data of skills and kind of higher-level learning as well.

 

Jocelyn Allen: [00:23:53] So back to something that you said a little bit earlier that kind of resonated with me, in regards to kind of this adaptive and customized way of living I guess, because you're the reference was that Netflix movie. I was like, “OK, I get this” because even Netflix right now is like, “Oh, here's what we recommend for you or here. We'll just start playing something that we know we think you'll like if you're having trouble picking something out.” But what is it really about customization that like, why is that so important to adaptive learning, or rather bigger scale, what is it about that movie that's being made specifically for you that makes it that much more impactful or effective on the learners or the viewers?

 

Michael Noble: [00:24:38] I would say it's attention. We can keep your attention because you're not getting bored. You don't already know stuff. I mentioned metacognition. In some ways. We're teaching you to learn with so much practice, right? If we think about the optimal way of learning, most of us learn better from like one-on-one tutoring experiences. And it's what most of us. It's practically all of us. The classroom model was only ever about efficiency. It was we're going to produce a set of we're going to like a like a factory for learning, right? The reason why tutoring has not. It's not very scalable, right? Because it's one-to-one. And the ability to have technology scaffold that right really does create it opens a whole new, just a whole new paradigm for us in learning, I think. And that's what we're seeing with the with kind of the early adopters that are excited about it is they're seeing they're seeing that process. I think it's a bit of a tough sell. If you're used to just skating through and treating your learning as, “Oh, I've got a I've got to do the ethics course.” Click, click, click check the box done. You're going to be annoyed by a course that makes you learn that ethics policy, and you can't complete it until you're done, right? And that's where you have to have attention. You have to be. You have to put forth effort. But our promise is we're not going to make you do. It's not meaningless effort. It's specific on the things you don't know.

 

John Laverdure: [00:26:09] When we were speaking recently, I hadn't even thought of it, but you had mentioned that an adaptive learning platform can be placed in front of the LMS and can actually help steer the learning experience for learners. Maybe not at that super granular level, but it can actually create, you know, help establish learning paths at the broader topic levels. Can you tell us a little bit more about how that looks? Because we have clients regularly that are trying to determine learning paths for their employees, and it's sounding like AI might be a very viable solution to do that for them.

 

Michael Noble: [00:26:45] It's certainly. LMS has a job to do in terms of like structuring, learning, and learning paths. It's something that that they do relatively well. But the problem with adaptive learning is if you plug it into an LMS, you're getting the benefit of the adaptive learning during the initial learning experience, but you're not getting the long-term benefit of the refresh, right, and the reinforcement because it's behind that LMS doorway. If we put it before the LMS, and we just send data to the LMS, we can remind the learners, “Hey, we're going to nudge you for. Give us two minutes and we'll refresh your learning on this topic.” Right? And it's going to compile that uniquely for you based on what you did in the course, right? You got to 100 percent proficiency when you were done, but you immediately started to forget everything you learned. Right. And with our artificial intelligence, we're going to try to predict what you're actually going to forget, and we're going to refresh on just those pieces. So, the advantage of putting it in front of the LMS is primarily one of refresh, right? And then, we'll send the data to the LMS and, you know, as opposed to contain arising the course and putting it inside the LMS.

 

Michael Noble: [00:28:03] Now there's a lot of things. In terms of paths, I wanted to talk a little bit about that because what our next generation platform does, which we're piloting right now, it allows us to take to do competency based, skills based passing right? Where we're creating more of that skills profile unique to that learner. So, it really does do that kind of thing you're talking about where, “OK, I've got my skills profile. It knows exactly where I am. It can serve up all of that” and do the “Oh, here's how you connect to other libraries of content that may not be adaptive.” Right? So, there's some advantages there as well. I think no one really wants to replace their LMS. They're behemoths. They are hard. It's hard to just. That project can be expensive. So, our goal is to kind of plug in where we can add the most value and advise our customers on how we become part of the stack.

 

Maria Melfa: [00:28:59] So, Dr. Noble, as we're coming to an end, how do we achieve long-term behavior change, and what is the role of both adaptive learning technology and the related targeted content development?

 

Michael Noble: [00:29:12] I think this goal that we've had of. It’s really human excellence, right? It's like we want to optimize. It's for ourselves. It's not for the factory model of everybody needs to be the same and we need a standard level of proficiency. This is to give me, as an individual, the chance to grow and develop my expertise. And if we think of it from an expertise standpoint, and not a “OK, here's the standard you've got to meet the standard and plug along”, I think it gives us a slightly different goal that is a little bit more human. It connects a little bit more to how each one of us is doing. And that's ironic because I'm talking about artificial intelligence. But, it's leveraging it for the most human of at our core. What makes us different and unique that I think is the role of adaptive learning. It's really hard to provide that transformative learning experience that we've had when we've worked with a coach, when we've worked with a piano tutor, when we've worked, you know, when you had that opportunity to have a mentor, right? And I'm not saying that artificial intelligence meets that need completely, but it can scaffold that need and also enable us so that we can create some of those experiences at scale. And that, I think, that's the long-term goal.

 

Jocelyn Allen: [00:30:35] Michael, it's always a pleasure reconnecting with you and talking about all the things that you're doing in this space. This has been incredibly informative and a lot of relatable examples as to why this will work in a lot of organization for the development of their employees in the way that they're utilizing their learning. So, I really appreciate your time and getting this information to our listeners. I think we're going to hear a lot about how we can help them in the future.

 

John Laverdure: [00:31:00] Yeah, I really want to thank you as well, Michael. This being innovative and technologically based is near and dear to my heart. So, I'm very excited to see how this progresses over time, and excited to start seeing some results with the time to proficiency reductions and everything that goes along with that.

 

Maria Melfa: [00:31:22] It's the way of 2022. 

 

Jocelyn Allen: [00:31:25] Mm hmm. 

 

Maria Melfa: [00:31:27] So, thank you very much, Michael. It was a true pleasure. Always love hearing you talk and always learning so much from you. So, thank you so much.

 

Michael Noble: [00:31:35] My pleasure. And you know, I love you guys. I look forward to future collaborations.

 

Jocelyn Allen: [00:31:40] For more information on today's podcast guests and how they can help your organization, please visit www.thetrainingassociates.com

 

Maria Melfa: [00:31:50] Bring Out The Talent is a MuddHouse Media Production.