Saturday, May 26, 2018

Prototyping through Conversations

I'm in Week 7 of a Working Out Loud Circle (my first) and while I had some difficulty connecting this week's exercises to my goal and I've almost lost track of what my goal was in the first place, there are always interesting insights that come out of the conversation with my circle buddy. 

I have found the additional resources provided in each of the weekly guide to be a great source of useful insights even when I'm not sure the rest of the activities did anything for me. 

Here's an example:  This week was about thinking about a long term vision of oneself. I did a lot of work on that a year ago when I was transitioning from full-time work to consulting.  My vision is still the same and I'm on track. In a sense, this entire year has been an experiment, prototyping a range of different activities.

One of the additional resources for week 7 is a blog post by John Stepper titled "The simplest and easiest form of prototyping is a conversation."  I experienced this earlier this week when, after a significant number of individual interviews to collect data on on-the-job learning for a client, I was finally starting to see where this work was going, I did a hand-written sketch of the framework that was emerging, and during the last two interviews of the week, I went with my "prototype" framework to test some of the ideas that were emerging.  These conversations were some of the most satisfying I have had so far.  Listening, asking questions and absorbing information in interview is great, but I find the conversations where I start to validate my own emerging understanding to be the most satisfying.

Saturday, May 19, 2018

Action Learning, Action Research & Experiential Learning - Lots of questions & no answers

I am currently doing work around on-the-job learning.  Trying to define what constitutes "on-the-job learning" has turned into an interesting exercise.  On the face of it, anything that you might learn while doing your job is on-the-job learning. You might also define it by articulating what it's not.  Anything that requires you to leave your work to go attend some "training" may involve learning but it's not "on-the-job" even if it's sponsored by your workplace.

Some on-the-job learning is planned and intentional (mentoring, coaching, stretch assignments, etc...) but most of it happens in the flow of work and may be unconscious. In the past few weeks I've come across many variations without coming any closer (yet) to a definition.  How does on-the-job training (as opposed to learning" fit in?  Is it different?    It would seem that something like an apprenticeship would be closer to on-the-job training.

What about action learning? Is that a form of on-the-job learning?

According to the World Institute for Action Learning (WIAL), action learning can be defined as "a process that involves a small group working on real problems, taking action, and learning as individuals, as a team, and as an organization. It helps organizations develop creative, flexible and successful strategies to pressing problems."

The main difference between what is traditional seen as on-the-job learning and action learning may be the team dimension of action learning. The same can be said of many knowledge management practices, including After-Action-Reviews.  They focus on team or group-level learning.

What about action research?  Is that the same as action learning?  Is that a form of on-the-job learning?  Does the word "research" make it sound more scientific and rigorous?

This is where we need to define "on-the-job."  If "on-the-job" means in the process of one's daily job, then action learning and action research might not fit the bill.  While they are meant to address real, practical problems, they appear to be separate from the normal workflow.  But then, the same could be said of mentoring and coaching.

How about experiential learning?  That might get things even more mixed up because people delivering training might argue that they make it "experiential."

I'm quite lost at this point.  What if "on-the-job learning" is a useless label?

On May 25th, the Knowledge Management Community of DC is hosting Dr. Bea Carson for a session on Action Learning.  Dr. Carson is an author, speaker and expert in the field of Action Learning. 

I look forward to it since I am personally quite confused about some of what I perceive as artificial boundaries between disciplines and overlapping terminology.  To see how confused I am, visit my blog post on "Action Learning, Action Research and Experiential Learning" (https://www.fillipconsulting.com/2018/05/action-learning-action-research.html).

If anyone needs additional information about the May 25th meeting, please contact me (barbara@fillipconsulting.com) or visit the KMC-DC MeetUp.


Thursday, March 29, 2018

How to Raise an Iceberg

I've been thinking about learning. Nothing new here.  Usually my thought process goes to individual learning, group learning, organizational learning.... how do we make them reinforce each other  Today, I'm looking at it with a different lens.

I've often argued that knowledge management needs to be better embedded in the flow of work, or at least in existing processes. For example, organizations that have strong risk management processes could enhance their risk management with knowledge management, couple the two processes and avoid duplicating efforts.  When we try to put new processes on top of already complex work routines, we unnecessarily decouple knowledge processes from work.  The same can be said about on-the-job-learning more generally.  The goal should be to enhance learning in the flow of work.  To a large extent, it may already be happening and it is not getting recognized.  Here is an extension of the knowledge management analogy using the famous knowledge iceberg.  The knowledge iceberg is typically used to illustrate that only a small portion of our knowledge (our explicit knowledge) is visible above the surface of the ocean while most of our knowledge (tacit knowledge) is buried in our heads and difficult if not impossible to share.  My little hand-drawn version of the iceberg below suggests a similar challenge with learning.  Formal learning, above the water line, is visible, measurable, and can be managed by organizations.  Informal learning is much less visible, not well understood, yet may contribute much more to overall learning within the organization.

Formal learning tends to be managed to support work and it can be aligned with the organization's strategic objectives.  Informal learning happens in the flow of work.  The challenge is 1) to better understand how informal learning can benefit the organization and support formal learning objectives; and 2) to raise the iceberg so that informal learning gets more visibility and the attention it deserves.

I've put Communities of Practice at the ocean's surface because I see it as a way of trying to encourage and shape informal learning around topics relevant to the organization.


Inspiration for this line of thinking often comes from:


Friday, March 23, 2018

We Still Need Lessons Learned - (Part 3 of 3)

In the previous two posts, I've argued that 1) KM practitioners need to be prepared to sell lessons learned practices by looking at them differently and debunking myths about the futility of lessons learned databases, and 2) the value of lessons learned is in the transformation of tacit knowledge into explicit knowledge.  In this third and last part of the series, I will talk about the value of aggregation.

Part 3: The value of lessons learned activities is not in the individual lessons themselves sitting in a database but in the aggregation of lessons. 


As I reviewed existing lessons and then spent almost 10 years helping to identify "new" lessons, two interesting insights emerged:  1) Many so-called lessons were deemed unworthy of being pushed higher up for further dissemination; 2) Many interesting new lessons had to do with the need to change and adapt to new conditions.

Dealing with the little lessons
Many lessons did not necessarily rise to the level of "real" lessons that deserved to be pushed to a database. Many of these lessons were good practices that projects needed to be reminded of.  It's easy to get caught in a discussion about what's a real or worthy lesson (almost as easy as getting caught in a discussion of the definition of knowledge management).  The mistake is to dismiss small lessons as not worthy.  The "little lessons" or "motherhood and apple pie" lessons may not be sexy and they may be a reminder that we often seem not to learn, but they are important in other ways.  Perhaps they're not lessons at all, but they tell you something about underlying problems that need to be addressed at the institutional level and not through a lessons learned database. They take on different meaning when you look at them in aggregate. You can review all of them, identify recurring themes,  then develop mini-case studies to incorporate into informal workshops as well as more formal training.  In addition, management could be encourage to use these mini-case studies as story vignettes in their communications.

Another way to look at these less than stellar lessons is to say that they are "local" lessons.  They are valid and valuable as the project's lessons but they don't need to be shared or go any further.  That can also become a problem.  What if the same little local lessons are being learned again and again around all the projects.  Unless someone is paying attention to these little lessons across the projects, they will remain localized, soon forgotten, invisible and will never come to the attention of management... until something goes terribly wrong, a thorough investigation is launched and the findings are that the underlying issue was known to everyone and prevalent across the organization, but it didn't get anyone's attention because no red flags were attached to it.  Make the recurring little lessons more visible. Do something about them.

Part of the role of the KM manager is to keep an eye on the recurring "motherhood and apple pie" lessons.  The idea is that these lessons are obvious, things that everyone should know and can't possibly disagree with.  Anything that sounds like motherhood and apple pie is dismissed because it is just common sense or it's a good practice.  It's the opposite of the powerful insight, the aha! moment.  It's not sexy at all.  It may even be embarrassing.  How could they not have known this before?  That's common sense.  It's just good project management practice.  That lesson is in the database already, in a dozen variations.  I would say, keep an eye on what everyone says is obvious yet keeps coming up as a challenge or a lesson. This is a little counter-intuitive. It's not what keeps management up at night. It could be pervasive, yet under the radar. No red flags.

In addition, just because something is obvious to a senior project manager and he/she doesn't see the value of putting it in the lessons learned database doesn't mean it's obvious to a junior member of the team who hasn't yet learned it first hand.

It is dangerous to think that it takes a major disaster like a Challenger or Columbia to create case studies worthy of widespread dissemination and incorporation into courses and workshops.  People need to learn from the "little lessons" before they become big problems.  The underlying issues that led to or contributed to the Challenger and Columbia accidents were things that manifested themselves in small ways every day and probably almost everywhere in the organization.  You can wait until they become a BIG lesson or you can catch them when they're much less visible.  It's not the stuff of academic papers and conference presentations (though I might try that route too), it's not catchy, but it matters.

As a side note, I've often asked myself if our KM activities could actually prevent another big failure.  How could we ever know what failures we've prevented?  How could KM be held responsible for failing to prevent a failure big or small?  Obviously, KM isn't the only mechanism organizations leverage to prevent failures, especially in high-reliability organizations like NASA.  Safety and Mission Assurance as well as Risk Management also play key roles... which is also why they should be working in close coordination with KM.

Learning to Adapt to Change and Unlearning
Shared Voyage (book cover)Many of what I would consider the more interesting lessons had to do with change and adaptation to change.  They may have been unique lessons and definitely new lessons but they fell into this bigger bucket of learning to adapt to change.  In my last presentation within NASA/Goddard a key point I made about what I had learned during time there was that we are (this isn't specific to NASA) a little slow to learn the lessons related to the need for change.  We have difficulty unlearning, we have trouble accepting that what had worked well in the past may not work well now.  Unlearning is key to adapting to change and continuous learning.   The theme of "unlearning" at NASA is addressed in Shared Voyage: Learning and Unlearning from Remarkable Projects.  This obviously creates another challenge for static lessons learned databases.  Ideally, lessons would be constantly flowing throughout the organization, aggregated and regularly analyzed for trends, reviewed and re-assessed through constant conversations.

Reference
  • Laufer, A., Post, T. & Hoffman, E. (2005). Shared Voyage: Learning and Unlearning from Remarkable Projects. NASA History Series. NASA SP-2005-4111.