Thursday, March 29, 2018

How to Raise an Iceberg

I've been thinking about learning. Nothing new here.  Usually my thought process goes to individual learning, group learning, organizational learning.... how do we make them reinforce each other  Today, I'm looking at it with a different lens.

I've often argued that knowledge management needs to be better embedded in the flow of work, or at least in existing processes. For example, organizations that have strong risk management processes could enhance their risk management with knowledge management, couple the two processes and avoid duplicating efforts.  When we try to put new processes on top of already complex work routines, we unnecessarily decouple knowledge processes from work.  The same can be said about on-the-job-learning more generally.  The goal should be to enhance learning in the flow of work.  To a large extent, it may already be happening and it is not getting recognized.  Here is an extension of the knowledge management analogy using the famous knowledge iceberg.  The knowledge iceberg is typically used to illustrate that only a small portion of our knowledge (our explicit knowledge) is visible above the surface of the ocean while most of our knowledge (tacit knowledge) is buried in our heads and difficult if not impossible to share.  My little hand-drawn version of the iceberg below suggests a similar challenge with learning.  Formal learning, above the water line, is visible, measurable, and can be managed by organizations.  Informal learning is much less visible, not well understood, yet may contribute much more to overall learning within the organization.

Formal learning tends to be managed to support work and it can be aligned with the organization's strategic objectives.  Informal learning happens in the flow of work.  The challenge is 1) to better understand how informal learning can benefit the organization and support formal learning objectives; and 2) to raise the iceberg so that informal learning gets more visibility and the attention it deserves.

I've put Communities of Practice at the ocean's surface because I see it as a way of trying to encourage and shape informal learning around topics relevant to the organization.


Inspiration for this line of thinking often comes from:


Friday, March 23, 2018

We Still Need Lessons Learned - (Part 3 of 3)

In the previous two posts, I've argued that 1) KM practitioners need to be prepared to sell lessons learned practices by looking at them differently and debunking myths about the futility of lessons learned databases, and 2) the value of lessons learned is in the transformation of tacit knowledge into explicit knowledge.  In this third and last part of the series, I will talk about the value of aggregation.

Part 3: The value of lessons learned activities is not in the individual lessons themselves sitting in a database but in the aggregation of lessons. 


As I reviewed existing lessons and then spent almost 10 years helping to identify "new" lessons, two interesting insights emerged:  1) Many so-called lessons were deemed unworthy of being pushed higher up for further dissemination; 2) Many interesting new lessons had to do with the need to change and adapt to new conditions.

Dealing with the little lessons
Many lessons did not necessarily rise to the level of "real" lessons that deserved to be pushed to a database. Many of these lessons were good practices that projects needed to be reminded of.  It's easy to get caught in a discussion about what's a real or worthy lesson (almost as easy as getting caught in a discussion of the definition of knowledge management).  The mistake is to dismiss small lessons as not worthy.  The "little lessons" or "motherhood and apple pie" lessons may not be sexy and they may be a reminder that we often seem not to learn, but they are important in other ways.  Perhaps they're not lessons at all, but they tell you something about underlying problems that need to be addressed at the institutional level and not through a lessons learned database. They take on different meaning when you look at them in aggregate. You can review all of them, identify recurring themes,  then develop mini-case studies to incorporate into informal workshops as well as more formal training.  In addition, management could be encourage to use these mini-case studies as story vignettes in their communications.

Another way to look at these less than stellar lessons is to say that they are "local" lessons.  They are valid and valuable as the project's lessons but they don't need to be shared or go any further.  That can also become a problem.  What if the same little local lessons are being learned again and again around all the projects.  Unless someone is paying attention to these little lessons across the projects, they will remain localized, soon forgotten, invisible and will never come to the attention of management... until something goes terribly wrong, a thorough investigation is launched and the findings are that the underlying issue was known to everyone and prevalent across the organization, but it didn't get anyone's attention because no red flags were attached to it.  Make the recurring little lessons more visible. Do something about them.

Part of the role of the KM manager is to keep an eye on the recurring "motherhood and apple pie" lessons.  The idea is that these lessons are obvious, things that everyone should know and can't possibly disagree with.  Anything that sounds like motherhood and apple pie is dismissed because it is just common sense or it's a good practice.  It's the opposite of the powerful insight, the aha! moment.  It's not sexy at all.  It may even be embarrassing.  How could they not have known this before?  That's common sense.  It's just good project management practice.  That lesson is in the database already, in a dozen variations.  I would say, keep an eye on what everyone says is obvious yet keeps coming up as a challenge or a lesson. This is a little counter-intuitive. It's not what keeps management up at night. It could be pervasive, yet under the radar. No red flags.

In addition, just because something is obvious to a senior project manager and he/she doesn't see the value of putting it in the lessons learned database doesn't mean it's obvious to a junior member of the team who hasn't yet learned it first hand.

It is dangerous to think that it takes a major disaster like a Challenger or Columbia to create case studies worthy of widespread dissemination and incorporation into courses and workshops.  People need to learn from the "little lessons" before they become big problems.  The underlying issues that led to or contributed to the Challenger and Columbia accidents were things that manifested themselves in small ways every day and probably almost everywhere in the organization.  You can wait until they become a BIG lesson or you can catch them when they're much less visible.  It's not the stuff of academic papers and conference presentations (though I might try that route too), it's not catchy, but it matters.

As a side note, I've often asked myself if our KM activities could actually prevent another big failure.  How could we ever know what failures we've prevented?  How could KM be held responsible for failing to prevent a failure big or small?  Obviously, KM isn't the only mechanism organizations leverage to prevent failures, especially in high-reliability organizations like NASA.  Safety and Mission Assurance as well as Risk Management also play key roles... which is also why they should be working in close coordination with KM.

Learning to Adapt to Change and Unlearning
Shared Voyage (book cover)Many of what I would consider the more interesting lessons had to do with change and adaptation to change.  They may have been unique lessons and definitely new lessons but they fell into this bigger bucket of learning to adapt to change.  In my last presentation within NASA/Goddard a key point I made about what I had learned during time there was that we are (this isn't specific to NASA) a little slow to learn the lessons related to the need for change.  We have difficulty unlearning, we have trouble accepting that what had worked well in the past may not work well now.  Unlearning is key to adapting to change and continuous learning.   The theme of "unlearning" at NASA is addressed in Shared Voyage: Learning and Unlearning from Remarkable Projects.  This obviously creates another challenge for static lessons learned databases.  Ideally, lessons would be constantly flowing throughout the organization, aggregated and regularly analyzed for trends, reviewed and re-assessed through constant conversations.

Reference
  • Laufer, A., Post, T. & Hoffman, E. (2005). Shared Voyage: Learning and Unlearning from Remarkable Projects. NASA History Series. NASA SP-2005-4111.

Wednesday, March 21, 2018

We Still Need Lessons Learned (Part 2 of 3)

Part 2: The value of a lessons learned activity is in the transformation of tacit knowledge into explicit knowledge,

The value of such lessons learned activities is increased when it is the result of a group reflection process (as opposed to an accumulation of lessons gathered from individuals. The numbers in the visual (in the previous post about the benefits of Pausing to Learn) are made up, meant to convey a sense of comparative magnitude of the benefits of documenting lessons.  A great deal of learning happens in the process of writing lessons down, even if no one else is reading them. It is now out of people's heads and transformed into explicit knowledge.

I have described this in APQC presentation (Rogers & Fillip, 2017). Learning happens in the initial conversation that helps to articulate the context and the lessons themselves.  Learning happens through knowledge synthesis as the conversation is documented (whether it's a narrative, a conversation map or a video interview).  Learning happens in the review and validation process and finally learning happens when others read or are exposed to the lesson.  In other words, learning happens in conversations, but these are not random conversations.  These are intentional learning conversations.

Even if no one ever reads those lessons, at least no one will be able to say that the employees retired or left and walked out the door with all their (tacit) knowledge. And yes, I know that most of it is still in their head and can't be captured, yet a good amount of learning or knowledge transfer can happen through conversations.

The real problem is that we often don't have very good mechanisms for utilizing that explicit knowledge and we draw the wrong conclusions, which is the equivalent of failing to learn from a failure... or learning the wrong lesson from a failure. An example would be to draw the conclusion that people aren't using the lessons learned system because they haven't been trained to use it.  Seriously?  Well... let's not assume anything but that wouldn't be my best guess as to why no one is using the database.  

There could be better utilization of lessons learned through other project processes such as reviews and risk management (Fillip, 2015).  In fact, many repeated "lessons" are the result of institutional challenges that cannot be addressed at the project level and could be tackled through organizational risk management. Expecting all the benefits of lessons learned processes to come from individuals going to the database to look for lessons is just wrong headed. It's standard build-it-and-they-will-come wrong-headedness. 

Reference


Monday, March 19, 2018

We Still Need Lessons Learned, and We Need to Look at Them Differently (Part 1 or 3)

In this first part of the series, I will talk about some of the challenges of leading a lessons learned practice and how I tried to address them, in other words, how I learned to sell lessons learned. 

Part 1: Selling Lessons Learned as a KM Practice

KM is a sales job.  When I was working at NASA/Goddard helping project teams document their lessons, I was at times confronted with resistance in the form of remarks such as "no one uses the lessons learned databases, why should we bother contributing lessons to the system?"  This had been exacerbated by successive reports criticizing the Agency level lessons database called the LLIS.  Regardless of the merits of the critique of that system (which isn't the point of this post series), I still had to do my job.

Since I needed to convince project teams to meet with me and document lessons, I developed an approach that would overcome this resistance.  It was based on a simple diagram (below) that allowed me to address the issue upfront by redirecting the conversation by saying "you're right, the lessons are not being used the way they were expected to in the databases, but that's not the right way to look at lessons."

I would use this diagram as a handout during planning meetings with the project leaders to convince them to hold a Pause and Learn (PaL) session (group reflection activity) and I would use it again (hand drawn on a flip chart) at the very beginning of the PaL session to explain why we were all there and what the benefits were.  For a valuable conversation to happen, I needed both the buy-in from the project management team and the buy-in from the rest of the team.  Sometimes the project management team does a good job of bringing the team to the table with the right attitude. Sometimes it's the job of the facilitator to quickly elicit that attitude at the very front-end of the meeting.  There is nothing worse than a senior scientist showing up for a PaL session determined to wreck it with sarcasm about how useless these meeting seem to be since we keep repeating the same mistakes (in their opinion) or a team showing up because they've been told to be there but they really think they should be back in their office working on real immediate and tangible problems.
Source:  Rogers, E. & Fillip, B. Mapping Lessons Learned to Improve Contextual Mapping at NASA. APQC Webinar, August 18, 2017.
Once a team had experienced a successful Pause and Learn session, it was typically much easier to do it again at a later stage in the project life cycle. If you've made a sale and you've delivered on the expectations, you can expect repeat business. KM may be everyone's job, but from the knowledge management professional's perspective, there is a significant amount of "selling" involved.

Related Resources

Sunday, March 11, 2018

Women in Knowledge Management (or any male-dominated field)


Stan Garfield posted a list of  KM Thought Leaders.  I don't think the list is new but it circulated recently on LinkedIn, which is where I saw it. The great majority of the thought leaders on that list are men.  I am re-posting here the names of women who were on that list.  The idea is 1) to give the women more visibility with a separate list; 2) to point out that perhaps there are many more women thought leaders in KM who need to be added to this list.

Ironically, posting the list on this blog will do little to increase visibility, but that's only a first step.

For those interested, there is a Women in KM LinkedIn Group.  It's "unlisted".  I assume there is a reason for keeping it unlisted but I don't know what it is. I shall inquire. I've been a member of the group and never contributed anything (my bad).

There was also a recent article in RealKM by Bruce Boyes about the issue of gender equity in KM (thank you to Boris Jaeger for pointing me in that direction).  The article seemed to point to the fact that the field of KM isn't immune to broader societal inequalities.  There is nothing surprising about that and I can't disagree. Having experienced some gender inequality frustrations of my own.  As a woman, I have a good sense of how it has affected my career.  I can't say I have good answers or solutions other than to become more vocal about it AND take responsibility for some of it as well.

While I happen to be more sensitive to gender bias, there are probably other biases embedded in that list.  How many are not US-based for example?


First NameLast Name
V. MaryAbraham
PattiAnklam
StephanieBarnes
IrmaBecerra-Fernandez
MadelynBlair
HelenBlunden
GloriaBurke
DanieleChauvel
KimizKalkir
VanessaDiMauro
NancyDixon
LiliaEfimova
SueFeldman
CarolKinsey-Goman
SusanHanley
RachelHappe
HeatherHedden
JuneHolley
DorothyLeonard
CharleneLi
AliceMacGillivray
CarlaO'Dell
JeanO'Grady
NirmalaPalaniappan
KatePugh
CelineSchillinger
CatherineShinners
CarlaVerwijs
NancyWhite
NilminiWickramasinghe
Gone but not forgotten
DebraAmidon
MelissieRumizen
Moved on to other focus area or retired
VernaAllee
KayeVivian


3/16/2018 - Correction: Adding women I had omitted from the original list.

  • Alex Bennet
  • Jo Ann Girard
  • Joitske Hulsebosch
  • Bronwyn Stuckey
  • Beverly Wenger-Trayner


Do I want to be on that list?  
Am I upset because I'm not on the list?  Yes and no.  A year ago I could not have cared less but now as an independent consultant with my own business to worry about, yes, I do need to worry about not being on that list and I may need to figure out what it takes to get the recognition and visibility.  Perhaps it's not that specific list I need to worry about but more what it represents (free advertising).  Do I think I should be on that list right now? I am not able to answer with a resounding "YES", whereas most men would not hesitate.  THAT is the problem!

What am I going to do about it?
  • Learn more about women thought leaders in KM: I don't know more than 1/3 of the women on that list.  I probably should.  I therefore commit to learning more about their thought leadership.
  • Identity other women leaders in KM: Reach out to the women on the list and women in the "Women in KM" LinkedIn group to expand the list.
  • Become more active in the "Women in KM" LinkedIn group
  • Increase my networking with women in KM in my local area

References
Boyes, B. (2018). Improving gender equality in knowledge management: Is there a gender imbalance in KM, and if there is, what should we do about it? RealKM.  URL: http://realkm.com/2018/03/08/improving-gender-equality-in-knowledge-management/ 

Garfield, S. KM Thought Leaders.
URL: 
https://sites.google.com/site/stangarfield/kmthoughtleaders.





Monday, March 05, 2018

Collaborating and Learning Across Organizations in the Context of Joint Projects

Two speech bubbles - Co-learning


Learning in the context of collaborative efforts and/or complex organizational structures can be a challenge. Most of the literature, including what has been written by KM practitioners, focuses on the organization as the main unit of analysis. On the other hand, the Learning and Development (L&D) practitioners are primarily focused on individuals and sometimes teams. There is also a significant amount of work written about learning across teams, which addresses transferring knowledge from one team or one project to another. In fact, the theme of knowledge transfer comes up regularly. The focus below will be different.

Increasingly, individuals within organizations find themselves working in teams, projects or work groups that are made up of staff who belong not just to different divisions within the organization but to different organizations altogether. As collaborations, partnerships and contracting arrangements evolve to meet the needs of a constantly changing environment, individuals must learn to work effectively within these teams, which includes learning to learn collaboratively. This type of joint learning or co-learning is relevant in the two fields I am most familiar with, international development and aerospace. In both fields, the Government entities (USAID & NASA) do not deliver their mandate on their own. They work with and through myriads of partnerships, contractual arrangements as well as formal and informal collaboration agreements each of which comes with a specific set of constraints and opportunities. For each of these ways of working together, co-learning opportunities need to be identified and their potential value assessed upfront. Co-learning isn’t for everyone and every situation.

Co-learning across multiple organizations can be challenging. It’s one thing to conduct a lessons learned conversation within a team whose members all belong to the same organization. It’s another to conduct the same lessons learned conversation when members of the team belong to different organizations. The likelihood that walls (defensive mechanisms) will come up is increased.

For example:

  • The organizational cultures may be different. While one organization may be accustomed to open and honest conversations about what isn’t working, others may shy away from such transparency. 
  • Organizational processes may be different. One organization may have a very rigorous and structured process for documenting lessons while another doesn’t have a process at all. This may lead to many unspoken and unwritten assumptions. 
As with all aspects of collaboration (not just the joint learning aspect discussed here), one of the drivers of success is a thorough discussion of all assumptions about not just what will be done, but how it will be done. For example, if two organizations are going to implement a project together with specific roles and responsibilities clearly assigned and a component of that project is to discuss and document lessons to make regular adjustments in the project’s path forward, it will be important to address all relevant assumptions upfront and clarify as much as possible, including the following:
  • Clarify the overall goals and objectives of the joint learning activities. There should be a clear understanding of why the learning activities are being undertaken jointly as opposed to each organization involved in a collaboration conducting its own independent lessons learned. This can happen even within the same organization and is typically a sign of dysfunction. There are many circumstances where an internal lessons learned session is also needed, which might focus on questions such as “What did we (as organization x) learn from our collaboration with organization Y? Joint lessons learned sessions should never be used to try to place blame on any member of the collaboration or to tell other members of the collaboration what they should have done differently. If a serious problem occurred (a partner isn’t performing as expected for example), there are avenues to address these problems other than a lessons learned session. The joint learning session helps identify what every party involved could have done different from the start, not who is to blame for failures.
  • Clarify who will lead the joint learning effort. Perhaps not just which of the two organizations, but which staff position. This may be important in providing clues regarding how the activity is perceived by both organizations and whether there is a disconnect that needs to be addressed. Does the position/person have the right level of seniority, appropriate level of expertise, etc.? If it is important for all parties to be represented fairly and equally in the discussion, it may be a good idea to get the support of a third party facilitator to ensure neutrality in the conduct of the session.
  • Clarify the schedule of learning activities. Avoid using ambiguous wording. Phrasing such as “Lessons learned will be documented on a regular basis” can be interpreted in many different ways and could lead to disconnects between the collaborating partners.
  • Stipulate the methods to be used: A wide-open lessons learned conversation with 20-30 participants is very different from a series of one-on-one interviews, yet both could conceivably be used to elicit and document project lessons. In fact, a broad discussion of methods and their use in the specific context of the project would be valuable. Don’t assume that the way you’ve been doing it on other projects is necessarily the right approach in this context.
  • Be clear about outputs and utilization: Discuss how lessons identified will be utilized to make adjustments to the path forward in terms of work plans and specific activities. For example, my preferred format for documenting lessons learned discussions is an insight map. However, I would never assume that it’s always the best format or that it works for everyone. Beyond concerns about formats which can be trivial, it’s important to talk upfront about whether the session is meant to identify specific recommendations and actions or not. In some cases, the parties to the discussion have a wide open conversation, go back to their respective offices and independently decide on specific actions they will take to address issues raised during the lessons learned meeting and perhaps report on actions taken at a later date. In other cases, a second joint meeting may be held to focus on specific recommendations and actions. In these different variations of the process, it’s also important to discuss the issue of decision making and validation. Who will have the final say in terms of the final version of the lessons and/or recommendations? What kind of follow up or accountability process is in place to ensure that lessons are indeed embedded into future activities (within the ongoing project, not just in a potential future collaboration)?
  • Be ready for disconnects: Since it is not possible to anticipate every inaccurate assumption or disconnect that might derail the collaboration, there should be a clear mechanism for quickly addressing disconnects when they arise. 
And a final point, don’t assume that prior collaboration experience between two organizations means you can skip all these steps about clarifying assumptions. For all you know, one of the two organizations may have held an internal lessons learned process at the end of the last collaboration and decided that it didn’t work well for them and they would recommend a completely different approach. Or, you might just be dealing with very different team, different leadership, making it a completely different collaborative environment. It never hurts to check that you are all on the same page.