Showing posts with label lessons learned. Show all posts
Showing posts with label lessons learned. Show all posts

Tuesday, February 05, 2019

Bridging the Gap Between Individual Learning and Team Learning (in practice)

At the core of what I do is the concept of LEARNING. I work both with individuals in the context of formal and more informal classes, and I work with teams.

In the classes I teach, it doesn't matter how much I've taught the students.  What really matters is how much the students have learned.  It's the difference between measuring activities (my teaching) and keeping an eye on results (what they've learned) or even impacts (what are they doing with what they've learned).

There is nothing more rewarding than a student who tells me at some point in the course that they've done something different as a result of what they've learned in the course.  It makes all the grading headaches vanish. Perhaps they've had a conversation with a colleague or they suggested a new way to do things at work. Sometimes it's a simple impact in their personal / family life. They've realized that Knowledge Management principles can be applied to many aspects of their life.

Too often, our modes of teaching are geared towards making students "study" rather than "learn."  After years of studying in traditional educational environments, they are not prepared to learn in the workplace.  Professional development in a workplace environment is still primarily a matter of attending a conference or signing up for a class to get one more certification or some kind of electronic badge in recognition for participation in some form of training.

As a result, many professionals fail to fully leverage the power of learning to enhance their careers and their lives in general.  They fail to do so as individuals and they find it difficult to do so in a group or team context.

Traditional studying and classroom-based learning do not emphasize learning from our mistakes -- or even from our successes.  When the goal is to pass the test, students learn to pass the test, not to master the content that was on the test.  The equivalent in the workplace, in the absence of regular testing of acquired competencies, is the traditional annual performance review.  The goal, in this context, becomes making sure to impress one's supervisor with how well we've done in the past year to ensure that we do not get penalized with a lower raise than our peers (or no raise at all).  Where is the incentive there to reflect on what we did not do so well, what we struggled with?  Where is the incentive to learn and grow?


How can we get past these learning challenges at the individual level and ensure some enhanced learning within a team?

Here are some simple, practical ideas:
  • Stop treating individual learning and team learning as so separate and distinct that they are handled by different departments (HR, KM/Business units). 
    See "From Individual to Team Learning."
  • Help individuals reflect on their individual lessons and bring those individual lessons to the group in a safe environment.  Writing a lesson learned sounds like a simple exercise on the surface but writing a useful lesson requires more work.
  • Help the group think of group learning not just in terms of the aggregation of the lessons of individuals within the group, but learning that applies to the entire group.  There is always an "I" and a "WE" in groups. Neither should be ignored.
  • Discuss the issue of lesson ownership at the individual level and at the group level.
    See "The Ownership of Lessons."

Related Posts on this Site:


________________________________________________________________________________

NEW COURSE:  KNOWLEDGE MANAGEMENT IN PROJECT ENVIRONMENTS
Ask about my newest course offered through George Mason University's Executive and Professional Education Program (and through corporate training programs): Knowledge Management in Project Environments.  The course touches on a lot of issues related to team learning.



Friday, March 23, 2018

We Still Need Lessons Learned - (Part 3 of 3)

In the previous two posts, I've argued that 1) KM practitioners need to be prepared to sell lessons learned practices by looking at them differently and debunking myths about the futility of lessons learned databases, and 2) the value of lessons learned is in the transformation of tacit knowledge into explicit knowledge.  In this third and last part of the series, I will talk about the value of aggregation.

Part 3: The value of lessons learned activities is not in the individual lessons themselves sitting in a database but in the aggregation of lessons. 


As I reviewed existing lessons and then spent almost 10 years helping to identify "new" lessons, two interesting insights emerged:  1) Many so-called lessons were deemed unworthy of being pushed higher up for further dissemination; 2) Many interesting new lessons had to do with the need to change and adapt to new conditions.

Dealing with the little lessons
Many lessons did not necessarily rise to the level of "real" lessons that deserved to be pushed to a database. Many of these lessons were good practices that projects needed to be reminded of.  It's easy to get caught in a discussion about what's a real or worthy lesson (almost as easy as getting caught in a discussion of the definition of knowledge management).  The mistake is to dismiss small lessons as not worthy.  The "little lessons" or "motherhood and apple pie" lessons may not be sexy and they may be a reminder that we often seem not to learn, but they are important in other ways.  Perhaps they're not lessons at all, but they tell you something about underlying problems that need to be addressed at the institutional level and not through a lessons learned database. They take on different meaning when you look at them in aggregate. You can review all of them, identify recurring themes,  then develop mini-case studies to incorporate into informal workshops as well as more formal training.  In addition, management could be encourage to use these mini-case studies as story vignettes in their communications.

Another way to look at these less than stellar lessons is to say that they are "local" lessons.  They are valid and valuable as the project's lessons but they don't need to be shared or go any further.  That can also become a problem.  What if the same little local lessons are being learned again and again around all the projects.  Unless someone is paying attention to these little lessons across the projects, they will remain localized, soon forgotten, invisible and will never come to the attention of management... until something goes terribly wrong, a thorough investigation is launched and the findings are that the underlying issue was known to everyone and prevalent across the organization, but it didn't get anyone's attention because no red flags were attached to it.  Make the recurring little lessons more visible. Do something about them.

Part of the role of the KM manager is to keep an eye on the recurring "motherhood and apple pie" lessons.  The idea is that these lessons are obvious, things that everyone should know and can't possibly disagree with.  Anything that sounds like motherhood and apple pie is dismissed because it is just common sense or it's a good practice.  It's the opposite of the powerful insight, the aha! moment.  It's not sexy at all.  It may even be embarrassing.  How could they not have known this before?  That's common sense.  It's just good project management practice.  That lesson is in the database already, in a dozen variations.  I would say, keep an eye on what everyone says is obvious yet keeps coming up as a challenge or a lesson. This is a little counter-intuitive. It's not what keeps management up at night. It could be pervasive, yet under the radar. No red flags.

In addition, just because something is obvious to a senior project manager and he/she doesn't see the value of putting it in the lessons learned database doesn't mean it's obvious to a junior member of the team who hasn't yet learned it first hand.

It is dangerous to think that it takes a major disaster like a Challenger or Columbia to create case studies worthy of widespread dissemination and incorporation into courses and workshops.  People need to learn from the "little lessons" before they become big problems.  The underlying issues that led to or contributed to the Challenger and Columbia accidents were things that manifested themselves in small ways every day and probably almost everywhere in the organization.  You can wait until they become a BIG lesson or you can catch them when they're much less visible.  It's not the stuff of academic papers and conference presentations (though I might try that route too), it's not catchy, but it matters.

As a side note, I've often asked myself if our KM activities could actually prevent another big failure.  How could we ever know what failures we've prevented?  How could KM be held responsible for failing to prevent a failure big or small?  Obviously, KM isn't the only mechanism organizations leverage to prevent failures, especially in high-reliability organizations like NASA.  Safety and Mission Assurance as well as Risk Management also play key roles... which is also why they should be working in close coordination with KM.

Learning to Adapt to Change and Unlearning
Shared Voyage (book cover)Many of what I would consider the more interesting lessons had to do with change and adaptation to change.  They may have been unique lessons and definitely new lessons but they fell into this bigger bucket of learning to adapt to change.  In my last presentation within NASA/Goddard a key point I made about what I had learned during time there was that we are (this isn't specific to NASA) a little slow to learn the lessons related to the need for change.  We have difficulty unlearning, we have trouble accepting that what had worked well in the past may not work well now.  Unlearning is key to adapting to change and continuous learning.   The theme of "unlearning" at NASA is addressed in Shared Voyage: Learning and Unlearning from Remarkable Projects.  This obviously creates another challenge for static lessons learned databases.  Ideally, lessons would be constantly flowing throughout the organization, aggregated and regularly analyzed for trends, reviewed and re-assessed through constant conversations.

Reference
  • Laufer, A., Post, T. & Hoffman, E. (2005). Shared Voyage: Learning and Unlearning from Remarkable Projects. NASA History Series. NASA SP-2005-4111.

Wednesday, March 21, 2018

We Still Need Lessons Learned (Part 2 of 3)

Part 2: The value of a lessons learned activity is in the transformation of tacit knowledge into explicit knowledge,

The value of such lessons learned activities is increased when it is the result of a group reflection process (as opposed to an accumulation of lessons gathered from individuals. The numbers in the visual (in the previous post about the benefits of Pausing to Learn) are made up, meant to convey a sense of comparative magnitude of the benefits of documenting lessons.  A great deal of learning happens in the process of writing lessons down, even if no one else is reading them. It is now out of people's heads and transformed into explicit knowledge.

I have described this in APQC presentation (Rogers & Fillip, 2017). Learning happens in the initial conversation that helps to articulate the context and the lessons themselves.  Learning happens through knowledge synthesis as the conversation is documented (whether it's a narrative, a conversation map or a video interview).  Learning happens in the review and validation process and finally learning happens when others read or are exposed to the lesson.  In other words, learning happens in conversations, but these are not random conversations.  These are intentional learning conversations.

Even if no one ever reads those lessons, at least no one will be able to say that the employees retired or left and walked out the door with all their (tacit) knowledge. And yes, I know that most of it is still in their head and can't be captured, yet a good amount of learning or knowledge transfer can happen through conversations.

The real problem is that we often don't have very good mechanisms for utilizing that explicit knowledge and we draw the wrong conclusions, which is the equivalent of failing to learn from a failure... or learning the wrong lesson from a failure. An example would be to draw the conclusion that people aren't using the lessons learned system because they haven't been trained to use it.  Seriously?  Well... let's not assume anything but that wouldn't be my best guess as to why no one is using the database.  

There could be better utilization of lessons learned through other project processes such as reviews and risk management (Fillip, 2015).  In fact, many repeated "lessons" are the result of institutional challenges that cannot be addressed at the project level and could be tackled through organizational risk management. Expecting all the benefits of lessons learned processes to come from individuals going to the database to look for lessons is just wrong headed. It's standard build-it-and-they-will-come wrong-headedness. 

Reference


Monday, March 19, 2018

We Still Need Lessons Learned, and We Need to Look at Them Differently (Part 1 or 3)

In this first part of the series, I will talk about some of the challenges of leading a lessons learned practice and how I tried to address them, in other words, how I learned to sell lessons learned. 

Part 1: Selling Lessons Learned as a KM Practice

KM is a sales job.  When I was working at NASA/Goddard helping project teams document their lessons, I was at times confronted with resistance in the form of remarks such as "no one uses the lessons learned databases, why should we bother contributing lessons to the system?"  This had been exacerbated by successive reports criticizing the Agency level lessons database called the LLIS.  Regardless of the merits of the critique of that system (which isn't the point of this post series), I still had to do my job.

Since I needed to convince project teams to meet with me and document lessons, I developed an approach that would overcome this resistance.  It was based on a simple diagram (below) that allowed me to address the issue upfront by redirecting the conversation by saying "you're right, the lessons are not being used the way they were expected to in the databases, but that's not the right way to look at lessons."

I would use this diagram as a handout during planning meetings with the project leaders to convince them to hold a Pause and Learn (PaL) session (group reflection activity) and I would use it again (hand drawn on a flip chart) at the very beginning of the PaL session to explain why we were all there and what the benefits were.  For a valuable conversation to happen, I needed both the buy-in from the project management team and the buy-in from the rest of the team.  Sometimes the project management team does a good job of bringing the team to the table with the right attitude. Sometimes it's the job of the facilitator to quickly elicit that attitude at the very front-end of the meeting.  There is nothing worse than a senior scientist showing up for a PaL session determined to wreck it with sarcasm about how useless these meeting seem to be since we keep repeating the same mistakes (in their opinion) or a team showing up because they've been told to be there but they really think they should be back in their office working on real immediate and tangible problems.
Source:  Rogers, E. & Fillip, B. Mapping Lessons Learned to Improve Contextual Mapping at NASA. APQC Webinar, August 18, 2017.
Once a team had experienced a successful Pause and Learn session, it was typically much easier to do it again at a later stage in the project life cycle. If you've made a sale and you've delivered on the expectations, you can expect repeat business. KM may be everyone's job, but from the knowledge management professional's perspective, there is a significant amount of "selling" involved.

Related Resources

Sunday, October 15, 2017

Experience Capitalization, Another Approach to Lessons Learned

The vocabulary of knowledge management and organizational learning is a never ending source of learning, especially when practicing across industries.  While looking at United Nations activities around Knowledge Management, I came across the term "experience capitalization."  Intuitively, I knew what it was referring to but I couldn't remember ever encountering the term before.  My first instinct was to try to figure out how that might be similar to or different from variations of lessons learned activities.

Here's what I found:

Experience capitalization includes the identification of lessons learned and good practices, but it goes beyond identification to include a significant effort to create materials for dissemination of the lessons and good practices.  This reflects the international development context within which the importance of disseminating good practices and lessons learned through appropriate communication channels is paramount and perhaps more complex and challenging than dissemination in a corporate environment. The use of the term appears to be more prevalent in agricultural development (FAO, IFAD, etc...), which makes sense because the UN consulting request for proposals where I first encountered the term was related to an agriculture program.

For additional information, see the following:
In parallel, as I was preparing for some facilitation of lessons learned conversations in French, I came across the term "retour d'experience," which literally means "return on experience" but if I say "return on experience" in English it brings up a possible association with "return on investment."  Perhaps each experience can be perceived as an investment (in time) and the return on that investment in time can be in part measured by the lessons learned in the process, as long as the lessons are indeed properly identified, captured and shared.  

Monday, September 04, 2017

Systems Thinking and Organizational Learning - Initial Thoughts (Post 1)

This month of learning is going to be an experiment in Working Out Loud (WOL) or more specifically Learning Out Loud (LOL).  Systems Thinking is the theme and I'll write posts based on what I learn and wherever my thinking is going.

Here's a simplistic way of grasping the concept of systems thinking: Nothing operates in a vacuum. Everything is part of a larger system.  When we analyze things (whether objects or problems) as if they operated in a vacuum, we are missing the bigger picture.

Here is how it relates to some of my work.  I help projects document their lessons.  A key challenge I have as a facilitator is to get project team members to focus on what THEY (within the team) learned and could have done differently or will do differently in the future as a result of their experience and consequent learning.  Inevitably, the team will refer to challenges that were brought upon the team that were outside their control.  The project can be thought of as a system, but it is part of an organization, which is a larger system, and it is connected to outside stakeholders who are part of an industry, which is an even larger system.

Insight:  While it is essential to push the team to focus on THEIR lessons, it is equally important to articulate lessons at other levels, to adopt a systems thinking approach.   When I talk about individual, team and organizational learning, and then intra-organizational (or perhaps industry) learning, I may be talking about systems within larger systems.  How do we ensure appropriate lessons are captured at all levels?  The lessons are distinct at each level, yet interconnected.

Here is how systems thinking relates to some of my earlier work in international development:  Individual international development projects have little chance of having any significant impact unless they pay attention to the broader context.  In the old days, we talked a lot about donor coordination and supporting country policies so that the country environment was more conducive to specific development efforts and donor activities didn't overlap or conflict.  I think (hope) that nowadays, approaches based on systems thinking are more prevalent.  Coordination of donor activities and alignment of policies may be a good start but certainly not enough.

Question:  What's the connection between systems thinking and issues related to scaling development interventions to have a larger impact?

Question:  What's the relationship or connection between systems thinking and design thinking?
For reasons unclear to me at this point, the concepts of systems thinking and design thinking are co-mingled and confused in my mind as if I was meant to connect the dots between them and yet I don't grasp either of them well enough on their own to make the connections.

Resources


  • Harold Jarche, Working and Learning Out Loud, blog post, November 10, 2014.
  • An example from USAID's use of systems thinking to support efforts in the health sector: Complexity and Lessons Learned from the Health Sector for Country System Strengthening (2012)
  • Monday, August 14, 2017

    Making New Mistakes - Learning @ NASA - August 16th Webinar Open to All




    I'll be joining NASA colleagues Michael Bell of Kennedy Space Center and Jennifer Stevens of Marshall Space Flight Center to talk about how NASA addresses lessons learned.  My focus at the Goddard Space Flight Center has been working with projects to institutionalize group reflection activities such as the Pause and Learn as a way of facilitating group learning and documenting lessons for dissemination, focusing on knowledge flows and learning rather than lessons in a database.

    This webinar is open to the public and there should be time for Q&A.
    ________________________________________

    What?  You missed it.  You can catch up here.

    Friday, August 04, 2017

    The Ownership of Lessons

    Earlier this week I attended a panel discussion on "The Role of Learning in Policymaking" organized by the Society for International Development's Policy and Learning Workgroup. I took a lot of notes because it was all very interesting but I'll focus here on one issue that hit a nerve for me:  Lessons learned ownership.

    There are many reasons why some lessons are not "learned"  We don't believe them, we don't care enough, we forget them, etc....   I'm only going to focus here on one reason: Lack of ownership.  In other words, the hypothesis is that the ownership of a lesson contributes significantly to its utilization.

    This lack of ownership comes in (at least) two flavors, two variations on the "not invented here" theme:

    1. We don't learn very well from other people;  We learn better from our own experience -- and even then it's far from perfect because of personal biases and other issues.  Even if we understand and agree with someone else's lesson, we may not think it applies to us.  We don't own it.

    2. We don't like being told what we should learn, especially if someone else's conclusion doesn't match ours. Why would I care about someone else's idea of what I should learn?  Did I ask for this "feedback"?  It is being offered in a way that's useful to me?  Sometimes we just don't want to own it.  We actively resist it because we didn't come up with it.

    Example:  A donor agency makes policy recommendations to a developing country government based on strong donor-collected "evidence."  Let's face it, we can't get out own government to always act upon strong "evidence," so why do we expect other countries to act upon donor-generated lessons. Ownership needs to be built in from the beginning, not mandated at the end.  We might all know that but does it always happen?  I don't think so.

    From Ownership to Action
    To say that lessons are not learned until something is changed (in policy, procedures, behavior, etc...) is perhaps cliche and misleading or at least not very useful.  Over the past 9 years of helping project teams identify lessons from their experience, I have found that statement to be disconnected from reality.  If not totally disconnected from reality, I found the one-to-one linear relationship between lesson and action to get to "learning" to be a gross oversimplification.  Some of this oversimplification has to do with the lack of discussion of lesson ownership.

    Having facilitated more than 100 lessons learned discussion sessions, I can now quickly identify ownership red flags in lessons learned conversations.  A lot has to do with the pronouns being used. I try to provide ground rules upfront encouraging the use of "I" and "we" and making sure the group is clear about who "we" refers to.  Blaming individuals or entities who are not in attendance and hinting at lessons intended for "them" ("They should do ________.") are both big red flags. It doesn't mean the conversation needs to stop, but it needs to be redirected to address ownership issues and ultimately increase the chances that some action will be taken.

    At that point, the facilitator's redirect can go into two different directions and sometimes both are needed:
    • "Assume THEY didn't hear you right now and they're going to keep doing it their way (i.e, they are not going to learn).  What can you do next time to avoid this or at least mitigate the problem?"
    • "Is there an avenue for giving them this feedback so that they might do something about it (i.e., they might learn) and this problem isn't repeated?"
    In the real world, where lessons that are documented don't automatically turn into actions, that's how I try to deal with ownership issues.  I primarily work with project teams, but their work requires interactions with many stakeholders external to the team.  Sometimes what is most needed is for separate lessons learned sessions with different set of stakeholders and then some discussion of lessons across the different sets.  It's not necessary to look for perfect consensus across the different groups, just to optimize understanding of the different perspectives.

    It feels as if I'm only skimming the surface here.  More percolation needed.

    Wednesday, July 26, 2017

    The Lessons Learned Handbook: Practical Approaches to Learning from Experience (Book 26 of 30)

    Title: The Lessons Learned Handbook: Practical approaches to learning from experience
    Author: Nick Milton


    This is a very readable book in the Nick Milton/Patrick Lambe tradition (see KM Approaches, Methods and Tools, and The Knowledge Manager's Handbook) providing a menu of approaches, in this case focusing on knowledge capture methods and more specifically, lessons learned.

    Regardless of books and guidance in other forms, there is nothing like working with real projects and real teams to understand the complexity of lessons learned activities (and why they are often so maligned).

    Two key points about lessons learned:

    1. Lessons stored in a database have very little use (that's almost a cliché). No one uses them. There is some benefit to whoever documented the lesson (whether an individual or a team), but once it is in storage, it is almost certainly lost.  Therefore, why bother?  An exception would be a lesson that was so critical that it resulted in a process or policy change, at which point it can be removed from the lessons learned database.  The danger, even in that case, is that people will forget why the process or policy is the way it is and eventually revert to previous practice, thereby unlearning or forgetting.   Unlearning is not always a bad thing.  In fact, it can be necessary, but that would be the subject of another post.

    It's not that the databases of lessons are completely useless.  They are not useful in the ways most people expect them to be useful. There are instances where lessons stored in a database can be useful. The database curator can and should do some regular data mining and analysis to identify possible trends, recurring lesson themes, etc... and advise management on possible actions.  At NASA/Goddard, I've used the database of lessons to help identify themes to be addressed in knowledge sharing workshops (aka Critical Knowledge Conversations). The database was never the only source of information I relied on for that purpose but it contributed to decisions about what topics to address.  There are other ways lessons could and should be better integrated into the project life cycle, but that should be yet another post.

    2. Documenting lessons learned well is more difficult than most people imagine.  I can't stress that enough.  Individual lessons learned can be heavily biased.  Group lessons are less likely to be biased by any single individual perspective but they will tend to have a group/team bias.  A project team's lessons are lessons from the team's perspective, not the organization's perspective.  The challenge is for an experienced facilitator to guide teams through the process of identifying and documenting valuable lessons without requiring the teams to take any kind of special lessons learned training on the part of teams.  This can be done over time, with lots of iterations of discussions around lessons.  Discussing what constitutes a valuable lesson in the abstract is not as useful as struggling with a real lesson and documenting it with some guidance.

    We need a broader vocabulary to discuss lessons.  In most cases, when I facilitate group discussions to discuss and document lessons, we end up with a lot of valuable observations and insights, lots of opinions, some whining or venting, and sometimes a lesson or two.
    ________

    I have more books left on my shelf than there are days to complete this 30-day challenge.  When appropriate, I will group them if they address a very similar topic within Knowledge Management.   Another book on my shelves addressing lessons learned is Post-Project Reviews to Gain Effective Lessons Learned, by Terry Williams.   This book was published by the Project Management Institute (PMI) and it has a strong project management angle.  PMI has done more recently to emphasize the knowledge dimension of project management, but PM and KM haven't yet really been fully integrated.

    A little further on the relatedness scale is Katrina Pugh's Sharing Hidden Know-How: How Managers Solve Thorny Problems with the Knowledge Jam.  The Knowledge Jam is a detailed, well thought-out methodology for engaging groups in purposeful facilitated conversations that have impacts in terms of integration or adaptation for use.  In other words, it's not a question of whether the lessons and insights will ever be used, but rather how to ensure they are used.  That part of the process isn't left to chance or to other knowledge management activities (like a separate workshop).

    TO DO:
    • Revisit Sharing Hidden Know-How.

    Friday, November 11, 2016

    Skills and Tools - Don't Put the Cart Before the Horse

    I just came upon a short article in the Harvard Business Review, "Until You Have Productivity Skills, Productivity Tools are Useless," which triggered the following insights.

    You could easily replace every instance of the word "productivity" in that article and replace it with "knowledge management."  The point is that skills must come before tools.

    Most efforts to build skills after the so-called KM platform has been deployed is really "training" to use that particular platform, failing to impart real knowledge management skills.

    We deploy a lessons learned platform and we plan to train people on how to put lessons learned into the system, focusing entirely on the mechanics of uploading a document, filling a web-based lessons learned form.

    A more valuable approach in the long term, which would also build buy-in from everyone involved, would start by asking how people have handled lessons learned without a sophisticated KM platform, identify the strengths and weaknesses of existing approaches and some of the lessons and insights that the employees have about lessons learned.

    Before you can expect employees to actively participate in an effort to capture valuable lessons learned and post them in a KM system, a lot of groundwork needs to be done.  To be most effective, that groundwork, laying out the foundations for a learning organization, has nothing to do with technology and everything to do with how people think and behave around their own knowledge, that of their team and the entire organization.  Then, more specifically, the employees at all levels need to engage in productive high level conversations around lessons (What constitutes a lesson?  When is a lesson really learned?  How do we share lessons?) as well as more practical conversations, perhaps in the form of hands-on workshops, around documenting specific lessons and working through the complexities involved in that process.

    Note that I am not advocating KM "training."  Employees do not need to be trained to become KM experts.  They need a little help thinking and doing their job with a more developed sense of what it means to be part of a learning organization.  None of this needs to be in the form of formal training.  Indeed, the more informal, conversational, hands-on, and embedded in the work flow, the better.

    Thursday, August 04, 2016

    Mapping Lessons Learned to Improve Contextual Learning at NASA - APQC's August 2016 Webinar


    A special invitation to join Dr. Rogers and I for a presentation on Mapping Lessons Learned at NASA. 




    "If you missed APQC's 2016 KM Conference this past April, we've got a treat for you! Join us on Thursday, August 18 at 10:30 a.m. CDT for the August KM webinar, Mapping Lessons Learned to Improve Contextual Learning at NASA.
    NASA Goddard Space Flight Center’s Chief Knowledge Officer, Dr. Edward Rogers, and Barbara Fillip from Inuteq, will repeat their highly-rated session from the conference on how Goddard has designed a KM program to fit the needs of the organization, focusing on one of the most essential aspects of the program: the process for documenting lessons learned from projects using concept maps.

    This presentation will have a very brief intro to concept mapping, followed by an explanation of how and why it is used at NASA. Dr. Fillip and Dr. Rogers have worked on this together for seven years and will jointly address benefits of the approach as well as remaining challenges.
    Can't make the webinar? Register anyway and you will receive a copy of the slides and recording, regardless of attendance. "



    FOLLOW UP:  We had more than 400 live attendees and the webinar was very well received.  It worked well that we talked for 30 minutes and had 30 minutes of Q&A.


    Sunday, July 24, 2016

    Lessons Learned... again

    "Lessons learned" is a perennial topic within knowledge management, one that is typically misunderstood and maligned.  I suspect many who pontificate about lessons learned (either saying they know how to do it very well or saying it's a waste of time) haven't spent a huge amount of time working (hands-on) with lessons learned.  As always, I could be wrong.  I just find it difficult to reconcile either extremes with my own experience.

    I'll try to synthesize my views below:

    1. Lessons learned activities have the potential for providing a great deal of value to individuals, teams and organizations, yet it's easy to completely miss the boat.

    2. We often look for benefits of lessons learned activities in the wrong places.  Those who argue that lessons learned are a waste of time will point out that lessons tend to end in databases that are not used, where they languish and soon become obsolete.  I don't argue with that.  Instead, I think it's a mistake to assume that the primary benefit of lessons learned activity are to be derived from other people consulting a database.

    3. The primary beneficiaries of lessons learned activities -- assuming the activity is done as a group -- are the people involved in the activity themselves.  This ensures that a) they take the time to reflect on an experience to articulate lessons; b) they do this as a team to avoid individual biases and narrow points of view.

    4. Aggregating the lessons into some form of database has value even if not a single person comes to consult the database, as long as someone is responsible for doing analysis on the repository of lessons to identify trends, critical knowledge, issues to be addressed at the institutional level, etc...

    5. Lessons learned activities are a source of valuable insights for other types of knowledge management activities, such as knowledge sharing workshops, where the issues emerging in lessons learned activities can be discussed in the context of panels of practitioners who can share their experience in much more engaging way than a database of lessons learned will ever be able to achieve.

    Recommendations
    • Reframe the way you talk about lessons learned activities and their benefits.
    • Don't ditch the database idea.  Make sure the person responsible for the database isn't just uploading lessons.  You need an analyst, not a database administrator.
    • Continuously work to improve the way lessons learned are captured and to educate employees about what constitutes a valuable lesson.
    • Broaden your view of what a database of lessons learned might look like.  Hint:  It could be a collection of 100+ concept maps hyperlinked into a rich web of knowledge.

    Saturday, October 24, 2015

    Insights, Tips, Lessons and More

    This is a map I developed based on my experience at the Shenandoah Fall Foliage Bike Festival out of Staunton, Virginia. Beautiful countryside. It was my first experience with this type of group bicycling event and therefore a great learning opportunity.  As you'll see from the map, I started thinking about the difference between lessons, insights, recommendations, observations, and tips.


    Click to enlarge.




    A tip is a piece of practical advice.  Tips are great for the novices.

    An observation is something worth noticing, a relevant piece of information, usually related to the context for key insights and recommendations. Observation is the first step in reflection.  It's the "what did I notice?"

    An insight is a kind of "aha moment", often triggered by reflection and supported by observations.  It's the answer to "what did it mean?"

    A lesson is what we tell ourselves we'll do differently next time.  Lessons occur to us most often when we realize we made a mistake that was avoidable and we could have done things differently. Note that even though riding mid-day instead of early morning to avoid the colder temperatures is common sense, I will most likely not learn that lesson.  I'm a morning person and waiting for mid-day to ride makes no sense to me whatsoever.   This is typical of lessons I'm afraid.  Just because we note the lesson does not guarantee we'll learn it and do things differently next time.

    I didn't include any "best practice" on the map.  Perhaps a best practice would be something that avid bicyclists do out of habit and novices like myself need to be reminded of, something like "stay hydrated, drink before, during and after the ride."  How is that different from a tip?  Perhaps a best practice is something that applies to everyone whereas a tip is simply advice to address challenges you might encounter as a rider.

    Sunday, August 02, 2009

    Organizational Learning and Fading Memories

    Warning: Learning doesn't last.

    Lessons learned can be slowly forgotten over time. Memories fade. When something is "learned", is it permanently imprinted in our memories? No. We become complacent again. We forget. We may not forget everything but we forget the details, the how and the why. Lessons may be institutionalized through new rules and processes as a result of an accident -- to ensure it doesn't happen again -- but with the passage of time, it's just another rule, soon disassociated from the original incident or accident. As soon as people no longer understand the "why" associated with a rule or process, it can be dismissed as bureaucratic red tape and soon ignored or frequently bypassed.

    Remember Chernobyl? Remember Bhopal? Remember the Tenerife double aircraft disaster?
    What do you remember about them?

    Everyone remembers the Titanic, but what exactly do we remember about it? Do we need to be reminded of the details of why and how it happened on a regular basis?

    We pay most attention to the why and how just after an accident happens because everyone is focused on "how could it possibly happen?" and "who is responsible?" What we really need is a process for reminding people of the why and how when they think they least need it, when everything is going well and they start thinking it could never happen to them.

    I'm also wondering about other factors:
    1) Proximity: What's the relationship between an individual's "proximity" or level of involvement with an accident or related lessons on the one hand, and the declining memory curve? Does first hand "learning" last longer?

    2) Intensity: What's the relationship between the intensity of the failure (i.e. human lives lost vs. a failed project that didn't achieve its objectives), the extent to which the causes of failure are investigated, and the speed with which memories of the failure fade and lessons are unlearned.

    3) Dynamic nature of Lessons: Lessons need to be "updated" regularly based on most recent history and discoveries. Even if you've learned something based on first hand experience, you still need to "update" that knowledge.

    Rules and mandated processes need to remain linked to their original rationale. When someone is told that they need to follow rule x, y, z, they should be able to ask "why" and to get a straight answer other than 1) that's how we've always done it, or 2) that's the rule. If you understand the why and the rationale makes sense, you're much more likely to follow the rule.

    Sunday, March 08, 2009

    Anatomy of a "Aha!" moment

    Step 1: Routine
    I'm doing something routine, checking my office email from home through the webmail application. I'm starting to draft a response to a message, I'm thinking, editing, thinking, and by the time I'm ready to send, the connection has timed out and I lost my message. It's not even saved as a draft.


    Step 2: Recurring, yet not constant error
    This is a recurring "error". It's entirely my mistake. It doesn't happen every time I deal with my office email from home but regularly enough to be an annoyance. It has happened before and I don't seem to learn anything from it. All I would need to do is remember to either save the message regularly (automated saving would be better!) or draft it outside the browser and cut-and-paste it when I'm ready to hit "send". The solution appears simple. Why am I not able to implement it?

    Step 3: What is going on?
    Do we really learn something from our mistakes only when the annoyance or consequences reach a certain threshold? If I've recognized that I need to do something about it and I know what the solution is, why am I not able to implement the solution?

    Part of the answer is that when I start answering an email, I don't recognize the fact that it might take me a while to be ready to send. I underestimate the time it will take to answer. In truth, when I start responding to an email, my mind is entirely focused on the answer, not the mechanics and limitations of my webmail application. [Thankfully, this blog tool automatically saves my drafts and has a big "SAVE NOW" button just below what I am typing, just in case.]

    The webmail application is particularly annoying because it doesn't tell you that you've lost the connection. It lets you think you're still connected. It lets you write a nice long message and you don't realize you've lost it until you try to send it. Can you see smoke coming out of my ears?

    Step 4: What is really going on?
    I'm staring at the login screen again and by then I know I've lost my message. Interestingly, the login screen that appears after you've been logged out as a result of a connection timeout isn't the same as the login screen I routinely start from, the one that I have bookmarked for easy access. So, what's the harm in reading the fine print? I might actually figure something out, right!

    Step 5: Fine Print
    Some background first: There are four options on the login screen (private vs. public computer; and full vs. light version of the webmail application). I'm working on my private computer at home and I have a good connection so I don't need to select the "light" version. The login screen explains that the light version is "sometimes" faster if you have a slow connection.

    Here's the fine print: You need Internet Explorer 6 or higher to use the full version of the webmail application. Aha! I use Firefox. What does that mean??????? It means that even though I'm selecting the "full" version, I'm automatically transferred to the "light" version... and since I've never experienced the "full" version, I never noticed. The "light" version seems to time out after 15 minutes (even when I'm on a private computer setting which, according to the instructions, should give me 24 hours without being automatically logged out).

    Step 6: What else is wrong?
    It took me too long to get to this Aha! moment. How could the error have been avoided in the first place or what might have made it easier for me to catch it sooner?
    • In hindsight, I could have mentioned the problem to a colleague to get some insights into what might be wrong. However, since I was assuming the problem was my inability to remember to save, there was no obvious need to mention it to a colleague.
    • I could have contacted IT support, but again, I was assuming the problem was with me, not with the system or how I was using it.
    • ALL login screens should have the same fine print. It's possible that I would have noticed the "IE" fine print sooner if it had been on my routine login screen and not just on the login screen I was rerouted to after I timed out.
    I might send this to the IT support folks.... they might find it useful as a case study in user stupidity. I'm not wondering if this is indeed the full explanation or if I'm still missing half of it.