Monday, August 28, 2017

Learning Plan for September 2017

September is just around the corner.  From a biking perspective, I can anticipate a few long bike rides in the cooler mornings.  From a learning perspective, it will be all about systems thinking, complex systems and visualization, combined with my ongoing interest in building bridges between individual learning, team learning and organizational learning.  This interest is based on the observation that individual learning is typically the purview of the Learning and Development (L&D) department within HR, while organizational learning may be in a completely different part of the organization, including under IT if it is perceived as part of a IT-based approach to knowledge management.  My gut tells me that part of the reason for the gap is that L&D tends to focus on formal learning approaches (aka training) while organizational learning is typically more experience-based.

Here's an initial half-baked insight/hypothesis:  The bridges to be built involve 1) reinforcing the informal, experience-based aspect of individual learning; and 2) strengthening corporate training based on experience-based organizational learning.

The question I will try to address is:  How can I apply systems thinking and related methodologies or tools to address complex systems to come up with a more integrated (systemic) approach to learning within organizations. 

A couple of secondary questions (which might confuse everything and send me down big rabbit holes):

  • How can learning itself benefit from systems thinking?
  • Can insight mapping support a systems thinking approach?

Here are my starting points:
  • Visible Thinking: Unlocking Causal Mapping for Practical Business Results (a book I recently discussed in a blog post)
  • SPACES MERL: Systems and Complexity White Paper (USAID 2016) ... which is where I learned about...
  • Systemigrams (visual representation of complex systems) and another book.....
  • Systems Thinking: Coping with 21st Century Problems (2008)
  • My own insight mapping practice as well as....
  • Previously posted insights about systems thinking and....
  • A need to clarify the difference between design thinking and systems thinking (I think I confuse them)
  • I also signed up for Degreed and I'd like to test how much I can get out of that learning platform for a rather narrow learning exercise as this one. 
Anticipated Outputs:
  • Extensive notes added to my Organizational Learning wiki (internal)
  • At least three blog posts and at least one integrative map (public website)
  • Draft presentation package for future use/adaptation, etc...
  • and if this all adds up to something of sufficient value, a post on LinkedIn.
How is this as a "learning plan" for September?
  • It's bounded in time and scope, though the scope could escape me as I dig deeper and a month might not be enough.
  • It has some intrinsic value for me in terms of learning.  Motivation to learn about this will NOT be a problem at all. I will need to schedule it as a core task to make sure sufficient time is allocated.
  • It spells out possible outputs which will force me to wrap up my own thinking and write things down in useful formats, contributing to other objectives, such as populating the blog with fresh insights and developing materials for presentations, possible lecturing/teaching or other forms of training/capacity building.
I shall see at the end of September if I achieved all that and where my expectations were off track.

This is my YOL (Year of Learning) after all.  I might as well make the most of it and plan for it.  I think it's called Walking the Talk. :)

Thursday, August 24, 2017

Learn to Plan and Plan to Learn

Experience is inevitable. Learning is not.  Being intentional and planning to learn isn't such a bad idea.

I had an interesting conversation this week which triggered some additional reflection around learning plans and learning agendas and then I was asked a question about project learning plans during the NASA Virtual PM Challenge.

1. USAID is advocating the use of Learning Agendas at the Mission/Country level.  Those are linked to country-level assistance programming.

2. I've talked in the past about individual learning plans, which can be part of an individual professional development effort.

3. What about learning plans at the project or program level?  Would it be appropriate to have learning goals at that level?  Under what conditions?  If you're trying out something that involves an innovation, wouldn't you want to have a well thought-out learning agenda?

At the NASA/Goddard Space Flight Center where I've worked with projects for the past nine years, projects have to include a lessons learned plan in their project implementation plan.  It's typically a couple of pages long though I've seen some 15-page documents that were more in line with an essay on project learning than a pragmatic plan of action. I like the effort and level of thinking put into the longer documents, but the key is to make those plans implementable with existing resources.  These plans have not, as far as I know, highlighted any specific learning agendas.  They spell out a number of key practices meant to facilitate team and organizational learning, but they are not tailored in terms of any thematic focus. Sometimes you can't really predict what you'll need to focus on.  In some cases, however, you know in advance that you're trying a new strategy or that there is something unique and interesting about a mission and it might be useful to develop a tailored learning plan.  The science component of the mission is, by definition, a learning agenda.  Each mission has a specific scientific objective, a set of questions it is trying to answer about earth, space, a planet or the universe.  I have always worked on the project management side of the mission, trying to help project teams learn how to better manage the development of the mission from a perspective of cost, schedule, scope, people, etc... Without good project management, the mission will not get off the ground and no science objective will be achieved.

4. What about learning plans at the organizational level?  How would an organizational learning plan sync with an organization's mission, strategic plans, etc...?

Tuesday, August 22, 2017

Going to College... Becoming a Learner

My youngest daughter is going off to college at the end of this week.  She has a reasonably good idea of what she wants to study, she picked a school with a strong yet not overly narrow focus.  It's more specific than "liberal arts" yet not a narrow path towards a single profession either.

As my daughter prepares to go off to college, she received in the mail today a little book for one of her classes.  It's a reading requirement for a one-credit class that is associated with her housing arrangement.  She will be part of a living community on campus, spending a lot of time with fellow students studying related topics and engaging with faculty in and out of classrooms.

I must admit that my eyes lit up when I saw the title of this little book: Becoming a Learner: Realizing the Opportunity of Education, by Matthew L. Sanders.  [As a side note, I get excited just reading the reading list portion of syllabi].  This little blue book should be mandatory reading for anyone going to college. Unfortunately, I'm afraid it's full of bits of wisdom one only realizes are true 20 or 30 years later, when our professional and personal lives have taken us far away from where we probably thought we were going.
"The primary purpose of college isn't learning a specific set of professional skills; the primary purpose of college is to become a learner." (p. 2)
Yes, but this requires a more detailed explanation of what we mean by becoming a learner.  In high school, we have students  Students succeed if they become proficient at studying.  If you take the highly rated MOOC called "Learning How to Learn,"  I would argue that you are primarily learning how to study, which still does not prepare you for lifelong learning.  College students who continue in that mode of studying may be successful in the short term, but if they do not evolve into learners, their success will be short-lived because they will not know how to continuously learn and grow throughout their professional and personal lives.
"Your ability to learn how to learn will be what takes you through the countless industry developments you will deal with in your work and in society.  By recognizing this, you can focus on your development as a learner, which will be more lasting and applicable in all your future endeavors." (p. 14). 
A student is taught by teachers.  Learners take responsibility for their own learning, decide what to learn and how to learn it. Faculty are there to guide the learning process in specific disciplines more than to teach.

This is really great reading as an introduction to college learning, and I hope it's fully embedded in the classroom practices.  If the faculty and entire curriculum design doesn't embrace this approach, it will be difficult for individual students (sorry, learners) to embrace if fully.  It will require constant reinforcement.

Two secondary insights:
The idea of putting on a broader set of lenses reminds me of a little mental reflex I've developed over the years.  When I feel pretty sure that I know exactly where I am going to be in my life in 5-10 years, I smile (internally) and I tell myself that's not where I'll be, but that's perfectly fine, because opportunities will emerge that I couldn't have imagined and if I'm able to keep an open mind and ditch the plan, I'll be able to capture those opportunities.  Have a plan, then ditch the plan!

The learning process is more important than the specific lesson.  That's very similar to what I said last week during the NASA Virtual PM Challenge on Lessons Learned.  I was asked what key lessons all project managers should know about.  Beyond general good project management practices, the key is to keep learning, not to know about any specific set of lessons hidden in a database.

Friday, August 18, 2017

Lifelong Learning... and Beyond

LinkedIn has become a regular source of leads for thought provoking readings and conversations, especially for sources that I don't necessarily read on an ongoing basis.  I am not a regular reader of The Economist, but an article came to my attention through my LinkedIn feed:  "Lifelong Learning is Becoming an Economic Imperative." The Economist - January 2017 Special Report on Learning and Earning.

Below is a slightly more developed version of a comment I posted on LinkedIn.

While I applaud lifelong learning, I don't think the authors of the article go far enough.
Technological change demands stronger and more continuous connections between education and employment.
We need to move beyond "continuous connections between education and employment." We need much greater integration. They should not be done in parallel.  Both the notions of education and employment are evolving, partly as a result of technological change.  Technological change is not just affecting the kinds of skills and jobs that are available.  Technological change is affecting how we gain new skills and how we think about work and employment.

We need to go beyond lifelong learning as currently described in the article.

First, they are still equating learning primarily with training and education programs.   The fact that these types of approaches are increasingly bring integrated into the workplace with corporate universities and the like is probably a step forward, yet not enough.  More is needed in the form of support for workplace learning, that is, learning on the job, learning from experience.  I'm a big fan of the approach taken by Jane Hart and her work on workplace learning as well as Jay Cross's work on informal learning.

Second, the authors are still equating learning primarily with individual learning, which is great from an individual employability perspective but does not do enough to support organizational learning.  More is needed in the form of support for team and organizational learning so that efforts at the individual level are part of a broader approach.  I am currently carefully reading An Everyone Culture: Becoming a Deliberately Developmental organization, for such an approach.

Question:  Is a deliberately development organization (DDO) better able to anticipate change rather than react to it?


One item on my "to do" list is to make better use of labels/tags on this blog. Here are three blog posts I previously tagged for "lifelong learning" (and no comment about their current value).

7/2017 - Leading the Learning Revolution
12/2015 - Lifelong Learning: Opportunities and Challenges for Learning Junkies
02/2009 - Autodidacts and Lifelong Learning

Question:  Should blogs go through some form of clean up or should they be left alone to reflect an evolution of thoughts not meant to form a coherent or consistently high quality whole.

Monday, August 14, 2017

Making New Mistakes - Learning @ NASA - August 16th Webinar Open to All

I'll be joining NASA colleagues Michael Bell of Kennedy Space Center and Jennifer Stevens of Marshall Space Flight Center to talk about how NASA addresses lessons learned.  My focus at the Goddard Space Flight Center has been working with projects to institutionalize group reflection activities such as the Pause and Learn as a way of facilitating group learning and documenting lessons for dissemination, focusing on knowledge flows and learning rather than lessons in a database.

This webinar is open to the public and there should be time for Q&A.

What?  You missed it.  You can catch up here.

Saturday, August 12, 2017

Quantification Bias

"Give me one example of a  time when a lesson learned was used effectively by a project."
You'd think one example wouldn't be too hard to find.  I'm not being asked "What's the percentage of lessons in the database that are actually applied?"

Then someone will also ask, "What's the ROI of lessons learned activities?  Does it save us any money?  How many failures have lessons learned ever prevented?"

This eternal conversation is one that I'll admit I've avoided at times, perhaps because it's just challenging.  It's challenging to provide an answer that will satisfy the person asking these types of questions.

I've addressed metrics in small bites throughout the years, most recently in a metrics anecdote post. Quantifying "learning from experience" is daunting.  Sometimes I almost want to say "I know it when I see or hear it."  In fact, it's more likely that I'll notice that a lesson has NOT been learned, when I'm having a déjà vu experience during a lessons learned session and I'm hearing something I've heard multiple times before. I could point Management to those lessons that keep coming back.  I've done that informally.  I have not kept quantitative data.  I can't tell you how many times it's happened in the past year.  I could, however, do a more thorough job of documenting specific instances AND perhaps even more importantly, figure out why it's happening again.

The answer to "why are we not learning this lesson" is never a simple one and it's usually not a single point failure and easy to fix problem.  Sometimes, as I've pointed out in the previous blog post, the root cause of the failure to learn is related to the ownership of lessons.  Making sure Management is aware of the repeated problems isn't the end of it.  In my experience, nothing I bring up to Management is completely new to their ears.  However, in the knowledge manager's role, I also facilitate dialogue between key stakeholders, including Management, through knowledge sharing workshops.  The topics selected for such workshops are typically based on recent themes emerging from lessons learned session.  And so we try to address the pain points as they emerge, but I'll confess that we don't quantify any of it.  Correction, we do the obvious of counting how many people attend the workshops.

There is a general quantification bias in many aspects of work and decision-making.  Everyone wants to make decisions based on evidence.  In most cases, evidence is taken to mean hard data, which is understood to be quantitative data (as opposed to soft, qualitative fluff), as if hard data was always correct and therefore much more useful and reliable than anything else.  The words "evidence" and "data" have now been completely associated with quantitative measures.

When people say "where is your data?" they don't mean what are your two or three data points.  That's easy to dismiss, it's anecdotal.  The more data points you have (the bigger your dataset), the more accurate your conclusions must be.  Under certain conditions, perhaps, but certainly not if you're asking the wrong questions in the first place.

I recently came across Tricia Wang's TED Talk, "The Human Insights Missing from Big Data."

Given that Ms. Wang is a data ethnographer (very cool job!), her point of view isn't surprising and given that I'm more or a qualitative methods person, the fact that I find it relevant and relate to it isn't surprising either.  That's just confirmation bias.   Ms. Wang brought up the quantification bias, which I have often been struggling against in my work.  It manifests itself in questions such as "how many hits do you get on the lessons learned database" or "how many new lessons were generated this past year?"  These (proxy measures of learning) are the simpler questions that have (meaningless) quantitative answers.  Is having a meaningless quantitative answer better or worse than saying that something can't be measured.  I should never say "that can't be measured."  It would be better to say "I don't know how to measure that.  Do you?"

I wouldn't suggest we should all turn to qualitative methods and neglect big data.  We should, however, do a better job of combining qualitative and quantitative approaches.  This isn't news.  It's just one of those lessons we learned in graduate school and then forgot.  We learn and forget just so that we can relearn.

My own bias and expertise stands squarely with qualitative approaches.  It could be simply that my first degree being in political science, I always have in the back of my mind that decision-making isn't simply a matter of having access to information/data to make the right decision.  It's part of what makes us human and not machines.

Friday, August 04, 2017

The Ownership of Lessons

Earlier this week I attended a panel discussion on "The Role of Learning in Policymaking" organized by the Society for International Development's Policy and Learning Workgroup. I took a lot of notes because it was all very interesting but I'll focus here on one issue that hit a nerve for me:  Lessons learned ownership.

There are many reasons why some lessons are not "learned"  We don't believe them, we don't care enough, we forget them, etc....   I'm only going to focus here on one reason: Lack of ownership.  In other words, the hypothesis is that the ownership of a lesson contributes significantly to its utilization.

This lack of ownership comes in (at least) two flavors, two variations on the "not invented here" theme:

1. We don't learn very well from other people;  We learn better from our own experience -- and even then it's far from perfect because of personal biases and other issues.  Even if we understand and agree with someone else's lesson, we may not think it applies to us.  We don't own it.

2. We don't like being told what we should learn, especially if someone else's conclusion doesn't match ours. Why would I care about someone else's idea of what I should learn?  Did I ask for this "feedback"?  It is being offered in a way that's useful to me?  Sometimes we just don't want to own it.  We actively resist it because we didn't come up with it.

Example:  A donor agency makes policy recommendations to a developing country government based on strong donor-collected "evidence."  Let's face it, we can't get out own government to always act upon strong "evidence," so why do we expect other countries to act upon donor-generated lessons. Ownership needs to be built in from the beginning, not mandated at the end.  We might all know that but does it always happen?  I don't think so.

From Ownership to Action
To say that lessons are not learned until something is changed (in policy, procedures, behavior, etc...) is perhaps cliche and misleading or at least not very useful.  Over the past 9 years of helping project teams identify lessons from their experience, I have found that statement to be disconnected from reality.  If not totally disconnected from reality, I found the one-to-one linear relationship between lesson and action to get to "learning" to be a gross oversimplification.  Some of this oversimplification has to do with the lack of discussion of lesson ownership.

Having facilitated more than 100 lessons learned discussion sessions, I can now quickly identify ownership red flags in lessons learned conversations.  A lot has to do with the pronouns being used. I try to provide ground rules upfront encouraging the use of "I" and "we" and making sure the group is clear about who "we" refers to.  Blaming individuals or entities who are not in attendance and hinting at lessons intended for "them" ("They should do ________.") are both big red flags. It doesn't mean the conversation needs to stop, but it needs to be redirected to address ownership issues and ultimately increase the chances that some action will be taken.

At that point, the facilitator's redirect can go into two different directions and sometimes both are needed:
  • "Assume THEY didn't hear you right now and they're going to keep doing it their way (i.e, they are not going to learn).  What can you do next time to avoid this or at least mitigate the problem?"
  • "Is there an avenue for giving them this feedback so that they might do something about it (i.e., they might learn) and this problem isn't repeated?"
In the real world, where lessons that are documented don't automatically turn into actions, that's how I try to deal with ownership issues.  I primarily work with project teams, but their work requires interactions with many stakeholders external to the team.  Sometimes what is most needed is for separate lessons learned sessions with different set of stakeholders and then some discussion of lessons across the different sets.  It's not necessary to look for perfect consensus across the different groups, just to optimize understanding of the different perspectives.

It feels as if I'm only skimming the surface here.  More percolation needed.