Showing posts with label Knowledge Management. Show all posts
Showing posts with label Knowledge Management. Show all posts

Friday, March 23, 2018

We Still Need Lessons Learned - (Part 3 of 3)

In the previous two posts, I've argued that 1) KM practitioners need to be prepared to sell lessons learned practices by looking at them differently and debunking myths about the futility of lessons learned databases, and 2) the value of lessons learned is in the transformation of tacit knowledge into explicit knowledge.  In this third and last part of the series, I will talk about the value of aggregation.

Part 3: The value of lessons learned activities is not in the individual lessons themselves sitting in a database but in the aggregation of lessons. 


As I reviewed existing lessons and then spent almost 10 years helping to identify "new" lessons, two interesting insights emerged:  1) Many so-called lessons were deemed unworthy of being pushed higher up for further dissemination; 2) Many interesting new lessons had to do with the need to change and adapt to new conditions.

Dealing with the little lessons
Many lessons did not necessarily rise to the level of "real" lessons that deserved to be pushed to a database. Many of these lessons were good practices that projects needed to be reminded of.  It's easy to get caught in a discussion about what's a real or worthy lesson (almost as easy as getting caught in a discussion of the definition of knowledge management).  The mistake is to dismiss small lessons as not worthy.  The "little lessons" or "motherhood and apple pie" lessons may not be sexy and they may be a reminder that we often seem not to learn, but they are important in other ways.  Perhaps they're not lessons at all, but they tell you something about underlying problems that need to be addressed at the institutional level and not through a lessons learned database. They take on different meaning when you look at them in aggregate. You can review all of them, identify recurring themes,  then develop mini-case studies to incorporate into informal workshops as well as more formal training.  In addition, management could be encourage to use these mini-case studies as story vignettes in their communications.

Another way to look at these less than stellar lessons is to say that they are "local" lessons.  They are valid and valuable as the project's lessons but they don't need to be shared or go any further.  That can also become a problem.  What if the same little local lessons are being learned again and again around all the projects.  Unless someone is paying attention to these little lessons across the projects, they will remain localized, soon forgotten, invisible and will never come to the attention of management... until something goes terribly wrong, a thorough investigation is launched and the findings are that the underlying issue was known to everyone and prevalent across the organization, but it didn't get anyone's attention because no red flags were attached to it.  Make the recurring little lessons more visible. Do something about them.

Part of the role of the KM manager is to keep an eye on the recurring "motherhood and apple pie" lessons.  The idea is that these lessons are obvious, things that everyone should know and can't possibly disagree with.  Anything that sounds like motherhood and apple pie is dismissed because it is just common sense or it's a good practice.  It's the opposite of the powerful insight, the aha! moment.  It's not sexy at all.  It may even be embarrassing.  How could they not have known this before?  That's common sense.  It's just good project management practice.  That lesson is in the database already, in a dozen variations.  I would say, keep an eye on what everyone says is obvious yet keeps coming up as a challenge or a lesson. This is a little counter-intuitive. It's not what keeps management up at night. It could be pervasive, yet under the radar. No red flags.

In addition, just because something is obvious to a senior project manager and he/she doesn't see the value of putting it in the lessons learned database doesn't mean it's obvious to a junior member of the team who hasn't yet learned it first hand.

It is dangerous to think that it takes a major disaster like a Challenger or Columbia to create case studies worthy of widespread dissemination and incorporation into courses and workshops.  People need to learn from the "little lessons" before they become big problems.  The underlying issues that led to or contributed to the Challenger and Columbia accidents were things that manifested themselves in small ways every day and probably almost everywhere in the organization.  You can wait until they become a BIG lesson or you can catch them when they're much less visible.  It's not the stuff of academic papers and conference presentations (though I might try that route too), it's not catchy, but it matters.

As a side note, I've often asked myself if our KM activities could actually prevent another big failure.  How could we ever know what failures we've prevented?  How could KM be held responsible for failing to prevent a failure big or small?  Obviously, KM isn't the only mechanism organizations leverage to prevent failures, especially in high-reliability organizations like NASA.  Safety and Mission Assurance as well as Risk Management also play key roles... which is also why they should be working in close coordination with KM.

Learning to Adapt to Change and Unlearning
Shared Voyage (book cover)Many of what I would consider the more interesting lessons had to do with change and adaptation to change.  They may have been unique lessons and definitely new lessons but they fell into this bigger bucket of learning to adapt to change.  In my last presentation within NASA/Goddard a key point I made about what I had learned during time there was that we are (this isn't specific to NASA) a little slow to learn the lessons related to the need for change.  We have difficulty unlearning, we have trouble accepting that what had worked well in the past may not work well now.  Unlearning is key to adapting to change and continuous learning.   The theme of "unlearning" at NASA is addressed in Shared Voyage: Learning and Unlearning from Remarkable Projects.  This obviously creates another challenge for static lessons learned databases.  Ideally, lessons would be constantly flowing throughout the organization, aggregated and regularly analyzed for trends, reviewed and re-assessed through constant conversations.

Reference
  • Laufer, A., Post, T. & Hoffman, E. (2005). Shared Voyage: Learning and Unlearning from Remarkable Projects. NASA History Series. NASA SP-2005-4111.

Wednesday, March 21, 2018

We Still Need Lessons Learned (Part 2 of 3)

Part 2: The value of a lessons learned activity is in the transformation of tacit knowledge into explicit knowledge,

The value of such lessons learned activities is increased when it is the result of a group reflection process (as opposed to an accumulation of lessons gathered from individuals. The numbers in the visual (in the previous post about the benefits of Pausing to Learn) are made up, meant to convey a sense of comparative magnitude of the benefits of documenting lessons.  A great deal of learning happens in the process of writing lessons down, even if no one else is reading them. It is now out of people's heads and transformed into explicit knowledge.

I have described this in APQC presentation (Rogers & Fillip, 2017). Learning happens in the initial conversation that helps to articulate the context and the lessons themselves.  Learning happens through knowledge synthesis as the conversation is documented (whether it's a narrative, a conversation map or a video interview).  Learning happens in the review and validation process and finally learning happens when others read or are exposed to the lesson.  In other words, learning happens in conversations, but these are not random conversations.  These are intentional learning conversations.

Even if no one ever reads those lessons, at least no one will be able to say that the employees retired or left and walked out the door with all their (tacit) knowledge. And yes, I know that most of it is still in their head and can't be captured, yet a good amount of learning or knowledge transfer can happen through conversations.

The real problem is that we often don't have very good mechanisms for utilizing that explicit knowledge and we draw the wrong conclusions, which is the equivalent of failing to learn from a failure... or learning the wrong lesson from a failure. An example would be to draw the conclusion that people aren't using the lessons learned system because they haven't been trained to use it.  Seriously?  Well... let's not assume anything but that wouldn't be my best guess as to why no one is using the database.  

There could be better utilization of lessons learned through other project processes such as reviews and risk management (Fillip, 2015).  In fact, many repeated "lessons" are the result of institutional challenges that cannot be addressed at the project level and could be tackled through organizational risk management. Expecting all the benefits of lessons learned processes to come from individuals going to the database to look for lessons is just wrong headed. It's standard build-it-and-they-will-come wrong-headedness. 

Reference


Monday, March 19, 2018

We Still Need Lessons Learned, and We Need to Look at Them Differently (Part 1 or 3)

In this first part of the series, I will talk about some of the challenges of leading a lessons learned practice and how I tried to address them, in other words, how I learned to sell lessons learned. 

Part 1: Selling Lessons Learned as a KM Practice

KM is a sales job.  When I was working at NASA/Goddard helping project teams document their lessons, I was at times confronted with resistance in the form of remarks such as "no one uses the lessons learned databases, why should we bother contributing lessons to the system?"  This had been exacerbated by successive reports criticizing the Agency level lessons database called the LLIS.  Regardless of the merits of the critique of that system (which isn't the point of this post series), I still had to do my job.

Since I needed to convince project teams to meet with me and document lessons, I developed an approach that would overcome this resistance.  It was based on a simple diagram (below) that allowed me to address the issue upfront by redirecting the conversation by saying "you're right, the lessons are not being used the way they were expected to in the databases, but that's not the right way to look at lessons."

I would use this diagram as a handout during planning meetings with the project leaders to convince them to hold a Pause and Learn (PaL) session (group reflection activity) and I would use it again (hand drawn on a flip chart) at the very beginning of the PaL session to explain why we were all there and what the benefits were.  For a valuable conversation to happen, I needed both the buy-in from the project management team and the buy-in from the rest of the team.  Sometimes the project management team does a good job of bringing the team to the table with the right attitude. Sometimes it's the job of the facilitator to quickly elicit that attitude at the very front-end of the meeting.  There is nothing worse than a senior scientist showing up for a PaL session determined to wreck it with sarcasm about how useless these meeting seem to be since we keep repeating the same mistakes (in their opinion) or a team showing up because they've been told to be there but they really think they should be back in their office working on real immediate and tangible problems.
Source:  Rogers, E. & Fillip, B. Mapping Lessons Learned to Improve Contextual Mapping at NASA. APQC Webinar, August 18, 2017.
Once a team had experienced a successful Pause and Learn session, it was typically much easier to do it again at a later stage in the project life cycle. If you've made a sale and you've delivered on the expectations, you can expect repeat business. KM may be everyone's job, but from the knowledge management professional's perspective, there is a significant amount of "selling" involved.

Related Resources

Sunday, March 11, 2018

Women in Knowledge Management (or any male-dominated field)


Stan Garfield posted a list of  KM Thought Leaders.  I don't think the list is new but it circulated recently on LinkedIn, which is where I saw it. The great majority of the thought leaders on that list are men.  I am re-posting here the names of women who were on that list.  The idea is 1) to give the women more visibility with a separate list; 2) to point out that perhaps there are many more women thought leaders in KM who need to be added to this list.

Ironically, posting the list on this blog will do little to increase visibility, but that's only a first step.

For those interested, there is a Women in KM LinkedIn Group.  It's "unlisted".  I assume there is a reason for keeping it unlisted but I don't know what it is. I shall inquire. I've been a member of the group and never contributed anything (my bad).

There was also a recent article in RealKM by Bruce Boyes about the issue of gender equity in KM (thank you to Boris Jaeger for pointing me in that direction).  The article seemed to point to the fact that the field of KM isn't immune to broader societal inequalities.  There is nothing surprising about that and I can't disagree. Having experienced some gender inequality frustrations of my own.  As a woman, I have a good sense of how it has affected my career.  I can't say I have good answers or solutions other than to become more vocal about it AND take responsibility for some of it as well.

While I happen to be more sensitive to gender bias, there are probably other biases embedded in that list.  How many are not US-based for example?


First NameLast Name
V. MaryAbraham
PattiAnklam
StephanieBarnes
IrmaBecerra-Fernandez
MadelynBlair
HelenBlunden
GloriaBurke
DanieleChauvel
KimizKalkir
VanessaDiMauro
NancyDixon
LiliaEfimova
SueFeldman
CarolKinsey-Goman
SusanHanley
RachelHappe
HeatherHedden
JuneHolley
DorothyLeonard
CharleneLi
AliceMacGillivray
CarlaO'Dell
JeanO'Grady
NirmalaPalaniappan
KatePugh
CelineSchillinger
CatherineShinners
CarlaVerwijs
NancyWhite
NilminiWickramasinghe
Gone but not forgotten
DebraAmidon
MelissieRumizen
Moved on to other focus area or retired
VernaAllee
KayeVivian


3/16/2018 - Correction: Adding women I had omitted from the original list.

  • Alex Bennet
  • Jo Ann Girard
  • Joitske Hulsebosch
  • Bronwyn Stuckey
  • Beverly Wenger-Trayner


Do I want to be on that list?  
Am I upset because I'm not on the list?  Yes and no.  A year ago I could not have cared less but now as an independent consultant with my own business to worry about, yes, I do need to worry about not being on that list and I may need to figure out what it takes to get the recognition and visibility.  Perhaps it's not that specific list I need to worry about but more what it represents (free advertising).  Do I think I should be on that list right now? I am not able to answer with a resounding "YES", whereas most men would not hesitate.  THAT is the problem!

What am I going to do about it?
  • Learn more about women thought leaders in KM: I don't know more than 1/3 of the women on that list.  I probably should.  I therefore commit to learning more about their thought leadership.
  • Identity other women leaders in KM: Reach out to the women on the list and women in the "Women in KM" LinkedIn group to expand the list.
  • Become more active in the "Women in KM" LinkedIn group
  • Increase my networking with women in KM in my local area

References
Boyes, B. (2018). Improving gender equality in knowledge management: Is there a gender imbalance in KM, and if there is, what should we do about it? RealKM.  URL: http://realkm.com/2018/03/08/improving-gender-equality-in-knowledge-management/ 

Garfield, S. KM Thought Leaders.
URL: 
https://sites.google.com/site/stangarfield/kmthoughtleaders.





Monday, December 18, 2017

Lifelong Learning, Learning & Development, Organization Development, Organizational Learning & Knowledge Management

Lifelong learning  has been mostly "sold" as the responsibility of the individual, a pursuit of learning beyond formal education, throughout life, sometimes as a means of strengthening one's employability in changing economic contexts, as a way of keeping up with new advances in one's professional field, as a way of staying engaged, even as a way to build cognitive reserves and ward off Alzheimer in old age.

Beyond personal responsibility, lifelong learning is often integrated into the vocabulary of Learning & Development (L&D) specialists.  In more traditional L&D approaches, lifelong learning may be advocated as a way to encourage employees to keep building their credentials by signing up for and attending courses, especially when the organization has invested in the development of corporate training.

Increasingly (and it's a good thing), learning has been broadened to cover much more than formal courses, whether in formal educational institutions or corporate environments (see the work of Jane Hart and Harold Jarche in particular).  As technology has evolved and penetrated learning and training departments, formats have evolved as well. For example, the recognition that time is often a constraint has led to the development of micro-learning, which can happen at any time using conspicuous mobile devices.  The alternative explanation for the development of micro-learning is that our attention spans are decreasing and we need bit-size learning moments to accommodate shrinking brain power.  That's scary.

This expansion of opportunities for learning has been accompanied by the recognition that most of our learning comes from first-hand experiences.  When we learn from experience, we essentially teach ourselves what no one else could possibly have taught us.  This learning by doing (and learning by reflecting on our experience) is still not well integrated in most models or framework of workplace learning. Harold Jarche's work is probably the notable exception.

From an organizational learning perspective, we often emphasize group/team learning, and the learning organization.  Yet group and organization learning cannot truly happen unless the individuals within that organization are themselves, learning.  Organization Development (OD) as a field of study, does pay much more attention to the different levels of analysis (individual, group, organization).  Knowledge Management, on the other hand, tends to neglect individual learning and focuses on leveraging knowledge for the organization with a strong focus on benefits to the organization's mission.

In the end, it is unfortunate that professional disciplines end up digging deep tracks and creating their own vocabulary when it's all connected and challenges would be much more effectively addressed with a broad systems approach.  I don't even want to say cross-disciplinary because it reinforces the fact that there are disciplines with artificial boundaries that need to be crossed.  In a systems perspective, the boundaries disappear.

We need a framework that speaks to the connections between individual learning, team learning, and organizational learning and addresses the dynamics of such a system in the context of a rapidly changing world where individuals AND organizations need to increase their learning agility in order to keep up but also to stay ahead and innovate.

Saturday, October 01, 2016

USAID and NASA - A Tentative Comparison of Industry Trends and Current Knowledge Management Challenges


The table below doesn't claim to be a thorough comparison of USAID and NASA.  It's a quick glimpse at key characteristics that impact current knowledge management challenges, inspired by the SID - Future of AID session earlier this week and about 10 years of practical experience in both of these worlds.

This deserves much more reflection and more than a blog post and table.   It could be a full book, but I can't answer the "SO WHAT?" question.  I keep coming up with new mini-insights that need to be connected somehow to build the bigger puzzle. All I'm really saying is that the two agencies are not that different and key knowledge management challenges are common across industries even if NASA is perceived as being well ahead of USAID from a Knowledge Management perspective.


US Government Agency / Industry
USAID / International Development
NASA / Aerospace
Goal
Global Economic Development, Poverty Reduction
Science & Exploration
Programs and Activities implemented to achieve the goal
Broad commitment to SDGs, Country strategies, sector-specific programs, individual projects
High-level strategies in each key space science domains (astrophysics, heliophysics, earth science, etc..); programs and individual missions
Implementation Models
Public private partnerships; contracts and grants with implementing non-profits and for-profit private sector organizations

International collaboration: working within the United Nations system
Increased emphasis on private sector involvement; continued partnerships with industry as contractors and academia as partners/contractors; partnerships with other countries’ space programs

International collaboration: Space Station
Changes in the industry
New entrants:
·        Countries like China and India, operating under different models, different rules.
·        Private sector investors
·        Large individual donors and corporate donors
New entrants:
·        Countries with new space ambitions
·        Private sector taking over roles previously owned by government (transport to Space Station, launch services, etc…)
Challenges
 Rapidly changing global economic and political environment; need to explore new implementation models.  NEED TO ADAPT FASTER, THEREFORE LEARN FASTER.
Rapidly changing technological innovation and implementation models. NEED TO ADAPT FAST, THEREFORE LEARN FASTER.
Key differences
Measuring success (‘IMPACT’) is a perennial challenge.  Scaling and replicability become difficult because there isn’t enough attention paid to “HOW” the activity was made to be successful.  Little emphasis on understanding the complex set of factors leading to success.  (See previous post)

Very little rigor in program and project implementation. (subjective judgment here, based on personal experience/perception)

What’s needed: Adaptive management, CRITICAL THINKING
Measuring success has never been an issue.  Success and failure are very clear and visible.  Identifying technical failures is a challenge when it happens on orbit, but the biggest challenge is identifying AND CORRECTING organizational failures.


High degree of rigor in project management (increasing rigor on cost and schedule dimensions), sometimes to the point of being a serious burden and impeding innovation.

What’s needed: Tailored application of project management “requirements”, CRITICAL THINKING
Knowledge Management Challenges
·        High turnover, shuffling around the same top contractors, same group of consultants (small world)
·        High barriers to entry (perhaps that’s changing with the emergence of new actors)
·        Generalists vs. specialists and the need for a holistic approach to problem solving, multi-disciplinary approach.
·        North-South discourse/issue, reinforcing impact of information technology
·        Absorptive capacity, perceived weakness of local knowledge capture/knowledge transfer.
Confusion around M&E, Knowledge Management and communications/PR resulting from the incentives structure (see previous blog post). 

DIFFICULTY IDENTIFYING REAL LESSONS, SPECIFYING “SUCCESS FACTORS”, INCLUDING CONTEXTUAL FACTORS.  NEED TO LEARN TO ADAPT AND INNOVATE.   Learning from flawed data on impact studies is… flawed.  Need to come up with something much more forward looking, agile and adaptive.
·        Retiring, aging workforce with critical experience-based knowledge is leaving
·        New entrants/partners are not using tested/proven approaches, steep learning curve, yet that’s how they can take risks and innovate
·        Need for insights from other fields, increased openness to insights from non-technical fields
·        Perennial challenge of cross-project knowledge transfer (“we are unique” mentality) and knowledge exchange across organizational boundaries.


FINDING THE BALANCE BETWEEN LESSONS LEARNED (OLD KNOWLEDGE) AND LEARNING TO ADAPT AND INNOVATE (NEW KNOWLEDGE).
This was a case where an insight map didn't seem to fit the purpose, yet I bet it would help me to connect the dots a little better. 

I had previously written about the two organizations:  Foreign Assistance Revitilization and Accountability Act of 2009, August 11, 2009.  A great deal of USAID's current focus on Monitoring, Evaluation, Knowledge Management and the CLA (Collaborating, Learning and Adapting) model emerged out of that 2009 legislation.  

See also "Defining Success and Failure: Managing Risk", July 29, 2009.

______________________________________________
12/17/2016 - Addendum - There are many interesting and related insights in Matthew Syed's Black Box Thinking, which investigates how certain industries are much better (more thorough) at learning from their mistakes than others.

Saturday, August 27, 2016

A Knowledge Management Puzzle



What does this picture have to do with knowledge management?

It's a picture of the box cover for a puzzle I'm working on.  I am now going to attempt to use that picture to talk about knowledge management.  I hope you're smiling.  This isn't too serious.

First, knowledge management is a puzzle.  It may not have 1500 pieces like the puzzle in the picture but it has a number of interlocking pieces and like a 1500-piece puzzle, it may seem overwhelming at first to try to tackle it all at once.

Second, if you're like me and you've worked on such puzzles before, you start with the edges and you frantically search in particular for the four corners.  I'm not sure there is a strong advantage to the approach but it ensure some quick wins because the edges are easier to find and then place so that within an hour or less you've accomplished something.  You need the positive feedback, the feeling that you CAN do this. The same can be said of knowledge management initiatives.  Fixing the big picture may seem intimidating but there are quick wins that can be found.

Third, the puzzle represented in the picture is two-dimensional but if you step back, you can pay attention to the picture, what it represents.  It's colorful but it's silent.  What's missing to give you a good sense of that environment, the context for that small village cobblestone street? This is just one angle and it's not even complete.  Our knowledge is never complete.  If we read a lesson learned without the appropriate context, we might miss the bigger point it's trying to make.  In working on a puzzle, if you focus on the mechanics (finding pieces of the same color for example), you might completely fail to pay attention to the picture that is emerging.  Don't lose sight of the big picture, the larger culture change that may need to happen for the organization to become a learning organization.

Fourth, there is a great deal of culture embedded in that piece of technology in the picture; the car.  It's an old "deux-chevaux".  Perhaps its knowledge management equivalent is the continued practice of using email (and attachment) to transfer knowledge.  It's part of the culture.  Don't ignore it.  Of course, that car is now a classic.

Fifth, you can't have a french street without a café. You need a café for conversation and some benches to take time to pause, think, reflect and talk with colleagues.

Sixth, you have flowers and plants flourishing here and there.  They need watering and nurturing on a regular basis.  These are perhaps your communities of practice.  I know it's a stretch but we're almost done.

As a first step, for a very quick win, I would recommend fixing the grammatical error on the puzzle's title:  Rue Français... No, it would have to be "Rue Française."

Monday, July 18, 2016

Future of Knowledge Management

In a recent LinkedIn post, Chris Collison addressed "the future of knowledge management - births, deaths or marriages?," in which noted the following, among other things:
Some things are 'evergreen'. People will always need to talk, learn, reflect, network, collaborate and interact.
I couldn't agree more but I would go even further.  I suspect reminding people of "the need to talk, learn, reflect, network, collaborate and interact" will only become more important as technology continues to change how we interact, communicate and learn. 

I see my role in the next 15-20 years as precisely that, focusing on helping individuals, teams and organization accelerate learning (ironically by pausing to reflect) while dealing with rapid changes. Sounds like a good challenge to me!

Friday, February 26, 2016

KM and Happiness

There are two types of meeting notes:  1) Notes meant to reflect what was said, so that anyone not in attendance would get a sense of what the meeting covered in terms of content, and 2) notes that reflect what you, as a participant in the meeting, thought was relevant for you, what you connected to and what it meant for you.

The following notes are of the second category... in no way meant to be construed as an accurate summary of the meeting.  They also feel very much like a work in progress.  They connect to a pattern of ideas I've encountered in a variety of settings in the past year or so and to a range of interests that I did not previously see as very connected.  For example, my interest in neuroscience started around a research focus on synesthesia for a personal fiction project which at the time I did not connect at all to my work around Knowledge Management.  Now, the cognitive psychology and neuroscience aspects of Knowledge Management are surfacing while AI and cognitive computing still remain a great mystery.

Click on the image to view larger in separate window. 


A great "thank you" to Michael Lennon of George Washington University who was presenting great materials for insights and further questions.

Friday, January 29, 2016

Doodling about KM Roles

Here's some doodling about KM roles inspired by a KMA-DC conversation this morning.


I currently play the role of KM specialist/facilitator of KM processes such as lessons learned and knowledge sharing workshops (within the NASA/Goddard Space Flight Center).  In the Knowledge Management structure, I'm somewhere between the Chief Knowledge Officer who sets the overall strategy and knowledge management practices, and I support the project teams who actually have the knowledge that needs to be "managed."  

I am NOT a subject matter expert in anything other than KM. We only do a small subset of what KM can do/mean in an organization because that's what makes sense for the organization. 

Saturday, September 17, 2011

KM Research Questions

I recently participated in a Twitter Chat focused on Knowledge Management Research (#KMers chat archive).

The conversation prompted me to think about key research questions I'd like to answer within the context of my own work.
  • What is different about knowledge management in the public sector (compared to the private sector)?
    Related Hypothesis: KM in the public sector is strongly impacted by contractor/government relationships and other types of "partnerships," creating organizational boundaries that inhibit knowledge sharing?
    Impact:  Barriers need to be recognized and addressed upfront.
  • What is the role of project managers and project management in knowledge management within the context of project-based organizations?
    Hypothesis:  1) Project-level knowledge management efforts need to be embedded in standard project management practices (not an add-on supported by the KM office); 2) In the absence of (and in addition to) formal KM requirements or embedding of KM practices in project management practices, the project manager plays a key role in ensuring the KM is taken seriously and not just a "check the box" activity.
    Impact:  We need to better understand how to embed KM practices and principles within established project management processes and we need to bring project managers on board.
  • What knowledge should knowledge management efforts focus on?
    Hypothesis:  KM efforts are often vague about the knowledge domain they will focus on or address, as if it was obvious or KM efforts were expected to cover all knowledge domains relevant to the organization.
    Impact: Be clear about what your KM efforts are covering or not covering.  Be strategic and focused. Don't try to do it all.
 Looks like I've skipped some steps and tried to answer my questions as I was asking them.
Enhanced by Zemanta

Sunday, September 11, 2011

Government Contracting & Knowledge Management

What is the impact of government contracting (both in terms of volume and types of contracts) on core competencies, intellectual capital, and knowledge management practices?

Insufficient attention is being paid to the government / contractor relationship in current KM strategies and practices.

HYPOTHESES:
1) Current KM strategies and practices ignore the organizational barriers, refusing to acknowledge that they exist;
2) Current KM strategies and practices focus on "government" knowledge and ignore the contractor perspective, while contractors may have their separate KM approaches focusing on their intellectual capital. 

Is the ultimate goal to merge government and contractor KM approaches?
Is the ultimate goal to establish knowledge sharing practices building bridges between government and contractors without necessarily merging KM strategies?  If so, at what level should this happen?  Project-level or higher?

I haven't scoped the issue very thoroughly yet but I think it's worth exploring.



Enhanced by Zemanta

Sunday, July 31, 2011

Signs of KM Maturation

More than three years ago (May 2008), I joined the Office of the Chief Knowledge Officer at NASA's Goddard Space Flight Center.  As a contractor rather than a civil servant, I was (and still am) working with a Task Order and slowly getting to assimilate how on-site contractors are supposed to work.  I had worked on Government contracts before, but in a very different context and not on-site.  At first, I thought I was responsible for expanding the reach of the center's KM practices so that it wasn't an ad hoc affair but a set of KM practices embedded into the projects' life-cycle.  Ideally, projects would complete a set of KM activities on a regular basis just like they go through key reviews and reach critical milestones.  It would be part of what they do.  In a perfect world, they would be doing it because they see value in it rather than because it's a requirement.  A lot of groundwork had already been laid by the Chief Knowledge Officer, so it made sense and at the time, it didn't look overly ambitious. I was naive.  The most important thing I have learned over the past three years is establishing a KM program takes time, even when you have a dedicated staff.  KM staff need to be resilient, persistent, and willing to constantly engage in small experiments to refine and adapt their approach, take advantage of opportunities that present themselves, and avoid the traps of KM.

If everything works well, as of October, I will finally get to work more directly with the projects to embed some KM practices in their life-cycle.  This is happening now not just as the result of a fortuitous coincidence of budget issues, but made possible by the fact that in the intervening years, our office has worked very hard to make KM practices work in a critical strategic area of the project organization.  Having demonstrated a successful approach in one small, yet critical office, we are offered an entry into the big guys' world, the mission projects.

When KM is funded as an overhead function, KM is at risk of de-funding.  When the project office is willing to pay not just for an annual KM event but a full time KM position, you know you're doing something right.  I'm not sure this is an indicator that features prominently in KM maturation models. Is it possible that the source of funding is a better indication of success than the overall size of a KM office? I feel that I have just been given this opportunity and I don't want to miss the boat. 
Of course, a lot could go wrong between now and October.  It is still very much a contractor position, therefore subject to a lot of budget uncertainty in the medium to long term.  If this opportunity moves forward as planned (I'm optimistic about it), there are no guarantees that we will succeed. There are no guarantees that what we did with that one small office can be a blueprint for other efforts, yet we have learned a lot with that effort and with three years under my belt in the organization, I am now much better equipped to assess the environment and admit that it is ambitious.

Working directly with the projects, rather than being perceived as a separate office, is an important step forward.  It has a lot to do with ownership of the KM activities.  When KM is something that the KM office does, it is typically an overhead, disposable activity.  When KM is embedded in projects, it becomes part of what they do, a way of doing work.
Enhanced by Zemanta

Tuesday, May 10, 2011

Technology Adoption: The Importance of the Second Trial

I seem to follow a pattern with most web 2.0 tools. There's an initiation phase where I try out the tool and I use it for a short amount of time. In that initiation phase, I figure out how it works, but I'm only scratching the surface of what the tool can do and I'm already noticing some of the drawbacks.

Then my attention span drops off and I barely visit that tool for months at a time. My suspicion is that this is where most people give up on a tool and decide it's not for them. I've done that recently with Quora.

At some point, I come across something on the web that reminds me that I have an account on that tool and I revisit it. Very often, the tool has evolved and added new functionalities between my first and second trials. I'm usually happy with improvements and likely to pick it up again. The second trial tends to be more focused on getting something specific out of it... a more focused project. It doesn't imply that I'm going to use the tool on an ongoing basis, just that I've thought about what the tool can be useful for and when I might need it, not necessarily on a daily basis. This pattern was realized with Pearltrees. I played with it more than a year ago, found it somewhat interesting but limited in the way it structures links between pearls (much less flexible than a mindmap for example, yet much easier to create than a mindmap consisting only of URLs).

One of the drawbacks of Pearltrees is that a Pearl has to be a URL. As far as I can tell, you can't add a "concept" pearl without a link. It's not meant to build concept maps or mind maps. If I'm organizing links, after the first ten links, I'm automatically starting to think about how to group them around key concepts. Pearltrees doesn't allow you to do that easily. To address that challenge, I've used links to Wikipedia as a way of organizing around key concepts in the pearl map below.

SOCIAL BOOKMARKING & Related Concepts in Barbara Fillip (bfillip)

 SOCIAL BOOKMARKING & Related Concepts
Enhanced by Zemanta

Saturday, May 07, 2011

Knowledge Areas for KM Professionals

This visual has nothing to do with a rigorous analysis of what a KM curriculum would need to address but it has a lot to do with areas/fields/topics I've encountered while DOING knowledge management. I've highlighted Personal Knowledge Management in bold because it ended up in the middle of my arrangement of bubbles somehow and it is often neglected and ignored by KM programs. I've used a loose coloring scheme to differentiate things that were strictly KM (ugly green) from IT-related items (light purple), from individual-focused items (red line) and organization-level focus.


Enhanced by Zemanta

Knowledge Management Education & the Job Market for KM Professionals

Kent State University and George Washington University have been collaborating around an initiative to strengthen KM education (mostly at the Masters and Ph.D. level) with what I understood to be a long term goal of strengthening the KM profession as a whole by turning KM into a "discipline" with a standardized set of core qualifications, etc... essentially trying to balance or counter the emergence of competing commercial outfits delivering KM certifications and in the process making a few claims about the strength of their training.

I don't have sufficient first hand knowledge of the entrails of all this to pass judgement on any of it. From a personal perspective, as a knowledge management professional, I've had to ask myself whether I needed to get a KM certificate of some kind and after looking into it, I've decided that it did not make sense. For one thing, I've been disappointed with classroom learning (even in workshop format) settings and I seem to do better with ongoing social learning opportunities using a wide variety of sources and methods online. Another aspect of this is that I know enough about knowledge management to be skeptical about the ability of any training out there to really help me with ongoing KM challenges I face in the workplace and to give me something I can't find on my own with a little of diligence on the web. I could see myself taking one "class" a year on a specific topic, but the certificate approach isn't appealing to me.

The initiative driven by Kent State University (KM Education Forum Community Wiki) includes a series of webinars (completed) and a two-day onsite event at George Washington University (completed this past week, May 5-6). I listened in on some of the webinars and attended the onsite event in D.C. The back and forth between the academics and practitioners was interesting. Not surprisingly, there's a significant gap between the concerns of the academics ("we need to create a true discipline with a rigorous curriculum") and the concerns of the practitioners ("can you please send us people who know what they're talking about and can DO knowledge management"). Part of the problem is that there isn't a standard explanation of what "doing knowledge management" really is because 1) KM is highly contextual; and 2) KM draws from a wide range of other disciplines.

In parenthesis, I'm not convinced KM is a discipline or needs to be a discipline from an academic perspective. I see it as a cross-disciplinary field. I'm not sure the marketplace is asking for KM professionals coming out of schools with a KM degree. I think the marketplace would be satisfied with KM professionals who have had cross-disciplinary training, whether their degree comes from the school of computer science, human resources, business, library sciences, etc..

At times it seems as if there are a few individuals ("strong personalities" I should say) who are positioning themselves to be able to say "I created KM as a discipline." A small dose of humility might do some good to the field of KM as a whole (see a related short post by Nick Milton: "It's Wrong to be Right").  Within organizations, without a dose of humility and the ability to collaborate with other departments, KM can't go very far. The same probably applies to KM in academia. I find the attempt to establish KM as an independent discipline to go against the nature of KM. KM is not going to get more recognition in organizations when it becomes an academic discipline.  KM will get the recognition it deserves where and when it is able to demonstrate value to the bottom line and/or organizational goals of the organization.

That being said, I do find a great deal of value in some of the work being done in the context of the initiative to define roles and responsibilities as well as competencies required for KM professionals. Again, as a KM professional (in a contractor position), I have to think in terms of career path. I consider myself a KM generalist. Where do I go from here? What competencies do I need to acquire to get to the higher levels of the KM career hierarchy? What KM specialist competencies would have the most value if I wanted to become a specialist? What would I really be good at and what would be too much of a stretch? What existing skills and competencies should I build upon? If I'm currently somewhere between KM specialist and KM team leader (leader without a team, but below CKO), what are my options both within my existing organizational setting and within other organizations.

Which brings me to the next set of issues: the marketplace. In trying to identify specific competencies and skills that I had/didn't have that were in demand in the marketplace, I've collected job advertisements for the past few months. I've focused on positions that had the words "knowledge management" in them, but also paid attention to jobs with titles and descriptions involving "organizational learning" and "chief learning officer." A few observations:
  1. A significant proportion of the jobs labelled as "Knowledge Management" are 99% IT. Some of them are webmasters jobs under a KM label. I suppose that may happen when web content management falls under the responsibility of a KM office.
  2. Some organizations with a significant number of KM jobs have well defined KM job descriptions and qualification requirements with a good degree of consistency across the board. You can tell from the job descriptions that they have a strong KM program (the World Bank comes to mind).
  3. Federal Government KM jobs are often described in Federalese and alphabet soup. Even when they're open to the general public, you'd need a translator and insider to explain the terminology and have a chance in competing with people already on the inside. If you don't already know the systems the agency has in place, I don't know how you can expect to get through the first level of screening because you can't target your responses properly if you don't understand what their language.
  4. A lot of KM jobs are very specific about the systems and tools you need to know to apply. They don't ask for specialists in communities of practice, they ask for Sharepoint or system XYZ specialists. As an applicant, that tells you something about where the organization is in terms of KM maturity. They've already made a lot of decisions in terms of approaches and systems.  As a KM generalist, I know how to use a dozen different systems but would I call myself a Sharepoint specialist? Probably not.  Give me a couple of months and I can probably become a Sharepoint specialist.  I don't think being a specialist in a specific tool or system would be a smart career move.
This is a personal blog and these are personal opinions.  Obviously, this isn't meant as a summary of the KM Education Forum webinars and onsite event.  See links below for the official information.

Related Links (not advertizing or recommending any particular academic or commercial training)


Enhanced by Zemanta