Timing is everything. Had I been reading "Made to Stick" six months ago, I would probably not be reacting to it the way I am now. "Made to Stick" is about what distinguishes a sticky idea from an unsticky one. The concept of stickiness had already been discussed by Malcolm Gladwell in "The Tipping Point," and I found the two books to be highly related and equally insightful.
There are lots of interesting insights in "Made to Stick" but I am going to focus here on what they refer to as the "curse of knowledge." The idea is that once you know something, it is quite impossible to imagine not knowing it. Good teachers know how to do it but most of us have some difficulty in explaining what we know to others simply because we tend to assume that the other person knows/ understands certain things when they in fact do not.
[This is clearly going to be an ongoing issue, very related to my last post.]
I took a position a month ago with NASA. It's a knowledge management position and I'm not expected to turn into a scientist or engineer any time soon. I expect, however, that to be effective in my position, I do need to have some understanding of the core concepts, methodologies and processes that define how NASA does business.
Yet I am confronted almost daily with highly skilled technical professionals who speak an alien language. When they used words like risk management, reliability, systems engineering, safety, --- not to mention the millions of acronyms thrown around -- there is a strong assumption that everyone in the room has a common understanding of what these terms refer to.
Let's take the word "sustainability" as an example. Coming from a "development background" -- a term which incidentally would mean nothing to someone working at NASA or would be misunderstood as having something to do with engineering design or software development -- the term "sustainability" means a whole bunch of things and can't easily be separated from the developing country / development project context I have always associated it with. Of course, for an environmentalist, the term "sustainability" conjures a whole different set of images. So, when a NASA engineer talks about sustainability, how can I unlearn what it has always meant to me?
I'm told that risk management and knowledge management are closely related. Obviously I'm going to have to figure out what risk management is all about because that's not a prevalent concept in international development work. I had come across the concept when exploring project management methodology, especially project management around IT projects but the international development community doesn't talk about risks.
I was a monitoring and evaluation specialist in my previous position. That doesn't mean anything to a NASA scientist or engineer, so I have to try to figure out what the closest equivalent is in their world. It turns out that it might be the risk manager. The risk manager identifies and analyzes risks, transforms risk information into planning decisions, tracks risks, controls risks and communicates and documents risks. The monitoring and evaluation specialist translates project objectives and activities into indicators, identifies approaches for collecting data for these indicators and plans activities to monitor progress towards the achievement of project goals, and at the end of the project, helps determine the extent to which project objectives were achieved. Both the risk manager and the monitoring and evaluation specialist are focusing on following the project life cycle to ensure that the project stays on track, avoids possible obstacles, and achieves its objectives. One focuses on identifying and avoiding obstacles, the other focuses on documenting the extent to which the project is making progress. One is proactive, the other is reactive. Is the glass half full or half empty? When you are dealing with very high risk and high cost projects, then perhaps it makes sense to focus on minimizing risks.