Monday, June 30, 2008

Accelerated Learning

I wrote about rapid onboarding a couple of months ago. Rather than seeing it as the organization's responsibility, I was looking at it from the perspective of the employee. As an employee, what can you do to shorten the length of time it takes you to be fully integrated within your team and to add value to the organization.

Based on my experience of the past few months, I'm tempted to express some caution with this approach and to articulate some hypotheses.

Unless the organization is equally in a hurry to get you on board and fully committed to helping you out, don't rush. Whether a rapid onboarding approach is appropriate (and successful) or not may depend on factors totally out of your control as a new employee. Such factors may include 1) the extent to which you have regular access to your supervisor/management for feedback and guidance; 2) the extent of disagreement / conflict around your position.

Becoming part of a team or organizational unit has a lot to do with establishing relationships. This can't be rushed too much and if there is disagreement or conflict around your position, it may take even more time. If the environment is less than ideal, it should still be possible to focus on learning the culture, the jargon, the processes, and staying away from actually "doing" anything (especially anything that might be perceived as stepping in other people's territories), at least until things settle and you've had a chance to see more clearly what is going on and you're better able to interpret subtle cues and signals.

I'm assuming this isn't such an issue when someone is hired with a clear mandate to lead a team. Then you probably can't afford to wait and see. If you're in charge, learn quickly and act like a leader. If there is confusion about who is in charge and roles and responsibilities, then being in a rush to do things and to add value is probably not where the emphasis should be.

Saturday, June 21, 2008

Propinquity... among other influence factors

Propinquity -- my new word of the day -- has to do with proximity. The "propinquity effect" refers to the impact of proximity -- or physical distance -- on human relationships. This has a clear impact on team building and efforts at fostering collaboration within organizations. How do geographically dispersed, mobile and sometimes entirely virtual teams and organizations succeed when the propinquity effect would suggest that they are bound to fail?

I came across the word "propinquity" in Influencer: The Power to Change Anything, by Kerry Patterson, Joseph Grenny, David Maxfield, Ron McMillan and Al Switzler. Interestingly, I picked it up thinking that it would be a typical business / management book with lots of anecdotes and stories from the U.S. Corporate sector and the reality was quite different. It is full of examples from the social sector. All of the international examples (Grameen Bank, Soul City, Guinea worm) were familiar to me. The book doesn't mention the term social marketing but many of the examples are clearly related to social marketing concepts. It does refer to social capital.

After reading both Made to Stick and Influencer, I realized that most books published these days -- or perhaps it's only those that are successful -- have a very similar structure. They're structured around six key concepts or success strategies. Case Studies (examples) are used throughout and each core case is referred to multiple times, across chapters.

Wednesday, June 11, 2008

The Curse of Knowledge

Timing is everything. Had I been reading "Made to Stick" six months ago, I would probably not be reacting to it the way I am now. "Made to Stick" is about what distinguishes a sticky idea from an unsticky one. The concept of stickiness had already been discussed by Malcolm Gladwell in "The Tipping Point," and I found the two books to be highly related and equally insightful.

There are lots of interesting insights in "Made to Stick" but I am going to focus here on what they refer to as the "curse of knowledge." The idea is that once you know something, it is quite impossible to imagine not knowing it. Good teachers know how to do it but most of us have some difficulty in explaining what we know to others simply because we tend to assume that the other person knows/ understands certain things when they in fact do not.

[This is clearly going to be an ongoing issue, very related to my last post.]

I took a position a month ago with NASA. It's a knowledge management position and I'm not expected to turn into a scientist or engineer any time soon. I expect, however, that to be effective in my position, I do need to have some understanding of the core concepts, methodologies and processes that define how NASA does business.

Yet I am confronted almost daily with highly skilled technical professionals who speak an alien language. When they used words like risk management, reliability, systems engineering, safety, --- not to mention the millions of acronyms thrown around -- there is a strong assumption that everyone in the room has a common understanding of what these terms refer to.

Let's take the word "sustainability" as an example. Coming from a "development background" -- a term which incidentally would mean nothing to someone working at NASA or would be misunderstood as having something to do with engineering design or software development -- the term "sustainability" means a whole bunch of things and can't easily be separated from the developing country / development project context I have always associated it with. Of course, for an environmentalist, the term "sustainability" conjures a whole different set of images. So, when a NASA engineer talks about sustainability, how can I unlearn what it has always meant to me?

I'm told that risk management and knowledge management are closely related. Obviously I'm going to have to figure out what risk management is all about because that's not a prevalent concept in international development work. I had come across the concept when exploring project management methodology, especially project management around IT projects but the international development community doesn't talk about risks.

I was a monitoring and evaluation specialist in my previous position. That doesn't mean anything to a NASA scientist or engineer, so I have to try to figure out what the closest equivalent is in their world. It turns out that it might be the risk manager. The risk manager identifies and analyzes risks, transforms risk information into planning decisions, tracks risks, controls risks and communicates and documents risks. The monitoring and evaluation specialist translates project objectives and activities into indicators, identifies approaches for collecting data for these indicators and plans activities to monitor progress towards the achievement of project goals, and at the end of the project, helps determine the extent to which project objectives were achieved. Both the risk manager and the monitoring and evaluation specialist are focusing on following the project life cycle to ensure that the project stays on track, avoids possible obstacles, and achieves its objectives. One focuses on identifying and avoiding obstacles, the other focuses on documenting the extent to which the project is making progress. One is proactive, the other is reactive. Is the glass half full or half empty? When you are dealing with very high risk and high cost projects, then perhaps it makes sense to focus on minimizing risks.