Monday, June 09, 2008

Using Narrative to Evaluate Knowledge Programs/Activities

A few posts ago I wrote briefly about Measuring the Impact of Knowledge Management and a few things I'd heard at the APQC conference in May. I had a short but provocative conversation with Kirby Wright about this challenge, and after speaking quite succinctly about the futility of using traditional measures and biased surveys to measure KM impact in complex environments, he suggested using Sensemaker and the various methods from Cognitive Edge / Dave Snowden's work for narrative elicitation, capture and analysis to uncover the real impact of KM work.

A very interesting idea. After all, do we not look very carefully at comments fields in traditional surveys for richer information. Do we not wish more people would fill out the comments field more? Are we careful not to act on one comment alone, but look for trends across multiple comments - and sometimes have difficulty doing that?

I wonder how one would go about convincing decision makers to embrace the foundation principles and try a pilot?

3 comments:

Patrick Lambe said...

Hi Dale

We have used narrative techniques to help evaluate a KM programme's impact (using anecdote circles, archetype extraction and also in a Most Significant Change exercise). We have found it's easier to get acceptance if (a) we keep the language in the "focus groups" "qualitative input" range upfront and (b) we combine this approach with some of the more traditional methods looking at metrics for activity and involvement levels etc (the two approaches together actually illuminate each other in interesting ways).

Once they've been through the process, they seem to get the value of the narrative approach instinctively, and it's never an issue again. The risk is in trying to over-explain upfront.

Anonymous said...

Thanks Patrick.. I'm thinking along the same lines, and somehow tapping into external expertise in some of the areas you mention..

In a recent conversation with Suzanne Zyngier (Monash University) she mentioned " ... back to qualitative + quantitative ... that's where I thought that a mixed methodology might be useful. Combining sucess (and possibly failure) stories would strengthen the quantitative data to be collected from a survey of staff. This fits with evidence based practice in our strongly positivist tradition of Western society with the overlay of speaking to [analytic types.]"

Gluten Free Recipes said...

Very nice postt