posted 9 Aug 2001 in Volume 5 Issue 1
An intellectual infrastructure for KM
Conventional thinking dictates that organisations must make a choice between adopting a codification or personalisation approach to knowledge management or risk losing customers and/or profits. But Tom Reamy refutes this idea arguing instead that the real issue is how to create a system that integrates the two in a flexible and systematic way.
Beginnings are always delicate times. Decisions made at the start of a project take on an exaggerated importance. In knowledge management the strategy phase with which KM initiatives are normally begun typically consists of such activities as doing a knowledge audit performing a cultural readiness review and a knowledge opportunity analysis. It is also during this strategy phase that it is recommended you align your KM strategy with your business strategy and that part of this alignment process is deciding on whether to emphasise a codification or personalisation implementation of KM.
A second activity often undertaken at the beginning of a KM project and sometimes as part of a knowledge audit sometimes as part of a requirements phase is to create a knowledge map. This knowledge map is usually seen as something that needs to be done but once it is completed it usually slips into maintenance mode where it becomes just one of many tools available for the KM effort.
However I am going to argue that for companies in fields like finance stock brokerage and indeed most companies outside the world of consultancy the decision between codification and personalisation is not an essential strategic one and that if the creation of a knowledge map is seen as one of the core elements in KM the result will not only be better knowledge management but a more integrated relationship between codification and personalisation.
Building evolving and utilising a knowledge map should be central to your KM effort and if it is a much more interesting and fruitful relationship between codification and personalisation can safely be developed.
Codification and personalisation
There was an article in Harvard Business Review (March/April 1999) that has become a standard text relating to the KM strategy phase of deciding whether to emphasise a codification or personalisation approach.
Briefly the difference between the two is whether you focus on capturing codifying and reusing information – connecting people with information – or emphasise connecting people with people and providing in-depth expertise as your product.
One of the essential points in the article is that it is at best dangerous and at worst catastrophic to try to straddle the two strategies. They recommend that you pick one and devote 80 per cent of your efforts to that one approach while keeping the other side at about 20 per cent. The danger of personalisation companies trying to add self-service knowledge bases for example is the alienation of customers who expect personal attention and don’t want to be told to ‘look it up’. Codification companies that try to add expertise-based solutions or in-depth personal attention risk losing their profit margin.
In addition the two approaches call for different incentive structures that can cause confusion if both are in place. If your normal reward structure or basis for performance reviews is how much material someone contributes to the company database how do you capture and evaluate their informal tacit knowledge sharing?
The article uses consulting companies like Ernst & Young and McKinsey as raw materials for what the different approaches are and for illuminating the dangers of trying to straddle the two strategies. For consulting companies the arguments are quite compelling but when extended to other industries they become a lot less powerful
For example in a company like Schwab two factors argue against a strategic either/or. First the company consists of multiple enterprises with at least four major KM constituencies: IT corp admin retail and institutional. Each of these four areas has different KM needs.
The second factor weighing against an either/or decision is that Schwab has multiple products and services with different mixtures of personal expertise needed. For example the authors of the Harvard Business Review article ask three questions to help determine if you should pursue a codification or personalisation strategy. The first is do you offer standardised or customised products? Our answer is yes. Second do you have a mature or innovative product? Again the answer is yes. The third is do your people rely on explicit or tacit knowledge to solve problems? The answer is both.
Schwab has been known as a discount brokerage and a brokerage technology leader and we are now expanding the help and advice area of the company. Even if the decision concerning codification or personalisation were such an essential strategic decision we would be left with more than one answer for different enterprises. In that case the decision would become whether or not we have a single unified KM initiative or department or create two or more KM departments with different answers to the codification/personalisation question.
In this instance I would argue that the situation at Schwab is a better model for industries in the financial sector to follow than that of consultant companies like Ernst & Young and McKinsey. Our early research indicates that the real issue is not which path to emphasise but rather how to create a system that integrates the two in a flexible and systematic way. The relationship between codification and personalisation has two primary intersection points creating a support context around when to make the transition from codified information to personal and tacit knowledge and enriching your information storage solution with more and more knowledge contexts.
The first intersection involves the question of when you need to (or when it make sense to) make the transition from information to knowledge.
As part of a knowledge audit we did an initial survey in which we interviewed branch employees who work in branch offices where customers come in for face-to-face interaction with a representative. Many of the answers that these representatives provide are straightforward and can be quickly and easily answered drawing from either the knowledge in the representative’s head or by quickly looking up the answer in an online reference application.
The majority of these transactions are information exchanges and rely on a codified information management system. However there can and does come a time when the representative cannot answer the needs of the customer at which time the essential question becomes if and when to escalate the situation to someone with more expert knowledge.
What KM brings to the situation is help in managing and supporting people make the decision on whether to escalate training them on how and when to escalate and ultimately enabling this escalation point to be shifted so that more queries can be handled without escalation. In other words to incorporate more knowledge into the information system layer and thus delay the need for escalation.
A knowledge map allows you to codify more knowledge elements and at the same time supports the decision by human workers of when to escalate from information to knowledge from a simple document to a human expert. An integrated KM system based on a knowledge map also provides a framework that enables the transition to go more smoothly as the knowledge base the initial contact representative and the expert are all using the same vocabulary and categorisation schema.
What I am going to show in the next section is that it is not only possible but also strategically sound to build an integrated codification/personalisation approach to KM if you make use of the right knowledge architecture and a powerful and dynamic knowledge map and include a flexible KM team of internal consultants.
Knowledge architecture: enriching codification and codifying personalisation
Knowledge is a very difficult thing to define but without some sense of the difference between information and knowledge you run the risk of confusing the two and developing a hazy approach to KM that falls prey to the hazards noted in the Harvard Business Review article. The danger doesn’t lie in trying to approach KM with both a personalisation and codification strategy but rather in creating a system that does not allow for the smooth integration of information and knowledge codification and personalisation.
Regardless of the definitional difficulties we have an intuitive sense that knowledge is broader deeper and richer than data or information. Knowledge is information plus something – meaning action organisation patterns or whatever. Rather than add to the list I’m just going to say that the ‘something extra’ is what we’ll call context. Knowledge is information plus context of a variety of types.
There are two major kinds of context: intellectual and personal. The latter includes not just individuals but also collections of individuals in social communities. It is the defining characteristic of knowledge management to model organise and support these additional contexts. Adding in these contexts is what differentiates knowledge management from information management.
In the area of intellectual contexts when you go from information to knowledge you are going from a relatively static and one-dimensional area to a land that is dynamic and multi-dimensional. It is somewhat analogous to a flatland progression from point (data) to lines (information) to an escape from flatland into the world of three dimensions (knowledge).
Because of these extra dimensions the normal KM approach is to extract these contexts from information store the information and then focus on the process of a human brain converting the information back into knowledge. KM can improve the conversion process by providing knowledge facilitators to work with employees both during the conversion of knowledge into information for storage and on the subsequent conversion back. The dangers of loss of context distortion of meaning and other losses attendant on any conversion process are particularly acute in this case since it is those contexts that are being converted twice that actually create knowledge out of information.
By adding facilitators at both steps some of the loss and distortion can be alleviated and more value can be added through another major component of any good KM system: the alliance between KM and learning. Training can empower the process of converting information in a database into knowledge by not only teaching good technique but also by ‘normalising’ the results; that is ensuring that the human side of the conversion is knowledgeable as to what the company-wide or community of practice-wide consensus is.
However no matter how good the conversion process gets the less conversion you need to do the better. This is where a powerful knowledge architecture effort can provide enormous benefit. By adding the ability to store more elements of the contexts that transform information into knowledge we make the systems smarter minimise the loss or distortion of knowledge during conversion and can integrate more fully with training efforts.
Let’s take an example. We have a document that is a procedure statement. It contains pieces of information like ‘when you finish X the next step is Y’. There are a number of additional contexts through which this piece of information becomes knowledge:
- History – we used to do Y before X but it led to all sorts of problems;
- Applicability – this procedure only applies to customers with an account over $100 000;
- Personal – you have never read this procedure document before;
- Value – following this procedure is important because if you don’t the company will be liable for a $10m fine;
- Relatedness – this procedure is similar to procedure P in the branch enterprise but differs in the following ways: A B C.
Some of these contexts can be captured and stored in the knowledge map of the enterprise. In some case the critical knowledge can’t be stored but pointers to other documents processes and people can. For example if you have a smart knowledge base that knows whether or not an individual has ever seen a document (and for how long they have read it) and knows that other documents in the knowledge base provide background or training for a novice reader of procedure X then some of the contexts can be stored along with the information.
What is essential for an integrated approach to KM is that the intellectual infrastructure be consistent across topic or subject areas and across the personal and social tacit knowledge components. In other words a knowledge map must deal with the integration of all types of contexts – intellectual personal and social. The nature of these latter two contexts can be seen in three example areas: personalisation collaboration and knowledge retrieval.
Personalisation has been touted for a number of years as a high-value proposition. Unfortunately the payoff has often not materialised. One reason for this is a poor understanding of the need for a rich knowledge architecture to support personalisation and place it in the overall context of communities.
This architecture must account for a variety of roles and functions membership in a variety of communities and be open to mapping a categorization of tasks and processes. It must include a temporal and historical dimension as well. For example having a person categorised as a client service representative is useful but having a system that knows the difference between an experienced client representative and a novice is even more useful.
Communities can be created around a variety of activities interests and channels of communication. They can have an established lifespan or can be created on the fly for a single meeting or the life of a project. KM support for collaboration must model and support all these communities and their interactions and must also integrate with the intellectual categorisation schema.
A knowledge retrieval system for communities should include both the retrieval of information and surrounding knowledge contexts on the one hand and on the other the location of humans that contain the knowledge the individual is searching for. The latter capability an expertise locator is one of the more exciting areas of knowledge management. However without a proper foundation (a knowledge map) expertise locators could end up like many personalisation efforts – where’s the benefit?
Too often projects like developing an expertise locator an enhanced knowledge retrieval system or a collaboration platform are approached either as ‘just another technology project’ or even if they are recognised as belonging to knowledge management they are seen as largely separate projects with one set of categories for experts one for information or documents and one for collaborative communities.
None of the three signature knowledge management projects will achieve their full potential until they are all integrated and rest squarely on the essential component of knowledge management: the knowledge map.
The strategic role of the knowledge map
The battle cry ‘It’s the culture stupid!’ represented a major advance in KM. Instead of pouring millions of dollars into technology infrastructure and wondering why there was little payoff companies began realising that culture was an essential ingredient. Some analysts claim that success in KM is 80 per cent culture-dependent.
While I agree the culture battle cry represented an advance it is still not enough. Currently companies are being told to find the right balance between technology and culture between the technical infrastructure and organisational infrastructure. But this overlooks the most important infrastructure of all: the intellectual infrastructure. In other words ‘It’s the knowledge stupid!’
During our strategic planning phase we developed a three-tiered infrastructural model. The technical infrastructure includes the actual network with everything from central servers to the desktop. It also includes enterprise infrastructure elements such as a central database of employees. Most of this component was already in place and is needed whether or not a company is going to invest in KM.
The technical infrastructure also includes a powerful and customisable search technology which goes some way to developing and supporting a people-based search (expertise locators) and a collaboration platform.
The organisational infrastructure includes the cultural elements of KM as well as such practical questions as how to staff a KM department where to locate the department in the hierarchy which roles to bring into the KM team and how to plan for their integration into the other enterprises. These are the questions that have become the focus of much of the strategic thinking around KM and we are learning how to adapt that thinking to our own situation.
Where we are having to develop new ideas and strategic plans is in the area of the third tier of the infrastructure tripod: the intellectual infrastructure which consists largely of the creation and maintenance of a knowledge map.
What is a knowledge map?
A knowledge map is the intellectual infrastructure for KM initiatives. The basis for one consists of multiple taxonomies for content repositories dynamic categorisation of people their expertise and the communities they belong to and finally a set of taxonomies for the variety of tasks that are performed within and by the company communities.
The taxonomies of content people and tasks then have to be mapped across the three tiers in order to provide a foundation for the integration of such KM enterprise projects as knowledge retrieval for both document-based knowledge and the tacit knowledge located within the minds of the company’s experts. It is also the foundation for collaboration for capturing the knowledge that is generated in those collaborative communities and for providing the framework within which knowledge facilitators or knowledge managers will operate as they provide services for those collaborative communities.
Perhaps the best way to describe a knowledge map is to describe how we are building ours.
The place to start is with a metadata standard. We started with the Dublin core and then added some additional tags like content-type and audience. For both of these tags we are developing controlled vocabularies. The audience tag will be the basis for our personal and community taxonomies.
We are looking to expand our metadata standards in two ways. First by expanding the controlled vocabularies for audience and subject or keywords. Once you have a rich enough set of values in your controlled vocabularies the more difficult but much more rewarding task of creating semantic networks to capture the rich mosaic of relationships among your values can begin.
The second way we are looking to expand is by looking at the new metadata approach offered by resource description framework or RDF. This technique can be described as an XML layer added to standard metadata. It is still an open question of how much effort to put into implementing this standard. The alternative is to rely on the tacit knowledge of a KM team to enrich the current metadata standard.
With the metadata standard in place the next stage is building a browseable taxonomy of content on the corporate intranet. We call ours the Yellow Pages and will not only be a project with an immediate payback in saving users’ and web developers’ time and effort it is also the first step toward building a knowledge map.
To grow our Yellow Pages into a full-scale knowledge map we will need a variety of approaches. First we need deeper and more dynamic categorisation of all our content. Then we have to develop the audience meta-tags into more complete personalisation and community descriptions.
To do this we are planning on a three-pronged approach. The first is the hard method track and it will rely on expanded search technology. There has been a recent flurry of offerings in the area of automatic categorisation tools offered by search vendors and smaller specialty vendors. We are currently doing an evaluation of this product space with the idea that what is really needed is not automatic categorisation (which is still too low quality) nor human categorisation (which is too costly) but rather what we are calling cyborg-categorisation.
Cyborg-categorisation is the fusion of automatic and human categorisation. It means that the automatic tools are going to be used by human categorisers to speed up their efforts and allow them to systematically and creatively explore the rich content and varied communities within our company.
The second method is to send knowledge engineers or analysts out into the field to interview knowledge workers and incorporate their findings into the knowledge map. These KEs will also begin the process of exploring how best to use the knowledge map within the various target groups.
The third method is to incorporate the results of the first two tracks into an enterprise content management and document management system. A content management platform allows a dynamic and distributed publishing procedure and provides for the transition from a website or department publishing model to a taxonomic publishing model. This will mean we can work with content providers to continue to build the knowledge map that supports that content by creating workflows that integrate metadata capture and routing of content to both subject matter experts and knowledge analysts.
The role of content management and a team of knowledge analysts and/or managers points out the last and most important part of a knowledge map and why it needs to be viewed as the third infrastructure leg rather than as many authors do a project that is done at the beginning of KM and then slips into an easy maintenance mode.
It’s a shark and like a shark (or a relationship in a Woody Allen movie) a knowledge map has to keep moving or it dies. It is an evergreen project that is not only never finished but its continuing development and use is perhaps as significant as the cultural readiness of a corporation as a deciding factor for the success of KM. It is not just that there is new information people and products constantly being developed and added to the system it forms the framework for how knowledge is incorporated into all employees’ work.
Workers and content providers are ultimately the experts when it comes to the subject matter of their areas but they are not skilled knowledge analysts and do not normally think of their content in terms of the category of knowledge it represents the relationships between knowledge chunks or objects or other KM components. Therefore in order to keep the system moving and alive the knowledge map and its utilisation in content management collaboration and knowledge retrieval must be an essential part of your KM system.
Finally the constant development and utilisation of your knowledge map by both employees and knowledge facilitators is an important part of obtaining the feedback loop that any good knowledge management system needs. For example knowledge managers need to be aware of the map and use it and since KM is a self-referential initiative knowledge managers need to use it to capture the knowledge in their projects to refine the knowledge map based on actual experience.
In summary for industries in the financial sector as well as most large companies outside the consultant sector I would recommend based on our experience and analysis that they do not limit themselves to the choice between codification or personalisation. What they should do instead is devote resources to creating the third infrastructure leg of KM the intellectual infrastructure which will become the means for an integrated balance of codification and personalisation components. This intellectual infrastructure rests primarily on creating a knowledge map a living breathing evolving knowledge map that consists of dynamically categorised repositories of content people and communities as well as the knowledge managers and facilitators who use the knowledge map and contribute to its continual evolution.
Tom Reamy is director of intellectual capital for Charles Schwab. He can be contacted at: email@example.com