posted 10 Oct 2001 in Volume 5 Issue 3
It’s not rocket science
Why knowledge management is so much harder
Initiatives that rely to any extent on technology usually raise a host of unexpected difficulties. Pat Shafer presents a study of the cultural behavioural and political complexities of knowledge management drawing on empiricism case studies and lessons learned from benchmarking and client engagements.
Rocket science is exactly that – a science. Predictable repeatable and albeit complex things go more or less as planned with admirable regularity. Not so with technology initiatives. Things usually don’t go as planned.
Depending on the study you read IT initiatives fail to deliver expected returns to the business somewhere between 65 per cent and 80 per cent of the time. A recent review of technology initiatives caused one company to pull the plug on 35 per cent of its then current initiatives implying that there would be zero ROI for those investments. While these studies often fail to point out why IT initiatives fail our experience indicates that it’s usually not the technology. Techies blame user ignorance users blame poor usability but it is partially because KM as a technology-enabled practice is in its infancy and partially because it is more complex than many initially assume. Mostly however it is because knowledge management is not about technology but about human behaviour.
What laws govern human behaviour? Precious few. While we do not approach knowledge management as behavioural scientists we do observe what works and what doesn’t as part of our benchmarking and performance measurement efforts. We can safely assume that incentives will direct behaviour. Yet many KM initiatives assume that employees will share knowledge for the good of the company. One example we frequently run into is in the field of R&D where physicians and scientists prefer to hoard information expecting to publish in the Journal of the American Medical Association or get credited with a patent rather than share information with colleagues. Also adding to and leveraging knowledge is often seen as time-consuming and in conflict with productivity. Workers often question the time saved searching for and re-using information and are rarely willing to go the extra mile to make contributions.
Lack of a coherent strategy is altogether too common. When pressed to recite the strategy for an intranet or KM initiative its stewards are quick to point out that there is no formal strategy other than to increase usage. And of course unless usage is tied to some form of value creation then increased usage may equate to increased distraction from higher-value activities. One of my colleagues has often referred to one popular measure – hits – as “how idiots track success”.
Among the strategies we have observed some have stood out as best practices simply because they tie directly into building shareholder value:
- Grow revenues;
- Position the firm as a thought leader;
- Reduce the costs of poor quality;
- Improve quality and timelines of delivery;
- Capture the experience of a retiring workforce.
The big picture
Technology initiatives focus on technology. But knowledge management like many technology initiatives is about empowering humans. When we look at the layers of a technology initiative we see that most requirements documents focus on the lower layers of the application spectrum: the legacy environment; infrastructure; the applications themselves; and increasingly workflow. We won’t address these dimensions in this article. What most requirements documents fail to address are the as critical and more elusive elements of governance support and culture. These will be the focus of this paper.
Perhaps the single most critical element of success lies in understanding the value propositions of those who are expected to contribute to manage or apply knowledge. It is often assumed that if someone is a ‘team player’ then they will submit reports and work product to a KM database. However productivity is steadily increasing and workers are generally reporting less time to devote to anything but mission-critical activities. This means that workers are far more likely to contribute to a client deliverable or to a proposal before they will work to build the knowledge base or participate in other activities with less a direct or less immediate benefit. In almost all cases workers will default to address needs that have a direct short-term impact on profitability. In many cases indirect activities will adversely impact compensation.
So how do companies align KM with users’ value propositions? Most would do far better if they would simply ask (and listen). Often companies can position KM as a critical part of job descriptions and align contributing to the knowledge base with reward recognition or simply as a proven way to reduce sales and delivery cycle times – which translates to more productivity and revenue.
Virtually all of the successful initiatives we’ve tracked include a proactive approach to ‘incentivising’ KM programmes some being more proactive than others. These incentives range from including KM activities as a critical part of performance reviews to direct cash rewards. Some examples include:
- An accounting firm’s tax practice developed a knowledge base to share ideas that would open doors and lead to new business. The ideas were well researched and tied to specific industries or tax situations. The firm published a few key ideas each month with sizeable predictable bonuses awarded to those who submitted the idea published for review;
- A technology consultancy has evolved a financial reward system based on monthly top performers in terms of value-creation related to KM. The original idea was to write a bonus cheque for $3 000 to the month’s top contributor. Unfortunately in a land of techies several wrote macros the day the programme was announced and downloaded volumes of information from the internet crashing the company’s network. Now a sophisticated approach to asset management measures the value of an asset when it is submitted in terms of economic value-added (EVA). Value is also measured and attributed when an asset is ‘checked out’ and applied to a deliverable or sales effort reducing the cost or improving the quality of the deliverable;
- Another benchmarked company Siemens demonstrated a ‘frequent flyer’ approach to incentives. Contributors to the knowledge base earn ‘mileage’ simply for submitting an asset. However they stand to gain far more miles from the informal peer review process. Users can award points to the originator in theory based on the quality and relevance of the contribution. Points are also awarded for more empirical performance such as frequency of access. Participants can turn in their points for rewards such as merchandise or travel;
- Recognition is subtle. We have seen very few instances of praise or direct mention within knowledge management initiatives (this is usually the province of employee newsletters). Instead contributors settle for the byline attached to assets or perhaps the increased frequency with which they turn up in the results of skills or knowledge search queries;
- One of the more mature approaches (and one that takes a long time to foster within an enterprise) is IBM’s approach of weaving knowledge management into its performance management process. Sharing and leveraging knowledge is a key competency and employees are evaluated on their participation both in periodic performance evaluations and in post-engagement assessments. While there is no direct cash bonus one’s track record of sharing and applying institutional knowledge can impact pay and promotions.
Many KM initiatives are marked by a failure to understand how audiences prefer to access knowledge bases. Leading KM practitioners would all agree that it’s important to place KM in the critical work path. This means that contributors should not have to perform extra work or use ‘swivel chair’ technology turning from one system to another to execute then record work product. And some leading professional services firms have developed workflow applications that not only facilitate the thorough and accurate development of products but also capture all input to build a comprehensive record of the work for audit or to be reapplied as appropriate in another engagement.
At an extreme companies are succeeding or failing by their ability to understand what their key audiences are and how they might connect to a knowledge system. Certainly most initiatives are web-based which works well when users are connected preferably to a high-speed line. But many professional services firms deploy their practitioners in the field where they may enjoy at best periodic connection via dial-up. This has lead many companies to maintain systems such as Lotus Notes that allow asynchronous use through replication of databases to local drives.
As mentioned in one benchmarking event SAP’s competitive intelligence group recognises that knowledge presents itself at inopportune moments such as when traversing a trade show floor or leaving a customer’s office. In response to this it has created telephone hotlines – voice mailboxes that sales persons and practitioners can call to leave bits of information. While at the time the organisation had not made use of voice-recognition software the currency and relevance of the information made the cost of manual transcription worthwhile. Other companies have established a pool of knowledge associates who use web-based KM systems to answer queries phoned in by professionals in the field.
Many companies understand that leadership needs to support KM issues if these are to succeed. But leadership support has varying impact and will not compensate for deficiencies in other areas of supporting the initiative. Leaders will be inclined to include messages about the importance of knowledge management in newsletters or in daily intranet columns. They are far less likely to be involved in a hands-on sense consumed by issues that face outward to the board shareholders analysts and the public. Besides people’s behaviour is more strongly influenced by those who have a direct impact on their pay cheque.
Another approach to aligning knowledge management and leadership is to invert the relationship and position KM with key leadership initiatives. One professional services CEO announces three strategic themes each year. The knowledge management group aligns its initiative and its categorisation along those themes. By developing a matrixed approach to categorisation that features thematic as well as traditional navigational access to information the group is able to support the leadership messages optimise relevance of information and provide ongoing access to information even when the leadership mandates change with the next year’s foci.
Political realities often seem insurmountable though there is a lot that can be done to navigate those realities. Two client situations include the creative application of ‘expert locator’ technologies that have raised red flags.
The first is a law firm that is exploring a software solution that reads e-mail and based upon the frequency of keywords identifies individuals as experts in relevant competencies within the enterprise. The technology is quite clever and elegant. Business rules can narrow the scope and refine interpretation to address privacy and search quality issues. But before agreement can be made to implement the solution the partners need assurance that their privacy won’t be compromised and that there are no unintended ramifications. One approach to navigating such a situation is to clearly define policies and guidelines limit the scope to future e-mails and publish the business rules so that employees understand how the technology is applied before their messages are subject to the review.
Similarly at a major manufacturer the quality group had planned a robust application that would combine curriculum management with online training career development and a skill locator driven by certifications and educational credits. While the training and career development raised few concerns the skill locator was construed by the workers’ representatives to be a threat. To overcome the privacy issues some companies have moved to opt-in systems that allow individual employees to participate in this type of expert locator system.
The practice of applying usability concepts to support technical initiatives is coming of age perhaps because users so often protest that new applications are a step backwards. Sometimes these complaints can be simply dismissed as resistance to change. Too often they are justified.
Usability testing evaluates applications and interfaces against several criteria including:
- Alignment with business goals and processes;
- User control freedom and satisfaction;
- Consistency with standards;
- Information and navigation accuracy and reliability;
- Efficiency and flexibility;
- User recognition and content categorisation.
The approaches to testing include laboratory evaluation field observation and quantitative measurement and analysis. Lab-based expert evaluation affords evaluation against standards and common criteria task-oriented evaluation and comparison with industry best practices. Field observation allows testers to score usability within the context of work processes and the broader work environment. In addition to ‘over-the-shoulder’ evaluation testers can use interviews and focus groups as well as questionnaires and online surveys.
Quantitative measurement and analysis leverages systems-based testing. It is easier to evaluate enterprise applications and establish a performance benchmark. Such systems can also provide continuous tracking to inform future usability improvements and broader workflow issues.
Back before web-based systems integrated workflow and application of usability principles training was critical to the success of knowledge management systems. Instruction ranged from presenting the leadership message and value propositions down to how to enter or find information. One company launched its KM initiative only to achieve 13 per cent participation a huge hit on ROI. After implementing classroom and online multimedia training participation rose to a then acceptable 68 per cent.
These days workers are familiar with technology and KM systems are leveraging improved workflow tools and user interfaces. Training on how to use a system is less relevant. However the knowledge systems can themselves extend to subject matter training. As integration and bandwidth improves KM will play a broader roll in workforce readiness bringing not only data and information to the user but also the tacit knowledge previously reserved for more formal training environments.
Most companies have technical help desks to assist users with purely technical and in some cases application issues. A help desk for KM is used in a different way and requires different skills. KM help desks are often not part of the technology organisation but are associated with lines of business or the enterprise knowledge organisation.
Questions to the KM help desk are rarely technical. They more often centre on how to find information (a result of insufficient training or more likely poor design or taxonomy). Leading organisations use the help desk staff to support workers who have limited access to the KM application or have more complex research needs that require pursuing multiple sources and complex analysis. In some cases the help desk will undertake complete research projects on behalf of the business units.
One reason that users fail to embrace applications is that they simply don’t know enough sometimes anything about them. As such knowledge management programs should be supported by a communications strategy that defines appropriate unique value propositions and core messages and preferred venues for all stakeholders and communities of interest.
Information architecture and taxonomy
There is little agreement on the definition of a taxonomy yet most practitioners agree that failure to address taxonomy needs undermines the ability to effectively categorise and therefore retrieve data and information.
The role of a taxonomy within an enterprise is to:
- Enable navigation throughout all enterprise and external information stores;
- Enable search throughout the above venues;
- Allow producers to efficiently categorise assets;
- Define an enterprise-wide lexicon (where possible).
Developing a taxonomy is challenging because it must adapt to the changing environments presented by mergers and acquisitions new management initiatives new lexicons and the introduction of new products and services. Developing an enterprise information architecture is a broader and more complex issue. For one it includes the development of the taxonomy but also addresses business and technical dimensions that are necessary to enable knowledge applications.
Governance describes the oversight activities that measure and inform the business performance of technology initiatives. Until now businesses have been spending aggressively on technology to remain competitive. In the post-dotcom era CFOs and other leaders will demand a new level of accountability to performance promises. Three key approaches to optimising performance include:
- Establishing policies and guidelines;
- Developing sound organisational partnerships;
- Implementing a performance measurement programme.
Policies and guidelines
The development of policies and guidelines increases overall productivity and reduces cost and time-to-market by eliminating the non-value-added decision making attributable to much application development. By electing to define standards for issues such as data structure technologies authorisation and security usability and navigation the enterprise allows its communities to focus on high-value content that drives business performance. Standards also allow simplified maintenance and support as well as the promise that stakeholders can develop an enterprise-wide perspective using consistent data and information.
An example of best practice is PricewaterhouseCooper’s KnowledgeCurve Roadmap which provides standards for navigation technology and design as well as templates for developing launching and maintaining sites.
Organisation and content management
Content management is used to define the software or functionality that allows organisations to automate content publishing and maintenance workflow. But before there can be content management there must be a definition of ownership responsibility and workflow upon which the content management rules are based.
Organisation and alliances defines the optimal role and make-up of the core application team as well as the relationships with functional groups such as sales marketing legal and HR. This eliminates the delay conflict and opportunity for error introduced by sharing responsibilities for content and functionality without clear definition of ownership and obligation.
Typically knowledge management initiatives must have representation by multiple groups regardless of whether or not the application is accessible by employees and other constituencies outside the firewall. The usual participants include:
- Lines of business;
- Legacy companies (in a merger environment);
- Corporate communications/IR;
- Sales and marketing;
Perhaps the most important but least implemented aspect of any KM application performance measurement identifies the measures and ‘sensors’ to evaluate strategic financial and operational success and drive continuous improvement. How do you know if your initiative is contributing as expected if you are not measuring success against the criteria originally used to evaluate and approve the expenditure? The answer is you can’t.
Performance measurement provides the framework for qualitatively and quantitatively evaluating the success of the programme and each of its components and sub-components. This includes financial measures such as ROI. Performance measurement is the key to accountability and ongoing success.
An example of the successful implementation of performance measurement includes the pharmaceutical example mentioned at the beginning of this article. The clinical trials extranet was initially focused on providing a place for doctors to publish their findings. When the metrics indicated little interest in publishing and a survey confirmed a desire to review past studies rather than publish new content the site was re-directed. Once the site was aligned with physicians’ needs usage and the desired business results took a turn for the better.
Pat Shafer is practice leader Strategic Services at Sage Information Consultants. He can be contacted at: firstname.lastname@example.org