posted 1 Sep 1999 in Volume 3 Issue 1
Chaos: ...the forbidden fruit?
article address the issue of complexity in knowledge management. We, as human
beings cannot give into the temptation of adding fuel to the fire of chaos in
the technology revolution. The driving force behind KM activities to date has
been quality and productivity management. It is Thomas Stough’s opinion that the
driving force of KM should be a simplified form of communication using available
PC and networking technologies.
Does anyone remember Hypertext? 'Been there... done that....'
I imagine that is what James Watt would say to me today if I could ask him what he thought of knowledge management (KM). Kevin Kelley wrote in his book 'Out of Control'1 that the information age began before the industrial age. Watt, to paraphrase Kelley, couldn't have done what he did without the systematic collection of tacit and explicit information. This refers of course to the blueprints, plans, lead-pencil and paper designs, human know-how, etc. which would govern the making of what is considered the first steam engine.
Before asking James Watt about Knowledge Management, I'm sure I would have to (somehow) define it, a task which is perhaps better left to others-whose varying responses can be confusing. For the purpose of understanding this article though, allow me to define KM thus:
Data is everywhere; information is everywhere and we know where it is; knowledge is making the two available but also useful. In short, allow me to refer to data-info-know when talking about KM. As we wave through the hype of KM and try to deal with the funds that are dumped on us not equal to expectations, perhaps the time has come to take a step back and consider what is going on. For some time now, corporate leadership has been interested in a phenomenon that will improve productivity. As if quality managers, productivity managers, etc. haven't tried this before! (Does anyone remember TQM?) But the expectation problem is two-fold. Knowledge workers either promise too much or management believes the hype. Whatever the case, a lot is being spent and results to-date are questionable. In considering this subject, and trying to figure out how far we've come, I have come up with a great scapegoat for our troubles: It's the technology.
The good news is, the PC has been born, exiting the confines of mother technology and her unwillingness to release the potential of what has been conceived. And just as parents consider their children's future, KM workers are still out on what will become of the PC, technology and their relevance to our efforts. The bad news is, we have limited insight when considering where networking advances will allow the PC to go. Without lots of networking it's fairly safe to assume that KM will be, at best, another idea that had lots of potential. Of course, one good thing about technology is that it has a life of it's own-which is why it continues to exist beyond all its questionable implementations. In fact, I would go out on a limb and even make the assumption that we've become so advanced in projecting information on a screen and powering-up the inner workings of CPUs that we've missed the boat on utilising it. If you consider that the technology in a F1 racing car could be scaled down to improve my tyres, or the suspension of my car-just think, they may even consider someday utilising the anti-skidding technology of aircraft brakes in cars-there must be a better way to utilise some of the high-end technology that PCs put in our face everyday. Imagine, we're all driving around in 4 zillion megahertz desktop dumb terminals that, if you could put wheels on 'em, would blow the doors off a race car.
I'll try to return to earth. More specifically, let's consider a technology concept that was conceived many years ago that is still viable today, especially for KM. Does anyone remember hypertext? Think beyond the 'HT' in the HTTP we click everyday? In short, it was a concept coined by Ted Nelson in the late '60s, a fan of alternative uses of technology, which actually led to the invention of the World Wide Web. Even twenty years before him, the scientist Vannevar Bush said: 'A record, if it is to be useful in science, must be continuously extended, it must be stored, and above all it must be consulted.'2
Without the foresight of men like Vannevar Bush and Ted Nelson, the PC might have remained the glorified typewriter and counting device we've all come to know so well.
Hypertext, as a concept, allows data-info-know to not just be accessible from all directions but also it can be created from all directions. Further, the hypertext concept is beyond linear and Newtonian thinking, which in itself is a hurdle in the data-info-know process. Of course the best example of hypertext today is the internet. The internet is to communication what the automobile was to mobility-without which the world would not be what it is today. And if you're a believer in what the internet allows individuals to do, like I am, then it's hard to understand why every organisation hasn't yet implemented it unconditionally as a corporate communication medium. Hypertext is a facilitator of communication; PCs and networking are our tools. Somehow it shouldn't be as complicated as it is today. I always liked Sun's marketing slogan 'the PC is the network' - which is more a philosophy than a slogan. Without the network capability of PCs it is hard to imagine even attempting to manage the vast amounts of data-info-know that is being created. This can be seen in the results of most internetworked organizations that are limited to the data-info part of my definition, albeit behind the façade of managed knowledge. Consider ERP and intelligent terminals enslaved within the confines of struggling information users and producers. The backbone of KM is the ability of users and producers to make what they know available, distributed. This can only be achieved in an environment that makes the network and the PCs readily available-not a top down terminal driven process. And with all the technology tools available to us today, there is nothing out there to allow the struggling users to 'publish' what they know or have produced-even if the proper environment existed.
When considering ERP (and the products sold in this environment), I can't help but see a similarity with Microsoft which equals: big, restricting, high learning curve. This isn't necessarily bad-if you're willing or have to overcome the learning curve. It's simply not efficient. And with all the advances in technology it is astonishing how things like learning curves and restrictions become the priorities in solutions. Many could argue that complacency or misdirection of IT efforts in general up to now have cost us not just as KM workers but also as ERP, BPR, whatever, implementers. In fact, where are the significant advances in word processors, databases, R/3, etc. over the years? And if there were advances, where are the results? No one can really understand all of them anymore anyway let alone figure out how to utilise them to capacity. At least they're all fancy, require lots of hardware and updates-which in turn keeps things going, I guess. Complexity is exactly what keeps KM from taking that next step.
When you talk about products or solutions for business you are also forced to consider security or the issue of protecting highly sensitive information-a subject which is more scary than criticising ERP or Microsoft. This is due to the fact that makers of software and solutions don't have consistent processes to look back on, as was the case in manufacturing mass mobility. There is no way to know what is right or wrong, and information processing has yet to cause great bodily harm anywhere (as far as I know). And because every company implementing an IT solution has the most sensitive and world-moving information locked up in its closets and desks, the makers of solutions (as service oriented as they all are!) offer highly complex products to meet confused demands. Don't misunderstand, I'm not promoting the free access of internal corporate information. To me the issue of security is simple: if your organisation has information that is so sensitive it cannot be distributed then it shouldn't be in a distributed, shared knowledge environment. The value of knowledge is in its distribution and accessibility. It must be obvious why so many efforts up to now have failed.
Perhaps the only difference today than when Watt was making his steam engine was 'time to market.' And the time between conceptualising mobility and realising mass mobility is ages compared to the time it has taken to internetwork the (corporate) world. But the key factor here is that mobility, relatively speaking, works well. Does this mean the corporate world has to wait 50 or 100 years to get internetworking right as was the case, (if you consider quality, security, services, etc., as metrics) in making automobiles? Or is the question: is the success of mass mobility a result of the proven processes in manufacturing? Consider the fairly smooth transfer from human manufacturing to robotic manufacturing. How easy it must have been since the robots were only required to emulate what humans have always done. Systematising the results of so much internetworking though has not resulted in success because such processes have yet to be created. In the mean time the corporate world is trying to catch up by continuing to implement ERP, or the like. In a way, selling ERP is like trying to get James Watt to buy a CAD system before there were monitors.
Computing has freed us from miscellaneous tasks and brought colour and moving pictures to telephony. But where is the value of high-end, pumped up terminals if they cannot facilitate knowledge? And if they could facilitate knowledge, what about the environment required to disseminate it? Many KM scholars think the answer lies in other complicated things like change management. I disagree. If you approach KM within the confines of an established Newtonian organisation then change management is, worst case scenario, necessary, and best case scenario, it is hard but manageable (and always expensive).
Not every organisation has the resources and so they settle for the easy compromise-buy an application, implement, collect knowledge and hope for the best. More importantly, it has to be understood that in today's competitive environment, distributed information has more value than non-distributed information. And value, at this point in our efforts to implement KM, is key. The concept behind hypertext is the link that we in KM are seeking, because it is a simple facilitator of networking. As we approach the millennium, the buzz words are globalisation, openness, distribution, etc. Secretive information, in a world of distributed networking has no value.
Thomas Stough is a Knowledge Manager within Software AG in Germany. He can be contacted at:firstname.lastname@example.org
|1 Kevin Kelly (1994), Out of Control - The New Biology of Machine, 4th Estate Publishers|
|2 Bush, V., (1945) As We May Think, The Atlantic Monthly (reprinted at http://www.ps.uni-sb.de/~duchier/pub/vbush/vbush.shtml)|