posted 16 Apr 2002 in Volume 5 Issue 7
The ESD t%lkit
A potlatch approach to knowledge sharing
Faced with the target of delivering all of their services online by 2005, local councils in the UK very quickly came to realise the enormity of the task ahead of them. Tim Anderson describes how the ESD t%lkit, a system based on ‘potlatch’ economics, provided local government organisations with a means to pool their efforts and benefit from the experiences of their peers.
There are (at least) two things you need in order to effectively share knowledge: a structure and a process. Over the past year, and through working with the Improvement and Development Agency’s (IDeA) electronic service delivery (ESD) t%lkit group, we have tackled both issues.
The process was something we developed right at the start. The driver for setting up the t%lkit was the sudden panic that spread among local councils when we were first faced with national e-government targets. We were presented with the aim of 100 per cent electronic service delivery by 2005, without us having any clear idea of what was meant by ‘electronic’, ‘service’ or indeed ‘delivery’. As further guidelines emerged from the Department of Transport, Local Government and the Regions (DTLR), our initial questions were (sort of) answered, but were quickly replaced by new ones.
I should explain that because local government is funded by local taxes and central government grants for core services, we tend to manage budgets on a ‘last year’s budget, plus or minus’ basis, rather than looking at the performance of individual ‘products’. We do measure performance, but either in isolation from cost or using a very high-level overview of costs. Targets tend to be ‘spending on X per head of population’ not ‘cost of X per unit’. By and large, we also only measure those things we have to report to central government. Because of this, and because the range of services we provide has been built up by a hundred-odd years of individual bits of legislation, there exists no definitive list of services as provided by the different types of authority, let alone one against which we are able to measure volume and cost on a regular basis.
Yet government directives meant we were now faced with deciding which services could be delivered electronically, before writing our IEG statements (all local authorities were asked to submit an ‘implementing electronic government’ statement to the DTLR) and then saying how we were doing against the targets that had been set using Best Value Performance Indicator (BVPI) 157 (the indicator against which local authorities were to report progress on e-government compliance). In circumstances like this, the first thing we would normally do is phone up the Local Government Association and the IDeA and ask for help. But as there are over 600 councils all wrestling with the same problems, a more structured way of dealing with these questions was obviously necessary. Our second response would be to say: “Council Y has been leading on e-government – let’s ask them how they did it.” As a result, the e-government department at council Y would be swamped with phone calls; again, far from an ideal situation.
The solution was the ESD t%lkit. The t%lkit basically functions by encouraging councils that have made a start on the provision of electronic services to share their methods and results with other councils via the IDeA website and through a series of seminars, CDs, and so on. Payback for participants comes, first of all, in getting your hands on things others have done but you haven’t, and in exchanging views on how to tackle the problems no-one has yet solved.
This is where the potlatch economy comes in. Potlatch is a system that will be familiar to all students of social anthropology. It is essentially an inversion of the normal economic status model: instead of being given prestige according to how much you accumulate, you get it for how much you give away. This does, of course, amount to the same thing, as while you give things away to others, they in turn give things to you. Like the traditional western economic model, it keeps goods in circulation and reinforces existing status – those with the most can afford to give away the most, creating a sense of indebtedness in others, who therefore feel obliged to give things back, thus maintaining the status quo. The key difference lies in the culture that rewards giving things away rather than holding them privately. If we are to encourage corporate or inter-agency knowledge management, this is exactly the type of culture we need to foster.
We are all aware of the idea that knowledge is power, but a newer spin, promoted by network economy gurus like Kevin Kelly, is that information is the only asset you can give away and still have; you can have your cake and eat it, so to speak. The argument is that the more knowledge there is around, the richer we all are. Each of us currently only has a section of the picture, but if we see the parts held by everyone else, then we all own the whole picture. In the socio-political sphere, this is the concept of reciprocity: there is a constant exchange and negotiation of the definition and ‘value’ of elements of civic life that we all own. The fewer the people who are excluded from that exchange, the stronger the society is, and the less likely it is to suffer civic disturbance at the hands of those who are excluded. None of this sits well in local government (and even less in central government), where structures are still often hierarchical and based on ‘expertise’: the more you know, the higher in the structure you sit (at least in theory).
There is also an accepted wisdom that you cannot grow the size of your market in government – after all, it already includes everybody. Yet this is a fallacy, a practical demonstration of such is our one-stop shop in Downham Market, Norfolk. One new building brought together a (county council) library, (district council) housing and planning departments, a (voluntary sector) Citizens Advice Bureau and a (further education college) learning centre. Footfall went up dramatically in each department. People who were on benefit had not tended to go to libraries or learning centres but now saw new opportunities open up to them, and many dropped in out of curiosity after a benefits query. Older people often didn’t want to go to benefit offices, primarily because of the stigma commonly attached to them, and as such often missed out on help that was rightfully theirs. The one-stop shop meant they could more easily discover the various allowances that were available to them. Similarly, people who used the free internet terminals in the library were often tempted into moving onto more formal training courses to enhance their job and leisure opportunities. Each agency thus increased their market penetration.
But back to the ESD t%lkit. We had a starting point from which we could develop our list of services. A project started by the IdeA’s predecessor, the Local Government Management Board, had created the so-called ‘Go with the flow’ charts. These identified a hierarchical list of generic services and processes, together with the data that supported them and the client groups they were aimed at. The problem was that these were generic (and each council calls the same thing by different names, as well as having different ways of structuring which departments do what) and extremely complex. They were also not in a format many councils could easily use, which meant they could also not easily customise the charts to suit their own purposes.
Some t%lkit members, however, took the list of services and turned it into a spreadsheet. Thameside is widely acknowledged to have come up with the most useful early version, customised to their area but also including categories relating to e-government compliance. Different authorities also conducted various data collection exercises to see where they stood against BVPI 157. Others started early in sorting out e-government policies and structures.
All of this information was subsequently compiled and offered to the entire local government community via the web and regional road shows. The core group – largely consisting of those councils that had done the work that was shared – also identified other bits of work that were needed and farmed that out among themselves or to other willing volunteers, to assist other councils that were racing to write their IEG statements. The database of services has also been linked to process mapping that was done under the Life Episode Access Project (Leap) and is now being examined by the Audit Commission and the Best Value Inspectorate to see if it can form a definitive set of lists of services for unitaries, counties and districts. A web-based version is also being considered. The reason all of this has worked is that people can see the value in giving information away, because they ultimately get back even more: the potlatch culture.
So the process seemed to work, but what about the framework? One of the most common complaints about the internet revolution is that people are left having to cope with too much information. What this actually means is that they cannot filter the information they want from all of the information they don’t want (yet may want later). The problem is not quantity or quality of information, but its location. The solution lies in a combination of improved structures and better search tools.
We could use that much-favoured analogy, Borges Library of Babel. Borges library had not only every book that existed, but every book that could exist, including one that was entirely blank pages and one that was just the letter X written over and over again. If Borges library were real, we would probably use two tools to locate the books we wanted: the good old Dewey Decimal System and the expertise of the librarians at hand. Dewey is the structure and the librarian is the search tool.
Push the metaphor a bit further and two key issues become clearer. The first is that a search tool is useless without a structure. The librarian needs to know how the books are organised on the shelf in order to find them. A search engine can yield the right results, only because it has pre-indexed the documents and catalogued them according to whatever system it uses: key words, Boolean logic, and so on. The second is that the actual structure is not as important as the fact that one exists at all. Dewey is an arbitrary classification based on the way one man’s mind worked. It is more useful than the Two Ronnies’ method of stacking books according to size and colour (a system that was, interestingly, used for a brief period in Norwich’s new Millennium Library) but there are many other equally good systems that could be devised.
In the e-government context, the structure was the ‘Go with the flow’ list of services, as amended by Thameside’s channel icons, the DTLR’s list of transaction types plus Leap’s process mapping methodology. This provided a spreadsheet that could be used to capture e-government compliance, once the generic service list had been customised.
In Norfolk we have taken this initial spreadsheet and used it as the basis for a data collection and knowledge management exercise, which has recently got underway. The spreadsheet was identified as the starting point, as it provided a point of reference. Although other frames of reference could have been used, we decided the spreadsheet was probably the best as it was both reasonably generic (and therefore easier to benchmark against) and would stay the same even if our organisational structure were changed.
The first step for us now is to identify which service unit provides each service, and which department or division that unit fits into. The next important element to capture is volume – not just how many units of service, but how many units of different types of transactions via different channels for that service are delivered. The different transaction types tell us what we need to ‘e-enable’ for each service, while the volume per channel helps us decide our priorities for action.
The next element to capture is cost. A unit cost can be derived by working out how much time a typical transaction needs depending on each channel by which it is delivered, and the cost per minute of maintaining those channels (face to face, phone, internet, post, etc). This allows us to identify how much can be saved by moving transactions to different channels. Linked to this is the number of full-time workers who deliver that service. Savings in time, realised by moving a transaction to a different channel, also mean savings in bodies, which is a major factor in our operational costs. Property locations also need to be mapped for the same reason.
Another element to consider is the information attached to a service. This includes information about the service – websites, leaflets, FAQs, etc – and information generated by the service – application forms, letters, database entries, orders, invoices and receipts. This needs to be identified, partly so we know what it is we must e-enable, and partly to help us manage the information itself more effectively. How much storage space are we using, and what does it cost compared to electronic storage? How easily can we track documents if we are presented with a Freedom of Information request? Are there any unnecessary steps in the paper (or electronic) chase that have to be taken before a service request or transaction can be fulfilled?
At the start of March, Norfolk began a major data collection exercise, which will eventually be used to feed the spreadsheet and allow us to start extracting information from the data, and knowledge from the information. We have not one but three questionnaires, as well as a series of meetings, designed to unearth qualitative information.
First, we aim to establish the metadata linked to individual services: confirm the list of services; agree which transaction types and channels are possible/desirable; capture what information attaches to the data; and identify which staff members, where and at what levels, are responsible for delivering that service to the public. Second, we intend to survey around 50 per cent of all those staff members to identify the volume, duration, transaction type and channel of their customer contacts, as well as details concerning ‘routing’ information (ie, were people passed on to them from somewhere else, and did they in turn pass them on or deal with the request themselves?). Third, we survey 10,000 residents to see how they currently access different services provided by us and other public sector bodies, and ask how they would like to access them.
This quantitative information will then be tested with staff and external focus and stakeholder groups. The internal group will help us ‘quality assure’ the data (was it subject to seasonal or other variations, for instance?) and both will explore the issues that stem from the data in terms of new ways of delivering services. The information will also help the implementation of any changes to service delivery by identifying which bits of information need to be e-enabled and the metadata systems that need to be put in place to assist with information and knowledge management. The spreadsheet itself can then be developed further by including reference to these metadata systems. And in parallel with the internal and customer data collection, cross-sector analysis will be carried out in two market towns to look at inter-agency e-government issues.
We will, of course, be feeding the work we have done locally back through the ESD t%lkit. If other councils use similar ways of mapping what they do, it will be easier for us to benchmark against them and identify best practice. It will also enable us to identify where the key priorities for national standard setting or product development work lie. By giving away the things we already have, we could get back the things we need.
Tim Anderson is the e-government officer for Norfolk County Council. He can be contacted at: email@example.com