BFM2013_1_00_Introduction – Improving the use and control

This first issue was focused on methods and solutions that could help in improving the way an organization accumulates knowledge, as a way to increase control on core processes, while reducing the use of external experts to where they can add value.

It is a simple statement, but a complex predicament. How do you convert this concept into something practical, without first dropping endless amounts of money into “knowledge management”?

Interestingly, some organizations that misused Knowledge Management tools and methodologies found themselves with an increased use of external consultancies, used by some organizational units as a way to circumvent rules that were applied only to internal knowledge production.

A caveat: what follows is a set of guidelines that could help to define your own roadmap or even blueprint of a “transformation programme”, but its feasibility is a function of two key elements: an understanding of your own corporate culture, and the clearness of the mandate that will be assigned to those involved- no amount of resources would deliver sustainable results without both of them.

BFM2013_1_01_Knowledge management vs. knowledge retention

Knowledge Management is quite often confused with tools and methodologies related to knowledge management.

Anyway, almost any system works according to the “Garbage In-Garbage Out” (GIGO) principle.

GIGO? Usually, any transformation process does not improve the quality of its inputs- at best, it can minimize the additional “background noise” introduced by the transformation process.

Any tool that tries to build a general description of activities across the functional divide assumes a specific reference set of accepted processes, results, and ways to manage change: we will define this set of assumptions the “Embedded Corporate Identity” (eCI).

ERP, CRM, and Knowledge Management tools share the same pitfall: unless you know already where you are, you risk that the tools will insert into your organization their own “Embedded Corporate Identity”.

The reality will then be discovered “on-the-job”, and changing both the tools and your own Corporate Identity will start; few million dollars, you will discover what happened (at an official ERP conference sponsored by a supplier, a customer reported that the overall cost of a botched ERP transition had been on the tune of more than 40 million EUR).

We suggest that Knowledge Management is introduced once you have already a working policy for collecting, structuring, distributing, maintaining knowledge: what we call “Knowledge Retention” (KR) policy.

Knowledge Management cannot work without Knowledge Retention, and it is greatly enhanced by the use of tools to manage the processing of knowledge.

BFM2013_1_02_Introducing knowledge retention

As a first step, we suggest that you identify what is the current behaviour in your organization when it comes to knowledge retention: do not be surprised to find wide differences between organizational units, or within each organizational unit.

The early XX century “Scientific management” à la Taylor was originally based on sound knowledge of management practices, to ease the training of managers without the need to start from the shop floor and rise up into management, as training was supposed to replace experience.

Thanks to PCs, the widespread use of computers tools like the spreadsheet allowed to replace the transfer of “fuzzy” knowledge (people skills, etc.) with a more structured and quantitative approach, easing the replication and communication of knowledge.

Or so it seemed.

In reality, the 1990s saw the sharp rise of spreadsheet-toting consultants, focused on quantifying everything, quite often discarding the unquantifiable as irrelevant, or coercing reality into a convenient set of values that allowed classifying everything.

Admittedly, successive generations of quantitative-focused managers and consultants increasingly drifted away from business common sense.

While assessing the current status of your “Knowledge Retention Policies” (KRP), we usually suggest to identify a set of qualitative parameters and some levels of compliance used to benchmark each organizational unit.

Using these parameters, a simple “radar” chart will become the basis for a brainstorming on possible initiatives to improve the status of each organizational unit.

BFM2013_1_03_ISO9000 and knowledge management

ISO9000 is just an example of a “standard” that can have significant impacts on your way of doing business- and therefore it is worth spending few words about it.

In the 1990s, ISO9000 certification became a “condition sine qua non”, a cost of staying in the business- as having a telephone.

Eventually, customers practically merged ISO9000 and Knowledge Management, but while the latter has the purpose of reducing the cost of increasing, distributing, and updating knowledge, the former is focused on different premises.

ISO9000 originally simply stated that you were going to be assessed vs. what you stated that would be your quality level- not vs. some “common” or “standard” quality level.

Often knowledge producers became the “bottleneck”, as they were required to produce variants of the same set of information for different destinations – ISO9000, Knowledge Management, etc.

If your organization has already ISO9000 and Knowledge Management initiatives in place, we are not suggesting to trash what you already have: our approach is to find a way to restructure knowledge so that each item can be reused to be “published” for a different public.

Public: yet another concept that is quite often discussed along with Knowledge Management, and quite often distorted into a simple format conversion.

Tailoring your content to a different public is not just a matter of changing layout, or graphical format: often, you have to consider also the different knowledge and experience background of each segment of your audience.

BFM2013_1_04_Public and knowledge production

While a technical approach to knowledge management sometimes could benefit from a “big bang”, this usually produces just a technological implementation.

But who will maintain the processed knowledge stored inside the knowledge management system? Certainly not the knowledge producers, as it will be completely different from the source.

Each document has (or should have) a destination public, i.e. an intended audience.

If you cannot define the audience, chances are that your document will not achieve the intended results.

But do you really need to write the same item time and again? The risk is that your knowledge producers will spend seven hours reporting in slightly different formats what they spent one hour doing.

While knowledge production obviously should stay with those who own the knowledge and are able to make it evolve, knowledge retention and communication might involve a differentiation of roles based on your own specific organizational culture and structure.

As an example, consider the differentiation between “knowledge producers” and “knowledge consumers”: what the former produce might require somebody belonging to the organization from the latter and able to “convey the message”.

This is probably already part of your practices, as e.g. you would not expect to communicate with the CEO as if (s)he were a production engineer: but data management tools often took over common sense- also if there is an alternative, i.e. to use both tools and common sense.

BFM2013_1_05_Knowledge snippets and traceability

Two simple actions could simplify both the interaction with the “thesaurisation and compliance organizations” (e.g. methodology, quality assurance, knowledge management) and those charged with knowledge production: writing “knowledge snippets” along with each item, and defining “building maps” for the basic documents to be generated by the knowledge producers.

Traceability is a simple concept, and it is more a matter of common sense than a completely new concept.

Simply, partition your document into “atomic” (i.e. not further divisible) items, and whenever producing an update, take note of which existing item you are updating.

Managing traceability is slightly more complex, but it is built around this same basic rule.

Not everyone is keen on documenting in a traceable way, or interested in restructuring documents according to different sets of requirements.

The knowledge producers should retain control of the knowledge they originated, as they are the only ones able to update it in a meaningful way.

Obviously, this implies that the knowledge producers operate according to your own rules- and this applies both to your own staff and any external supplier: as you pay, they have to use your methodology, not their own.

As it will be discussed in a later issue, you can outsource/delegate the execution, but not the responsibility.

While outsourcing, you focus on the results and not the process: in that case, the knowledge ownership balance might be altered.

If you define clear “roadmaps” to assemble and store knowledge items, then the thesaurisation is just a by-product of your normal activities.

Then, the knowledge producers can delegate the actual collection and formalization to resources focused on those task, usually belonging to the “thesaurisation and compliance organizations”.

Knowledge producers usually need to get through some trial-and-error before they can identify the proper size of the “knowledge snippets” for their own processes, but there is a low-cost option.

Ask them to define “knowledge snippets” so that they can enforce traceability.

They should start by looking at their own business processes, and check what is the minimal traceable “knowledge snippet”: probably the actual size and configuration of a meaningful knowledge snippet is different across the organization.

It is critical that this “traceability identification” is defined by knowledge producers, to ensure that it is consistent with their actual “real” processes, as often processes contain both a visible layer, operating through the formal organization, and an invisible layer, managed through the informal organization.

And this awareness becomes even more critical in case of knowledge thesaurisation activities that are a step toward restructuring (e.g. downsizing, merger, acquisition, asset/business line disposal) or outsourcing activities.

BFM2013_1_06_Downsizing and knowledge configuration

A common problem in any organization is that through downsizing, BPR, re-organization, outsourcing, probably the organization removed as much organizational redundancy as it was previously available, therefore leaving scarce (if any at all) cross-functional expertise left in-house.

This implies that quite often “knowledge configuration” (KC), where existing in-house knowledge components are re-used across the organization, is increasingly difficult, or it is delegated to external resources- who happen to be the only ones still working across the functional divide (with further side-effects on business continuity).

Using “knowledge snippets” along with traceability reduces the cost of knowledge production and retention (including its maintenance).

Also, this allows your organization to introduce “knowledge configuration management” (KCM), ensuring consistency of behaviour across the organization.

KCM is linked to another concept: “versioning” of knowledge. The basic concept is that any item of knowledge depends from one or more other items, produced in the same or other organizational units.

Knowledge management tools can become obviously useful- but you need first to have in place a culture able to process knowledge.

The next step is, obviously, seeing how you can monitor your knowledge stock and benefit from it.

Incidentally: “knowledgebase”, or “knowledge base” are avoided in this book, as too often they are used to describe both “knowledge collections” and “knowledge thesauruses”- basically, “static” vs. “dynamic” knowledge

BFM2013_1_07_Knowledge Configuration Management & Knowledge Costing

When “Knowledge Configuration Management” is in place, quantitative analysis of your “knowledge stock” becomes again possible.

Obviously, the parameters must be tailored to the specific categorisation of “knowledge snippets” that will have been negotiated with knowledge producers.

Why introducing again quantitative analysis, after criticizing it in the previous section?

Consider this as possible side effect, not a mandatory element of a “knowledge retention” policy.

Introducing quantitative analysis after adopting the new approach could be useful to actually share the costs of producing and maintaining a specific “knowledge snippet” with all its users.

And it is not only useful for “Internal Transfer Pricing” (e.g. to identify how other parts of your organization should contribute to the investment on knowledge production), but also when spinning out a unit or negotiating a joint venture, and in IPR management.

Obviously, also if the demonstrated value or cost of an item is X, corporate policy realities could result in a lower transfer price, considering the balance as “overheads”, paid by the organisation as a whole, e.g. for processes that, while being managed by a specific business unit, are actually the reason for the organization to exist.

The risk? If you forget what is “core” to your business, some excessively entrepreneurial elements of your organization might quarrel about considering their support processes as “core business”- and ask others to contribute to their own costs.

BFM2013_1_08_Knowledge management and participation

If you introduce Knowledge Management after Knowledge Retention and Traceability, the actual knowledge producers will be more active partners in setting up and negotiating the separation of roles and structuring of the knowledge infrastructure.

This increased participation will result in a greater control on what is inside your “knowledge store”, as the knowledge management organizational unit will not be the only organizational structure in charge of all tasks related to knowledge management.

If the Knowledge Management organization is in place before the rest of the organization is able to provide usable knowledge, the net result is usually an “ivory tower” approach.

Should this be the case, the Knowledge Management organization, for lack of contacts with sources that understand its “lingo”, starts generating knowledge and assumptions, instead of structuring knowledge produced by the actual Knowledge Providers.

In turn, this will give less incentive to the other organizational units to become part of a process that adds overhead but whose value (i.e. knowledge that can be extracted and reused) is questionable.

Of course: a better alternative is to clearly “phase-in” such an organization, by assigning it an advisory role to business units during the first phase of its activity.

Knowledge Management based on Knowledge Retention also simplifies the identification of “core” knowledge items, whose understanding should be kept inside your organization.

These “core” items are those relevant to ensure that any “outsourcing” activity does not affect your business continuity.

BFM2013_1_09_Knowledge and embedded security

Once the knowledge production is structured, it becomes easier to define a knowledge control/security policy, by assigning a different level of access and responsibility to different organizational units.

A positive side-effect of this definition is: identifying the “knowledge boundaries” for each organizational unit limits the need of cross-functional meetings to those where the subject is new or clearly spans across “knowledge boundaries”.

As each item is classified while being defined, it becomes possible to delegate without losing control : this will reduce the number of resources needed to cope with a larger number of projects, using external resources only when and for how long is really needed, and without any loss of knowledge.

As described above, adopting a sound Knowledge Management policy based on Knowledge Retention makes investment on knowledge and knowledge costing possible.

Why the title of this issue links “knowledge retention” to “embedded security”?

Knowing which items of your knowledge thesaurus are “core” and should be maintained inside your organization ensures that you can improve your business continuity capabilities, also when delegating to third parties one or more processes.

What is “embedded security”? Security management is quite often considered an additional set of processes, almost an afterthought.

But this externalisation of security implies that you try to build up walls (virtual or real), without actually involving those who produce the knowledge and therefore should know its operational sensitivity.

It is true that those above them know how, within the “formal organization”, that knowledge impacts on other areas of business: but also if those managers are promoted from the rank-and-file, eventually they will lose knowledge of the current, real, “informal organization”, and its informal communication flows (more appropriately, “back channels”).

Security (both physical and logical) does not come cheaply, and 100% security is simply impossible.

Our concept of “embedded security” is quite simple: instead of adding security after your processes, try to focus on identifying who is responsible for each specific knowledge subset, and involve them in the definition of the related security profile, with the support of your security experts or “internal audit”.

Maybe you will discover that some of the security can be “embedded” in the actual processes involved in producing the knowledge, minimizing your security overhead.

Knowledge-based security profiling increases the accountability of knowledge producers, while ensuring compliance with your own internal policies.

Further additional layers of security would just (expensively) increase the perceived security, while impeding the knowledge distribution that is needed to actually generate value, and creating a false sense of security in knowledge producers