Chapter 7: Putting it together – Web 2.0 Knowledge Technologies and the Enterprise

7

Putting it together

If a theory explains something, then it can be applied to stop it from going wrong. In this section, I employ a kind of ‘muscular philosophy’ to characterise a strategy of using the constructivist and critical theories we have discussed previously to design implementation methodologies to make Web 2.0 work. These perspectives would normally be thought to be too ‘academic’ and unwieldy for practical purposes, but if technology projects continue to disappoint, then perhaps, as Plato suggests, we need to find alternative ways of looking at things. So this section is a step-by-step approach for assessing and preparing the ground for the planting of the Web 2.0 seed.

Where previously much of technology implementation has been about the technology, training and precise fit to business purpose, the familiarity, ease of use and open-endedness of Web 2.0 tools shifts almost the entire emphasis of implementation to conceptual and motivational activity. And this is about understanding and restructuring social institutions – we are not just moving the furniture. Even though it is possible to write or adapt customer-built programs and interfaces for Web 2.0, the bulk of the business benefit will come from the use of basic functions; technology, on the whole, is not the key issue and should not be allowed to set the agenda. It is no surprise that McKinsey found a strong correlation between successful Web 2.0 projects and business initiation and implementation of those projects.1

It is also critical to recall one of the key attributes of Web 2.0 here: the perpetual beta. Unlike the chimerical aspiration of conventional systems development to deliver a complete package of functions, the idea of continual development and delivery is built into Web 2.0. Traditionally there has been a focus in information systems projects on requirements gathering, software development or acquisition and then implementation as a set of activities directed towards embedding the computer functions into business processes. The focus has been on short-term measures to stimulate ‘adoption’: training, motivation, rewards, user support and so on. But the knowledge work activities which are the usual candidates for Web 2.0 tools are (usually) far less structured and routine than order entry or inventory management so are (often) less well understood and more difficult to specify in advance. And Web 2.0 is far more forgiving than ERP or CRM. As new possibilities for use emerge, as people learn about the software features, as underlying cognitive, regulative and normative institutional structures catch up with the technological possibilities, so those possibilities can be taken up because their readiness at hand can be appreciated.

As usual, the reason to implement technology should be driven by some form of business requirement which reflects a new business function or a need to overcome a problem of some sort. However, requirements can also be driven by strategy: operations may be smooth and efficient, but relentless competition and innovation may force management to introduce changes such as outsourcing, business process redesign or workforce distribution. Outsourcing the design function will increase the need for rapid, rich interaction with partners; business process redesign may require greater information sharing with a larger number of participants in a hub rather than linear configuration; workforce distribution will mean that corporate knowledge may need to be available to (and from) staff in remote locations. Web 2.0 provides solutions to these kinds of changes.

You may notice that in the methodology below there is no step for ‘Tool Selection and Installation’. This reflects my own experience and belief that these tools should be part of the standard operating environment of an enterprise, much like e-mail, word processing and a virus checker. There should be no need for any particular business unit to go through the budgetary and product selection processes: the software should be made available to all, and training and advice on how to use it placed within the software (such as wikis and blogs) itself. The step from recognition of need by any part of the business to the capability to use should be able to be taken with immediate effect. This decision to use can then be taken by management or triggered by a situational need by a project, a group or an individual. The alternative is that business units or groups will select their own wikis, blogs or tagging methods, leading to fragmentation and further islands of information and knowledge. Under some circumstances, this may be their only way of making progress, but it is preferable not to do so. The point of Web 2.0 is the sharing of knowledge as part of workflow, often with the unanticipated future use of that organisational memory by others beyond the immediate business unit. This places a demand on senior management and technology managers to take a strategic and architected view of Web 2.0 for their company.

So once the business purpose is understood and Web 2.0 tools are recognised to be the appropriate type of knowledge tool then the steps to be taken by the business unit are:

1. Define the space and its purpose.

2. Develop the business case for the space and make a decision to proceed.

3. Define the knowledge types which will exist in the space.

4. Define the knowledge transformation processes for the knowledge types in each space.

5. Define the transactive memory processes to ensure knowledge can be easily found.

6. Analyse the social groups who may be involved for adoption readiness and develop adoption strategies.

7. Identify the inhibiting and facilitating social institutions for each space and develop adoption strategies.

8. Understand power relationships and how to facilitate adoption.

Define the space

Spaces and flows are the defining characteristic of Web 2.0 use for organisations. Web 2.0 tools are the Swiss army knife of knowledge management technologies – but the impulse for specialisation in enterprises generally leads them to buy cleavers, chisels or scalpels, tools which fit a specific task and achieve a unique purpose. Defining the space in Web 2.0 means establishing the purpose of the space and then the nature of the game to be played within it in order to guide subsequent design and implementation decisions and give an overall coherence to the information to be created and used. What kind of thing are we building here – is it for collaboration, to become an encyclopaedia or to communicate with partners? Where are the boundaries and when can a manager or participant says ‘stop, enough, this doesn’t belong here!’?

Deciding upon the type of space will lead to clearer decisions regarding the purpose and application of Web 2.0 tools and clarify subsequent decisions like access rights, knowledge type, rates of update, information distribution, administration, appropriate behaviour and the meaningfulness of statistics. It will also give a clearer idea of the social institutions (facilitators and inhibitors) one is likely to encounter.

A key decision regards the scope of access to the space. Should a wiki be corporate, enterprise-wide and available to everyone, or restricted to a group? If it is corporate, should every user be allowed to edit, change and upload content or should that be the domain of moderators, nominated editors or privileged experts? Every organisation needs to examine its objectives and assess its mindset, organisational maturity and capabilities when making this decision. Many organisations prefer to keep things local for reasons of comfort or control. But I vividly recall a meeting I conducted as a KM facilitator in a government organisation charged with natural resource management. The director of scientific research announced to the meeting that he was not interested in participating: ‘We have no knowledge management problems’. This was greeted with horror by the administrators, policy-makers and field staff present who found it impossible to get information, reports and advice – about scientific issues. The consequences of restricting access and contribution rights to the group level must be very carefully considered. Restricting wiki, blog or tagging access to a sub-group is not true ‘Web 2.0’ and will lead to:

 a reduction in the exploitation of organisational memory through the exclusion of potentially interested parties and particularly unanticipated potentially interested parties;

 an increased likelihood of organisational memory loss when the group or project disbands and the content is simply archived or not accessible;

 a decrease in the potency of weak ties and the transactive memory system of the organisation as the authors and groups interested in the information are hidden;

 alternative sense-making and knowledge transformation processes which produce deviating concepts and ideas which are not strongly integrated with other organisational knowledge;

 perpetuating alternative power structures and competing institutions;

 social identity construction which is self-referential, elitist or hostile.

Therefore it is far preferable to create organisation-wide wikis and blogs. If absolutely necessary, one might restrict writing to certain pages or areas and allow comments and discussion by other organisation members who are not in the particular group. This may restrict knowledge creation but at least aids sharing and externalisation. A closed wiki or blog with a restricted readership is at most ‘Web 1.1’ – a limping, mewling effort of no particular interest.

Define the business case for the space

The business purpose of the Web 2.0 system must be clearly understood and a business case at an appropriate level of sophistication must be made. The attention deficit in most modern organisations means that a presentation about oxygen will only get someone’s attention when they’re drowning. Immediate usefulness is probably the single most important factor in spontaneous, non-coerced adoption of technology solutions – although this is neither a sufficient nor a necessary condition. But it does create that sense of Heidegger’s breakdown, where existing tools no longer appear ready at hand. It is even better when the pressing informational problem can be addressed with little or no latency to becoming productive, as is the case with universally available, easy-to-use wikis or blogs.

Nonetheless, an AIIM survey concluded that 40 per cent of respondents required a definitive business case for Enterprise 2.0 and 30 per cent an indicative one.2 Of these, 77 per cent were unable to find an acceptable level of return. Of the remaining 23 per cent, 45 per cent anticipated a return after two to three years. These results are surprising, given the cheapness of the software, the low cost of infrastructure and the negligible need for technical support. Further, 54 per cent of respondents stated that their organisations did not measure the success of Enterprise 2.0 the same way they did themselves, suggesting that there is a wide discrepancy between tangible cost-benefit analysis and the perceptions of working staff that the technology is immediately useful.

Another significant correlation in the AIIM survey is that organisations not requiring a business case (that is, seeing the technology as just a cost of doing business) were generally classed as knowledge-oriented. These organisations are twice as likely to be actively using Web 2.0 technologies, leveraging their knowledge assets to greater effect and advancing themselves to higher levels of performance. On the other hand, a similar German survey of 156 senior managers in ‘knowledge-intensive’ companies with more than 100 employees found several key reasons for them not introducing these tools into their companies: 62 per cent cited unclear business benefits, 48 per cent cited ‘lack of openness of employees’ and 30 per cent cited ‘lack of openness of management’. Only 2–6 per cent currently use these tools in a company-wide context while 29 per cent agree completely and 21 per cent agree mostly that ‘Web 2.0 applications will be part of the company business in a few years’.3

Defining the ex ante (before implementation) value proposition of technology is often difficult and rates of disappointment ex post (after implementation) are often scarcely inspiring. Yet although the short-term results are often disappointing (or inadequate to justify investment), the longer-term effects are often considerable. The short-term challenge in justifying the investment cost in Web 2.0 comes largely, with the notable exception of marketing and customer interaction, from its lack of immediate linkage to a specific business process. There is no shortening of cycle time or reduction in headcount. So what is the value proposition?

The overarching value of these tools to an enterprise is that of organisational memory: every interaction through these tools builds and structures corporate knowledge for reuse, building upon and improving. But this is often too vague for investment decision-makers. The notions of space and flow provide a framework for the articulation of this higher level value proposition as a specific business project. Space forces proponents of a project to articulate a system boundary, a purpose and the participants. These translate into a business context, a business objective and the business beneficiary. Flows within the space are the informational actions of the knowledge worker, group or organisation in which improvement can be gained and in terms of which the contribution of the flow to a business objective can be understood and measured.

There are many ways to define a business case, but the received wisdom is to ensure that the system ‘pays its freight’. This should not be difficult for software that is generally free or very low cost and which requires little technical or administrative support and almost no training costs. The key is to identify specific spaces for specific activities: the distribution of information by an internal programme space, the capture of expert knowledge for an encyclopaedia space and the generation of process improvements through a collaboration space for example. Table 7.1 enumerates three examples. These examples of tangible and intangible benefits can be enhanced with concrete numbers for return on investment calculations if required.

Table 7.1

Developing a business case

Define the knowledge types for the space

After having established the effectiveness and viability of the use of tools to support spaces, we move to the detail of the knowledge content to be kept there and how to design the information. The taxonomy for knowledge objects is fourfold: prescriptive, descriptive, distinctive and emerging. While these categories are a useful starting point for deciding which types of knowledge to place or develop in Web 2.0 tools, there may be some variations in how organisations decide to deal with them. Organisations will vary in their need for certainty, speed, innovation focus, audit trails and so on. They will also vary in the starting point of their current infrastructure and strategic plans for technology acquisition: some will have content management systems, others will use shared network drives. So the type of knowledge to be created within a particular Web 2.0 space will be a decision contingent upon these factors.

 Prescriptive knowledge – how do we wish to create, manage and protect the information that people should conform to?

 Descriptive knowledge – how do we wish to create, manage and protect the everyday information which describes the generally accepted way of (or reasons for) approaching tasks at the moment?

 Distinctive knowledge – how do we wish to create, manage and protect the deep knowledge that is held by our experts?

 Emerging knowledge – how do we wish to conduct those conversations and interactions which lead to new knowledge or which develop our understanding of our work?

Deciding the type of knowledge is only a first step in the information design process, however. After establishing the general knowledge taxonomy and who will engage with that knowledge, it is necessary to create the part of the space infrastructure which deals with actual information objects and their structure. For example, an encyclopaedia space will contain proprietary descriptive knowledge which will be stored in ‘knowledge pages’. These knowledge pages should have a consistent structure and layout where it is clear what type of content is expected. If all the pages in a wiki for example have an ad hoc structure, then it will be hard to ascertain the levels of quality and content which are expected. On the other hand, a page layout which demands that information be entered into specific fields with prescribed possible values may be too constraining and alienate users. Those who are creating knowledge pages need to be presented with a standard ‘template’ of some kind, with an appropriate level of structure to assist with orientation and give a clear indication of what the ‘desired’ information object will look like.

We might define three levels of template discipline within blog or wiki pages. Each level of template increases the degree of conformance required of users and decreases the level of autonomy users will enjoy in defining the shape of information:

 No template. This will cover completely free-form pages where the author can create a page design according to their desires. At most there might be a consistent look and feel using standard fonts and corporate colours and logos. This kind of freedom will work well within personal, social or collaborative spaces where knowledge should emerge within a more or less spontaneous process.

 Loose template. This template might create a page layout with suggested headings and descriptions of the kind of input and authoring

 style required. The template might insert a number of services, such as links to key pages or to a main page, a standard tag of the category to which the page type will belong, and perhaps even standard macros written by IT specialists.

 Tight template. This template will create more structured and prescriptive layouts with data fields, pull-down menus of permitted values, possible validity checking of inputs and so on.

Table 7.2 might help in generalising some answers to questions of rules of participation and the structure of knowledge in this step.

Table 7.2

Matching knowledge types to spaces

Define the flows within the space

The knowledge transformation processes which externalise and internalise knowledge as symbolic behaviour and speech take place in wikis and blogs as reading and writing. Concept creation (objectivation) and innovation (knowledge creation) are products of this interaction. Authorisation (legitimation) of new ideas, concepts and ways of doing things can also occur in the technology when the appropriate managers are involved or group consensus is reached. And the institutionalisation and socialisation of staff into the knowledge of the firm also occur here. If we take more straightforward expressions as rough correlates of the academic names, Table 7.3 emerges.

Table 7.3

Defining the knowledge transformation processes within spaces

Having understood the types of knowledge transformation we can then proceed to identify and design the types of flows involved (see Table 7.4). These are not decisions made once during project initiation: they should be considered whenever new flows or information types are to be added.

Table 7.4

Identify flows and functions, knowledge type and storage

Transactive memory is a key method in couples, groups and organisations of finding what you need to know, when you need to know it. Without current, informative transactive directories and the effective transactive processes to maintain and use these directories supported by technology, the strength of weak ties, network effects and knowledge retrieval from organisational memory will be far less effective in organisations. For the various spaces in your Web 2.0 tools, you will need to ensure that key metadata about knowledge and information in

the organisation, such as creator, holder, interested party, knowledge repository, locations and other attributes, are recorded, understood, linked and made searchable by the Web 2.0 system. This will generally be the role of wiki administrators supported by technical staff.

Generally the identity of participants is recorded automatically and usually this can be linked to a personal page or contact details in a corporate address list. The mode of searching and using the transactive memory metadata must be simple, be well publicised and become an acceptable means of taking up contact with knowledgeable colleagues or finding the right documentation or database. Anonymity and distance make casual, conversational transactive directory maintenance difficult in large organisations, so to make a functioning transactive memory from Web 2.0 tools requires first:

 a glossary of concepts which provides consistent terms and meanings across the organisation: for example, In the world, there are machines, raw materials, workers, tools;

 a description of the relationships between concepts describing ‘facts’ about the organisational world: for example, A machine needs raw materials, a worker uses tools, and tools can be drills, hammers or pliers;

 a description of the collections of facts which constitute domains of knowledge or key business processes: for example, Production is about how to set up machines, re-tool and use tools;

 a set of role and group definitions that defines which departments and repositories are responsible for domains and business processes: for example, The Operations and Production Planning Department is responsible for production, and their machinists should know about drills and how to mount them on a work machine. Inventory management is about how to order, store and move goods.

This provides a normative, meaning-based map of organisational memory, which shows where you should be able to find an appropriate repository of organisational memory by following directory entries (or searching for) words, domains, departments and roles. But this does not account for instance knowledge, which is specific knowledge about an event, area or project possessed by a particular person, database or group; for example, a particular machinist has done maintenance often on (the troublesome) machine 4711 and knows it well, or was a drilling specialist in Indonesia in 1998 on Project Pisces.

Instance knowledge will be found by specifically searching for a term (by search engine or asking around the closest-matching role knowledge for pointers), tracking down the instance repository and then approaching the repository linked to the instance (for example, who was the author of a report or the project manager at the time).

This normative part of transactive memory is the ontological backbone of a company and the instance part is what accrues as people within organisations learn. Without an ontological backbone which fulfils the role of a transactive map, the power of the information and knowledge embedded in your Web 2.0 tools is unlikely to gain full traction.

The normative part of transactive maps needs to be established through standard glossaries, thesauri and ontologies which define the key concepts of organisational language. These concepts should be combined to provide adequate descriptions of domains of competency which are linked to organisational charts, descriptions of the role of departments and descriptions of job roles within the departments. Information repositories, be they databases, people or documents, need to be locatable via classification schemes, search engines or tagging. Information needs to be linked to other information in order to provide context and trails to follow. The Apollo 11 Project should be able to use standard tags like ‘Spaceship’ or ‘Lunar Module’ to mark-up its web pages, documents and data.

The instance part of transactive maps needs to be established by giving users or administrators the opportunity to create and apply social tags describing the events, projects or undertakings they feel are necessary to classify information. The Apollo 11 Project should be able to create linked tags for its own instance information, when standard tags are insufficient or when they are simply breaking new ground. People need to be able to have a home page describing their own ‘instance’ knowledge using ‘instance’ keywords (‘I was a member of the Apollo 11 programming team …’). Résumés should be available online and be searchable to track down specific instances of knowledge.

Understand and address the institutions which influence adoption

One of the fascinating aspects of Web 2.0 is the disjunction between the cheapness, easiness and usefulness of the tools and the elephantine efforts companies can make at implementing it. Whether it’s corporate procedures, the not-invented-here-syndrome, IT methodologies for testing and implementation using gold-plated consultants or glacial approval processes, it is clear that it is the institutional structures of business, not the tools, which define the initial playing field and the rules. Two surveys, from the Aberdeen Group and McKinsey, relate the intentions to use Web 2.0 tools to the performance of the organisation, and highlight the importance of this point of departure. Although these examples are from external-facing use of Web 2.0 (and we are focusing on the use of Web 2.0 for supporting knowledge work), they serve to illustrate how the institutional launching pad influences the trajectory of use.

The Aberdeen Group survey first establishes the relative strength of the marketing function of the organisation using five dimensions: process, organisation, technology, knowledge management and performance management.4 It then finds that, although all companies seek to use Web 2.0 tools to increase brand awareness, there are significant differences in how they apply them. Best-in-class performers seek to use Web 2.0 technologies to interact with customers and collect their observations regarding brand and marketing to develop new products. The average and laggard performers seek simply to ‘improve the online experience’, which is simplistic Web 1.0 thinking dressed in Web 2.0 capability. The survey concludes:

Best in class companies realize that metrics such as customer satisfaction cannot easily be increased simply by providing Web 2.0 portals on a company website; there must be an internal process where the insights exchanged in these forums are used to grow the business.

The McKinsey survey divides its respondents into those who are satisfied with Web 2.0 and those that are dissatisfied.5 Satisfied respondents confirm:

 higher levels of change in communications with customers and suppliers (26 per cent of satisfied to 12 per cent of dissatisfied), talent hiring (27 per cent to 13 per cent);

 the creation of new roles (33 per cent to 9 per cent);

 movement to a flatter hierarchy (33 per cent to 13 per cent).

Of the companies that were dissatisfied, 46 per cent say it has not changed the way the companies operate. Only 8 per cent of satisfied companies say the same thing.

Clearly, the application of the tools varies according to the capability of the tool users to conceptualise improvements and implement them. Every organisation has inertia. This inertia is a vector property with force and direction. The results of these surveys suggest that Web 2.0 is a tool which will amplify inertia. It allows well functioning organisations to move to a significantly higher level of performance: product co-creation, organisational restructuring and new job roles are generative structures which enhance performance. So this technology will increase the capabilities of the already capable – but to move into this realm of capability, tools are not the answer. Giving average organisations Web 2.0 tools is as little the answer as giving a power saw and router to an apprentice for a month. A good tool must be ready at hand, but the hand must also be ready.

How might one assess the capability of a firm’s institutions to catalyse new approaches? Looking at history is a good start: if similar tools have been tried in the past, and if communities of practice have stuttered, if formal networks languished and bulletin boards lament ‘is anyone there?’, then Web 2.0 tools probably won’t work either. As Peter Senge says, a lag in achieving success will generally lead to the abandonment of a project: this has nothing to do with the underlying capability of the tools. Therefore it may be better to make a realistic assessment of a company’s ability to change its practices to make a success of the use of new tools. If the company’s capabilities and institutions are not conducive to adoption, then the norms which will influence adoption should be worked on before the introduction of the tool.

The institutions that guide behaviour and social action must be identified and analysed to examine the extent to which they constitute barriers or facilitators of adoption. Then, as necessary, the social processes which change institutions need to be initiated. It is beyond the scope of this book to define the appropriate change management processes, but in the face of so much adoption failure, we can use some of the ideas of this book to explore promising approaches for Web 2.0.

A simple way is to identify and articulate the postulates which influence people in their decision to use the tools to improve business outcomes. Postulates which might lead to rational non-adoption of Web 2.0 systems include:

 Managers can be deceitful: watch what they do, not what they say.

 It’s best to keep doing things the way we have before.

 Keep things to yourself.

 Making mistakes leads to punishment.

 Only the group needs to know what the group knows.

 Only experts should express an opinion.

These norms and institutions can be changed over time: fear of speaking in public can be reduced when there is a culture that encourages asking questions, that having a go at something and failing is better than not trying, for example. These norms can be changed in organisations by leaders displaying these characteristics and enacting them in micro-interactions with staff over time. But it is a fragile process which can be easily destroyed by all-too-common leadership change, reluctant middle management, an unfortunate event or two where a question is ridiculed by a leader or a pack, and a general national culture which discourages ‘stepping out’ (also known as the tall poppy syndrome).

An institutional perspective helps. If the normative structure contradicts public contribution through norms of modesty and knowing one’s place, vilification of mistakes or even vengefulness, and this norm is massive, reified and unspoken, then the scope of Web 2.0 should be tailored appropriately. Contribution might be better served by accepting the status quo and becoming expert and authority driven, with review and legitimation processes prior to publication, protected wiki pages, wikis of restricted scope and access and so on. This is unfortunate, and will suppress the wide participation, weak ties and network effects so crucial to Web 2.0 advantage, but it is likely that some Web 2.0 utilisation is better than none.

The institution of leadership

It is a cliché, but senior management commitment is very important to the success of Web 2.0 undertakings, as it is for most deliberate transformation initiatives. This is largely due to their direct, legitimate authority and their span of influence. Management will generally look to a business case for a set of tangible benefits, as they themselves are generally judged on such results. This kind of business case is often hard to find for Web 2.0, but the use of spaces and flows helps to nail it down. A strong business case will also help to maintain commitment if progress is not as exalted as anticipated.

Even where the amount of money (if any) to be spent on wikis, RSS or a blog system pales into insignificance against that spent on e-mail management, document control or ERP upgrades, someone will have to write a cheque at some stage, particularly in organisations where IT services are outsourced. Leaders must be convinced that the technology offers functional and productivity benefits or some other reason (such as staff attraction or workforce modernisation). Unfortunately, few managers devote the time to finding out what these benefits are.

But setting an example of using these systems and exposing one’s opinion to criticism and feedback is a vital aspect of convincing the tentative to start using. What therefore influences the adoption of a technology by leaders themselves, other than financial commitment? Here are some thoughts to be considered:

 What constitutes leadership in the organisation: a dark suit, a serious manner, always being right, being paid significantly more or being part of a special society? If the in-group, institutional prototype for managers has characteristics which militate against the openness and level field of Web 2.0, it is unlikely that managers themselves will participate and inspire by active leadership.

 Managers may be less willing to expose themselves in a public forum. Managers can be very sensitive of the need to manage signals to their own workforce and their own image. They may be unwilling to participate in forums which are dynamic, interactive and broadcast universally.

 In organisations where managers move through roles in a matter of a few years, there is little incentive to implement strategic initiatives which bear fruit over the longer term. The building of organisational memory is an investment that is for the whole organisation and for the future, but management reward is often expressed through other forms of measurement.

 Managers themselves look upwards to their leadership. If the managers’ manager is risk-averse, hostile to fluidity and dynamics, top-down and deterministic, then the manager will scarcely be inclined to take risk.

 Tolerance of dissent – the ideal collaborative environment is anti-hierarchical with little sense of role-based hierarchy and where knowledge rather than position determines authority. At 38 metres, the Hall of Supreme Harmony in the Forbidden City within Beijing was the tallest wooden structure in China for hundreds of years – no one was allowed to build a higher building than the Emperor. Does your organisation allow workers to have better ideas than the boss?

The locus of decision-making – business ownership

For many reasons, business involvement in technology projects is a critical success factor. This will be no exception in Web 2.0 projects – indeed it may be even more pronounced. The McKinsey July 2008 Report on Web 2.0 concludes with a critical and unambiguous indicator of Web 2.0 satisfaction: those companies with the lowest levels of satisfaction with Web 2.0 technologies were overwhelmingly provided with them on the initiative of the IT department (36 per cent of those dissatisfied compared with 11 per cent of those satisfied). Conversely, 25 per cent of those satisfied with Web 2.0 do this in a context where ‘the business identified new technologies and brings them into the company without IT support’. While business involvement in IT projects is now received wisdom, this observation perhaps raises the stakes: lack of involvement by the IT department as a critical success factor! In all likelihood this reflects a strong readiness, leadership and feeling of perceived control over the technology in user departments.

The drumbeat of continuous improvement

There is no contradiction in a motivated or compliant workforce not adopting a useful tool. Even if the tool is understood by the protagonists to be useful and easily apprehended and learned, that workforce, although motivated to perform well, is not necessarily motivated to improve. An incentive to improve working methods is a different social object to the one which drives performance as measured by existing management criteria. In the absence of a salient norm which aspires to continuous improvement, the adoption of tools must be driven by some constraint which makes higher levels of performance clearly desirable – such as measured targets for performance improvement, regular increases in targets or decreases in resources to achieve the same targets. This makes non-improvement untenable.

Harvesting the benefits of improvement

In organisations where managerial positions change every year or two, the motivation of managers to introduce change (and ‘rock the boat’) which produces benefits which only materialise in the mid to long term may be low. Increasing levels of organisational memory using Web 2.0 tools is such an improvement that may only be useful to my managerial successor, so why expend effort? A fully internalised institution that motivates managers towards the greater good is required.

Privacy

Like the phobia of public speaking, to write in an open environment is a basic fear. In spite of the repeated observation that the younger generation places information of hitherto inconceivable privacy in public spaces, does this mean they will rush into corporate publication? Of course it doesn’t! As observed in the Facebook News Feed fiasco, the otherwise unworried users resent being lifted out of the crowd and they do not like their data being used for purposes they did not themselves mandate. There would scarcely be a corporate environment in which users would be permitted to be anonymous, so every contribution to a wiki, blog or corporate Facebook will be associated with an identity. Secondly, most corporate environments consist of tribes who know each other well and have a passing acquaintance with other related tribes. Being revealed and exposed to ridicule by people you know or with whom you are acquainted because of an ill-considered contribution is a few keystrokes away, with the kind of consequences that may pursue a contributor for the rest of their career, should there be one.

It would seem that because of the scope and permanence of contributions to Web 2.0 software, public contribution might be a significant risk to one’s identity and status in a corporate context. What ameliorates this risk perception? We need to understand this. Being a recognised expert would be one factor – in this instance what one writes is almost taken as gospel anyway. Being confident in one’s knowledge would be another – as would be the case with well-educated scientists or engineers, for example. Being used to putting one’s ideas in public would be another advantage – consider university academics who send in contributions to journals for ritual degradation in reviews, for example. Then there are those who just don’t care what others think – quite possibly an inter-generational genetic defect we would all like to possess.

Power and participation

How is control created and exercised in your organisation? For Web 2.0, this is by no means a trivial question as the exercise of power is crucial in being successful with a technology which, to a large degree, is based upon the presumption of free and open expression and a volunteer ethos. It requires a form of work which does not necessarily measurably improve the performance of specific tasks, at least not in a way which can be mandated in advance. Further, it requires the changing of habits (i.e. from e-mail to wiki, from face to face to blog, from creating information to reusing information). Add to this the management fear of ‘letting go’ and allowing open use of Web 2.0 tools in the enterprise. But the systems will only become truly productive when power has shifted from the direct to the indirect and then to the institutionalised: when staff compel themselves to contribute their knowledge to wikis and blogs to conform to inner value constructs, in which case management control has increased substantially.

As we have seen, there are different dimensions to the issue of power. Let us first consider direct control, or the exercise of explicit authority and sanction for non-conformance. This has been tried with conventional knowledge management by providing rewards or promotion for measures such as the number of contributions to a forum. This has generally not been very successful and is unlikely to be successful with Web 2.0. However, the converse scenario looks different: if a leader instructs their group to use a wiki or a blog where there is a latent demand for such a tool, then this use of power to remove other forms of inhibition and legitimate participation can be very successful. This legitimation is even stronger where the leader leads by example and contributes actively to wikis or blogs or supports them with material or time.

Under these circumstances it would seem that the imprimatur of leadership and explicit power is a necessary but not a sufficient condition. So let us consider indirect control, which is the creation of an environment which has the secondary effects of facilitating or promoting contribution to and participation in open collaboration. The objective is to motivate participants to contribute through the power of social constructions and institutions to guide perception, thought and behaviour in a direction which is favourable to the development of a common good. These constructs are generally reified, having a life of their own, independent of the good sense or perception of the protagonists.

The power of the tall poppy

Many countries have an institution known as the ‘tall poppy syndrome’, essentially a behavioural norm that one should not get too far above one’s station or ahead of others and thereby risk being cut down to size.6 This ethos can be introduced into otherwise high-performing groups as part of the baggage of the wider society. And indeed, in open environments such as an enterprise wiki, I have encountered several instances of benign colleagues telling their peers (in a very nice way) who had been video-recorded for podcasts that they were ‘glory seekers’ or ‘star struck’: a cutting remark. The tall poppy syndrome is a power institution which will block adoption and use of Web 2.0 technologies, effectively slapping down people trying to ‘do the right thing’ (another power institution of course).

The cult of expertise

The institution of expertise enables holders of knowledge in discipline areas to wield influence within those areas, define what counts as knowledge and how one gets there and indeed how we should feel about ourselves vis-a-vis the knowledge we absorb. An examination of the learning process, levels of interactivity and proceduralisation will reveal how knowledge is conceptualised. Are relationships for future collaboration built by moving new people through the organisation or are there siloed walls, secret languages and rites of passage? The rituals that guide the protocols for knowledge exchange and creation vary and depend upon the institutionalised view of expertise. A cult of expertise will potentially scuttle attempts at participative and collaborative knowledge development but may lead to encyclopaedia articles of great depth, for example.

Repressive politeness

The language and rules of engagement on wiki sites generally are revealing. Reminders to stay polite are ubiquitous, and politeness is the generally accepted rule. Such formats constitute fertile ground for collaboration, co-creation and knowledge sharing. In the case of impoliteness there is generally immediate group sanction or even withdrawal of others’ entries. The publicity of the forum, its permanence, enforces a public protocol with the highest common denominator of politeness and gravitas. Nevertheless, the power of institutions of politeness to create restraint may lead to a loss in signal richness and lead to an anodyne uniformity.

The power of special interests

The provision of software services within organisations is intimately bound up with money: the staggering rates of pay given to SAP module consultants and programmers and Oracle database administrators are a testament to the laws of supply and demand. The implementation of selfservice freeware places service providers in a conundrum: how to justify the use of specialists who need to be charged out at $250 per hour to provide support for free software which users can adapt and configure themselves? It is not unknown to meet with downright hostility under these circumstances, resentment and a palpable sense of letting the genie of self-service out of the bottle of technological servitude. I have experienced multinational service companies deliberately ‘going slow’ on installing a freeware wiki until they ‘had developed a policy’. So it is important to understand who will win and who will lose from the introduction of cheap or free ubiquitous software which requires no training or specialist support. In the case of Web 2.0 technologies, the Gartner Group states that many sceptical IT departments push back against Web 2.0, fearing loss of control, security issues and user empowerment.7

Power and influence – a summary

It may sound paradoxical, but the successful adoption of these democratic and anti-hierarchical technologies revolves around the exercise of power. To move people to behave in a new way which suits the institution of management, which suits the profit-making or governing organisation, requires the exertion of a force which overcomes natural momentum and inertia. This power to change can be exerted through direct control – by fiat, by order, by monitoring and by sanction, or by aligning the outcomes of the job with the products of the tool. It can be exerted by indirect control – incentives, performance measurement, the provision of a facilitating and motivating environment, and inspirational and exemplary leadership. Or the control can be exerted by the development and presence of a system of beliefs which make potential resistors unable to see the alternative: where the conceptual status quo becomes participation and where the matrix of actionable possibilities is determined by the internalised belief structures of the brave new way – a way which requires the collective intelligence and memory of the organisation.

Analyse the social groups

Organisations approaching Web 2.0 implementations strategically will often plan a ‘rollout’, a programme to spread awareness, provide usage scenarios, establish cells and allies, and argue the business opportunities to supervisors and managers. Given the highly variable propensities in groups to adopt the technology, an analysis of likely adoption based upon applicability, payback and the group prototypes is called for. Table 7.5 provides such a strategic approach.

Table 7.5

Analysing groups as potential adopters of Web 2.0


1.There is of course still a substantial amount of knowledge required to implement Web 2.0 in a robust user-friendly manner, but in general this will be the province of a few people within organisations. And in contrast to application and database technology, this knowledge is largely not ‘technical’: the people might be more accurately described as ‘power users’ or business analysts. This is not to say those with a particular vested view and interest may not insist on large teams of technical support … Excellent books on Web 2.0 implementation which focus on technical and implementation aspects are Casarez et al. (2009) and Newman and Thomas (2009).

2.AIIM – The ECM Association (2008).

3.Dufft (2008).

4.Aberdeen Group (June 2008).

5.Bughin et al. (2008):.

6.The term is sometimes attributed to the Roman historian Livy, who described how Tarquinus Superbus, a tyrant of Rome, sent a silent message to his son to eliminate the leaders of a conspiracy by cutting off the tops of poppies in a garden and telling the envoy to report to his son what he had seen: ‘[Death] cropt the heads of nations, as Tarquin struck off the Poppy-heads.’ Institutionalising this as the resentment of peers is probably a far more powerful means of suppression than were the leader to exercise this power overtly.

7.Phifer et al. (2007).