Chapter 3: The modern business environment – Web 2.0 Knowledge Technologies and the Enterprise

3

The modern business environment

‘To someone with a hammer, the whole world looks like a nail.’ This is a proverb one might readily apply to information management and communication tools. To IT specialists, salesmen and web evangelists the world is pieces of information waiting to be processed, tagged and stored, such is the hype that has surrounded successive waves of technology product announcements over the past 30 years. An evidence-based database juxtaposing claims and disappointments for the IT industry would make interesting reading. Are we standing before a similar mountain of hype with Web 2.0?

There does seem to be a lot happening. A young woman of about 25, an administrative assistant at a Chinese university I visited, writes a blog for women, about their lives, their pain and their joys. She writes short stories into the blog of things that women tell her, in particular about the sadness of their marriages and the disappointment they begin to feel the day after the ceremony. She has 500 readers, people who she says return regularly because she catches their sadness and frustration. It is impossible to read a newspaper online without seeing the social tagging links to websites like digg and delicious. Stories are rated, linked, blogged about and classified. I read about Richard Branson twittering on his Virgin flight to Orange County: ‘Arianna Huffington and I chatting on Virgin America’s inaugural flight to OC. Have put my trousers back on. 2:11 PM Apr 29th.’ And of course the use of Twitter.com in the disputed Iranian presidential elections of 2009: ‘RT @hughdeburghRT @iran09 To world press in Tehran: People have died tonight, B a witness at least. Don’t let them die in the dark#iranelection.’

‘Web 2.0’ was deliberately so baptised to contrast with ‘Web 1.0’: it encapsulates an evolution from client–server relations (where there is an implied master–slave relationship) to peer–peer (where partners stand on a equal footing), from one-directional broadcasting of information to conversations, from formal specification of products and objectives to iterative collaboration, from planning to evolving. The Amazon.com of the ‘Web 1.0’ era allowed a person to search the Amazon books and music database catalogue and buy their products. The ‘Web 2.0’ Amazon.com allows a person to write a review of the products, rate a book, discuss their views with others and even merge the Amazon product catalogue into their own product database using Amazon functions. Web 2.0 is a platform of possibilities, allowing people to construct and arrange information according to their needs and tastes rather than predefined rules for interaction. Indeed, we probably shouldn’t talk of ‘users’ of Web 2.0: one is a partner, collaborator, participant or co-creator.

This language resonates strongly with modern business rhetoric, which emphasises speed, dynamism, unpredictability and flexibility – ‘virtual, adaptive and contingent’. IBM calls this the ‘flex-pon-sive’ corporation:

… it is the description of a company that responds with lightning speed and agility to rapidly changing business needs. This company must have a focus on processes that are enabled for change through IT.1

Corporations have changed – and continue to change – their underlying organisational models to cope with the unpredictability that has accompanied technological, economic and structural change. They constantly reconfigure themselves, their partners and their processes for new projects, adaptive services and demanding customers. Castells describes this as the networked enterprise, a ‘… lean agency of economic activity, built around specific business projects, which are enacted by networks of various composition and origin: the network is the enterprise.’2 These ‘networked enterprises’ succeed because of their ability to create competitive, desirable commodities by sharing information signals with customers and suppliers.

The tools of the Web 2.0 suite, with their emphasis on equality of participation, openness and ease of use appear to offer compelling arguments for corporate application, within and beyond the firewall. Indeed, most knowledge workers expect to be able to use the tools they use at home on the World Wide Web at the office.3 But in contrast to the personal sphere, what counts is getting work done or adding some value for some future time. By work, I mean the production of the main outputs of any organisation, the primary value chain of marketing, production, logistics and sales, and, in support of this, the secondary value chain of human resource management, finance or IT services. Information systems are intended to support these functions, but often do so inadequately and sometimes hinder and constrain them. Active adoption of systems that appear to offer strategic, tactical or operational benefits is driven by human institutions and perceptions of value, not just of value to production, but of corporate acceptability, personal taste, trust and managerial control, compatibility with existing production norms, authority, fear and egoism.

We need to ask of Web 2.0, therefore, not only how it can be used in enterprises, but will it be really adopted and useful or just of marginal interest? Indeed, will it be real ‘Web 2.0’ or some conventional shadow of it? Will it satisfy the ‘fast return at minimum cost’ principle which dominates IT decision-making in many corporations today? If Web 2.0 technology is potentially useful, are there other more useful tools? In whose interest is it that it be used or not used – management or operations – and which are the stronger institutions in an organisation – hierarchy and authority, regulation or professional dedication? Who will use it more, who will use it less? A 2008 survey by Gartner of Japanese firms’ use of Web 2.0 Internet facilities found a decline in use, which they attributed to increased concerns about compliance to Japanese Sarbanes-Oxley legislation and security of intellectual property.4 So, in short, not everything in the business world is a nail. Indeed, it isn’t about the hammer, it’s about the carpenter, the house, building codes and even the weather.

In this chapter we look at the factors which are often cited as compelling evidence for the impending success of Web 2.0 in enterprises: information overload, knowledge management, network effects, generational change, globalisation, the ‘strength of weak ties’ and the ‘wisdom of crowds’, for example.5 Generally these appear to be strong macro-level arguments favouring a broad-based take-up of the technologies. In the main, these theories argue why Web 2.0 tools are the right solution for our time. However, this is quite a different thing to making an adequate business case for investment and then making it work (let us keep encainide and knee arthroscopies in mind here). In a later chapter we present theories of knowledge and human behaviour which we think provide a more appropriate framework for understanding the power of these systems and the impediments to their implementation and adoption. We will review the research to the present moment in these areas and examine what factors may influence the adoption of Web 2.0 in business.

We also need to consider the timeframe for the adoption of these tools – the impact of technology has often been overestimated in the short term and underestimated in the long term. Immediate feedback may show that there is little or no benefit, but over time (like telephones and electricity), as tools become systemic, pervasive and familiar and work processes adapt to and integrate the capabilities, a more typical response may be ‘how did we ever work without this?’ So there may be several different time dimensions at work: the structures and patterns which influence shortterm adoption of technology, such as immediate productivity gain or unique functionality, may in the end be less significant than the momentum generated by evolving social habits and new mindsets. Patience may be needed.

So this is the key message of this chapter: there are many systemic and environmental arguments to consider in the adoption and use of Web 2.0 technologies. But although these factors should attract the attention of senior management, they are usually not arguments which suffice to make a specific business case for investment or give a clear idea of concrete applications. Further these general arguments do not apply equally to all companies in all types of industry. It is not a fait accompli that the acquisition and implementation of these tools leads to the successful adoption of new ways of working and appreciable returns to the business.6

Mobilising knowledge assets

The primary purpose of business is to make a profit – and stay in business. Unlike the personal sphere, where these malleable technologies can be adapted to a multitude of private uses, Web 2.0 as business tools must offer a more productive, compelling alternative to accomplish a business task than incumbent methods: or they must facilitate the introduction of new value-adding activities and services. Designing products, taking orders, planning maintenance, resolving complaints and filling in holiday applications are repeated processes which contribute to the success of a streamlined, running business. In general, new information tools need to support these kinds of activities.

The fundamental challenge in business is of course to be competitive and to generate returns from the assets of the firm, creating profits for shareholders each quarter. Productivity is the cornerstone of this, the ability to generate optimal returns from all assets at your disposal. Improvements to productivity occur insofar as profit is generated. Many of those generative assets are intangible and reside in the capacity of knowledge workers to generate new products, new services and better ways to manage the value chain from conception to delivery.

Therefore optimising the capacity and the motivation of those knowledge workers becomes a crucial strategic requirement of competing firms. What hinders this? Much of this can be traced back to complexity. Of 7,900 global executives surveyed by McKinsey in 2005:7

 64 per cent noted a significant increase in interactions compared to five years ago (e-mail, meetings, voice-mail);

 25 per cent stated that communication was unmanageable;

 40 per cent stated that their company does not manage information and knowledge well;

 35 per cent found it difficult to find knowledge and information to make decisions.

As interaction costs have decreased through digital networks and communications software, great opportunities have been created, but the ensuing complexity has generated problems of a different kind. While reduced interaction costs have meant that global outsourcing and inter-firm collaboration allow it to be cheaper and more effective to move work to other locations and organisations, the volumes of e-mail and mobile phone use have exploded.8 But although useful for specific interactions, these are highly individual, fragmented, non-persistent and unavailable to the enterprise: the only records left are scattered throughout individual memory traces and massive personal e-mail in-boxes.9 This makes it extremely difficult to learn from previous interactions and decisions, and further means that interactions adopt individual options with their own nuances, leading to greater ‘ad-hocery’ and complexity. Managing this has become much more difficult say 60 per cent of the surveyed global executives.

Digital technologies have also increased the volume of what is called ‘virtual work’, which is the distribution of work across barriers of time, space and the firm. This brings many advantages: access to the best resources and talent, use of all 24 hours in a day, close proximity to customers and the ability to ‘punch above your weight’ by finding business partners with key skills that complement yours. Virtual work takes many forms: telework, mobile virtual work, customer frontline work, virtual teaming and the virtual enterprise. But it also involves risks: miscommunication, loss of trust, loss of managerial control over performance and productivity, a longer working day and impaired coordination and cohesion. How is a unified and positive organisational mindset to be maintained in the face of such workforce fragmentation?

The fundamental proposition is therefore that tools are needed for knowledge workers which reduce information complexity, which minimise unproductive e-mail and one-to-one interactions, and which build up corporate knowledge for others to use in the real-time, digital environment of the modern workplace. Where possible, these tools need to maintain a sense of cohesion and work against the risks and disadvantages of isolation and distance. Properly managed, the Web 2.0 tools we saw in the previous chapter such as wikis and blogs are excellent vehicles for the capture of knowledge within the context of workflow rather than as an additional (and therefore ultimately doomed) data entry step in routine business processes. Add to this the classification of that information through the use of an organisational semantic web augmented by social tagging and the issues of information management begin to improve. Corporate social networking tools (exemplified by the functionality of Facebook or MySpace) offer the ability to disclose information about the self and facilitate group relationships, enhance affective relationships and strengthen cohesion. A coherent and consistent ‘single version of the truth’ may begin to emerge, not because of rigid controls by the few, but because of the attention of and contribution by the many.

Generational change

Web 2.0 tools are often said to be the implements of the ‘net generation’, who have grown up surrounded by the Internet, mobile phones and computer games. Web 2.0 capabilities first appeared on the Internet and were quickly adopted by young people in particular for social, special interest and non-commercial interactions. It is anticipated that this generation will expect and demand similar tools at the workplace and will indeed be able to use them to improve organisational performance by working at Internet speed.10 But the world does not fall neatly into ‘digital natives’ and ‘digital immigrants’. Depending upon the psychosocial attributes of individuals, there are always people who have the characteristics of both generations: some digital natives are disengaged and some digital immigrants are ‘naturalised’ and fluent participants. As the factors that cause changes in attitudes and behaviour evolve and spread, so there will be people who absorb these attitudes and behaviours at different rates and in different proportions. There are also intergenerational flows from younger to older, as children influence their parents and teach their grandparents how to watch YouTube (and even vice versa).

But even with these caveats in mind, we can nevertheless generalise about shifts in attitudes and behaviour in a generation that is hitting the workforce now, and whose characteristics will become more pronounced and dominant. Enterprises will need to implement tools which match their expectations and patterns of interaction and enterprise technology departments will be judged on their success in achieving adoption in new and evolving enterprise information tools. The net generation, so it seems, are the people for whom we need to start designing and implementing technologies – and Web 2.0 tools are natural candidates for a range of knowledge work activities.

Increasingly, young people moving from university and higher education institutes do so having had ubiquitous access to computing, to smart mobile devices and Internet broadband. The provision of these facilities has given them a level of expectation about what constitutes an acceptable level of service and connectivity in their education and in their leisure time. But of course it’s not only about the technology: technology must be seen as one element embedded within a set of social norms, values and expectations.

In anticipating the demands on workplace computing, we therefore need to look at the social and historic context of the emerging generation of workers and what has shaped its expectations. A comparison with a previous generation, the baby boomers, will help us here. The boomers were born into the postwar optimism of the Anglo-Saxon and European countries. Birth rates increased until the mid-1960s, when they declined to a low point in about 1975 (the Germans affectionately called this the Pillenknick, the bend in the population graph due to the contraceptive pill). The baby boomers delayed parenting and prolonged their youth. As the biological clocks continued to tick down, however, the birth rates took off again, producing the ‘boomer echo’ generation, from which we are deriving the current and future crop of knowledge workers.

Baby boomers were brought up in a time when the dominant information technology was television and the mode of transmission was a unidirectional broadcast style. The formation of public opinion and personal taste tended to be along broad lines. There were left and right, mods and rockers: there was little or no interactivity with technology, diversity was restricted and collaborative co-creation of knowledge and ideas was low. This is not to say that people passively accepted content or didn’t have choices: indeed, there was an explosion in the 1990s of television and radio providers catering to a wide variety of market segments. But as Neil Postman points out, ownership of the means of transmission gives the ability to set the thematic spaces and agendas for discussion for society at large.11

At the same time, however, we cannot assume the attitudes and capabilities of baby boomers are cast in stone. A survey of baby boomers by McKinsey showed that 40 per cent of them are ready to ‘change my life as I age’, that 77 per cent are on the Internet and that ‘this generation’s experiences … have generated a real openness to change’.12 The Gartner Group argue that more differentiated criteria than age are required to account for propensities to use or not use Web 2.0.13

But certainly in contrast to baby boomers, the children and grandchildren of the boomers in the USA were confronted with the personal computer and, from the early 1990s, with the Internet. They have inherited outsourcing, downsizing and a cooling of affections between employers and employees. They are more inclined to use ‘I’ than ‘we’ and their social role profiles tend to be personal rather than group oriented. They are generally wealthier than previous generations, well educated and, compared with previous times, are raised in environments which encourage curiosity and openness. Since the widespread affordability of broadband, features of the Internet such as interactive chat, video streaming, search engines, infinite numbers of websites, news, culture and sports subscription have become commonplace and changed information behaviour. They seek information rather then passively watch a limited number of TV channels, they observe diversity every day, and they engage regularly with a wide range of previously inaccessible opinions and viewpoints in communities that they actively choose. Practical objectives are achieved online: banking, booking holidays, submitting university assignments or observing surfing conditions. In a nutshell, the social context and the technological capabilities have changed radically since the boomer days and co-evolved to create the mindset of what Don Tapscott characterises as ‘The Net Generation’.14

There are some grand dangers in simplifying the characteristics of generations, but let us persist, as these generalisations are widely used to justify the need for new workplace information tools. Tapscott orders these into ten themes: they are open, inclusive and independent. They are used to open expression, they innovate freely and naturally, they are sceptical, require evidence and reasoning, and expect immediate gratification. They are sensitive to the image of the corporations they work in and have a need to be taken seriously. These characteristics are generally echoed by Rigby who in writing about the potential for Web 2.0 tools to engage young people in political and civic activities describes them as politically involved, critically active, technology savvy and influential.15

Utrecht observes the potential of Web 2.0 in education and describes current students as the ‘customisation generation’.16 At a basic level, students customise their Windows desktop, settings and menus. In a more sophisticated process, students customise their approach to learning, taking what they perceive they need to succeed according to the rewards and constraints of the system. This mirrors the logic behind the introduction of IBM’s Common User Access architecture for Graphical User Interfaces in the 1980s: that instead of the users being driven by rigid menus and transaction codes, the users of systems would multitask and themselves drive the sequences of work on the computer.

John Seely Brown describes several dimensions of current schoolchildren whom he observed as part of work at the Xerox Palo Alto Research Center.17 Their literacy is in navigation, multitasking and sorting multiple informational medias (not just text); they learn by finding and sorting information from a vast, available array, not being spoon-fed by figures in authority; they put this information in purposive yet original ways to achieve specific goals (‘bricolage’), rather than using analytical, deductive methods; their learning is largely action-based rather than being derived from an internalised theory or body of knowledge.

If these are plausible generational characteristics, the outcomes of social, ethical, economic and technological currents, it is clear that expectations at the workplace will also change. Instead of being hierarchical, command and control, routine and constrained, segregated and clinging to their jobs, this generation will bring entirely new characteristics to the workplace and therefore to their demands for tools. These tools are not only what this generation demands, they are an orthogonal fit between their characteristics and the self-organising, emergent behaviour required in the workplace.

But there are reasons to qualify the need for new information tools in the workplace based upon the needs of the incoming generation. Firstly, as demonstrated, it is not only only Generations X, Y and Millennium who are responsive to systemic social and technological change. Baby boomers are also capable of appropriating and using new techniques and tools.

Secondly, if information tools are to be successful they need to satisfy a range of criteria, not just resonate with the feelings of the current younger generation. They need to be ready at hand, providing accurate, precise, timely information relevant and applicable to a task. Customisation and informality is fine, but much of work is uniform requiring coordination and coherence.

Thirdly, in a workplace characterised by outsourcing, high unemployment, short-term work and contracting, career change and individualism, attitudes at home or to society may be quite different to those of the workplace. If the objective of information tools is to renew or generate organisational capital, it may be that the new generation has even less interest in contributing to the organisation’s intellectual assets than the previous one.

Finally, young people take developmental trajectories and develop preferences which are only partially influenced by global factors such as materialism and technology. National, cultural and economic factors are also determinants of the expectations of generations. Where young Anglo-Saxons use the mobile phone to talk, the Catalans use them to arrange to meet face to face.18 Such cultural preferences co-determine the use of technology; technology is not adopted uniformly in all contexts.

The loss of baby boomer knowledge

As baby boomers head for retirement, there are fears that firms are losing a key generative asset, limiting their capability to make effective decisions, solve problems and continue the relentless cycle of competition, innovation and adaptation. Wheels will be reinvented, lessons will be forgotten and mistakes repeated at great cost. Relationships with employers for baby boomers have tended to be longer term, containing relatively high levels of mutual commitment and a personal identification with the firm, although job tenure and security varies widely from country to country. The boomer generation was highly educated and have learned much on the job, and as this generation moves into retirement, the tacit knowledge gathered over decades of work experience will be lost. This is already posing a problem as some of those boomers with adequate wealth seek to retire. The effects of this knowledge loss vary between industries and type of role played. It is expected that oil and gas and healthcare, for example, will be more affected than manufacturing and technology. There is greatest concern at the senior management level, but firms are also worried about losing middle-manager expertise as well as specialist knowledge. There are a number of strategies to ameliorate this: phased retirement, job redefinition, mixed-age teams, mentoring and flexible ‘at call’ working arrangements. Indeed, the concept of retirement will undergo change as a nexus of corporate knowledge loss, personal financial need and the boomers’ refusal to ‘grow old gracefully’ coalesces.

But although a genuine and significant challenge for many western economies, the disappearance of the baby boomers may not be as abrupt as is expected and the ‘lockstep’ retirement of entire cohorts is probably a myth. A McKinsey survey revealed that 62 per cent of boomers ‘worry that I have not planned sufficiently for retirement’.19 This is confirmed by a Bucks Consulting survey of 480 American enterprises, in which 86 per cent of mature workers said they would prefer to continue work for financial reasons.20 Another McKinsey study prior to the financial crisis of 2008 showed that baby boomers, although having earned more than twice that of the previous generation at the same age, have saved poorly, with only 2 per cent of income being put aside (in 1985 10 per cent of income was saved). The ratio of debt to net worth is 50 per cent higher than the previous generation at the same age and 69 per cent of baby boomers are unprepared for retirement (about 50 per cent of baby boomers are ignorant of this exposure). Thirty-eight per cent say it is extremely likely that they will have to work longer than the normal retirement age and 85 per cent say it is at least somewhat likely. The reasons to postpone retirement are mostly to meet expenses (35 per cent) and maintain their lifestyle (24 per cent).

But the financial crisis of 2008 has led to devastating losses in retirement portfolios. Hits of 25 per cent are not uncommon and two trillion dollars was wiped off the value of US retirement savings in 2008–9. Retirement ages are being pushed out even further still.21

Taken from this perspective, the disappearance of baby boomers from the workplace in the near future has been perhaps exaggerated. They will have to work longer, reduce their draw-down on savings and therefore continue to have managerial control over policies, budgets, strategy and attitudes for some time to come. A 2009 Gartner report states that as recession bites, baby boomer CIOs (according to HR reports more risk-averse than their business peers) will first lay off Generation Y IT employees on the last-in first-out principle, valuing experience rather than potential. In this process, IT service providers will lose ‘access to Gen Y’s strengths … These digital natives often help more experienced peers understand and appreciate new ways of using and applying new technologies.’22

In the USA, between 1977 and 2007, the number of people employed over 65 years of age (these are not baby boomers) rose by 101 per cent, compared with 59 per cent for the total employed population. Since 2001, in a trend starting in 1995, most of these older workers are working full time rather than part time. As mass layoffs take place as a consequence of the global financial crisis, Figure 3.1 suggests older workers, far from being disproportionately retrenched, are being retained.

Figure 3.1 US labour force participation by age Source: US Bureau of Labor Statistics (http://www.bls.gov/spotlight/2008/olderworkers/).

But in a significant survey of 4,000 firms in Germany entitled ‘Farewell to the Obsession with Youth’, Commerzbank found that 85 per cent of the participants focused on educating younger employees as a method of managing demographic change and only 44 per cent educated their older employees.23 Indeed, one-third sees no possibility of keeping their employees until 67 and 12 per cent only after extensive adjustments to work processes. Companies with more than 250 employees in particular focused on winning over younger hires in the war for talent rather than educating mid-term and older employees. But at the same time, they recognised that the aging of a wealthy and choosy society meant that they needed older employees in order to remain responsive to the market’s changing demands for products and services. In spite of their current strategies, 77 per cent of the respondents said a move away from the obsession with youth was necessary and over three-quarters demanded a longer working and learning life. This would only work if all stakeholders – government, states, industry confederations and unions – worked together.

So there is a varied and complex picture of the impending retirement of baby boomers. In the US, with voluntary retirement being delayed and the preparedness of older workers to remain, the boot has moved decisively to the other foot. Under the more social European model of greater state-funded retirement, the picture is different but in a state of change. Ultimately, corporations will decide which workers go or stay, depending upon regulatory and institutional criteria and the national-cultural models of capitalism in place. They will make decisions based upon what they deem important for their business at that time.

Network dynamics

Web 2.0 is a phenomenon based upon networks: networks of computers overlaid with software capabilities which facilitate the creation of linkages between the people who use that software. This introduces the peculiar mathematical properties and powers of networks, which are increasingly argued to be foundational arguments for the use of these technologies in commercial contexts.24 The development and usefulness of links between nodes in networks follow certain patterns which underline how the networking technologies of Web 2.0 create value.

If each node in a network of things has an average of one link to any other node, a network cluster forms which exhibits particular properties, particularly in the transmission of information. The more links we add, the less probable it is that any part of the cluster is isolated, meaning that once in a network, the probability of missing information reduces according to how the links subsequently develop. This is not a simple linear reduction.

In networks where links do not form in a random manner (which describes most networks) and where links between nodes can be consciously navigated, the number of links needed to transverse the network to any other node, without knowing the path in advance, becomes quite small quite quickly. This is reflected in the concept of ‘six degrees of separation’, which stems from an experiment performed in 1967.25 Stanley Milgram asked residents of Wichita and Omaha to send a card to a particular person in Boston. If they knew the person they should send the card directly, but if not, they should send it to a person who is more likely than themselves to know that person. The average number of letters sent turned out to be 5.5, or ‘six degrees’. In 2001, Duncan Watts performed a similar experiment with e-mail, again finding the average number of intermediaries to be 6.26 The implication is that the distance between any two people between whom some link can be imagined is actually far less than we might expect. It is a matter of providing a means to identify the best next step towards the target.

Far from being uniform, networks have asymmetric properties. Some parts of networks are denser than others, having strong interconnections between all or most nodes in that particular ‘sub-network’ (you and a group of close friends for example).27 But the other less common links between your dense network of friends and other dense clusters can be very useful; indeed they can be more useful and more wide-ranging than the strong ties within your own dense network of close friends. Granovetter’s famous work on ‘The strength of weak ties’ found that links to acquaintances, rather than close friends, are more likely to lead to a wider range of social advantages.28 For example, they play a critical role in getting jobs. The general structure of social networks is therefore of strongly tied clusters with links to other groups facilitated through weaker links. It takes only a relatively small increase in weak links to radically increase the average closeness of nodes to each other in a network. Strong ties, between committed friends for example, are not used nearly as much, often because close friends, having a similar profile, do not offer each other capabilities which the other does not already have. Simply put, this reflects the fact that much social activity, of which business interaction is a type, occurs through finding and exploiting people through a ‘weak tie’ (a friend of a friend or someone who went to the same school).

It seems to be a common occurrence in networks of all types that some nodes have a considerably higher number of links than others: these nodes are known as hubs, and most systems appear to fall into a pattern in which a small number of nodes have a disproportionately high number of links to other nodes. Some web pages within websites (or even over the entire World Wide Web) have a huge number of links while the others drop away to ten or less; some people are tremendously popular and facilitate relationships between many others; a few airports have a huge number of incoming routes – the next level of airport far less; profitability is often concentrated in a few customers or customer types and a few products often account for the most sales. Power, wealth, attention, complaining customers or productive workers are concentrated in ‘hubs’ rather than being evenly distributed across populations.

Networks are not static: they grow, and as they grow they demonstrate certain behaviours, in particular preferential behaviour, by means of which new nodes will tend to link to the most popular existing nodes. Clusters form as nodes become linked, and they lead to hubs, which have a larger number of links than most nodes.

This information about networks is relevant to the adoption of Web 2.0, because Web 2.0 technologies link people and ideas across the entire Internet or corporate intranet. A node in a computer network is not just a computer with an IP address. It might be a person, an event, a discipline or area of knowledge or just a web page. The links between nodes are given through the hyperlink path information and signposts to those nodes which are retained and in many cases automatically managed in wikis, blogs, social networking sites and so on. Figure 3.2 illustrates how the network effects influence communication and access to knowledge resources within the linked enterprise.

Figure 3.2 Network effects and their impact on resource access

So, because there are several levels at which we can specify node granularity, there are many forms of interconnectedness which can be supported by Web 2.0 tools. Through simple hyperlinks in web pages, the Internet is already a network of linked page-nodes. Then there are people-nodes, individuals who may have previously been isolated experts, who are made visible and connected as authors of wiki pages or members of social networking sites: they become informational hubs, their knowledge elevated and leveraged, and they become members of networks with common interests. At the group level, people work together and collaborate based upon a common interest or objective and will have strong ties to each other, interacting on a regular basis. On occasion they go outside of the group, using the weak ties, and sometimes the generative expertise of their group will be needed by others. Web 2.0, through making nodes and their competencies more visible, enhances ‘weak ties’ and the subsequent network traversal. Web 2.0 facilitates the automatic addition of network pathways across cluster boundaries, thereby multiplying the capability of the whole according to power laws rather than linear laws of growth.

At a business level, the number of nodes-as-organisations is increasing as value chain disaggregation and outsourcing occur. A firm may analyse its value chain and find it can do better outsourcing product design and support or information technology and manufacturing. Firms emerge and specialise to provide these services: the average size of corporations decreases as the vertically integrated concern is abandoned for the network.29 This means more nodes exist, but some nodes will continue to grow at a faster rate, as they have more links into others. The growth of nodes is preferential, favouring some nodes (i.e. firms) over others. Managing the connections into and out of your node, as well as navigating the network to find a specific node or node type, cannot be defined in advance due to the volatility of the network and the fact that you cannot anticipate in advance which node you require. Therefore firms require systems which can take advantage of the laws of networks to locate nodes (such as other people who might offer specific and complementary skills) both within the firm (if it is large enough) and beyond: the strength of weak ties, the exploitation of hubs, the power laws which lead to the small degrees of separation between you and your target, even though you do not know your target in advance.

The rapid establishment of transparent information exchanges with transient partners in the modern, semi-anonymous digital marketplace requires shared network capabilities and is itself a network creator. The Internet has provided a simple, universal, robust platform for connecting many buyers to many sellers. E-marketplaces and hubs which provide shared repositories to exchange enterprise resource planning data have proliferated for vertical integrated and horizontal industries. But not all information can be structured and inserted into databases. Where abstract thought and unstructured discussion dominate work activities, Web 2.0 capabilities allow network interactions to happen earlier in the value chain: requirements definition, designs, quality standards and collaboration around marketing strategies can be supported through Web 2.0 tools, where the downstream ERP systems support structured, repetitive and defined transactions.30

The power of the crowd

Where groups of people are physically or formally separated, the development of knowledge and solutions to everyday organisational problems becomes fragmented and localised. Islands of information develop and local experts evolve who embody and legitimate these local solutions. As groups make local decisions and assessments, fragmentation and deviation occur within organisations, even when desired methods are initially codified and distributed as policies and procedures. Furthermore, local optimisations are not shared with other groups. Web 2.0 tools allow an incremental pooling of knowledge which facilitates input and review by others. Information can be scrutinised in open public forums and its validity and relevance assessed.

There are many such instances on the Internet of dispersed groups with a common interest in developing a body of knowledge, a product or accomplishing a job of work: Wikipedia and the open source software movement are examples of contributions of knowledge, brainpower and critical capability to a common cause. There are sites where professional groups of scientists, educators or librarians make opportunities available to develop their discipline. And there are project sites where people within organisations construct common knowledge.31 But what is the quality of this information? How reliable, accurate or even ‘true’ is it? What level of remuneration or recognition is required to stimulate worthwhile contributions? In this section we will first examine the process of information gathering in networks before moving to decision-making processes regarding or based upon this information.

Networked collective intelligence

Many hands make light work and many people engaged in doing a little bit can add up to a lot. The connectivity of the Internet and the fact that it is an information-based network means that a vast number of people can contribute a part of a whole, can look at the contributions of others and improve what others have created without having to commit substantial personal time or resources.

A powerful example of this is open source software, which is software made available for free and for extension by others on the Internet. Linux, for example, is a computer operating system developed by Linus Torvalds, a young Finnish programmer. In 1991, he shared the source code of a simple operating system he had written with other programmers via an Internet bulletin board. He invited them to improve the system, placing it under a general public licence, so there were no fees to pay or licences to acquire, but others were not permitted to make money from it themselves and had to make their improvements available to others. Now Linux is a major product for managing large computers and co-developed (for free) by IBM at a cost of around $200 million per year to them (but saving them money overall through the contributions of others) and freeing capacity to offer higher value, less imitable consulting services. Other open source software products are Firefox the web browser, Open Office, Joomla and Apache (a freeware product which runs most web servers in the world). Mediawiki, the leading free wiki product, is constantly augmented and enhanced by programmers contributing extensions they have created for their own use. The SourceForge site for the Open Source Initiative lists thousands of products which can be downloaded for free.

The motivations to contribute to such product development are varied, although perhaps they all boil down to money, love or glory. Certainly, motivations other than the financial are at play – self-actualisation, reputation enhancement, recognition and genuine pleasure are some of them. Free software can be an inducement to use associated (but not free) products and services. But the bottom line is that for large-scale open source products, no single participant has to do a lot – thousands of interconnected programmers, each writing a few lines of code, can result in a large, sophisticated product. The overall design, architecture and coordination of systems (as opposed to snippets of code) require dedication and should not be taken for granted. But these are driven either through ‘true believers’, indirect state funding (i.e. research institutes and universities) or discretionary donations which cover some basic costs.

The Internet allows access to the vast intellectual and physical resources around the globe. Of the billions of people connected, there will be enough to participate. For example, the Galaxyzoo project is a collaboration between researchers at Oxford University and Portsmouth University in the UK and Johns Hopkins University in the US which began in 2007.32 The aim is to recruit Internet volunteers to visually classify large volumes of images of galaxies taken by the Sloan Digital Sky Survey II. It now has over 100,000 active volunteers who not only classify the images, but who also engage actively with astronomy, sharing particularly beautiful or quirky photos. As Daniel Thomas of Portsmouth University says: ‘We now have the world’s largest computer working for us, through the combined power of all these human brains.’ A similar project is NASA’s ‘clickworkers’ project, which uses volunteers to visually classify Mars craters. The website Patientslikeme33 allows sufferers from around the world to exchange stories about their ailments, adverse reactions to drugs and feelings (see Figure 3.3), leading to what the think-tank the California HealthCare Foundation calls a body of knowledge which ‘may rival the body of information that any single medical school or pharmaceutical company has assembled in this field’.34

Figure 3.3 The Patientslikeme portal Source: http://www.patientslikeme.com/

This harnessing of public capability and enthusiasm has extended into application development in the form of mashups. Some governments, for example, encourage the public to develop mashups using existing government datafeeds. One might combine maps, schools matriculation data and crime statistics and make this web page available to the public so young couples can decide where to buy a home for example. The productivity benefits for government agencies can be substantial: the public is better informed and the degree of democratic participation is increased. The office of the technology officer of Washington, DC published a public competition in October of 2000 which solicited the creation of web pages to use the datafeeds of the state of Washington, DC.35 The entries by the public included mashups for bringing data and maps together for car-pooling, historic tours of the city and building permits for certain parts of town.

Mass contribution is not restricted to information creation: it extends to the creation and application of tagging metadata as well, the descriptive data which helps make sense of the information that is out there. The more people tag pages with whatever tags they find important, the greater the scope for other users to find a page and also for intelligent software to make automatic links between tags based upon the coincidence of labels and the sorts of words in the content. Networks of concepts and ideas evolve that can present the underlying information from many angles and for many purposes. Networked collective intelligence applies to the classifying and ordering of the mass of information on the Internet as well as the building of that information content in the first place.

Web 2.0 technologies are in a position to harness this ‘wisdom of crowds’ through providing easy-to-access, wide-scope tools like wikis, blogs, social tagging sites, quality ratings and widgets. Much of the Web 2.0 literature suggests that the ability to network and reach the minds of the many will lead to new forms of extended collaboration, which will threaten the ‘traditional’ integrated, control-oriented development/production business models, in particular product design and co-creation with customers.36

But the proportion of commercial activity which will be subject to volunteer input is not yet clear and the degree to which it happens depends upon the motivations and capabilities of contributors and the willingness of firms to open up their product development processes to outsiders. It will be interesting to observe the sustainability of volunteer models in the immediate future, as competing models of pay-for-small-service evolve and hard times begin to bite due, for example, to the global financial crisis of 2008. In July 2008 Google announced Knol, an online encyclopaedia in which users can write articles on any subject, much as in Wikipedia, but gain income from the resulting traffic generated to the page. It remains to be seen how this will disrupt the volunteer basis of competing knowledge bases like Wikipedia, which was in fact not founded as an experiment in mass democracy but to be a world-class encyclopaedia. Nevertheless, with or without direct remuneration, networked collective intelligence appears to be an area of substantial and growing activity.

In particular, it remains to be seen how influential the phenomenon of collective intelligence will be within the firm. Firstly there is the question of numbers: commercial firms usually try and minimise redundancy and overlap in knowledge – will there be sufficient potential contributors with the right kinds of knowledge to achieve the desired effects? Secondly, there is the question of motivation: whereas contributors on the Internet seem to be mostly (but not always) driven by enthusiasm for the subject matter and the desire for recognition, workers within the corporation may be more driven by financial incentives, which are largely associated with how well they do their own job, rather than whether they contribute to the firm’s general stock of knowledge. Indeed, there even may be norms in the firm which militate against contribution. Thirdly, the efficiency gains of collective intelligence are not clear: the distribution of work over an unmanaged network may be lead to remarkable outcomes on the Internet, but it may also lead to great waste. Personnel (possibly unqualified) may spend company time following blind alleys and creating unnecessary work products – anathema to the philosophy of management which has encouraged specialisation and efficiency. Finally, even where networked collective intelligence is required within the firm, it may well be that conventional methods and technologies suffice. Perhaps e-mail, project e-rooms, Intranets – even meetings and conversations (!) – are enough.

Networked collective decision-making

So collecting information from many participants at low cost is all very well, but what about its quality and reliability? How do we assess this? How do we arrive at good decisions and evaluations? In the previous section we discussed descriptive metadata, which gives an idea of the meaning of web material. There is also metadata which is evaluative. There is a type of tag called a rating which provides the participant with an opportunity to express an evaluation of web content or general products. This is usually visible as a set of stars which express a consolidated assessment by readers of the material. There are now an increasing number of opportunities to review and vent an opinion about material and products, opportunities which can either be provided by the website owner or are placed by people with an opinion onto a general-purpose subject website. So Amazon.com and youtube.com, for example, allow customer to review and rate products and content, and tripadvisor.com allows anyone to share experiences about hotels, travel destinations and airlines. In spite of the potential for information warfare and sabotage, this information generally proves to be a useful indicator, as long as one realises it is in fact from a source whose reliability is not known in advance. The sheer number of people who have watched a YouTube video, for example, is some kind of indicator of its quality and attractiveness.

There are several ways to develop assessments of content, decide upon courses of action and arrive at conclusions about quality. There can be authoritarian and hierarchical approaches, by which a legitimated leader or expert, informed by subordinates or colleagues, decides upon the right course of action, or mandates a value or a process to be followed. This leader or expert can be selected by various means – a strong (or overbearing) personality, ‘superior’ experience or intellect or simply the prevailing social norms – but once anointed, the right to make decisions is also assumed.

When searching for a correct answer, one might elicit information from a number of people and take the statistical mean of the responses in the belief that the average will be closest to the truth. This kind of information gathering and assessment works well for information and decisions where there are definitive answers based upon estimates and facts. Research shows that judgments such as guessing the number of jelly beans in a jar or the weight of an ox at a county fair are best arrived at using this method.37

One might elicit information, discuss ideas and develop options and decisions cooperatively in a group. The consensual, deliberative approach (of which brainstorming is one variety) is intended to take the best ideas and perspectives and meld them into an optimal solution. This is the mode of idea generation which appears to be most favoured in modern organisations, as it implies a contemporary democratic orientation towards the best knowledge in the group, an aggregation of the best partial knowledge towards a more complete knowledge, and the availability of critical review to weed out bad ideas and develop good ones. However, there are multiple potential distortions in this mode of decision-making.

Firstly, in groups people read information signals from others. If the prevailing views contradict their own, they are more likely to doubt themselves than the group, leading to a failure to dissent.38 Furthermore, homogenous groups tend to become more pronounced in their views, leading to more extreme positions than those held by the individual participants.39 This is compounded when others are known to be authorities and is widely known as the ‘groupthink’ phenomenon.40 Famous examples are the disastrous assessment of Iraq’s weapons capability,41 the failure to prevent the ill-fated Columbia space shuttle flight42 and the surreal Bay of Pigs fiasco, in which 1,200 Americantrained insurgents were to invade and overthrow the well-trained army loyal to the popular Fidel Castro. Indeed, most of us have probably participated in meetings where something like this has happened, albeit with less dramatic consequences.

Secondly, people in groups are not only information processors, they are social agents. If they feel that their statements will be ridiculed or lead to sanction, they will not speak out. If they feel of lower social status, they may not volunteer information. Conversely, other research shows that some people routinely overestimate their own knowledge, a phenomenon known as ‘illusory superiority’: if they are considered an ‘expert’, they may well believe that they know more than they actually do, meaning that the social certainty and authority they project will cause a disproportionate degree of influence on a group’s decision.43

Thirdly, arriving at the best decision is simply not always in everybody’s best interest. A decision in the best interest of the firm may mean more work or less resource for a set of individual departments who, in the service of rational self-interest, withhold information or ideas which may lead to this decision.

So how does one reduce the distortion in decision-making? It may be possible through leadership and culture management to create standards and behaviours which protect dissenters and the vulnerable, and which construct ‘the ideal speech situation’ in which people are socialised into institutions which create the best conditions for discourse and decision-making.44 Further, Suriwiecki demonstrates with several case studies and experiments that group deliberation can be efficient and more effective than the smartest individual within the group, but only if there is adequate diversity and when there is some method to aggregate the knowledge of group members.45 But the norm in organisations is that informational, social and power pressures may distort collective decision-making in deliberative groups.

Finally one method of decision-making which tries to gather opinions from many while minimising the distortions of social presence is the anonymous vote (of which ratings are a basic form). The veracity of knowledge or the best course of action can be decided upon statistically, for example when a group is asked to rank ideas or to vote for the best ideas and do so anonymously. This can now happen over distance and include vast populations, increasing the statistical probability of arriving at the truth. As long as voting members of a group have a probability greater than 0.5 of being right, then the greater the number of voters, the higher the probability of the decision being correct or the knowledge ‘true’. This removes the distortion of face-to-face situations while gaining the feedback and assessment of participants. The number of these so-called ‘prediction market’ scenarios on the Internet is increasing and their accuracy appears to mirror real-world events.

An example of this in action is Google’s ‘market system’. Google give their staff ‘money’ to invest. The price of an event as a ‘stock’ reflects the probability of it becoming true (i.e. 10 cents is 10 per cent probability). The value of events in internal Google markets subsequently predict with good accuracy actual events, that is events which were predicted to have a ‘10 (per) cent’ chance of occurring actually occurred around 10 per cent of the time and, further, the decisiveness of predictions increased as an event approached. While this is a useful confirmation of the prediction market approach, research into the internal Google market revealed some biases. There is a tendency to optimism, for example when Google stock happens to increase, and there are distortions due to close social and physical proximity, where people sitting close together share opinions. This suggests that people will be influenced by incidental factors even when they are anonymous.46 A nice operational example of such an anonymous market system is in project management, a discipline which is often subject to the informational distortions of pressure from above to deliver and restricted information flow from below: a Microsoft software development project was deemed by the project managers to be on time and on budget. When ‘floated’ on the internal market, the project shares went to a value showing a 1 per cent probability of success – the project came in three months late.

But some kinds of judgment are even more delicate than this. Collective judgments about commercial entities, institutions, people or products may lead to severe difficulties which challenge and erode traditional notions of quality, truth or depth. Websites such as tripadvisor.com for hotels and travel destinations, docinsider.de for the quality of medical practitioners and meinprof.de for university lecturers give Internet users the ability to express an assessment as well as describe personal experiences. The objectivity and accuracy of these judgments do not undergo any quality assurance process other than removing obvious outliers. Many sites require some form of authentication to verify the reliability of the source or their right to make a judgment, but institutional clashes occur as data protection laws and rights to privacy confront these websites. A case in point is spickmich, literally ‘my cheat note’.

The spickmich.de web portal, which allows German school students to evaluate their teachers using the same grading scales as are applied to the students, was sued by a teacher who appeared on this site.47 The most telling argument according to German law was that as data processing (i.e. the collection, calculation and presentation of data) is the core activity of the site, the site operators must prove that users of the data have a valid and justifiable interest in it. The strict application of German personal data protection laws means that every site user must be identified and every query then related to the user, such that their ‘justifiable interest’ can, at least in theory, be assessed. This is an enormous obligation for website operations to enter into and the loss of anonymity for users a probable death notice. In business organisations, on the other hand, where anonymity would be the exception rather than the rule, being identified would generally be a disincentive to be critical. However, in June 2009 the German High Court found that the freedom of expression of school students is of higher value than the privacy of teachers. While conservative politicians and journalists talk darkly of the bad old days of denunciation, 56 per cent of Germans agreed with the judgment, with 76 per cent of those between 14 and 29 years being in favour of it. The editorial of the Frankfurter Allgemeine Zeitung asks how the (objective) marks a teacher gives a student can be compared to the (subjective) opinion of the students and quotes Ortega y Gasset: ‘When the masses act autonomously, they can only do it in one way: they lynch.’48

So technologies such as those in the Web 2.0 suite, which gather knowledge (beginning with small but useful increments) from a distributed but often large set of contributors and which allow those contributors to assess (on an often anonymous basis) the validity of that knowledge and make decisions based upon it, are generally powerful, though not infallible, methods of information building and legitimation. Web 2.0 content creation and collaboration tools such as wikis, blogs and social networking software provide functions for social tagging, feedback and review and information rating and voting systems. These are generally built into the standard software products, requiring no further enhancement. The functions can be made available to people within organisations to build information and also evaluate it. If a sufficient number of diverse people are available and motivated to use these features, then the value of the information will be greatly increased and at relatively little cost.

Globalisation

In his bestseller about globalisation, Thomas Friedman describes ten flatteners which have interconnected the world and allow instantaneous communication, awareness and sensing without barriers.49 Most obvious among these is the technology revolution which has introduced universal, mobile and (almost) ubiquitous broadband access overlaid with communications, search and inter-application standards which support immediate, low-cost information sharing. There have been certain events which have spurred on the use of these capabilities: geopolitical changes, the end of the Cold War, the rise of China and the Y2K ‘Millennium Bug’ challenge, which saw the movement of programming work to India, paving the way initially for massive code remediation and programming and then further IT services outsourcing such as call centre provision and systems programming.

Outsourcing and offshoring via digital communications have enabled massive fragmentation of supply chains to occur in the pursuit of cost savings. This has also introduced new models for collaboration and development which transcend normal pay-for-service stereotypes. In what Friedman calls the triple convergence, web technologies, analytical and managerial know-how and the economic-political liberation of China, India and Russia, we are now confronted with the ‘flattened’ playing field, open to all and heralding the arrival of true competition for all aspects of work at a global level.

Already much work, both high and low value, has been outsourced and offshored and the flow of work from high to low wage economies will almost certainly continue. For example, the US is expected to send one in four IT jobs offshore by 2010.50 At least 3.3 million white-collar jobs will shift from the US to India, China and Russia by 2015.51 Not quite as apocalyptic, the US Department of Labor’s Occupational Outlook Handbook, 2008–9 edition, states for computer programmers that there will be a slow decline of 4 per cent in employment from 2006 to 2016:

… Because they can transmit their programs digitally, computer programmers can perform their job function from anywhere in the world, allowing companies to employ workers in countries that have lower prevailing wages. Computer programmers are at a much higher risk of having their jobs outsourced abroad than are workers involved in more complex and sophisticated information technology functions, such as software engineering.

Little effort is needed to apply the same criteria to other forms of knowledge work to appreciate the potential scale of offshoring – this even of jobs with very high skill levels and years of training. Radiological diagnosis, call centres, programming, engineering and design and even legal analysis require information inputs and produce information outputs and lend themselves well to outsourcing and offshoring.

However, there are nuances and even outright contradictions in how countries interpret and respond to the demands of the information age, and the US should not be seen as somehow merely a more ‘advanced’ example of a nation progressing towards a knowledge economy. While the nature of work in G7 countries, for example, is of course transformed through digitisation and networks, the statistics describing structural changes and occupational segmentation vary. While the UK and the US experienced rapid decline in manufacturing between 1970 and 1990 from 38.7 to 22.5 per cent and 25.9 to 17.5 respectively) Japan and Germany reduced theirs only moderately (from 26.0 to 23.6 per cent and 38.6 to 32.2 per cent respectively). And these two countries were the most competitive economies at that time and have the lowest service to industry ratios and the lowest rate of information employment throughout the century. While China has become the world’s leading exporter, Germany was until recently the largest by volume and, interestingly, is also the world’s major exporter of intra-logistical solutions and technologies. Intra-logistical activity is a major source of costs in manufacturing. This confirms that an organisation or economy can make an efficient shift from manufacturing to manufacturing services (or from ‘doing’ to ‘advising’) and that the two can coexist. Value chain fragmentation within a national economy can lead to a competitive advantage on the global stage.

While agriculture diminishes, manufacturing declines and producer services such as health and education grow, and while retail and service jobs increase the ranks of low-skilled, low-paid activities, there does seem to be a hollowing out – an increase at the top and the bottom of the scales, with the relative prosperity of the upper range increasing. So there are two models: the industrial production model (which may grow in the area of manufacturing services rather than actual manufacturing) and the industrialisation of services.52 As Castells says:

… the new information paradigm of work and labor is not a neat model but a messy quilt woven from the historical interaction between technological change, industrial relations policy and conflictive social action.53

Put simply, globalisation does not mean that technologies and solutions to working problems will be implemented globally in a uniform way. We only need to look at Cole’s comparative international studies of the adoption of small group activities, such as quality circles. He shows that differing national emphases, institutions and infrastructures created quite different rates of diffusion and persistence, with Japan adopting and retaining more successfully than Sweden and Sweden more than the United States.54

But the global economic system is of course a massively complicated beast, with multiple significant factors affecting its size, shape and nature. Credit squeezes, oil prices, the rise of China, India and Russia as significant producers and consumers, tariffs, carbon emissions and conflict whether through terrorism or between nations are a few of the imponderables that shape the trajectory of global trading and commerce. For example, the introduction of global carbon trading schemes and emission restrictions will force manufacturers and retailers to become more tightly integrated with their supply chains. In the case of manufacturers, 40–60 per cent of emissions occur upstream in the supply chain (80 per cent for retailers) and reduction in emissions will require greater collaboration between suppliers and manufacturers in resource management, processing, packaging and transport.55 Greater collaboration (rather than just data exchange) implies potential use of Web 2.0 tools. The global economic crisis also has a massive impact, as political pressures mount to restrict trade and increase protection of local industries – and indeed reduce the rate of adoption of carbon trading schemes.

In a globalised economy, tools which facilitate the establishment of trading relationships and the exchange of information are in demand. There is a natural tension between the requirements for ease of use, ubiquity and speed of establishment (which for example favours e-mail) and control, security and solid information management. Web 2.0 tools are advantageous because they are universal, cheap, flexible and available globally, they are simple to install, learn and use, and are suitable for the iterative and interactive development and exchange of knowledge. Being technologically simple and robust, they are natural candidates for inter-firm communication and collaboration within a globalised world economy.

The information economy

The knowledge economy, the information age, the digital economy: these are expressions which seek to capture the defining characteristics of commercial activity in our time. In simple statistical terms, in industrial countries there has been a clear demographic shift from productive activities involving physical labour to those in non-physical occupations. There is a marked, but not uniform, decline in agriculture and manufacturing and a corresponding growth in services. Those remaining in agriculture and manufacturing have become more skilled and more productive, particularly in the use of technology and labour-saving devices, to the extent where much of the activity could be characterised as ‘knowledge work’. The fragmentation of value chains through outsourcing has led to the growth of service companies which supply informational inputs powered by knowledge: marketing concepts, product designs, call centres – the opportunities are endless for the production of informational rather than physical goods. Where a company might once have manufactured something, they now increasingly organise the input services by others to their brand.

In response to this, knowledge management has become one of the signature disciplines in management science since the early 1990s and there has been a substantial focus by technology vendors, consultants and academics upon framing their various products in terms of knowledge and the macro-economic forces which have elevated its status as a key productive asset.

This is a key setting and assumption for Web 2.0 technologies. As a suite of digital tools, they transmit messages from one brain to another. This type of tool is a perfect fit for access to and the generation of knowledge via collaboration, above all in a world that has become increasingly flexible and dynamic. The need to generate information grows undiminished but the networks in which this occurs are more transient and volatile and the processes which are executed are less prescriptive. You may not know who your collaboration partner will be tomorrow, there may be no methodology agreed in advance, and you may not even know what information you will exchange.

Electronic marketplaces or automated inventory management for widgets, gearboxes and paper functions exchange structured data based upon explicitly structured business processes and the pertinent data. But when it is knowledge being developed and exchanged, and not structured data, the precise formats and flows remain unknown in advance. And the speed of partner contact, work ramp up, information exchange and information development must be rapid, implying that the supporting tools must be simple to establish, simple to use, conform to some open standard and be themselves open and simple to integrate into any number of back-end systems. Further, they must permit users of the system to self-organise and establish their own rules of engagement, administration and control. These are the underlying capabilities of Web 2.0 technologies.

Innovation

Innovation, the creation of new products, new services, new business models and improvements to business processes throughout the value chain, is seen by senior management as a key driver for growth and competitiveness. But at the same time, most business leaders are pessimistic about their own ability to achieve the desired levels of innovation.56 The life of products has reduced in the past 50 years from 20 to five years and now to two.57 Attalli expects the cycle from ‘creation to production and commercialization’ to fall for automobiles and household products to six months (from two years currently) and for medicines from seven to four years.58 Competitive advantage lies in the capacity of organisations to facilitate and mobilise their intangible assets such as employee knowledge, motivation and customer goodwill and turn this into innovative products and services.

Accelerating innovation by opening the requirements gathering and product design processes to inputs, suggestions, ideas and enhancements by customers or partners using Web 2.0 technologies and attitudes is gaining momentum. Atizo (Swiss) and Jovoto (German) are two websites created expressly to allow firms to solicit creative ideas and move them through a business case and prototyping phase. The Atizo community was responsible for the idea of a transparent, safety-check conformant bag for cabin luggage, which was manufactured by the textile firm Blacksocks.com. Mammut took the idea of using deep-freeze bag zippers for mountain sports clothing from the same site.59 Figure 3.4 shows ideas contributed to a next-generation BMW motorbike and how the tags within the cloud associated with the ideas increase in size according to their usage.

Figure 3.4 The Atizo innovation site showing ideas for next-generation BMW motorbikes Source: https://www.atizon.com/projects/ideaproject/24/ideas/

There are many critical success factors for innovation and its diffusion through groups, but there are no recipes to guarantee success. The fundamental challenges are not of a technological nature, but many factors can be influenced by the use of Web 2.0 tools:

 Leadership – organisational leaders need to send strong, consistent signals to employees about the importance of innovation. They should design and publicise measurable performance targets for innovation. But conventional methods of management communication are often clinical, formulaic and broadcast. Employees will often see these as yet another management fad or discount them as meaningless exhortations. Blogs and wikis, however, provide excellent opportunities to formulate and project these messages in ‘real language’, in an interactive form, without them simply being part of organisational spam.

 Culture – 94 per cent of managers in one survey said that people and culture are the most important components in stimulating innovation.60 Trust, risk-taking and norms which value innovation all need to be created to stimulate innovative behaviour. Leadership and legitimation of the desired behaviours is clearly critical for the development of new norms; Web 2.0 technologies offer the possibility for managers to formulate and repeat these messages and to underpin this with action by participating in innovation forums, discussions, contributions of ideas and so on. When managers take risks, expose themselves to criticism and spend time on an innovation wiki page, then this is clearly not only desirable behaviour, it is legitimate.

 Networks – the creation of networks is critical to the development of innovation, more important than individual creativity.61 There are two key dimensions to organisational networks which support innovation. The first is letting people find each other (or bringing them together) and providing spaces for them to interact. This is directly supported by Web 2.0 tools like wikis and social networking software. The second is to facilitate the use of alternative network paths to develop innovation networks in order to prevent non-supportive nodes, such as negative middle managers, from blocking this process. Web 2.0 tools reduce the impact of such blockages and make it easier to gather innovations and change attitudes in the desired direction.

Fragmentation of business processes

A key characteristic of modern business is the movement of work to the location where it is most cost-effective or where some differentiating competency is present. Design might be done in Italy, engineering in Germany, manufacturing in China and support from India for example. The intellectual and managerial capability to decompose work into its constituent parts, move that work to an appropriate supplier or location, and yet retain coherence and control is one of the major accomplishments of industrial science. It is this that is at the heart of outsourcing and offshoring – not technology or networks. Figure 3.5 shows the fragmentation of the value chain. Although an organisation may outsource a process, it retains the need to manage the informational inputs to and outputs from that process as executed by a subcontractor. There may be a requirement for collaboration and cooperation in this process to transfer knowledge and details of designs and even co-create design solutions.

Figure 3.5 The fragmentation of value chains and outsourcing

This decomposition has also highlighted a key aspect of work: that most of it is knowledge work. Fragmentation of work has led to the knowledge economy, not because people don’t buy fridges or stereos, but because many individual companies provide the necessary services in the production chain before a piece of steel is ever cut or welded. And the inputs and the outputs of this knowledge work are information. While it is easy for organisations to think they are mining companies or manufacturing companies when massive trucks are leaving the pit laden with ore or shiny new cars are rolling out of their factories, the degree of outsourcing in many such companies suggests that they are actually management companies – knowledge management companies.

If we invert the previous figure and, instead of looking from the production chain downward, we look from the service provider outward, we see that each service provider deals with many firms. In Figure 3.6, each service provider has a network of customers (all of whom might build roads or fridges or mines). We see that the business environment indeed consists of interconnected enterprises which function as nodes: some nodes provide services and some integrate services into a value chain. The Internet has reduced most components of the transaction cost: search, bargaining and policing.62 In the case of information deliverables, the cost of delivery is also reduced. Further, tools which support collaboration between any of these nodes (which hitherto were separated from each other) will enhance innovation and the rate of idea generation.

Figure 3.6 Enterprise networks arising from work fragmentation

So, tools which manage information exchange easily, cheaply and quickly, according to standards, facilitate the flow of work from place to place. The first generation of web applications simply provided a product catalogue and the ability to buy goods, effectively replacing a shop assistant. Then came electronic marketplaces, which allowed customers to place orders and specifications in electronic format at a site for ‘reverse auction’, and allowed sellers to place commodities for sale in advance of production. ERP products began to provide interfaces which automated the flow of information such as inventory reorder or payments to suppliers: it became a matter of mere configuration of software to automate inter-organisational transactions.

Then these supply chains again began to change – instead of linear supply chains, with bullwhip effects and sudden shortages, large suppliers changed to hub structures, where the supply chain vendors could not only see in advance the type of products in the pipeline, they could even participate and co-design products for improved manufacturability and cost-effectiveness. Collaboration, a higher-value interaction than mere supply, has now become a critical component of high-performing supply chains. Collaboration, the exchange of information in conversational mode and at the speed of business, to achieve a common objective or intersection of goals requires non-transactional yet cheap and easy tools – exactly the characteristics of technologies in the Web 2.0 suite.

The average size of US corporation has declined, in one survey from 60 employees in 1960 to 34 in 1990.63 The fragmentation of corporate value chains and the proliferation of offshoring and outsourcing have led to leaner manufacturing companies and mining companies that no longer produce anything or dig any ore out of the ground: they focus on branding and mineral assets. These companies might more accurately be described as knowledge management companies: they manage expertise and the information required to ensure that others do work for them. However, if a company does not make the institutional transition to seeing itself as a knowledge management company, it will be unlikely to appreciate that tools like Web 2.0 are in fact support for their core business.

The increase in individual contracting

The already mature trend towards shorter-term working relationships is increasing, although the rate and extent vary from country to country. This allows a firm to reduce its costs quickly and without legal or public relations difficulties when orders are reduced or it experiences hardships. Similarly it can ramp up for spikes in demand without committing to longer-term employment relationships. Every country is wrestling to find the appropriate balance between the rights of labour and capital. The Danish model allows firms to fire workers easily, but this is balanced by the high commitment of the state to find work for its citizens and the obligation of citizens to take that work. This seems to be very successful, although it cannot be assumed that the same model would work everywhere. In neighbouring Germany, work relationships are still contractually laid out for the long term, which reduces flexibility and the desire of employers to take on staff.

For some forms of work, individual contracting is not particularly problematic (retail sales assistants, drafting or labouring for example) but in others it is a source of loss of efficiency at the induction phase and subsequently of knowledge leakage and outright loss at separation. While individual subcontractors may bring particular skills into an organisation, they must first absorb much firm-specific knowledge before they can generate their particular outputs: they then take that knowledge with them when they leave. For example, a business systems analyst or solutions designer will often gain a picture of a business area which is original, systematic and insightful: only a small portion of that knowledge might be captured in an explicit report or model. When they leave, the tacit knowledge is taken with them out the door.

The baby boomers notice and generally regret the passing of permanency and loss of belonging, but younger generations know nothing else – indeed, they are more prone to move on, having recognised that employment is based upon transient mutual self-interest. They understand that shorter-term contracts allow firms greater flexibility in adapting resource commitments to economic circumstance: hiring in boom times, firing when the order books dip. They also understand that capital and not compassion is the determining factor.

There are some significant implications for the knowledge of the firm, however. Greater turnover of staff means an increase in the amount of learning and orientation, and smart approaches to induction and productivity acceleration are required. At the end of the working relationship, in-flight, current knowledge walks out the door at short notice, as do the lessons learned by contractors during their tenure.

Web 2.0 tools such as wikis provide excellent ‘incidental’ records of the information generated during such tenure. If one uses a wiki for project communication, then interactions and decisions are captured centrally around the specific theme, making them easier to find – and rely upon. As part of their contract, specialists can be required to ‘blog’ their findings and experiences. Social networking capabilities enable communication with departed contractors to be maintained and reopened when their knowledge is needed.

Consumerisation

The pervasive availability and general use of the Internet and its tools has led to those tools being part of a general suite of consumer skills and expectations. Consumerisation refers to the adaptation of those expectations to enterprise environments.64 At its simplest level, users ask why their organisation doesn’t supply a Google search or a Facebook for employees. But if at home, in your lounge, you can rate a book in Amazon, post a question about the best wine to drink with duck ragout and have input to a Microsoft design blog, the question is justified: why can’t I do this at work? And that is the tip of the iceberg: why can’t I interact with my customers using a Second Life avatar and why can’t I chat with the engineers while playing Warcraft 3?

The Gartner Group expect consumerisation to be a high-impact trend in driving enterprise product definition and take-up and they say that without recognition by CIOs of the acceptability of consumer-style software in the enterprise, Web 2.0 take-up and exploitation will remain stunted.65

Dynamic business models

Analytical capability, combined with rapid changes in technology, has led to a proliferation of ways of organising and structuring work. Whether it is full outsourcing, creative partnering, offshoring or product co-creation with active consumers, modern managers must consistently push the boundaries of possibility in searching for business models which enhance competitive advantage. Technologies which are inherently inflexible, based upon proprietary standards, which require very constrained and controlled data entry or which only allow a narrow corridor of usage do not permit rapid change in business models. Web 2.0 technologies have high degrees of openness and allow local adaptation and situated responsiveness. They are highly malleable forms of technology with substantial local negotiability and open-endedness.

Changes in managerial style

The fluidity and dynamism of the business world places new demands upon managers: orchestration rather than direction, coordination rather than control. In particular, the rapid construction of dispersed teams, many of whom do not report to a single manager, means authoritarian management will not work. Greater consultation, collaboration and adaptiveness are required, leading to new manager profiles, with an emphasis on softer skills and greater visibility of decision logic.

These are attributes served well by the open, conversational tools of Web 2.0. The underlying social processes of knowledge creation, sharing and collaboration are made transparent by tools like wikis and blogs, creating a platform for leaders to engage in this new kind of dialectic with dispersed, qualified knowledge workers and to communicate, legitimate and objectify vision, objectives, means and ends.66 In such environments, leaders can project these attributes with greater immediacy and interactivity and participate in collaborations which simultaneously project their authority, legitimating certain types of approach, solution and language.

Of course, just because the tools are available does not mean that leaders or middle managers will engage with them or see them as worthwhile. Managers are themselves the objects of key performance indicators and targets. Where Web 2.0 does not directly contribute to these, or only does so such that the benefits will be realised after a manager departs, it is probably unlikely that managers will initiate any such programmes. This style of management is risky and difficult, foreign to many and almost certainly perceived as faddish and unnecessary in many organisations. Further, the privileges of management rest, in many cases, on the maintenance of distance and apartness. As Jeffrey Pfeffer says of companies in the USA, cutting health benefits and salaries, spying on employees, not giving employees a say in decision-making and increasing work pressure is still common management practice, but has the opposite effect to increasing productivity by ‘holding and therefore acting on naïve, simplistic, and inaccurate theories of human behaviour and organizational performance’.67

Regulation and governance

Since the Enron and WorldCom collapses, greater transparency and oversight has been demanded in the accounting affairs of corporations. Directors are liable for omissions and inaccuracies in reporting and face jail in cases of misrepresentation of the financial health of their organisations – ignorance is no excuse. The Sarbanes-Oxley legislation in the United States, which affects any company listed on its stock exchange, is being replicated in other countries trading with the US. It has made legal discovery a major concern of directors, with e-mail retention now being fundamental to records management. The content of e-mail is accessible to prosecution lawyers without management actually being able to control what goes into e-mail in the first place: how many hostages to fortune are kept in e-mail archives?

Web 2.0 technologies offer a partial solution, in that blogs and wikis are interactions conducted in a public forum. This naturally leads people to be careful about what they say: sales managers would probably not conduct cartel-type behaviour or price fixing on the enterprise wiki. This naturally restricts the recording of sensitive information and conversely gives managers the opportunity to identify and act upon information that may compromise the organisation.

Conclusion

Many of the points discussed in this chapter are encountered in the business and research literature, magazines and journals as being compelling arguments for Web 2.0 in enterprises. In summary, the logic is that the implementation of Web 2.0 tools in the workplace constitutes a successful response to:

 the flood of information and proliferation of e-mails;

 the need to generate greater returns on knowledge as a key production asset;

 the expectations and inclinations of the next generation of workers;

 the loss of deep production knowledge imminent in the retirement of baby boomers;

 the connectivity opportunities offered by network dynamics;

 the need to improve decision-making by the application of many minds;

 the capability to create large sophisticated knowledge products by allowing many small contributions;

 the fragmentation of value chain work to multiple parties;

 the distribution of work across the vast distances;

 the creation and distribution of new knowledge by allowing rapid yet persistent conversations;

 the proliferation of outsourcing of work to contractors and transient staff;

 the consumerisation of software which moulds workers’ expectations of corporate software;

 the need for corporations to manage communications for regulatory purposes.

But the devil does indeed lie in the detail. There is a substantial gap between these assertions and the reality of business decision-making. While there is insufficient published research to verify the claims, there is sufficient reason to be sceptical. Above all, as with most social phenomena, enterprises are complex open systems where there are many factors at play, and predictions or ‘case studies’ of success and appropriateness for Web 2.0 need to be regarded carefully. There are counter- and attenuating arguments which at least indicate that the touted need and the anticipated uptake might not be as dramatic or self-evident as expected:

 The baby boomers are leaving, but not as quickly as you might think.

 To achieve a network effect, there need to be sufficient nodes – this is not the case in many organisations.

 The wisdom of crowds needs a (diverse) crowd – in organisations experts are often alone.

 The net generation uses the new technologies naturally and adroitly, but they adapt to the inertia of the organisation, where sharing and interactivity are not to be taken for granted.

 New managerial styles are needed, but control (and often the protection of privileges) is still the priority of many leaders.

 Flexibility is needed, but over-complexity and product extension lead to losses in efficiency: the line must be drawn somewhere.

Many types of information systems – enterprise resource systems, data warehouses and knowledge management systems for example – are logically excellent solutions to common problems but have substantial failure rates, often more than 50 per cent. We need to understand the underlying social, technological and market logic and the factors which cause failure in order to assist good decisions to be made in the acquisition and implementation of technology solutions. Success with Web 2.0 will not simply happen.

Furthermore, if the logic of networks and the wisdom of crowds are to gain traction, we need to understand how to use Web 2.0 tools in a Web 2.0 way: simply using wikis as glorified file servers or intranet content managers is no great leap forward. Creating private wiki islands where a project team can share their project information is nice, but e-rooms and intranet content management systems do this already. And the information is consigned to invisibility when the project ceases. The power of Web 2.0 comes to bear when the project information is stored in a public place and where those outside the direct team can contribute to or learn from the team’s information. And this is the difficult bit to get working.

Logic told us that knee arthroscopies and encainide were the right treatments – but they weren’t. Even heart-bypass surgery works but can be delayed without harm and coronary problems instead treated with drugs. Web 2.0 implementation can likewise be delayed or adopted according to a firm-specific logic or other means used to address the underlying problems. We just need to understand these so that good choices and better implementations can be achieved.


1.Carter (2007: 14).

2.Castells (2001: 67).

3.A 2009 AIIM international survey found that 47 per cent of 18–30s and 31 per cent of over 45 s expect to use the same type of networking tools with business colleagues as with friends and family.

4.Sarbanes-Oxley is legislation passed in the United States to enforce greater accountability and auditability of company finances in the wake of the Enron and Worldcom corporate collapses.

5.For example, Kroski (2008), Shuen (2008), Solomon and Schrum (2007).

6.An indication of the confusion is an AIIM 2008 survey, in which 44 per cent of respondents said Enterprise 2.0 (i.e. defined in that survey as the use of Web 2.0 tools in the enterprise) was imperative or of significant importance to their organisation, but 74 per cent had only a vague familiarity with it.

7.Bryan and Joyce (2007).

8.Charman (2006) writes that the average US knowledge worker receives 94 e-mails per day, 34 of them occupational spam.

9.In an interview with CIO Insight, famous blogger Robert Scoble said: ‘When I left my old job at NEC, I left behind a gig and a half of e-mail: I couldn’t look at it, and they erased it. So my former co-workers couldn’t use that knowledge. A collaborative toolset helps to get information out of e-mail into the shared social space. You see productivity benefits. People can see where you’re going and make suggestions on who to call there’ (http://www.cioinsight.com/c/a/Foreward/Robert-Scoble-on-Corporate-Blogging/).

10.For example, see Basso (2008).

11.Postman (1985).

12.Court et al. (2007).

13.Prentice and Sarner (2008).

14.Tapscott (1998).

15.Rigby (2008).

16.See: http://www.thethinkingstick.com/customization-generation.

17.Seely Brown (2002).

18.Lecture by Manuel Castells at the International Conference on Information Systems (ICIS), Barcelona, 2003.

19.Court et al. (2007).

20.Bucks Consulting (2007).

21.For example, see ‘Rethinking Retirement’, Business Week, 13 and 20 July 2009.

22.Walker and Bittinger (2009).

23.See Commerzbank (2009) and the comments by the German Federation for Medium-Sized Enterprises (Anonymous, 2009b).

24.The network as a model for mathematical analysis was invented by the Swiss mathematician Euler in his analysis of the problem of the Königsberg Bridges, a puzzle in which one has to cross seven bridges joining four pieces of land only once each in a single round trip. Euler invented graph theory, the precursor of network analysis, to prove that any network (the route around the bridges) which has more than two nodes (the land areas) with an odd number of links (bridges) cannot possibly be negotiated without retracing steps somewhere. A node with an odd number of links must be a starting point or an end point, and there can be only two of these in a network (or route) in which no path is retraced. From this development of graph theory, important insights have been gained about the behaviour of networks consisting of interconnected nodes of pretty much anything – people, animals, atoms, biological cells and computers. For more on network science, see Barabasi (2002) and Watts (2003).

25.Milgram (1967).

26.Watts (2004).

27.See Barabási (2002).

28.See Granovetter (1973).

29.Scott (2004: 11) writes that the average size of the US corporation has declined from 60 employees in 1960 to 34 in 1990.

30.Furthermore, there are multiple-sided markets which amplify network effects even further. A multiple or ‘n-sided’ market is one in which there are many buyers and many sellers: the relationships are many to many. Instead of connecting individual buyers and sellers: these markets connect different sets of partners, where the increase in mass on one side directly improves the profitability or utility to the other side. For example, the Visa Company, as a service, connects a community of providers with a community of purchasers. The utility to the purchasers rises greatly, the more providers there are using the Visa service and vice versa.

31.For example, the Collaboration Project is a resource established by the Obama administration in the USA to foster inter-agency and citizen collaboration. It cites many cases of the use of blogs, wikis, ratings and other Web 2.0 tools to achieve its objectives. (See: http://www.collaborationproject.org/display/home/Home: ‘The Collaboration Project is an independent forum of leaders committed to leveraging the interactive web and the benefits of collaborative technology to solve government’s complex problems. Powered by the National Academy of Public Administration, this “wikified” space is designed to share ideas, examples and insights on the adoption of Web 2.0 technologies in the field of public governance.’)

32.http://www.galaxyzoo.org/

33.http://www.patientslikeme.com

34.The Economist, 18–24 April 2009.

35.http://www.appsfordemocracy.org/

36.Leadbeater (2009) calls this ‘We Think’. He gives a nice historical example of how strict patent enforcement by Boulton and Watt restricted innovation in pumping engines in English coal mines in the eighteenth century: an alternative engine, designed by Woolf and Trevithick but without patents, rapidly became three times more efficient as a result of collaboration between Cornish mine owners and engineers. Cornwall had the fastest rate of steam-engine innovation and the lowest rate of patents in Britain at the time – and Watt and Boulton didn’t sell another engine in Cornwall after 1790. Woolf and Trevithick made their money installing and adapting engines (p. 54). Li and Bernoff (2008) and Tapscott and Williams (2006) are two examples of books describing modern commercial use of this phenomenon.

37.Suriwiecki (2004).

38.Asch (1952).

39.Sunstein (2006: 45).

40.Janis (1982).

41.Senate Select Committee on Intelligence (2004).

42.Columbia Accident Investigation Board (2005).

43.See: http://en.wikipedia.org/wiki/Illusory_superiority.

44.Habermas (1996).

45.Suriwiecki (2004).

46.http://bocowgill.com/GooglePredictionMarketPaper.pdf

47.Hipp (2009).

48.Zastrow (2009).

49.Friedman (2005).

50.Solomon and Schrum (2007).

51.Pink (2006).

52.Attalli (2009).

53.Castells (2000).

54.Cole (1989).

55.Brickmann and Ungerman (2008).

56.About 65 per cent of the senior executives surveyed by McKinsey were only ‘somewhat’, ‘a little’ or ‘not at all’ confident about the decisions they make in this area.

57.Gray and Larson (2003).

58.Attalli (2009: 119).

59.Stillich (2009: 20).

60.Barsh et al. (2008).

61.Fleming and Marx (2006).

62.Coase (1937).

63.Scott (2004: 11).

64.Smith et al. (2006).

65.Smith (2008a).

66.Bell (2004).

67.Pfeffer (2007: 6).