Wiki as a KM Tool [web version]


A disruptive innovation has rocked the business world. A new IT tool is fast being adopted by many organizations, in many different ways. This new tool is called a wiki and is part of a new suite of tools being dubbed ‘Web 2.0’ or ‘Enterprise 2.0.’ These new tools and wikis in particular are being trumpeted as revolutionary new tools for knowledge management.

Most knowledge management practitioners have heard of wikis but many still aren’t sure what they really are or how they fit into the knowledge management tool belt. This analysis looks at wikis in the context of knowledge management and in the context of these new tools. Specifically, it will determine how wikis are a knowledge management tool. To do this, requires first a definition of knowledge management and of a knowledge management tool. This is followed by a definition and explanation of the new tools and the wiki in particular. Next, it shows how companies are using wikis. This information is combined to show how wikis can be considered a tool for knowledge management. The analysis will continue with an example of a company that could use wikis for collaboration among distributed groups. It will show their organizational structure and processes in depth and how they could benefit from use of a wiki.

The overall goal of this work is to provide a comprehensive overview of this new technology and to contextualize it in the field of knowledge management. To do this, secondary sources are used. Because wikis are so new, there is not yet much research available. Resources cited for data on wikis were mostly news articles (and in some cases blogs) or published technical and statistical characteristics of particular wikis. The background information on knowledge management is found in more traditional formats: books, journals and papers. For the case example, qualitative interview were conducted with employees of the company used.


2.1 What is knowledge: philosophy vs. management theory

The seminal definition of knowledge is given in Plato’s Dialog as: justified true belief. This elegant phrase has led to endless discussion of how to justify knowledge and what constitutes truth. This has led to the whole body of work under the umbrella of epistemology.

The goal of epistemology is to get at the definition of knowledge on a fundamental level, to know how we know even the most basic things, like the fact that I exist or the color blue. Epistemology is asking how do I know what I know, how do I know it’s true – at a ridiculously fundamental level.

Knowledge management in an organization is about managing knowledge like a resource in order to increase competitiveness. For this purpose, an exegesis on the foundation of knowledge is not necessary. Organizations take it for granted that they and all the objects around them exist. Instead, knowledge is justified in the marketplace. What is considered to be knowledge worth saving and managing like an asset, is that which contributes to increasing competitiveness of the organization.

To clarify further exactly what it is that contributes to increasing competitiveness, knowledge is distinguished between data and information. The idea is to show how information is derived from data and knowledge is derived from information. Table 2.1 shows a thorough and yet practical definition of terms (Wiig 1995, page 22.). Here knowledge is defined on a sliding scale of usefulness against pathways, signals, data, information and wisdom.

Pathways Air; coaxial cables; glass fiber; (either) – media that transmits signals
Signals Sound waves; electrical pulses signifying “0s” and “1s;” light reflected from objects to observers; strokes of pen on paper
Data Sequences of numbers and letters; spoken words; pictures; even physical objects when presented without a context
Information Organized data: tables of annual sales statistics; a magazine article; a well presented talk; a picture or an object when presented in a particular context
Knowledge Organized information: understanding what the sales statistics mean and how to interpret them
Wisdom Ability to provide judgment on sales statistics; hypothesize and learn what is happening within the company and which forces may be at work in the market place; propose valid and valuable ways to deal with new situations

Table 2.1 Distinction between data, information and knowledge (Wiig, 1995, p.22)

This distinction says nothing about how knowledge is created. Here philosophy has had an unexpected say. The knowledge management movement has tapped a work of philosopher Michael Polanyi. Polanyi introduced the idea of tacit knowledge, an idea now at the core of knowledge management (Polanyi, 1967). Tacit knowledge is knowledge that is hidden; it explains how we can know things without knowing that we know them. An example is knowledge of how ride a bike. No amount of explanations or documentation could possibly explain how to ride a bike. Only after experiencing riding a bike does one know it.

In a seminal work on knowledge management, this idea of tacit knowledge is made a central part of knowledge creation (Nonaka and Takeuchi, 1995). The four modes of knowledge conversion (Socialization, Externalization, Internalization and Combination) explain how the interplay between tacit and explicit knowledge forms the basis of knowledge creation. This theory stresses the importance of sharing knowledge through shared experience. This basic need to share tacit knowledge is at the heart of this seminal theory and of almost all knowledge management initiatives.

2.2 Academic view of knowledge management: the main ideas

Nearly everything ever written on knowledge management begins with a section defining what knowledge management is. This can only mean that there is not overwhelming agreement on that definition. The next section explains this in terms of the evolution of knowledge management as an academic field of research. But first, here are some of the key elements that appear in descriptions of knowledge management as a field of management research.

The most basic idea driving knowledge management is that knowledge is a strategic asset that must be managed. It should be managed as an asset or resource, just like land, capital and labor. This is a shift away from how to simply obtain knowledge to how to use it productively. Knowledge is to be seen as an activity as well as an object, it is a product and a process. It is something that must be created and shared.

As mentioned, knowledge management is essentially about tacit knowledge. It is aimed at making tacit knowledge explicit and then sharing that knowledge for reuse across an organization. This can be expressed as a sort of knowledge management lifecycle:

Knowledge generation —> Knowledge codification —> Knowledge transfer

Knowledge management initiatives are often expressions of part of this process. Some are concerned with the first part: knowledge creation, innovation or organizational learning. Others are concerned with capturing tacit knowledge for codification. This can mean recording videos or feeding data into a database. Knowledge transfer is sharing knowledge. This can mean a database of information with access methods. It can mean fostering networks of people for sharing knowledge or creating knowledge maps showing who has what expertise.

The sharing of knowledge and expertise among individuals can be especially challenging in some organizations. Some experts are resistant to share their knowledge because it gives them power. Getting these individuals to share their knowledge can involve cultural change.

Part of the impetus for knowledge management comes from a realization that work is more and more knowledge based. Management needs ways to deal effectively with knowledge work and knowledge workers. There is an individual element of knowledge management that tries to find ways of effectively managing individual knowledge and increasing the productivity of knowledge workers.

Knowledge management initiatives, though difficult to justify quantitatively, usually involve some kind of performance measures. Any knowledge management effort should be checked against the overall goal of the organization. Many definitions of knowledge management stress the importance of action. Knowledge is useful insofar as it leads to action, or good decision-making. As Nonaka and Takeuchi point out (Nonaka and Takeuchi, 1995), knowledge is tied to strategy, or the organizational intent. As such, knowledge management efforts must always be designed with respect to corporate strategy.

Knowledge management can also be seen as a logical successor to the process reengineering movement (Brown and Duguid, 2000). The process reengineering movement focused on improving linear processes in organizations and enjoyed most of its success in areas like procurement, warehousing, billing. It was not as successful in knowledge rich areas like management or research and development. Knowledge rich processes tend to be fuzzy, non-linear and not as well defined. This is where knowledge management aims to improve the organization, the fuzzy knowledge rich areas. It does this by looking at how people do their jobs, the information they need and how they get it. That is, to look the practice, instead of the process.

Knowledge management focuses on the practice – how people do their jobs within the overall process. Usually the kind of information people need is subjective and can’t be provided by a set of guidelines or a procedural guide. They need to know how to make judgments. For this they need the help of other people, through communication, collaboration and storytelling. Knowledge management has been most successful in this area by applying the use of lessons learned, best practices and communities of practice.

Communities of practice are groups of people who meet on a regular basis to share knowledge, information, insight and advice about a certain topic (Wenger, McDermott and Snyder, 2002). They discuss their problems and learn from each other. In an organization, it is an informal structure that normally spans divisional boundaries. They often spring up spontaneously out of a need for peer advice. In short, a community of practice covers a certain domain of topics and it builds a community of individuals involved in that topic who meet regularly to discuss specific experiences. An example would be a group of software programmers who use an obscure technology. They might meet regularly to exchange ideas, discuss they’ve done and problems they have.

2.3 Multidisciplinary roots of knowledge management

The concepts underlying knowledge management can be traced back to the 13th century with the use of craft guilds and the apprentice-journeyman-master systems of training and knowledge transfer.

Ideas related to knowledge management appeared in academic literature as early as the late 1960’s. In the 1970’s and 1980’s articles by Peter Drucker and Paul Strassman stressed the importance of information and explicit knowledge while those by Peter Senge discussed the learning organization (Barclay and Murray, 1997). Other related articles by Chris Argyris, Christopher Bartlett ad Dorothy Leonard-Barton appeared as well as work on knowledge diffusion by Everette Rogers and Thomas Allen.

Knowledge management systems even appeared as early as the 1980’s, predating the World Wide Web. In the 1980’s lots of knowledge-based work was being done in the field of artificial intelligence, little of which was able to live up to expectations.

In the 1990’s, knowledge management as we have come to know it was written and discussed in earnest. Major contributors include Davenport, Nonaka, Sveiby and Wiig. In 1993 the first conference devoted entirely to knowledge management took place in Boston (Prusak, 2001). The use of the terms knowledge management in academic papers began to increase most notably in 1996 and again in 2001 (Wilson, 2002).

Some have accused knowledge management of being the rejuvenation of the waning process re-engineering movement. It is also said to be the latest in a long line of management consulting fads (downsizing, TQM, the matrix organization, to name a few). While there is some interesting support of this, it is perhaps more useful to show knowledge management as part of an evolution of the disciplines it grew out of. The fields of organizational learning, organizational memory and artificial intelligence have been the most influential (Maier, Hadrich and Peinl, 2005). Table 2.2 shows the specific fields of study out of which knowledge management grew.

research field characterization
Organizational change Supports changes within and changes of organizations with deveopent, selection and learning models.
Organizational development Is a methodical strategy for intervention, initiated through consulting and planned by management with the assistance of a change agent, concerning personal, interpersonal, structural and technological aspects.
Organizational learning Claims that observable phenomenta of change n organizations are connected with unobservable inter-personal process of learning on a micro-social (group) as well as macro-social level (organization).
Organizational memory Is capable of storing things perceived, experienced or self-constructed beyond the duration of actual occurrence, and then retrieving them at a later point in time in analogy to an individual’s memory.
Organizational intelligence Provides a slightly different focus on organizational information processing than OL with an emphasis on collective processing of information and decision making.
Organizational culture Is largely an implicit phenomenon only indirectly observable through concepts such as trust, norms, standards, unwritten rules, symbols, artifacts which are the results of learning processes, provide orientation and are shared by the organization’s members in a process of socialization.
Theories of the evolution of organizations Apply evolution theories originally developed in philosophy, biology and social sciences to organizations, e.g., population-ecology approach, self-organizing systems, organized chaos and evolutionary management.
Human resources management In an institutional sense denotes an organizational subsystem that prepares, makes and implements personnel decisions to secure availability and effectiveness of personnel.
Information processing approach Explains individual behavior, e.g., problem solving, decision making, with concepts from cognitive psychology such as attitude, personality, definition of the situation as well as short and long term memory.
Systems theory Aims at the formulation of general laws and rules about states and behaviors of systems and provides the basis for many investigations, theories and concepts developed within organization science and MIS.
Artificial intelligence Tries to establish the analogy between human and computer problem solving and applies a common set of methods, e.g, mathematical logics, pattern recognition or search heuristics, to a variety of problem domains.
Strategic management Determines long-term goals and positioning of organizations and encompasses the entire process of formulation, implementation and evaluation of strategies to link strategic and operational decision-making.
Other management approaches Focus on certain aspects of management, such as innovation management, or provide and alternative view on management, such as systemic or system-oriented management, and evolutionary management.

Table 2.2 Academic fields out of which knowledge management grew (Maier, Hadrich and Peinl, 2005, p. 31)

This view also distinguishes between human-oriented and technology-oriented knowledge management approaches. This corresponds to a distinction in the two different backgrounds of knowledge management, one being the organizational science/organizational learning background and the other being the IS/MIS/AI background. Human-oriented knowledge management focuses on organizational and individual knowledge while technology-oriented knowledge management focuses on IT tools and technology platforms.

Given the multidisciplinary roots of knowledge management, it is easy to see why academicians might disagree over what knowledge management is about. A look at syllabi from courses taught on the subject shows an array of covered research domains: data mining, data management, process management, strategic management, organizational learning, adult learning, decision support systems. In short, knowledge management is a broad category of areas of research that span the gamut from technology to human oriented approaches to managing data, information and knowledge within an enterprise.

2.4 Why manage knowledge?

On the one hand, we have a slippery definition of knowledge and a field of study vaguely defined, with multidisciplinary roots, accused of being a fad. On the other hand, it’s hard to deny that knowledge is a core competency that needs to be managed.

As mentioned, the most basic idea driving knowledge management is that knowledge is a strategic asset that must be managed, something that must be created and shared. What contributed to this shift in thinking about knowledge was a shift away from an industrial society. As Peter Drucker points out (Drucker, 2005), knowledge is a key production factor in the post-industrial society. Knowledge is a core competency. Companies don’t sell specific products anymore but rather knowledge of how to do things. In the industrial economy, a company, like Coca Cola, would become famous for selling a specific product. Its core competency would be the production of this product. Nowadays, it’s hard to imagine a company staying on top of its game long by continuing to produce a single product. Companies try to obsolete their own products now, before the competition does. They then become a sort of “new product engine” (Davenport and Prusak, 1998, p. 13). At the core of this ability to be a new product engine, the core of innovation, is knowledge creation. So a company’s new core competency is knowledge. “In a global economy, knowledge may be a company’s greatest competitive advantage” (Davenport and Prusak, 1998, p. 17).

2.5 Practitioner view: it’s all knowledge management

While it is clear that knowledge management is strategically important and therefore has its place in the ranks of management research, it is still a young field. The practitioner view of the field reflects just that. A recent study (Alavi and Leidner, 2002) shows practitioners agree about the strategic relevance of knowledge management, but find it difficult to establish measurable clear goals and have no defined set of proven strategies. There are no standard solutions. What works for one company won’t work for another and there are many different instruments used (see Table 2.3).

instrument measures
Case debriefings Several information systems including yellow pages and a case data base; new roles like knowledge stewards, coordinators and advocates and organizational rules
Best practice sharing A new organizational structure with several centers of excellence, an information system containing best practices and the adoption of benchmarking and models [can be implemented in the format: lessons learned – leads to – best practices – leads to – process change]
Externalization of knowledge Career plans, incentive systems, 360° evaluation, an electronic document management system and yellow pages, the introduction of so-called Intellectual Capital Teams that review new documents
Documentation/evaluation of customer feedback Establishing a new team and regular meetings, creating templates and organizational rules
Lessons learned Method for systematic harvesting of lessons learned in projects at defied project steps; consists of organizational rules, document templates and an IT system
Community of experts, interest, practice, purpose Foster networking between experts (community of experts), employees working on (community of practice) or interested in a topic (community of interest) or working towards a common goal (community of purpose)
Knowledge maps Consistent access to customer, product and process knowledge with the help of organizational rules and visualization tools
Corporate and team culture management Corporate culture: off-shore meetings, expert meetings and debriefings; team culture: new team structures, informal interviews and an education program
Documenting tacit knowledge, identifying and integrating external knowledge A new organizational unit, document management system, access to an online encyclopedia, lessons learned enforced through a workflow management system and “in-a-nutshell” videos
Competence management Recording skills and experience of staff… yellow pages with a taxonomy of skill set, links to past projects, pages authored, community membership and pictures
Personal experience management Recording personal experience… through notes, journal, blog
Community and knowledge networks Communication and collaboration among groups, network building… through forums, chat, yellow pages, shared information spaces
Knowledge process re-engineering Systematic analysis and redesign of business processes based on knowledge flow

Table 2.3 Knowledge management tools (Maier, Hadrich and Peinl, 2005, p. 41)

Initiatives aimed directly at turning tacit knowledge into explicit knowledge have not met with raging success. But efforts that improve access to existing knowledge, communication and the location of experts have worked (Maier, Hadrich and Peinl, 2005, p. 355). Most companies have at least an intranet, email and some groupware that provide the foundation for knowledge management.

Similar to dividing knowledge management into human-oriented or technology-oriented views, the study by Alavi and Leidner (2002) identify three dominant perspectives from the practitioner point of view:

Information-based Technology-based Culture-based
Actionable information Data mining Collective learning
Categorization of data Data warehousing Continuous learning
Corporate yellow pages Executive information systems Intellectual property cultivation
Filtered information Expert systems Learning organization
Free text and concepts Intelligent agents
People information archive Intranet
Readily accessible information Multimedia
Search engines
Smart systems

Table 2.4 Practitioner view of knowledge management orientations (Alavi and Leidner, 2002, p.22)

The information-based view is more concerned with access to information, information overload and the ability to inventory/locate information through knowledge mapping or corporate yellow pages type efforts. The technology-based view is based more interested in information systems, IT systems/infrastructure and the integration of various systems. The culture-based view works on learning, communication and the cultivation of knowledge. Practitioners in this category suggest that technology makes up 20% of the knowledge management concept while cultural aspects make up 80%. This is referred to as the 80/20 rule.

Interestingly, the distinctions between knowledge, information and data pounded on by academics have little impact on practitioners. This study showed that knowledge practitioners use the three terms interchangeably. Practitioners did implicitly make a difference when emphasizing the overabundance of information. For example, one used the phrase ‘one’s person’s knowledge is another’s data.’ But it was found overall that there are “no inherent ‘objective’ attributes that distinguish between the two constructs” (Alavi and Leidner, 2002, p. 20)


Section 2.4 covers some of the main techniques and initiatives used by practitioners to achieve knowledge management goals like best practices, communities of practice and knowledge mapping. Though there are some similarities between particular IT tools and the main knowledge management techniques, there is no set of standard IT tools that directly support the major initiatives like best practices, sharing tacit knowledge, etc. There are no major commercial off-the-shelf (COTS) products that are considered an industry standard or even mildly successful “Knowledge Management System.” No attempt at building an overall KMS has yet succeeded. Instead, there appears to be a wide array of IT tools for knowledge management covering some aspects of the main techniques. For example, email, chat, instant messaging and forums are all IT tools that don’t match up specifically with a particular knowledge management initiative, but they all contribute to knowledge sharing in general. There are a lot of IT tools that are incongruent with the main knowledge management techniques, but which nonetheless aim toward the overall goal of managing knowledge like an asset in order to improve competitiveness. Moreover, it seems that innovation in the field of knowledge management can come from both angles, from advances in IT – like information management and communication technologies, as well as new organizational initiatives (that is, from either the human-oriented or technology-oriented domains).

Part of the confusion over what is and isn’t a knowledge management tool comes from the overuse of the term knowledge to describe IT systems, where information or data may be more appropriate. It may even be argued that the term is used to sell information systems that have little to do with knowledge management proper. As shown in the study by Alavi and Leidner (2002, p. 21), practitioners associated all sorts of data management and other IT tools with knowledge management. They go on to say that there is no clear definition of a knowledge management tool:

“A clear view of a new type of technology specifically dedicated to KM did not emerge. Indeed, this is consistent with the fact that KM systems can be accomplished with different technologies, the most effective of which are likely to depend upon an organization’s size and existing technical infrastructure.”

Instead of having one particular piece of software to point to and say, “this is a knowledge management tool,” we have a wide array of IT tools that are combined, according to an organization’s needs to reach the overall goal of managing knowledge to improve competitiveness.

We can identify two main categories of IT systems that support all business activity. One is ERP systems, which manages the process – they organize well structured data and service work. These are best suited for well-structured processes. Examples are systems to manage orders and track shipments, accounting software and inventory systems. On the other hand, businesses also need systems to deal with weakly structured data and processes (or, practice). Meetings, development projects, customer service, issue resolution, training, new idea generation: these are all examples of fuzzy or less structured processes to which more flexible software solutions apply. The systems that support this kind of weakly structured work, more closely resemble what we could call knowledge management tools. Table 3.1 shows some categories of IT tools commonly associated with knowledge management.

technology description/examples
AI technology Expert systems, learning systems
Communication and collaboration systems Email, teleconference, videoconference, chat, IM, forum, listserv, groupware, group calendar, blog, shared information spaces, workflow management system, group decision support system
Document management system Management of electronic documents, a system to search, edit, distribute, retrieve, archive and otherwise manage the complete life-cycle of documents.
Content management system Management of electronic content, includes multimedia files
Intranet A network contained within the enterprise. It is used to share information and computing resources among employees as well as to facilitate group working.
Search engine Tool that searches the contents of a web
Learning systems Distance learning, e-learning and computer-based training
Knowledge mapping tools Any resources that locates people by their knowledge: corporate yellow pages, human resources skill set inventory system

Table 3.1 Types of knowledge management IT tools

Knowledge management can also be described as the most recent phase of an evolution from a managerial focus on data management, then information management and finally, knowledge management:

database administration (1970s) data administration (1980s) data management (1980s) information management (1990s) knowledge management (2000s)
Use of DBMS Data modeling, relational DBMS Large data modeling and DBS Data warehousing, data mining, document management, ERP, OODBMS Organizational memory: communication technologies, web content management, semi-structured data (XML)

Figure 3.1 Evolution of IT tools (Maier, Hadrich and Peinl, 2005, p. 35)

According to this evolution, database administration marks the beginning of data management in the 1970’s. The 1990’s mark the move from data management to information management. Information was seen as a production factor (like land and capital) that had to managed. The last step, starting in the late 1990’s introduces knowledge management. The difference here is the focus on knowledge systems integration and communication systems to support the creation and sharing of knowledge. It’s in this last phase that IT systems to support communication, collaboration and knowledge sharing really start to blossom.

Email and corporate intranets are the top two communication technologies used by knowledge workers. Davenport reports that 100% of knowledge workers use e-mail each week while close to 40% use corporate intranets (Davenport, 2005).


A new group of web-based information management tools has emerged based on freeform social software, that enhances individual knowledge work, group communication and collaboration (McAfee, 2006). Information management tools for knowledge work and communication are not new. While so called “Knowledge Management Systems” of the past decade were largely a failure, this does not mean that some IT tools have not been used widely as knowledge management tools. As noted in section 2, communication and collaboration tools have been used to aid knowledge management efforts. Companies have used email and instant messenger programs to aid one-to-one (and some limited group) communication. They have also used corporate intranets and content management systems to provide information required by knowledge workers – from a central source or funneled through a central source – in a one-to-many format. Whereas email performs one-to-one communication that is not shared with a larger group, the new tools open up one-to-one and group communication to be viewed by many users if not the public. And whereas corporate intranets funnel information through a central source or from the top down, the new technologies allow for decentralized authoring – direct editing by distributed users. So the new tools add to this suite a new form of support for communication, that of many-to-many collaboration.

The new tools differ fundamentally from the old suite of tools in that they are based on user participation. The new tools are mostly characterized by participatory services, where users create content. They usually allow users to manage and modify their own data within a given system – information that is usually made public for others’ benefit. Thus the services get better the more people use them. Organization or knowledge is drawn out of user actions like tagging or visiting sites (the basis for Google’s Pagerank, for example). But by far the most dominant characteristic is that of participation. Participation is built into the actual architecture of the tool or service offering.

The new tools also differ from the old in that the web, and not desktops are the dominant platform. Applications are built to run on the web, not users’ PC’s. They are meant to run from anywhere requiring of users only an internet connection and a web browser.

Lastly, the new tools are predominantly freeform tools. They are designed with only minimal structure to support all the less structured processes, the processes without much procedure to them. This minimal or lack of structure in the software itself allows organizations to configure the tool to exactly fit their needs, or better yet to let the structure of the process emerge out of use of the tool (McAfee, 2006). As mentioned, knowledge management systems in the past have struggled to support just such semi-structured processes.

4.1 Web 2.0

These new tools are often described as belonging to a suite of new technologies and business models under the debated moniker “Web 2.0.” At one point, the term “Web 2.0” referred to the development of the semantic web – a project spearheaded by Tim Berners-Lee’s, inventor of the World Wide Web. His vision for the web aims to make web documents’ meaning understandable by computers.

But Web 2.0 has come to mean the transformation of the internet from a web of interconnected websites to a platform for web applications. This shift certainly has a lot to do with the semantic web. The semantic web calls for the seperation of web document data into different elements (visual, structural, semantic, etc.), which has led to the development of technologies like CSS and XML. These technologies in turn pushed along the development of richer web pages and applications. Ajax, one of Web 2.0’s star programming tools is a combination of Javascript, CSS and XML. But the semantic web is more about increasing the functionality of the web by increasing the meaning of web pages to computers. Web 2.0 is more broadly about increasing the functionality of the web as a platform for web applications.

Depending on what definition of Web 2.0 you subscribe to, the list of associated tools, technologies, business models and services will all differ. Here is a list of some of the things that from at least one perspective can be considered part of the Web 2.0 basket.

4.1.1 Web 2.0 as a business model

Web 2.0 is has been described as a business model based in large part on user participation (O’Reilly, 2005). Included in the core competencies of this business model are: selling service as opposed to software, control over unique data sources that get richer as more users use them, user participation in development, harnessing collective intelligence, leveraging the participation of many obscure users rather than a few known, and integration of services across devices. Examples of organizations with these core competencies include: Ebay, Napster, Craigslist, Amazon, Apple iTunes and Google. Amazon is a good example of having control over a unique data source with its abundance of user reviews of books. This is undoubtedly a core competence of Amazon. Apple’s iTunes and iPod system show both the success of selling a service as opposed to software and the integration of services across devices. This product combination sells a service – portable media delivery, utilizing a variety of players, a user’s pc or other storage device and a massive online music database. Finally, Ebay is an example of a company that has leveraged the participation of many obscure users. The store would be much less rich if it had catered to a few strong sellers. Instead it focused on the gazillions of little sellers. It also harnesses collective intelligence as its value is the direct result of the activity of all its users. Likewise, Amazon focused on developing a vast inventory of gazillions of obscure books. They profit more from the gazillions of individual sales of obscure books than from the sales of a few best sellers.


Figure 4.1 Anatomy of the Long Tail (Anderson, 2004)

This phenomenon is known as the Long Tail. The most illustrative example is given in a Wired magazine article by Chris Anderson (2004). A traditional commercial retailer like Wal-Mart sells the top 39,000 best selling songs. Rhapsody, an online music download store sells over 735,000 songs. Rhapsody observed that, as it increased its inventory of songs from 100,000 to 200,000 to beyond 400,000, each new song added, no matter how obscure, was sold at least once each month. No matter how obscure the tune, somebody somewhere wants it. With a big enough inventory, sales of obscure songs dwarf sales of popular songs.

4.1.2 Web 2.0 as a service

Web 2.0 also refers to a group of organizations that have created or popularized a new people-driven, participation-driven service. These include: Wikipedia, Flickr, Myspace, Cloudmark, Sourceforge and Youtube. Indeed these organizations exhibit nearly all the core competencies of Web 2.0 as a business model. But in addition, they can also be thought of as a new service model. Wikipedia for example, is a new model for organizing information in a way that draws the information out of the obscure knowledge and ultimate consensus of many individuals. This service is not limited to the organization of encyclopedic knowledge of the world but can be used for any subset of knowledge. For example, Intuit has created to organize information about tax law.

4.1.3 Web 2.0 as a technology

Web 2.0 can mean the technologies that form the basis for these new tools, some of which include: CSS, Ajax, Rails, Javascript, RSS/Atom, SQL, XHTML and XML. In the early days of the internet, a web designer needed only to know HTML. Web developers soon realized that through CGI, they could add a database to the back end. Since then, the technologies involved in building a web page have grown immensely. In particular, the aforementioned set of technologies, as part of the Web 2.0 phenomenon, have notably expanded the capabilities of web pages. These technologies have helped separate the various data elements involved in presenting web content to users (CSS, XML, SQL). They have also made web pages more dynamic and usable (Ajax, XHTML, RSS). In short, these technologies take web pages from static documents filled with meaningless text to more structured, integrated, meaningful documents with dynamic user interaction.

4.1.4 Web 2.0 as a tool

Web 2.0 is also (perhaps most) often described as a group of people-driven tools that allow collaboration. These include blogs, wikis, syndication, tags, and mash-ups. Blog

Blogs are most commonly used as an online version of a personal journal. Essentially, a blog is simply a webpage that contains periodic, chronological ordered posts, additionally grouped by categories. Users visiting the blog can often add comments to posts. Administering a blog (updating it by adding new posts, creating links to other web pages, adding pictures, categorizing posts, etc.) is extremely simple. Setting up a blog can be more complicated but keeping it updated and posting end-user comments to it is very easy. User settings are typically highly configurable. Because of the emphasis on reverse-chronological posting, blogs are often characterized as promoting form over content.

Blogs are most commonly used as an online daily journal or personal knowledge management tool. For example teenagers may post photos, poetry, game scores and other content to share with their friends. But businesses as well have found them to be powerful internal and external communication tools. Some companies have harnessed the power of a star blogger within the company or used a CEO blog for corporate communications. These uses are closed to the most common use of blogs as a way to broadcast information from one writer or create discussion based on the writing of that writer. But companies have used blogs in other creative ways. Some use it as an easy web publisher, publishing new content in the form of new posts. Others use it as a company notice board that can be fed through an RSS feeder, doing away with the need for constant broadcast emails. Still others have used it as a log file, to record chronological data like system updates. Wiki

A wiki is fundamentally a web of interlinked pages where each page typically contains a concept (a name) and a description of that concept (an article). Figure 3 shows what a typical wiki page looks like. The wiki engine used here is MediaWiki, the open source wiki engine developed for The body of the page (the article) contains links to other pages or concepts within the wiki (in Figure 3, ‘Articles 2’ and ‘Article 3’). Users are allowed to edit any part of the article, modify the description, add new names, add external links and add links to names (and their corresponding articles) that don’t exist yet (so that another user can fill in the description of the new concept). In Figure 4.2, ‘Article 3’ is an example of such a link to a non-existent page. You can see that clicking on the link takes you to a page to create content for that topic. Users can modify existing pages or add new pages, filling them with whatever content they want, including pictures, links, and attached files.


Figure 4.2 Sample wiki page

Being essentially just a web of interlinked pages created by users, wikis are freeform, informal and emphasize content over form. They are also often minimalist in design. They are extremely easy to navigate in and add content to. User security is usually set low, giving users as much power as possible to change the content. Page changes, old versions of pages and recent changes of pages are all well documented and manageable by users and/or administrators.

Wikis, like blogs and forums, have become a tool for online collaboration and community building. They differ from blogs in several ways. The fundamental difference is that wikis do not contain chronological posts, and are otherwise not a tool for recording chronological data. Though functionality has been added to some wikis to allow recently added content to show up in a certain place, this is intended to fold some blog functionality into wikis. Wikis are best suited for building a knowledge base from a variety user input. Blogs are better suited to communicate recent or chronological data from one source to many users. Forums are best suited to allow users to discuss and answer each other’s particular questions about topics.

Wikis have been used by businesses in a variety of ways. Some have used them for project management. Because wiki page content is so easy to add and modify, project status information can be updated quickly and easily. Links to project information can be given to all parties involved, who can check the progress as needed. They have also been used for a wide variety of collaboration work. Any time a document needs to be seen and modified by more than one user, it can be posted on a wiki and all revisions done there. Documents, lists, meeting agendas, reports, RFP’s can all be uploaded to a wiki page and revised from there. This saves time by reducing the amount of emails flying around related to the document revisions. Wikis have also been used to create any kind of knowledge base. Examples include product documentation, corporate yellow pages, case base and an entire intranet. These kind of projects work because they cut down on the bottleneck of updates. Anytime an administrator must be involved to update a web page, a bottleneck is created that slows updates. Wikis reduce that bottleneck by giving users the power to do updates themselves, increasing the quality of information online.

In response to these successful uses of what started out a rather simple open source software, a few companies have sprouted up with proprietary versions of wikis that enhance the functionality of a basic wiki, specifically for business use. Open source versions do much of this as well. These enhancements include the ability to email a page’s content to a group of users and to email changes to a page to that page as well as upload a word or excel file to a wiki page and edit it in a browser. There are add-on applications to support common business process like project management and corporate yellow pages. User security is highly configurable. Syndication

RSS/Atom and Podcast are common examples of syndication technologies. They work by polling an information source, either from a website or from a piece of software called a feeder, to retrieve updated articles or information. For example, the show Forum airs on National Public Radio daily, then is recorded into a podcast which is updated to the NPR website ( A feeder can be added on to an individually configured MyYahoo website which will pull the Forum podcast into the feeder and onto the user’s MyYahoo page daily, or however often the podcast is updated. Another example would be an RSS feed of a blog roll. The blog of Brad DeLong from UC Berkeley can be fed through an individual’s MyYahoo page so that links to new articles show up on the MyYahoo page as soon as they are added to the blog. MyYahoo is an example of an aggregator program, which is a program that aggregates different news sources, as well as other sources such as blogs and podcasts. But syndication can work with a single feeder as well, which can filter content from either the RSS protocol or podcasts through. So in the blog example, links to new articles from a blog roll can be fed through an RSS feeder program that sits on a user’s desktop. Old or read articles are marked as such.

The corporate application of this technology is obvious. This format is perfectly suited to replace mass communications coming from corporate, or any subscription based communications for that matter. Users can subscribe to, or be automatically subscribed to based on their job function, certain RSS feeds and content can be sent directly to their feeder or aggregator page. This cuts down on email clutter and allows users to maintain subscribe lists. Tags

Tagging is essentially a form of social bookmarking. It allows users to tag or categorize web pages with words they create. One example of a tagging service is At a basic level, it allows for importing and exporting categorized web pages and otherwise easy sharing of bookmarks. However, the service goes much further than that. When tagging a particular page, you can see words others have used to categorize the page, thereby synchronizing your own categories with others to create an overall order. You can also see how many others have tagged the same page, any notes they wrote about that page, what other pages they have tagged, add them to a network of contacts and contact them. Tagging is also present in services like Flickr, YouTube and Yahoo’s MyWeb.

Tagging is a way to let structure emerge. Andrew McAfee (McAfee and Sjoman, 2006) argues the importance of letting structure emerge out of systems rather than planning and implementing structure from the top down. In this way, users decide what categories should exist based on their own needs. Mash-ups

Mash-ups are yet another addition to the Web 2.0 toolbox. Mash-ups take web content from different sources and combine them in a single web service. They create value by collating information from different places. A simple example of a mash-up is the mapping function in Craigslist’s housing search. This feature matches the address of a location for rent with a Google or Yahoo! map to the location. Another example is Podbop, which allows users to search by city to find music downloads of bands that are coming to that area soon.

4.2 The Wiki in Detail: Culture, Strengths, Weaknesses, Features

4.2.1 Culture History

The first wiki was created by Ward Cunningham in 1995. It was called the WikiWikiWeb, based on the Hawaiian word for “quick.” The idea was to build a website that was quick and easy to edit by anybody. The Wiki Way

Two key and lasting characteristics of wikis are openness and simplicity. Wiki’s are typically open to edit by anyone, often without even requiring the user to log in. And their design is unusually simple. Ward Cunningham is famously quoted: “What’s the simplest thing that could possibly work?” A wiki is essentially a database of the simplest form, each entry containing a name and an article explaining that name. The connection between names is made by reference to them from articles other than their own. Wiki’s grow through virtue of the incremental principle. Within an article, users can add links to other names that either already exist or have yet to be created. In this way, one user can create the article describing a name, using a concept that they themselves don’t have to describe. Another user can create the article for that name, thereby incrementally expanding the database. With a system open to participation by anyone, truth and accuracy are the result of collective influence. If anyone is able to edit a document, the only information that will remain after constant revision, is that information that everyone can agree to. Wikipedia

The most well known example of a successful wiki is, a free online encyclopedia composed of articles written by the general public. It was started in 2001 by Jimmy “Jimbo” Whales and Larry Sanger. By August 2004 it contained 330,000 articles, but has recently grown to over 1,279,000 articles in English alone. There are Wikipedia’s set up in over 100 languages. The site contains encyclopedic information on anything and everything, that was almost entirely built by users. Interestingly, the project started as a way to collect information from the public for later validation by an editorial board. Now of course, the only editorial board is the collective influence of the general public. The site is officially managed by a small, mostly volunteer group of individuals operating as the Wikimedia Foundation. Figure 4.3 and Figure 4.4 show the main page of Wikipedia and a sample article from the site, respectively.


Figure 4.3 Wikipedia home page


Figure 4.4 Sample Wikipedia article page

It is easy to imagine how the integrity of Wikipedia could be quickly compromised by the actions of just a few wrongdoers. Surprisingly, there have been very few cases of users intentionally entering bad information. One such case of vandalism involved the biography of US journalist John Siegenthaler. Someone posted a biography of the journalist with several false statements, accusing him of being involved in the assassination of JFK and having lived in the Soviet Union for 13 years. Later it was revealed that the post was a joke and the author didn’t know that Wikipedia was intended to be a factual encyclopedia of knowledge. But a study done by Nature magazine showed that the accuracy of Wikipedia was actually quite similar to Britannica. “The exercise revealed numerous errors in both encyclopedias, but among 42 entries tested, the difference in accuracy was not particularly great: the average science entry in Wikipedia contained around four inaccuracies; Britannica, about three” (Giles, 2005). In July 2006, the Wikipedia article on Kenneth Lay was altered five times within the first half hour after his death was announced. Naysayers think this shows the fallibility of the wiki way. But defenders consider it evidence of the resilience of the system.

4.2.2 Strengths

The dominant strength of the wiki is the low transaction cost of entering data. With a simple wiki, it’s extremely easy to add and edit content. It is assumed that this ease of use leads to high participation. The Technology Acceptance Model (TAM) provides a theory as to why that is (Davis, 1989). It holds:

adoption rate = perceived usefulness * perceived ease of use

According to this theory, a user’s decision to use a system is dependent on their perception of how useful the system is to them and their perception of how easy it is to use. Perceived usefulness is also seen as being affected by the perceived ease of use. This theory was later challenged and expanded on but remains a good model of why participation is high in the case of wikis.

Wikipedia is a prime example of a wiki with a low perceived ease of use and high participation. From 2001 to 2004, Wikipedia articles in English grew from nothing to over 300,000. Now, just two years later, there are over 1,279,000 articles in English. The database grows by over a thousand articles every day.


Figure 4.5 Growth of Wikipedia

Interestingly, when we look at who is contributing to Wikipedia, we find that a core group of individuals, .5% of users (500 people), account for 50% of the edits. This phenomenon was also found in a study of Apache web servers. This study showed that 80%-90% of submissions came from a core of 15 programmers in a community of more than 3000 (Figure 4.5) (Mockus, Fielding and Herbsleb, 2000).


Figure 4.6 The cumulative distribution of contributions to the Apache code base (Mockus, Fielding and Hervsleb, 2000)

These observations suggest that user participation follows a power law probability distribution. The power law has been observed by economist Vilfredo Pareto and linguist George Kingsley Zipf. The Pareto distribution, or 80/20 rule, says that 80% of the wealth ends up in 20% of the population. Likewise, Zipf observed similar distributions in bodies of text.

Despite the fact that everyone can participate easily, the system migrates towards a situation where communities of very active contributors form. And eventually, it’s contributions by these users that make up the bulk of participation. But that is not meant to discourage participation by the masses. On the contrary, Ross Mayfield and others argue that it’s important to support both low threshold participation and high engagement. In other words, it’s important to foster an environment conducive to community building at the level of the core high participation contributors but with an open periphery encouraging participation by the masses as well.

It is the contribution of the masses that make up the value of this tool. In the case of Wikipedia, the total volume of obscure articles exceeds the volume of popular articles. But it is because of these obscure articles that Wikipedia has such great value. This is the Long Tail phenomenon from Section 4.1.1. The more obscure articles that Wikipedia contains, the more valuable it is. This simple tool has the ability to tap the obscure knowledge of millions of users. Out of that obscure knowledge, it’s able to build a deep knowledge base of great value.

Besides encouraging participation, the simplicity of a wiki also makes for easy use and intuitive navigation. The simple title/contents structure is easy to follow. And the interlinking structure makes it easy to quickly learn about a subject and all its related subjects. This kind of structure is intuitive and easy to follow.

In terms of implementation, wikis are extremely easy to set up and can be free. There are several open source wiki engines to choose from, including enterprise flavors. For the examples in Sections 6-10, it took no more than a few hours to set up the wiki engine itself. A tool so simple might not need funding approval or help from the IT department.

Lastly, knowledge work is messy. Processes and required knowledge can’t be known beforehand. It’s often necessary to create new knowledge ad hoc and in cooperation with other people. Software to support this kind of work should have open and collaborative structure. A wiki is just this kind of software; it is a freeform tool that supports collaboration.

4.2.3 Weaknesses

All this discussion of user contributions to Wikipedia is great but the same laws may not even apply in a corporate setting. For one, most enterprises don’t have millions of employees, out of whom a core of over 3,000 super-active contributors can emerge. The power law of participation may not apply in smaller numbers. Also, there is authority, power structure and politics in a business setting, so there are different forces at work than in the Wikipedia. Lastly, knowledge in a business setting might not be so decentralized. Those with expertise may be the least willing or have the least amount of time to contribute it. These are all factors that are different in a business setting, which we don’t yet have much information about.

A study of the benefits achieved by users in a corporate setting showed that there are slight differences in user motivations to contribute to wikis (Majchrzak, Wagner and Yates, 2006). The users surveyed sought benefits such as enhanced reputation, work made easier and improving processes for the organization. The last two here apply specifically to an organizational (corporate) setting. The study did conclude that corporate wikis are sustainable in light of the benefits achieved by wiki users.

One of the great attributes of a wiki can also be its weakness, the lack of structure. Having little structure is great for unpredictable processes. But most business processes have at least some structure to them, so too should the tools that support them. Knowledge work tends to be less structured, but even it must have some form. A wiki can seem to end up a nest of linked pages, where you have no clear idea of what all is in it. A great big messy web is fine unless you need to find something. And having a search box won’t always cut it. There needs to be ways to browse information or ways to search by categories of some sort. Though most wikis allow for the creation of categories, they still must all be set up. At the time being there is no general business taxonomy plug-in. With content creation coming from the bottom up and not coming through a central point, there is the possibility of duplication.

Vandalism and errors are not as much of a risk as in the case of Wikipedia. This is because a wiki would most likely be implemented behind a firewall and users would be held accountable for any contribution. Ross Mayfield of Socialtext purports that in four years of building wikis for corporations, he has never seen vandalism. On the other hand, in a business environment there needs to be a way to implement user security. No matter what content is being displayed, there will always be different groups of users with different needs and authority. Most enterprise wikis provide this functionality. However, this goes against the essence of what a wiki is. The whole idea is based on openness and simplicity. Articles are created democratically, weaving together a neutral point of view through constant revision. But in a business setting authority and procedure take precedence. Truth may be determined by the higher ups, not democratic discussion. All this emphasis on tight security controls might dampen user contributions.

Lastly, like any IT implementation, setting up a wiki does not guarantee its success. People need to buy into it, just like any other tool. They may need training to overcome fears or other expectations. There needs to be a strong user base who support the wiki and find value in it. Processes may need to be folded into the wiki, or other things established to increase the usefulness of the tool. Maintaining this – the community behind the wiki – is much harder than setting up or maintaining the wiki itself.

4.2.4 Features

A wiki engine is the software that runs a wiki system. In the Wikipedia example, the software behind it is called MediaWiki. MediaWiki is a free and open source software that is maintained by the Wikimedia Foundation. There are hundreds of wiki engines. Lists of wiki engines can be viewed on Ward Cunningham’s site at <> or by visiting the Google directory of wiki engines <>. Wikis can be open source or proprietary, free or fee based. Their features vary widely. It may be too early to identify specific groups of wikis based on their features. But some main evolutions have evolved. An enterprise wiki, for example, is a wiki intended for use by business. Examples of enterprise wikis include Socialtext, Jotspot, Twiki, Atlassian’s Confluence and Mindtouch.

The main features of a wiki have been discussed in Section 3.1. Table 4.1 provides an overview of all the more specific, often configurable features of a wiki. Most wikis differ in their intended use, so they differ with respect to these features. This list can provide a starting point when choosing which wiki engine to use. For in depth comparisons of all major wiki engines and further investigation and help choosing a wiki, see <> or Ward Cunningham’s <>.

Licensing Wikis come in open source or proprietary versions. They can be free or fee based. And they may or may not be hosted (if they are, they’re called a wikifarm).
Language Most enterprise wikis only come in English for the moment. But some, like Mediawiki and Twiki support several languages.
User/group security Most wikis support at least minimal user security. You can block or allow editing or viewing based on user or group identification.
Page security Many wikis also support some kind of page security. You can set viewing and editing capability page by page.
Common Features
Sandbox The sandbox is a hallmark wiki feature. It is a space for new or existing users to fool around and try things out.
Search A text based search box is a mandatory feature.
Open editing and page creation Page editing can include the ability to allow users to change content on a page, create new pages, as well as move, redirect and delete pages.
Preview changes Preview changes allows users to preview any changes they made before submitting.
Minor changes Minor changes allows users to mark changes as minor. This feature can keep those changes from appearing on a recent changes list. Minor grammatical changes for example, might not be of concern to a moderator.
Email changes This feature allows users to have all changes from a certain page emailed to them. Again this feature is valuable for any moderator.
Changes summary This allows users to make a short summary of the changes that they made to a page. This is valuable for wikis with very large articles.
Section changes This divides the page up into many different editable sections. It allows users to make changes to only a small part of the page, without having to scroll down the length of the article to find the desired section.
Watch pages This refers to functionality that allows users to keep track of all changes to certain pages.
Recent changes This is a running list of pages that have been changed recently.
Page history Page history stores old versions of a page so that content can be reverted to old versions. Version tracking is a way to keep track of various versions of a page. There is also functionality to show a side-by-side comparison of different versions.
Discussion Discussion is a core value of the wiki system. Discussion functionality can be added as a separate page connected to a particular page. It can also be in the form of comments tacked onto a page. In any case, it’s the ability to add comments and discussion to a topic or page.
Wysiwyg editing Wysiwyg – what you see is what you get – editing is the ability to edit pages in a way that shows you exactly what it will look like. The wiki syntax can be a bit cryptic for the lay user. This feature is important in an enterprise wiki.
Online file editing This allows files to be uploaded to the wiki and then modified in the wiki itself. Instead of being upload, downloaded, edited and the re-uploaded, the file can be worked on collaboratively in the browser window. Not all file formats are supported in this way but some wikis can support word, excel and image files.
Concurrent access Most wikis have some way of reconciling issues related to concurrent access. If two users are trying to modify the same content at the same time, there must be a way to resolve the conflict.
Scripting Some wikis allow scripting functionality to be added to pages.
Content includes Content includes are pieces of content or code that can be pulled directly from another location, usually through scripting. In this way, if the content changes in the other location, it is automatically updated on all pages that pull content from it.
Feed aggregation Feed aggregation allows users to combine syndicated content into one page. An example is MyYahoo!, which allows users to feed a variety of news sources into the page. Content is automatically updated, or pushed by the source.
Incremental linking Incremental linking allows for the creation of links to pages that don’t exist yet.
Interlinking Interlinking is a way to easily link to pages within the wiki. They are distinguished syntactically from external links.
Camelcase Camelcase is a particular syntax used for interlinking in wikis. The first letter of each word is capitalized and all words run together with no spaces. For example, a link to MainPage would link to a page so named.
Wiki/html syntax Wiki editors use a particular syntax for functions like bold, underline, linking, etc. Often they can also support using html tags to display content.
Other syntax elements Other syntax elements include: emoticons, foot notes, quoting, tools for creating faq pages, etc.
Index Most wikis have an index of all pages in the wiki. Some have more advanced ways to browse content.
Categories, Namespaces, Tagging Categories are groupings of pages. They can have subcategories as well. Namespaces are another way to group pages together. They allow use of identical page names in separate areas, separate namespaces. Tagging might also be supported as another way to create a kind of taxonomy for the wiki and integrate it with other taxonomies.
User pages User pages are a formal structure for creating a page of information on each user. In a business setting, it could serve as a kind of corporate yellow pages. Or it could be used to create a portal page unique to each user.
User tracking User tracking involves creating a space for each user to track their involvement. This would include all watched pages and edits made by the user as well as any discussions or communication with the user.
Wanted pages This would show any wanted pages, or pages which are empty. They have been linked to without having any content.
Orphaned pages Orphaned pages are pages with nothing linking to them.
Popular pages This shows a list of pages in order of popularity.
Recent visitors This shows a list of users who visited the wiki recently.
Print view This shows a page that can be printed correctly.
Skins Skins or themes allow the look and feel of the wiki to be modified. They are usually a set of CSS files that change the color scheme and other visual elements.
Syndication Syndication means the ability to have page content sent to a feeder of some sort, depending on the technology. For example, updates to a page could be sent via RSS to an RSS feeder.
Pdf/html/xml Some wikis can export page content in certain formats. For example a page could be exported as a pdf file.
File attachment Most enterprise wikis support file attachment to pages.
Media search Some wikis allow users to search the content of media files, instead of just the file names.
Embedded video Some wikis can allow video or flash elements to be embedded into the page.

Table 4.1 Wiki features


5.1 Where do wikis fit in?

Wiki software was not created as a knowledge management COTS to fill a specific need of knowledge management practitioners. It’s a disruptive innovation in the field of knowledge management. But it has a lot in common with and is already being compared to other IT tools for knowledge management.

Generally, wikis are considered a knowledge management tool because they can be used as a collaboration tool and because they are relatively free of structure. Initiatives that have proven particularly successful, like communities of practice and best practices, all rely heavily on communication and collaboration. And knowledge work in general is less structured, with less well defined processes. A freeform tool is well suited for this kind of work.

Based on the categories of IT tools discussed in Section 2, we can see that wikis are similar to a content management system (CMS). Wiki software provides a way to create and organize content in the form of web pages and file attachments. More specifically, a wiki resembles a web content management system (WCMS). This is a system to manage the publication of web content. It can also support the design and maintenance of a web site including all tasks related to authoring, publishing and delivering content in web format. A WCMS is essentially a system that allows anyone to publish content directly without going through a web designer. They usually collect content from users through forms, apply templates and store the data in a structured way so that it can be indexed and searched effectively. They also can provide sophisticated tools to enhance the look of the website.

A wiki is a unique kind of CMS for a couple of reasons. One is that it is extremely easy to use. To edit a page or add content, a user simply clicks a link on the page. There’s no admin interface or templates to go through. Also, navigation in a wiki is simple and intuitive. Because of interwiki linking, definitions of related concepts and information can be linked to easily. This organizes information naturally and contextualizes it semantically. But, as mentioned, it can create an unorganized nest of pages.

Wikis are also described as a collaboration tool. Specifically, they can be considered a kind of groupware (or computer-supported collaborative work, or social software). This is any kind of system that supports collecting, organizing and sharing information within a group of distributed people (email, group calendars, directories, shared spaces, etc.). Groupware is for lest structured, informal work and collaboration whereas workflow management systems, by contrast, are for structured processes. Wikis are like groupware because they provide a way to collaborate on documents online. They also typically have discussion spaces attached to pages and ways to email other users. And generally they provide a way for users to get their knowledge on the web and shared with other users.


Figure 5.1 Communication-oriented classification (Fuchs-Kittowski and Kohler, 2006)

Clearly then, a wiki combines aspects of a CMS and groupware. This is interesting because it combines a focus on both the results of communication and the process of communication (Fuchs-Kittowski and Kohler). A CMS is focused on organizing and sharing the results of communication – explicit and recorded information. And groupware is focused on fostering communication, a process that is requisite for knowledge creation. Together, these two combine to form what could be a new effective tool for both the management of knowledge assets and knowledge creation – in one. It is at once, a tool for knowledge creation and a tool for sharing explicit knowledge. In practical terms, it could allow users to discuss a project as they are working on the results they’re discussing for that project. The knowledge is at once created, stored and shared.

5.2 Corporate Uses of Wikis

As mentioned, wikis are a disruptive innovation. They are a new tool whose features are now being developed. And they are a freeform tool, intentionally designed to be simple, without much features. Because of this, it makes sense that wikis are used by companies in a variety of ways. In addition, they tend to spread in an organization in a viral way. Many of the stories of how companies use wikis show that one part of the organization started using a wiki and it spread to other parts of the organization as its value was realized by more and more parts of the organization. Wikis tend to start out in one place like software development and end up being used all over. The case study of Dresdner Kleinwort Wasserstein showed that traffic on their wiki quickly grew to outperform the traffic on their intranet (McAfee and Sjoman, 2006).

Table 5.1 shows the different ways wikis have been used in a corporate setting. The list shows that wikis have been used mainly to support a particular process, to distribute/share information or to aid in communication and collaboration. Using wikis as a document management system, a knowledge base, a corporate yellow pages or for storing product information and documentation – are all examples of wikis being used to share or distribute information. Using wikis for user feedback, idea generation, project management and communities of practice – are ways of using wikis for communication and collaboration. Lastly, using wikis for resource management, group email and logging sales leads – show how wikis are used to support particular work processes.

Document management system Documents (word, excel and pdf, etc.) can be attached to wiki pages. Often the document contents can be searched and are indexed. In this way, a wiki can function as a rudimentary document management system.
Project management Wikis allow project status to be tracked easily by disparate groups through a web browser. From smaller ad hoc projects, like software implementation status reports to full blown project management, deliverables, meeting agendas, status reports, progress reports, standards and practices can all be built in a wiki. Some wiki engines have project management add-on applications.
Collaborative work space Wikis are widely used as any kind of collaborative workspace or whiteboard. Documents can be uploaded and modified through a web browser allowing many users to view and change a document without uploading, downloading or emailing anything. In addition, some wikis allow changes to be emailed to a document. Users can work collaboratively on drafts, documents, lists, and reports – without sending documents back and forth. All changes are usually tracked and reversible.
Communication between geographically distributed groups Because wikis are accessible through a web browser and require no additional software, information can be shared easily among users across time zones and environments.
Group email Wikis provide an efficient substitution for group emails. Wiki pages (or a link to them) can be emailed to a group of users at once. Users can then view, add comments and make changes to the information on the page itself. All future information and changes can be communicated through the page as it can be used as a message board – with users posting information to the page which is immediately viewable by all.
Private and public knowledge base With most wiki engines, each page can have user security attached to it. So information can be made public or private. Some have used wikis to create a knowledge base used both internally and/or externally.
Product information and product documentation Some companies have used wikis to store product information and product documentation. Pages can be modified quickly and easily by select users and made available internally as well as to customers and partners.
Marketing/CRM Wikis have been used in this way for: tracking market trends, collecting data, logging daily sales lead counts, info on customers and partners, sales prospects, notifying of new features, communication with customers.
Software development/IT support Almost everything related to the development of software has been supported by a wiki: documentation, issues tracking, internal workflow, quality and process design, reference information, setup information, configurations, specifications, instructions for installing software (amongst others). IT departments have used wikis similarly, to track and record information about installations and bugs and to provide a reference to key network information. And tech support units have used them for recording: best practices, customer support information sharing, how tos, system request for new hardware, setup instructions and software downloads.
Corporate yellow pages Wikis can record information about users and their involvement with projects or their knowledge assets.
User feedback, discussion area Wikis have been used as a space to gather user comments or discussion on a particular issue.
Intranet Wikis have occasionally been used as an intranet backbone. Each page on the intranet is a wiki page, thus editable by all or select users. This creates a sort of bottom-up intranet.
Personal information management Wikis are used by individuals to manage their own personal information. They can easily contain notes, to do lists, how to’s, vacation schedules, photos, address books, calendars and blogs. Outside the corporation, they are used to keep in touch with or share information with family members and friends.
Community of practice There are many examples of wikis used to share information among communities of practice or other user groups.
Idea generation Wikis have been used as a whiteboard space for brainstorming, idea discussion and generation.
Resource management Wikis have been used for tracking usage of shared resources, e.g. shared machines.
Best practices Wikis have been used to store best practices, innovative methods and processes used.
Benefits HR departments have used wikis to post benefits information, guidelines, insurance information, expense reimbursement, and time-off.
Policies and procedures Wikis have been used to record various corporate policies and procedures.
Research and Development Wikis have been used to support all aspects of research and development. In one example, they were used to share clinical trial experiences.

Table 5.1 Corporate uses of wikis

6 CASE EXAMPLE: The Reynolds and Reynolds Company

The Reynolds and Reynolds Company (NYSE: REY) was founded in 1866 as a forms company. In 1927 it began to specialize in services for automotive retailing. Since then it has developed into the leading provider of services related to automotive retailing. These services consist largely of forms management and data management. Reynolds and Reynolds now serves, in some way, 80% of all automotive dealers in North America. Among these services, software services for automotive retailing figures prominently and in particular, the ERA Dealer Management System. This software solution supports all aspects of automotive dealership management including: accounting, employee management, business development, finance and insurance, selling parts and services and customer relationship management (amongst others).

The company is headquartered in Dayton, Ohio and employs about 4,400 associates worldwide. Their annual net sales are over $1 billion. They recently entered the European market with the acquisition of Incadea AG and DCS Automotive. But their presence is mostly concentrated in the U.S. and Canada.

The company’s operations are divided into segments like software solutions, documents, product development, sales, etc. This analysis will focus on software solutions, which is comprised of hardware and software solutions and any related deployment and support services. In particular, it will focus on the deployment and support services related to software and hardware solutions. These support services include field installation and issue resolution groups, call center support, remote installation services like conversion programmers and software configuration teams, consulting services and product or process documentation.

The following groups combine to form the bulk of services for the deployment and support of the company’s software and hardware solutions:

Account Managers This is the sales channel for all products. Account Managers keep track of customer information with respect to prospecting and products sold. Distributed across regions: west, northwest, southwest, east, etc.
Consulting Consulting provides services above basic system functionality: improving system utilization, performance and overall operational efficiency. Distributed across regions: west, northwest, southwest, east, etc.
Conversion, Data Management and Backline Support Groups Conversion programmers are responsible for copying customer data from their existing system to Reynolds and Reynolds’ system. They (conversion and backline support) provide complex data manipulation and system configuration. Ohio
Technical Assistance Center – HW/SW The TAC provides telephone support for hardware or software related issues. They provide support directly to customers as well as to internal support groups (like Field Engineers and Customer Training Professionals). Ohio
F&I Support This group supports Finance and Insurance applications, configures forms for F&I applications. California
Intellipath and Integrated Document Solutions IDS manages forms sales. Intellipath provides forms support – any document printed in a dealership that have a predefined format (an invoice for example). They configure the forms servers and the format of particular forms. Ohio
Customer Training Professionals CTP’s provide on-site customer training for all software products. They assist product deployment as well as provide on-site ad hoc training. Distributed across regions: west, northwest, southwest, east, etc.
Field Engineers FE’s provide on-site deployment of new hardware and on-site repair and maintenance of existing hardware. Distributed across regions: west, northwest, southwest, east, etc.
Distance Learning Center The DLC provides telephone training for all software products. They also configure the software for new product deployment. Maryland
Service Readiness and Knowledge Products SR and KP produce product documentation, product installation guides, new product version and release information and process guides. They also determine the overall processes for customer support. Ohio
Reynolds University RU is a unit based at headquarters that trains internal staff (mostly CTP’s, DLC, Consultants and FE’s). They also provide off-site classroom training for dealerships. Ohio

Table 6.1 Reynolds and Reynolds product support groups


At the outset, Reynolds and Reynolds would appear to be a good candidate for knowledge management initiatives. Almost everyone in the company is a knowledge worker. From product managers, to software developers, to technical writers, to software trainers and consultants – the entire product lifecycle involves knowledge-based work. Their most dominant software product, the ERA dealer management system, is a large and complex ERP product. Using the product itself is knowledge intensive. It’s sold with training and additional training is offered on-site, on the web and by telephone. The amount of customer support, customer training and consulting services that companion the ERA system is unusually robust. These are essentially all services that sell knowledge of the ERA system. In addition, the system is sold to dealers all over the U.S. and Canada. The company employs field support – Consultants, CTP’s and FE’s, in several regions spanning North America. All that knowledge must be communicated amongst a geographically disbursed group.

One effective way the company achieves knowledge sharing about the ERA system among field representatives (anyone working at customer sites) is to pair them up on jobs. Juniors are paired with senior representatives in a mentoring system for their first year on the job. After that, field representatives continue to learn from each other as they work in teams of usually at least 2, on 1-4 week projects. The other way knowledge is shared is through calls to the support center. When a field representative is stuck with a problem, they call the support center (TAC) and work together with someone back in headquarters to solve the problem together. Both the field representatives and the TAC representatives have access to the same product documentation and on-line help documents. So knowledge is shared both through the use of on-line help and through the conversation between the two representatives. If a problem is not resolved between the two representatives, it is sent to a backline programmer, through a tiered support system.

This system works well to solve problems and provide field representatives sufficient knowledge of the ERA system and the hardware that supports it. It insures that most representatives have a functioning level of knowledge or way to obtain knowledge to resolve problems at the first tier level. But it does not encourage the development of deep expertise and it does not insure against problem resolution redundancies. The former is not really a problem, except in the case of certain issues that are specific only to field representatives. Certain processes, like new product installations, are only handled by field representatives. So knowledge of these processes will not be as great in the TAC as it is, buried in the minds of a few experienced field reps. For example, the knowledge of how to install a particular software package would be ripe in the mind of a field rep who just completed four of those particular installs. Other representatives have no way of knowing that that representative has that knowledge. And that knowledge is not shared with the TAC because the representative has no reason to call them for problems with it anymore. In addition, the same representative, one year later may not have completed any more of those installs and completely forgotten how to do them. Nowhere in this process was there a way to capture the expertise of this representative, when it was ripe and share it with others. This points to another problem, the problem of redundancy. Since there is no way to share the expertise of this representative who just completed four installs of a particular software product, other representatives will fumble over the same mistakes in other places at other times. They may call the TAC and speak with someone who may or may not have heard of this problem. If there was a big enough problem with these installs, that became widely known at the TAC, an email might circulate among the TAC and maybe even among the field groups. But even if an email does circulate, it is rarely considered relevant until the moment your yourself are faced with the problem, which may be 6 months from now – at which time, you scramble to sift through hundreds of emails, if you even remember receiving the email at all.

It’s easy to imagine how a wiki could help this company increase knowledge sharing among support groups and thereby increase the development of deep expertise. A full and ideal implementation of a wiki for knowledge sharing would probably look something like a knowledge base for the service groups. It would turn the existing product documentation into an interactive knowledge base. It would do this by adding to the documentation itself collaborative space for the TAC, the field service groups and customers to all communicate and collaborate in conjunction with the documentation. It would combine the experience of all these groups into one space. It would provide a space for them to discuss a product and its utilization, i.e. uses of it, problems with it, configurations of it, creative workarounds, etc. Ideally, it would reach out to the long tail of product users with interesting input, where the real product expertise resides. It would draw out the contributions of individuals with particularly deep knowledge in parts of the system, obscure knowledge. This would encourage the development of deep expertise. It would also help to create a community of active knowledge contributors and provide a kind of knowledge map to their particular expertise.

This idyllic knowledge base would be a large undertaking. Perhaps there are also more grass-roots level usages of wikis that would improve smaller processes within the organization. This analysis looks at whether a wiki-based knowledge base, created from the existing product documentation is feasible, and if there are any other possible wiki uses that would improve the process or the practice of support groups at Reynolds and Reynolds.

The overall goal, it is assumed, is to increase knowledge sharing among support groups. This analysis will look at the use of wikis in two particular contexts, how they can improve the process or practice and whether that contributes to this overall goal of knowledge sharing among support groups.

The first context relates to support groups’ need for information about the particular project they are working on. They need to know what other representatives have been or will be working with the client and what the status is of the work they are assigned to do there. Each new install of the ERA system involves nearly all of the aforementioned support groups: account managers, conversion programmers, FE’s, CTP’s, intellipath, the DLC and sometimes consulting. Coordination among these players must be synchronized. FE’s must have the sales information from the account managers. The distance learning center must have information from intellipath before configuring part of the system. Trainers must have equipment installed by FE’s and configuration done by the DLC before beginning their work. Conversion programmers need the trainers and the DLC to finalize configuration before transferring the customer’s data. The coordination among these groups requires information about who is involved and the status of their work to be reported to project managers as well as the groups involved.

There is currently limited communication among these groups. There is a process to collect information about who (the names of the various representatives) will be involved in the project in the beginning of the project. But this information changes often, and the process to update the information is complex. It’s often not updated. Information about the status of each group’s work is not reported to a central location and not shared among all the groups involved. And yet this information is extremely valuable to the project manager (the regional CTP manager) as well as those involved.

The second context relates to the original idea of sharing of knowledge through a common knowledge base, based on the product documentation used by support groups. Product documentation, which is produced by both the Service Readiness and Knowledge Products divisions, is the core of the body of knowledge about the ERA system and supporting hardware equipment. These are the documents referred to by all support representatives – field and TAC (as well as customers). They provide both specific information about what a product is, what its functionality and features are, as well as how to use the product and how to install it. The accuracy and quality of information these documents provide is of paramount importance to support groups. They form a knowledge base used to disburse information from the top-down to representatives. This analysis will look at whether the process used to create these documents can be improved so that they share knowledge better.


As mentioned in section 4.2, wikis have been used for: project management, collaboration, collaborative work space, communication between geographically distributed groups, a knowledge base, product documentation, group email, discussion and resource management. At Reynolds and Reynolds, two of these uses seem appropriate, project management and product documentation.

Both of these uses require a certain level of collaboration. Project management for ERA system installs is entirely collaborative. As described, certain groups need to know the status of other groups’ work progress. At a minimum, all groups involved need to know, with accuracy who all is involved in the project. This information is currently insufficient or not available. Project management systems have been considered in the past. In particular, the MS Project Server was proposed. But the system proved to be too complicated for users (Hattner, 2006). The most important attributes identified for this process are ease of use and the ability of various groups and individuals to note the status on specified tasks. Additionally, project status and information should be available to all groups involved. This process was chosen as a candidate for support by a wiki because of the high level of collaboration involved. Since there are many different groups involved (account managers, conversion programmers, FE’s, CTP’s, intellipath, the DLC and consulting), a collaboration tool, or groupware, is in order. A wiki allows users to modify pages. This means users themselves could correct wrong information, without going through a central point. And they could add status information to pages, so that the information is viewable by anyone immediately. In this way, information about an install and all the associated tasks and their status, could be constantly up to date and available to everyone.

The level of collaboration required for creating product documentation in general is different. The process of creating and organizing product documentation requires some kind of a content management system. If the documents are all stored on the web, this could be a web content management system. Since the process for creating documentation is usually collaborative, probably the best tool for this combine the elements of a collaborative tool and a web content management system. A wiki is just this combination. In particular, this could mean a process that required input from a group of distributed users, and where that input needs to be managed in the form of online content, by an independent system.

A wiki can be useful solely as a collaborative tool for creating single web documents – as a form of groupware. Or it can be valuable for managing online content that is not modified by users – as a simple CMS. Or it can be valuable as a tool to allow users to collaborate (update information, discuss information) in a distributed way – a WCMS.

A prime example of use of a wiki as a collaborative tool, is to use it to post meeting agenda items before a meeting. In this way, a group of different people can all post their own items individually and create just the one document, without a bunch of emails flying around. This same kind of collaboration is not used to create product documentation at Reynolds and Reynolds. First, the document itself is usually written by one person. Second, the groups involved are for the most part working in the same physical location, off of shared drives. Third, the process to create documentation is rather complex and involves many phases. In each phase, information about the product is collected both to include in the documentation and to assess other support needs (training and other resources) in an overall effort to provide support for the new product. So the process of writing documentation is embedded in other processes. Also, information is gathered through a variety of communication: phone calls, meetings, system testing and feedback from pilot installs. Sometimes, information about the product is discovered by the documentation team, so the information flows in many directions. Overall, the process of creating documents is rather structured and functions to the satisfaction of those involved (Sauby, 2006). So, while the process of creating documentation is collaborative, it is so in a complex and structured way. And the process is embedded in other processes that combine to provide overall support for the product – including things like training, marketing, development and budget analysis. Moreover, the product documentation is not the culmination of this collaboration. You couldn’t just put up a wiki page and tell everyone to post his or her part of the information like you would with a meeting agenda. Information must come from them, yes, but it must be filtered through a person and be rehashed through series of communication with other entities. Ultimately, one person must be in control of the document.

In addition, the support website that contains all the product support documentation is very well structured. This goes against the very nature of a wiki as a freeform tool. Certainly, structure can be imposed. But that would all have to be set up, and the value of the wiki must be otherwise justified. Another benefit of wikis is how easy it is to update information. But updating information on the support site, with the current content management system is easy. Everything is fed into templates and uploaded to a development site. The development site is accessible to many and changes can be made by any authorized person. Changes are pushed to the live site weekly. Further, the process to make an update to a product document must be cleared through certain channels. Again, this goes against the grain of wiki culture where changes can be made by anyone and accuracy is insured through consensus. In this company’s culture, accuracy is insured through process and organizational structure.

Perhaps the document could benefit from functionality that allows user feedback. Wikis usually have a discussion page attached to each page or a way to add comments. This allows users to create threaded discussions on whatever content is displayed on that page. This might be an interesting way to elicit feedback on the document or the product, or discuss anything relevant to it. In the early stages of product development, this might be a way to centralize discussion of the product. But if most communication takes place over the phone or by walking across the office, it might not be used often. Or perhaps there could be instead, a wiki devoted to the product itself, used by the product developers primarily. During the pilot phase, a discussion area on the product document would be valuable. During the pilot phase there normally is feedback about the document itself coming from field representatives and the TAC primarily, but also possible from account managers, marketing and others. There is a way however, on the current documentation to send feedback via email. In addition, the TAC has a direct contact to which they regularly send feedback on documentation. In the last phase, when the product is released, there still might be an opportunity to collect feedback from customers. Again there is already a feedback mechanism in place that allows users to send feedback via email. But there is a difference in transaction cost and user motivation between sending feedback via email and posting comments to a page. A user is certainly less motivated to send feedback through email than to post comments. As mentioned in Section 6.0, having a public customer discussion area for each page would allow for discussion of product utilization, information that might be of value to the company. Using a wiki for product documentation would allow the creation of public-facing customer discussion forums. Moving all of Reynolds and Reynolds current product documentation to a wiki format just for this purpose would certainly be a large undertaking but probably of great value in this respect. But there was concern expressed over the fact that bad information could too easily circulate with a system like this.

For the most part, a wiki is not appropriate as a collaborative web content management system for the creation and maintenance of all product documentation at Reynolds and Reynolds. The reasons are mostly due to inconsistencies with the wiki way. The wiki is about openness and free-form, whereas their documentation process is well structure and imbued with hierarchical authority. And the documents themselves are loaded into an already well structured website that doesn’t necessarily benefit from either the simple wiki-style navigation or ease of updating by many users.

There is however, one part of the process of the creation of product documentation that might really benefit from a wiki. Developing installation guides is particularly more difficult (and particularly more collaborative) than developing other forms of product documentation (Robison, 2006). This is because the development teams are primarily concerned with developing the product and insuring the functionality of it. They do lots of testing of product features and functionality, so there’s lots of information about the product itself available from the developers. But they are not focused on developing a procedure for installing it. So to gather information about how to install a new product, before it is actually installed for a customer can be difficult. Resources permitting, the documentation group will perform test groups to test the procedure for installing a product. But this is not always possible. So the quality of set-up instructions may suffer. They try to gather information when the product is in the pilot phase, from field representatives. This is often the first opportunity to substantial knowledge about how to install a product. But field representatives are often very busy and stressed and don’t have a lot of time to worry about sending feedback on their installation guide. So for this particular process, the creation of installation guides, getting user feedback is difficult. This process might be improved with use of a wiki. Feedback can be quickly and easily added to wiki pages in the form of comments. And that information (or any change to the page) can be automatically emailed to the person in documentation responsible immediately.

9 PROPOSAL: wiki for product documentation

9.1 The process for product documentation

The process for creating product documentation runs alongside product development. As products are developed, the documentation for them is developed. This process follows five stages: selection, definition, development, pilot (delivery) and release (GCA).

In selection, some need is determined for a new product or a new product feature. This can be a hardware or software product. In this stage the teams involved determine what kind of effort is required to develop and support the product. A rough estimate of this effort is created.

In definition, development works to define detailed functional specifications for the product. They will go into as much detail as possible given certain assumptions about the functionality of the product and what will be required to support it. A more detailed estimate is defined including specific estimates for what kind of training and documentation will be required, what role the different teams will play in the process of creating and deploying the product, and any role any third party vendor might play.

In the development stage, development teams are working on developing the product. Documentation will work together with development to determine the specific details of the product. The product will go through iterations of development and testing. Documentation will work at a very technical level, determining and documenting details about the architecture and functionality of the product.

In the pilot stage, the product is tested at a real customer site. At this point, all documentation for the product has been created. This includes documents on how the product works, product specifications, installation guides and any supporting third party vendor documentation. As the product is installed and tested at customer sites, feedback is sent from a variety of groups to address the product documentation. Feedback is primarily sent from representatives in the field (CTP’s, FE’s) or in the customer support center (TAC) involved in pilot installs. But it can come from anyone else involved (account managers, regional managers, consultants, forms representatives, marketing groups, etc) as well. Feedback on documentation is sent either directly to the documentation group or it is sent to the group from the TAC. It can be sent from the online documents themselves by email. Feedback is often solicited by the documentation groups for certain documents or products and especially for any installation guides.

In the last stage, the product is officially released for sale to customers. At this point, the documentation is more or less complete. Most feedback on product documentation will come through the customer support center or by email from the documents themselves.

9.2 The problem with creating product documentation

One of the problems for documentation groups is getting feedback on product installation guides. As mentioned in section 7, during the development stage, the development group is working primarily to develop architectural and functional aspects of the product. So there is plenty of information available from development about architectural and functional attributes of the product. But they don’t have as much information about how to install the product. Sometimes they will try to do simulated test installations. But it’s very difficult to simulate the large variety of environments that exists at customer sites. Often they rely heavily on the feedback that comes from field representatives during the pilot stage. But field representatives working on a product pilot are usually extremely busy. They don’t have much time to devote to sending feedback on their installation guides. Use of a wiki for installation guides should provide a way for field representatives to quickly and easily provide feedback.

This analysis looks at the use of wiki pages for installation guides. The wiki engine chosen for this is MediaWiki. There is almost certainly a more appropriate wiki engine. Section 3.2.4 provides links to information on how to choose a wiki. I chose MediaWiki because it was easy to install and easy to modify how information is displayed (creating new skins). It took me half a day to set up MediaWiki on a server and a few hours to create a new skin. It also seemed a logical choice for organizing product documentation – information that functions like a knowledge base. In choosing a wiki for this particular use, these features are ideal: user/page security configuration options, skins, wysiwyg/wiki/html editing, ability to create tables and table of contents, file attachment, email changes, watch pages, section editing, discussion and a variety output options like html/pdf/word/printerfriendly/etc.

9.3 Solution: MediaWiki

MediaWiki is the software used to support Wikipedia. It is an open source wiki engine developed by the Wikimedia Foundation specifically for use by the website and all other Wikimedia Foundation projects. It is used widely by other organizations, including enterprises. See Section for more on Wikipedia. See Figure 4.2 for a sample MediaWiki page. This wiki was built for collaborative creation of a document base so it has a lot of functionality to support discussion about topics and page editing.

Documentation for a product can include a whole set of documents. These can be web pages or pdf files. And they can be for internal and/or external use. Figure 9.1 shows a template for a typical product page. This page contains basic information about the product and references documents for different aspects of the product like images, specifications, user guides, installation guides, images and other guides. Page navigation breadcrumbs are at the top, showing the user where they are. There is a feedback button that launches the user’s email application to send an email to an individual in the documentation group. You can see in the documentation section links to the ‘Quick Reference Guide’ and ‘Installation Guide’ as well as a pdf file, the ‘Reynolds Site Preparation Guide.’


Figure 9.1 Sample page of product documentation

In the introduction section, there is a table showing the various types of files included in the documentation. Here, it is suggested that another file type be added, a wiki file (see Figure 9.2).

Figure 9.2 Table of file types

The ‘Installation Guide’ currently exists as a web page in the support site. But it could exist as a wiki page. The link ‘Installation Guide’ would link to the wiki page. The wiki page could have the navigation bar at the top, by integrating the wiki page into the support site using frames or includes. Or the main document could simply link to the wiki page as it exists in the wiki, like it would a stand-alone pdf file. Figure 9.3 shows the ‘Installation Guide’ template as it exists now in the support website.

Figure 9.3 Sample installation guide

The overall layout of this document (minus the navigation and feedback button) can be nearly exactly duplicated on a wiki page. To do this, I created a new skin for MediaWiki. All formatting is controlled through CSS in the form of skins. Figure 9.4 shows the installation guide with the new skin intended to mimic the formatting rules of Reynolds and Reynolds documentation. The skin could be even further modified to replicate exactly the original document.

Figure 9.4 Sample installation guide built in wiki

The original wiki skin (Figure 4.2) shows the wiki navigation bar on the left. This allows users to navigate within the wiki, perform certain functions like uploading files and search within the wiki. For our purposes, these functions are not necessary. The installation guide is set up as a stand-alone wiki page linked to from the product’s main page and possibly fed through a frame or include. So there is no need to be able to navigate within the wiki. The top part of the page is the same in both documents (Figure 9.5). This part of the page shows the wiki functionality that is most useful.

Figure 9.5 Top navigation bar of wiki page

Starting at the left, the article tab shows the page contents, the installation guide itself. The discussion tab shows a list of comments made on this page. Once you click on the discussion tab, another tab appears (Figure 9.6) with a ‘+.’ This tab allows you to post a new comment, after which your comments will appear on the discussion page. The edit tab allows you edit the entire page. Editing of a page can be turned off for users not logged in, or limited only to administrators. The history tab shows the history of all changes made to the page. It allows side-by-side comparisons of versions and reverting to previous versions. The move tab allows users to move the page to another location. This too can be turned off for registered or unregistered users. Finally, the watch tab allows users to “watch” the page, which means to have the page appear under the user’s ‘my watchlist’ with the option to have email sent whenever changes are made to a watched page. The upper right bar shows tools available to logged in users. If no user is logged in, it will show a link to log in. First it shows the name of who is logged in. The ‘my talk’ page is the discussion page of the user’s page. It can be used to leave comments directly to users, which can be emailed to the user as well. The ‘preferences’ page allows the user to configure their user settings. ‘My watchlist’ shows the pages they are watching and ‘my contributions’ shows the pages this user has either edited or commented.

Figure 9.6 Adding comments

There is a wealth of tools available for communication and collaboration. In addition, users can email each other from within the wiki, so that they don’t have to open an email client. And their email addresses can be hidden.

Moving down to the body of the page, as mentioned, the formatting is nearly exactly replicated (comparing Figure 9.3 and Figure 9.4). The table of contents is automatically generated by the wiki. It is added to the page with the wiki syntax “TOC”. The formatting of the table of contents is generated by CSS and so configurable in the skin. Each section of content has a horizontal rule, title and link to the table of contents. On the wiki page this is replicated. Also on the installation guide template are lists and tables. These too are possible to replicate. Tables are more complex to create in a wiki than in an html editor. At the moment there is no reputable wysiwyg editor for Mediawiki, so creating tables requires coding them in wiki syntax. All table functionality is possible, but writing them is not easy.

Based on the installation guide template, this web document can be nearly exactly replicated as a wiki article. As a wiki page, the installation guide can have a page of comments directly attached to it. Comments can be added by anyone, anywhere with only an internet connection. The user does not even need to log in to the wiki to add comments (depending on the security set-up). But logging in is fast and easy. Once users are logged in they can sign their comments and use all the email functionality.

To use a specific example, imagine a Field Engineer installing a new kind of PC at a pilot install. The FE could go to the support site and find the product page for the new PC. There would be a link to the installation guide as a wiki page. Going to the wiki page, the FE would first see all the information in the installation guide. There would be options to add comments to the page. These comments can be immediately emailed to the individual responsible for the document, who is ‘watching’ the page. If the FE logged in to the wiki, the comments can be signed and the person in documentation can email the FE. Even if the FE didn’t log in, the person in documentation can immediately post comments responding to the FE. This can all happen from any terminal anywhere with nothing more than an internet connection and a browser. All collaboration between the documentation group and the FE is made easier and faster.

Another option would be to give the FE access to edit the install guide itself. If the FE finds something inaccurate, or finds something that should be added, he/she could add or change the material on the guide itself. The change would be immediately emailed to whoever is watching the page. If the information added was incorrect, the page could be easily reverted to an earlier version.

The use of a wiki page for installation guides provides a fast and easy way for field representatives to provide feedback on the document. This solution is better than a feedback button or mailto link because it does not require the user to open a separate email application. It’s an easier way to discuss a piece of documentation, as the comments are directly attached to it.

The problem with this solution is that the wiki page is a new and different format. Documents have to be created in a different way. Creating documents in a wiki would not necessarily take longer than creating them in an html editor. But if the documents do not stay in wiki format, and are later re-published as a static web page, they must be converted. Currently wikipedia does not support output as pdf or html (but it could very soon, and other wiki engines do).

10 PROPOSAL: wiki for installation project management

10.1 The process for ERA installation projects

The planning stages of the process for an ERA install include: a solution review, slotting the deal and creating the planware. Prior to the solution review, sales conducts various pre-sale activity like product demos and various needs assessments (forms, data conversion and customer reports). Then, the first three stages are:

  • Solution review
  • Deal slotted
  • Planware created

In a solution review, regional managers from hardware, software, forms, sales and sometimes consulting meet, over a conference call, to discuss the new business opportunity. They discuss the details of the sales proposal and needed/available resources to deploy whatever the proposed sale is.

After this, once the sale is finalized it is slotted – actually put onto the schedule. At this point, the regional manager of software support (manager of CTP’s) will assign CTP’s to the install for a four-week period. This information is emailed to the hardware regional manager (manager of FE’s) who will assign FE’s to the install for the period leading up to the CTP’s arrival. Consulting resources might be added to the install as well, arriving last. Also scheduled are: forms processing (intellipath), configuration (DLC), conversion, F&I forms and any third party vendor involvement.

Then, all the information about the install is collected by an individual in Dayton. This information includes: the dealership contact information (name, location, phone numbers and hours of operation), the names and phone numbers of all the people involved in the install from various groups (sales, hardware, software, consulting, DLC, TAC, Intellipath and F&I support), dates that certain groups will be working on the install (FE’s, CTP’s, and the DLC), names of key people at the dealership like the system administrator and management, and package information – what applications were sold to the customer. The excel document created to store all this data is called planware and is emailed to all groups and individuals involved in the install.

At this point, the installation preparation activities begin. The following steps are taken during this phase:

  • Hardware walkthrough, to determine hardware required
  • Conversion preparation activities and testing
  • Intellipath configures forms
  • Hardware is configured and shipped to customer
  • F&I configures forms
  • DLC conducts remote specification configuration
  • Data management configures custom reports
  • Hardware monitors 3rd party vendor activity
  • Hardware notifies others that hardware is installed
  • First pass conversion and verification of data (DLC, conversion, data management)
  • DLC notifies others that the first pass conversion and specifications are done

The final phase is the execution of the installation with customer training, the final conversion and monitoring of live operations. The following steps are taken during this phase:

  • Kick-off meeting starts on-site presence
  • CTP’s create training schedules, conduct training and verify configuration
  • CTP’s conduct weekly status meetings (throughout all four weeks on-site) and send status report to regional manager
  • Final conversion and verification of data (conversion, CTP’s)
  • CTP’s monitor live operations on the new system
  • Final transition meeting before all field representatives (CTP’s) depart
  • Weeks later, one CTP returns for a follow-up visit

10.2 the problem with ERA installs

Communication among the different groups during this whole process occurs mainly through telephone calls, conference calls and emails. There is not really a single project manager tracking the entire process. All groups involved are basically doing their part in isolation, usually from different locations. Status reports from each group or task are sent to the people assumed to be involved or in need of the information.

A couple of problems have been identified with this process. One is that almost everyone involved suffers from email overload. Any change causes a chain reaction of emails flying through the system. This is especially true for the regional software and hardware managers, since they are a central point of contact between many of the different groups. If, for example, a CTP arrives and a hardware terminal is missing, the CTP will email the regional software manager, who will then email the hardware regional manager, who will then email the FE responsible who will then email back a response that will be filtered all the way back down to the CTP. Even if the communication is by phone, it’s still convoluted. If the CTP has the planware, this might show who the FE is, who could then be contacted directly. But the CTP doesn’t always have this and the information is not always correct. This is another problem with the current process. The process to change what people have been assigned to an install is too difficult and so often not done. So the information in the planware is rarely accurate. Lastly, the status reports coming from the various groups and individuals are not consistent, frequent or reliable enough. There isn’t enough process in place to guarantee that information about project items is produced and circulated to the right people. This can cause major bottlenecks. Imagine from the previous example that the CTP missing a terminal is missing the only terminal allocated for training purposes. The CTP can’t begin training and may lose a whole day productivity waiting for a response from hardware. Meanwhile, the response could be disturbingly trivial – that the terminal had been installed in another building. Perhaps the only person on-site that knew about the change, the customer’s system administrator, was off work that day. It’s easy to see how little pieces of status information can be so valuable to unexpected people.

This analysis looks at two wiki-based solutions to support the ERA install process. One is a wiki-based project management software application. The other is just a bare-bones wiki. Both are fee-based systems built for enterprise use. The particular wiki engines chosen might not be the best for this process but will work for the purpose of demonstrating the value of a wiki. They were chosen for two main reasons. One is that they both support several basic features that are critical for any enterprise wiki used for collaboration: tight and configurable user/group/page security, rich page email features, page duplication, wysiwyg editing, syndication and rich file attachment capabilities especially the ability to handle excel files (view and edit in browser). The other main reason I chose these two vendors is that they were easy to test, since they are both hosted with free trials. I simply signed up for trial or personal accounts. There was almost nothing to configure. Setting up the wikis for the ERA install process took only a few hours (each).

10.3 Solution A: Socialtext Enterprise Wiki

The Socialtext Enterprise Wiki is a basic wiki with some enterprise functionality added. The overall strategy for the development of this product has been to resist temptation to add structure to the software. Instead of, for example, having a host of add-on applications to keep a company directory, manage issue tracking or manage sales proposals, it’s just a simple wiki that can be used for any of those things. And by a simple wiki, I mean a set of blank web pages that any user can modify (with configurable restrictions).

The value of this approach is that you can start out with a set of blank pages and develop the necessary structure. Structure is created both through links on pages and through categories and workspaces (workspaces are like namespaces) that pages can be assigned to. You could give the wiki to all regional managers as a blank set of pages without any guidelines, to see what kind of structures emerged. Or you could set up some structure beforehand based on expected needs and make revisions along the way. However it is implemented, what’s important is that the structure emerges out of what is really needed by users. Over time, you should end up with a tool that exactly fits your needs.

Figure 10.1 Sample Socialtext wiki page

There are four main elements of this wiki page (Figure 10.1). In the page edit options section there are typical wiki functions like page edit, page history and add comments (page discussion). You can also duplicate a page, send the page via email or see the email address to send an email to this page. Just below the page title, you can see it’s possible to add this page to a category. If the category is not created yet you can add it while creating it. You can also attach a file to this page. A file attached to this page may be linked to from anywhere in the wiki.

The most obvious place to add this kind of wiki into the process of ERA installs would be to use it to create planware. The distribution of planware, as a static document through email is an inefficient way to communicate. The difficulty in updating information in planware has been discussed. A wiki would provide a better way to share this information among groups in a way that can be easily updated.

There are a couple of really fast and easy ways this could be set up. One would be to simply upload the planware files to a wiki page as attachments. This page could serve as an index of planware files. Or it could be an individual page dedicated to the particular install the planware was for. Either way, the files could be uploaded to a central location for all to view or download. A link to, or the page itself could be easily emailed out to a group. Then the files could be downloaded, changed and re-uploaded with any subsequent changes. Another quick and painless way would be to upload all the information from the planware files as they exist, page by page. Each wiki page would contain the information from a particular file or spreadsheet in the planware. In the same way, the main page or a link to the main page of the planware could be emailed to the groups involved and subsequent changes made on those pages.

In addition, users have the option of ‘watching’ any particular page they want. This allows them to be notified by email or RSS whenever any changes are made to that page. For example, the training manager could be notified if there was a change in whoever was responsible for installing the hardware.

The wiki application could go even further and allow functionality to track project status. The following demonstration will show just one way a wiki could be configured to support the ERA install process so that 1) contact information can be kept current and 2) task items and/or status reports can be added.

Figure 10.2 shows the main index page for all projects happening in one region, in the San Francisco region. A separate workspace has been created called ‘Projects: San Francisco’ or ‘projects-sf’ for short, so that pages common to all regions, like ‘Current ERA Installs’ or People Report’ can have the same name. On the lower right corner, under the last item on the right navigation bar ‘My Workspaces’ you can see this user id has access to three regions, ‘projects-sf’, ‘projects-la’ and ‘projects-stl.’ On this same navigation bar, under the section ‘My Favorites’, the user can list links to other sections of this or other wikis or workspaces or particular files, like ‘people report.xls.’

Figure 10.2 Main project index page per region

Of course, this main project page can be created with any information desired, but I have divided it into sections to provide 1) basic information about the San Francisco Region, 2) ERA installs current and completed, 3) other ad hoc projects and 4) administrative information for CTP’s, FE’s, or any other remotely managed field representatives.

The link to Current ERA Installs in this example is actually a link to a category, not a wiki page. Figure 10.3 shows the page for this category. If configured this way, the install page for an individual dealership is put into the category ‘Current ERA Installs’ and when it is completed it is moved to the category ‘Completed ERA Installs.’ Another way to do this without using categories is to create a page titled ‘Current ERA Installs’ that lists links to all current installs. When an install is complete, the link is removed from this page and created on the page titled ‘Completed ERA Installs.’ Figure 10.4 shows the ‘Current ERA Installs’ page created as a web page instead of a category. Distinctions like this in the structure of the information will affect navigation and search capabilities and should be well thought out.

Figure 10.3 List of installs, created as a category

Figure 10.4 List of installs, created as a page

Figure 10.5 shows what a page for a particular install looks like. Here I have included a page for each group involved in an install. To remove a group is as simple as removing the link to it, to add one as simple as adding a link. This page may also contain basic information about the dealership and/or any overall status or notes about the dealership or install. Any file related to the dealership or install may be attached as well.

Figure 10.5 Individual install page

Drilling down further, each individual resource page can contain whatever information is necessary for both that group as well as for others. For example, the resource page for FE’s (Figure 10.6) could contain the names of the regional manager and FE’s assigned to the install. To change this information, you would click on ‘Edit Page’ and change the user name(s) listed. These names could automatically link to a page with information about that employee. The resource page could also contain any files necessary for that group. For example for hardware, this could include the hardware layout plan and other documents needed. It could also contain status information provided by the FE onsite. The status information on this page was added by adding comments to the page, by clicking on ‘Add Comment.’ Once the comment is typed in and saved, the comment is automatically signed with the user name (which hyperlinks to his/hers’ user page) of whoever is logged into the wiki, the date and the time of the comment. Comments appear at the bottom of the page in chronological order.

Figure 10.6 Individual resource page

This format, and this page in particular meet the requirements established for this project. It allows the name of people involved in the project to be changed easily. And it allows for project status to be updated easily. In addition, all the information about an install is stored in a central location for all to see. This information can be stored in the system indefinitely under completed installs, creating a rich knowledge base of customer history.

There are a few drawbacks to this approach. First is that unlike a full-fledged project management system, there is very little task management. There’s nothing stopping the FE from not writing a status report. There’s no priority on items, or any system to alert the FE when a task is overdue. Moreover, there’s no way to alert the project manager that tasks are overdue. It might be possible to configure the wiki, with watched pages or rss, in such a way that provided this sort of functionality. But it’s not evident and would require much more effort to design such a system.

Another drawback is that there is currently no way to template an entire install. In order to set up a new install, you must duplicate each page and rename the links in the first page. Under this set-up, there are about 9 pages to an install. This might be longer than it currently takes to create a set of planware. Installs have more or less the same structure every time. With the Socialtext wiki, creating templates might be possible using the open source version and making programmatic changes. Some other wiki engines do have template functionality. On the other hand, the flexibility offered by this lack of structure might be better for supporting all the smaller customer projects like EDM or Contact Management installs.

Another problem is that page security, naming conventions and the overall structure must all be configured. This is obviously an asset as well as a drawback. These things are all open to configuration, but they all must be configured in order for the system to make sense. Navigation can be a nightmare if naming conventions are not adhered to or if there is very little use of categories and namespaces for adding structure. As for page security, wiki purists would argue that pages should be open to modifications by anyone. But this would allow users to accidentally delete critical information. This is countered by the fact that all pages can be reverted to a certain point in their history.

10.4 Solution B: Jotspot Wiki Project Manager Application

Jotspot, like Socialtext has built a wiki engine for enterprise use. But the Jotspot wiki is different in several ways. First, it is a simple wiki that allows users to create pages with a title and content. Each page (see Figure 10.7) has all the typical wiki functions: linking, adding photos and image attachments, emailing to and from pages, adding comments, duplicating pages and adding page security.

Figure 10.7 Sample Jotspot wiki page

Unlike other simple wikis, Jotspot has created a suite of add-on applications to support common business processes. These applications include: project manager, company directory, but reporter, recruiting manager, call log manager, group calendar, meeting manager and contact manager. These wiki based applications all run alongside or integrate fully with their freeform wiki. For example, the company directory creates a formatted page of information on each employee so that links to the user go to this formatted page, as opposed to a blank page in Socialtext. Jotspot, unlike Socialtext, is less queasy about adding structure to their enterprise wiki. The still have a freeform tool, but with options for utilizing a host of structured applications. It should be noted however, that these add-on applications are designed the wiki way – minimalist, flexible and very easy to set up and use.

The advantage of this approach is that less of the structural elements need to be set up. Some processes, like a company yellow pages, are pretty standard. And a lot of smaller organizations don’t need much more than a simple tool. And a lot of smaller divisions or smaller processes in larger organizations don’t need more than a simple tool. More importantly, they need something quick and easy to set up and use.

The Jotspot wiki is an interesting tool for the ERA install process for exactly those reasons. A full-fledged project management application is difficult to set up and difficult to use. At one time, Microsoft Project was considered for this process but proved to be too complicated to set up and use. The Jotspot project manager application is the antithesis of Microsoft Project, it’s the simplest thing that could possible manage a project.

The Jotspot project manager application has very minimal and basic functionality. You can create a list of tasks (to-do’s) and have them completed. You can invite various people to the project and assign them to tasks. You can attach files and comments to the project. In addition, you can create client progress reports and quickly view any recent changes to the project.

Projects are arranged by who has access to them, instead of by region. Whatever projects I have created or been invited to work on, are what I will see when I log into the system and go to the project management application (Figure 10.8). In this example, the FE John Smith, has only been invited to one project, Lexus of Reno. Upon login, each person will only see projects that concern him or her.

Figure 10.8 Project manager application

Once inside the project itself, you can see all the information relevant to the project like who is involved, the tasks and theirs status, any files attached to the project and any comments attached to the project or individual tasks. You can either view “My To-Do’s”, which will show that tasks assigned to whoever is logged in, or all to-dos, which will show you all the tasks on the project. You can also filter to-dos by viewing those due soon or those done. The system is designed so that priority items and items due soon will appear prominently.

Figure 10.9 Project dashboard view

Following the same example as in the previous section, the planware for the install could be uploaded as files to the ‘Files’ section. Any changes to the planware would require the documents to be downloaded, changed and re-uploaded. Each individual assigned to the install would be invited, either by entering their email address or by selecting them from a list of system users.

Tasks would be created for each step of the install process. There could be as few or as many tasks as needed. For example, they could track just the completion of each group’s task like the completion of: intellipath forms, F&I forms, first pass conversion, specifications, custom reports, hardware install, training, final conversion and follow-up. Or the tasks could include further detail like verification that hardware shipped or separate tasks for each department’s specifications set-up. Each task has a person responsible, a priority and an optional due date (figure 10.10). There is also space to write a description of the task. But for this process it would be better to write the entire description of the task in the title and use the description for users to write status information. There is also an option to receive an email when the task is completed. Anyone can add a task to a project for anyone else to complete. Changing who is assigned to a task is also very simple. From a particular ‘To-do’ screen, you can click on the chevron next to “Assigned to …” (see Figure 13). If the person has not been invited to the project you can do it then, in the same screen.

Figure 10.10 Create task

Anytime leading up to the completion of the task, the FE can use the description field to write notes about the status of the task (figure 10.11). Each time this field is changed, it is noted under ‘Recent Activity’, with who made the change and when. This is similar to the signature of comments in the Socialtext wiki. In this way, status information can be continually updated relevant to a specific task, up until the moment the task is finally completed.

Figure 10.11 Add task status

Completion of a task is very easy. It can be done either from the project dashboard or inside the individual task. The user clicks on the green circle next to the task name, after which a check mark will appear (figure 10.12).

Figure 10.12 Mark task as done

Once again, this system meets the minimum requirement of 1) allowing the names of people involved in the project to be changed easily, and 2) allowing for project status to be updated easily. In addition, this system is better designed to manage tasks. Tasks with higher priority that are due sooner will appear first. The project can be tracked better with less risk of tasks falling through the cracks. The view of projects is clear; it’s easy to see what tasks are pending and which are complete. The view of the project is tailored to who is logged in, so that I can see tasks assigned to me in order of priority and due date. Lastly, the planware can be better managed because it is attached to the project. Changes can be made to the documents by anyone throughout the life of the project so that they are constantly kept current.

This solution suffers the same problem as the previous one in that it is not possible to create a project template (not yet anyways). Creating a new project would involve uploading the planware and creating/assigning each task.

The most disturbing flaw in this application is that it doesn’t integrate well with Jotspot’s freeform wiki. If for example, I created a home page for myself in the wiki and wanted to create a link to each of my assigned projects on that page, I would have to know how to create that link. It is possible, but not evident. The projects do exist as wiki pages that can be linked to but their names are not evident. In this example, Lexus of Reno was named ProjectManagerProject3 (because it was the 3rd project I created). Also, if I see that John Smith is the FE assigned to Lexus of Reno and want to contact him, there’s no quick and easy way to get to his contact info (i.e. his name is not hyperlinked to his contact information). I would most likely have to go from the project manager application to the company directory and look him up.

The project manager application functions primarily as a tool to insure tasks get completed and that people have the information needed to complete a project. Once the project is complete, it is deleted and the information is gone. Any derivative information that might be gained by looking at past installs or by looking at the information related to installs in a certain way is not possible. In all, derivative information about installs is not gained. If for example, I am a trainer who wants to find another trainer who is assigned to or just completed a Lexus install in my or another region. Since I only see projects that are assigned to me, this information would be difficult to obtain.

This last point highlights the real value of the freeform wiki. The flexibility in a freeform wiki allows much more possibilities for gathering information than through the canned transactions preconceived by a structured application. The ability to gather information about the information you have (i.e. knowledge) is much greater. Information in a wiki is stored in a sort of extremely basic database. Information can always be searched by a simple text-based search (title or page contents search). But if the information is organized by categories, workspaces, and with certain naming conventions, the ways of searching are even richer. With a structured application like a project management tool, information is hidden in tasks, descriptions and comment fields. In a wiki, everything is searchable. If a certain need is determined later for information in a certain way, the way the information is formatted in the wiki can be changed to accommodate this need. Say for example, down the road the trainers find themselves often wanting to know what other trainers are working on installs of the same car make. The install data can be reorganized so that this kind of view is easily available. With a freeform tool, the structure can emerge to support both the needs of the process but also the knowledge that is gained from the data that is stored.


This analysis did not find a specific wiki use that could contribute immediately to two the problem areas defined in section 5.0: lack of development of deep expertise and problem resolution redundancies among support groups. The idyllic vision of a knowledge base to encourage development of and share deep expertise is probably too large a task that goes against the culture of the company. Using a freeform wiki to support ERA installs might go part of the way in reducing problem resolution redundancies as it keeps the various support groups better informed about theirs and others’ projects. But it is not certain that this would directly reduce problem resolution redundancies.

On the other hand, this analysis has found two processes at Reynolds and Reynolds that would benefit from a wiki, in terms of improving work processes and overall knowledge sharing. The needs of the organization, defined as they are in sections 8.2 and 9.2, would be met by the wiki engines tested. Whether or not these solutions would be useful enough to justify the effort in setting up and maintaining them is another question. On the one hand, compared to other IT tools, wikis are extremely easy to set up and extremely easy to maintain. But on the other hand, they are not without some effort. This would need to be assessed in more detail.

This analysis has shown that use of a wiki to support these processes would increase knowledge sharing among support groups. In the first example, knowledge of the installation process is shared during the pilot stage between the field representative and the documentation group. This example also shows that it is relatively easy to create Reynolds and Reynolds’ product documentation in a wiki. While it would be an overwhelmingly difficult task to move all documentation to wiki format, particular documents here and there could be moved to wiki format one by one, as saw fit. If it were eventually determined that a wiki format would be a better way to format documentation, this is one grass-roots way to do it.

In the second example, information about ERA installs including status information and all related documents (planware, etc.) is shared among all groups involved in an install. This information, in the Socialtext example, can be used to provide additional knowledge, like knowledge of past installs or who has worked where. It is easy to see how this wiki could grow to support other knowledge needs related to these processes, as they are discovered. Or, it could grow to support more knowledge sharing among the field service representatives themselves. Individuals could use the space to post information relevant to others like training documents or customer notes.

In both of these examples, the problems encountered were in implementation. In the first example, it was difficult to integrate the wiki pages into the current existing documentation. Product features like output as html/pdf and wysiwyg editing of tables were important. In the second example, templating projects was important. These can be added to a wiki and are already part of at least one other wiki engine. So it’s easy to imagine that soon, a wiki tool will exist with all the appropriate features for each of these particular implementations. The perfect wiki, to be used for both of these implementations would have the following features:

user/page security configuration options, rich page email features (email to and from page), modifiable skins, wysiwyg, wiki and html editing, ability to create (wysiwyg) tables and table of contents, file attachment, email changes, page duplication, templating of pages and sets of pages, scripting, content includes, syndication, watch pages, section editing, page discussion (and/or page comments), online editing (especially excel files) and a variety output options like html/pdf/word/printerfriendly/etc

Lastly, a wiki is a freeform tool, the uses of which are discovered. Each individual and each group in an organization have their own particular collaboration needs, which a wiki might exactly meet. From a global point of view it’s difficult to say where those needs are. Wikis tend to enter organizations at in one place and grow from there like a virus. In the case of Reynolds and Reynolds, I have identified a few needs. If a wiki were introduce to fill just a few needs it would undoubtedly grow to other parts of the organization and improve efficiency elsewhere. Ideally, it could grow into something like the knowledge base for field representatives originally conceived.


Sections 1 and 2 showed that there is not much agreement on an academic definition of knowledge management but that there are some main and lasting ideas. This disagreement is due to the fact that knowledge management is a diverse field that grew out of a variety of disciplines. Practitioners in the field have a broad view of what knowledge management is and don’t adhere much to the academic discrepancies. Moreover, it’s hard to deny an overall need for knowledge management, as the new economy is driven by knowledge. The main idea driving knowledge management is that knowledge must be managed like an asset. This involves creating, codifying and sharing knowledge.

There isn’t much agreement on what a knowledge management tool is either. This is partly because the term has been overused and partly because a lot of different tools that were not built to be knowledge management tools per se are used towards the overall goal of knowledge management. What we have instead is a large group of tools like content management systems, decision support systems and collaboration tools. The trend for IT tools, especially those that contribute to knowledge management, is towards more communication and collaboration tools.

The new tools being dubbed ‘Web 2.0’ are unique because they are web-based applications that are freeform and thrive off of user participation. They are tools that support individual knowledge work, communication and collaboration. Wikis are a prime example of such tools. As a tool for knowledge management, they most closely resemble a cross between a content management system and groupware. Corporate uses reflect this as they have been used in a wide variety of ways that span the gamut from pure content management to a mix of content management and groupware to purely as groupware.

Sections 6 through 11 used an example of a company with distributed field operations to show how wikis could be used to increase knowledge sharing among distributed support groups. Three demonstrations of proposed wiki implementations showed that use of a wiki for each process would increase knowledge sharing among the support groups involved.

Add a New Comment
or Sign in as Wikidot user
(will not be published)
- +

Unless otherwise stated, the content of this page is licensed under Creative Commons Attribution-Share Alike 2.5 License.