Comparing IMS and KMS
True to observations made by Galandere-Zile and Vinogradova (2004), information management systems (IMS) and knowledge management systems (KMS) is often used for the same purpose. Their common usage notwithstanding, one cannot ignore the fact that IMS and KMS are different. At the very basic, IMS is used for the storage and management of information. KMS is more involving, because it includes the creating of knowledge (often obtained from combining information with experience and context), storing it, retrieving it, using it, and auditing the efficacy in its usage. Additionally, KMS has functions that are more complex compared to similar functions in IMS. Such include communication, document management, access, visualization, and basic search functions.
In some organizations, IMS is used for the management of incompatible explicit knowledge (Galandere-Zile & Vinogradova 2004). KMS on the other hand is used for managing knowledge processes. The latter is also used to integrate networks, computers, and databases to support KM. KMS is also more suitable where there are organization-wide communication and information infrastructure. Notably, technologies such as computer systems facilitate KMS. In other words, they act as enablers of KMS. IMS is often used for the management of information in different departments in the same organization.
Another major difference between IMS and KMS is that while the former is meant to provide better information for use in the organization, the latter is meant to provide intelligence, which forms the basis of better decision-making in an organization. Since knowledge is generated from information based on context and experience, it is therefore arguable that knowledge management systems are often supported by information management systems.
Thoughts on KMS Components Using Agricultural DSS
Before delving too deep into this discussion, it is important to identify what KMS components are, and what agricultural DSS is, and thereafter establishing their connection.
KMS components are the mechanisms that ensure that KMS is effective. They include the aspects that ensure that KMS is designed in a manner that fits a company’s structure. KMS components also include appropriate software on the system, the right knowledge among employees; retrieval and use of knowledge; and evaluation of KMS for purposes of ensuring its effectiveness.
On its part, an agricultural DSS is a computer-based system that supports agricultural-related managerial and technological decision-making (Matthews et al. 2008). An agricultural DSS would be expected to serve different planning levels in an organization, including people in the management and operations. Knowledge management systems are often part of DSS since a properly designed agricultural DSS is based on interactive software, which can take and analyze raw data, existing knowledge, and/or business models to produce information that would be used in decision-making.
Like other DSS, an agricultural DSS should have three components namely: the knowledge base, the decision model, and the user interface. The KMS components should therefore be expected to be most applicable in the knowledge base component of the agricultural DSS. Since users are also an important factor in the DSS architecture, KMS components in an agricultural DSS context would need to consider how best to create and use knowledge among many farmers. Considering that farmers may have different crops on the field, different animals on their farms, and different farming preferences, the software used in the KMS should be flexible enough to accommodate all differences.
Knowledge analysis is the process by which an organization seeks to understand knowledge. The process involves an exploration of situations relating to how “knowledge is created, transferred and utilised” (Goldkuhl & Braf 2001, p.4). In other words, knowledge analysis involves investigating knowledge origins, knowledge deployment methods, and knowledge utilization approach.
Arguably, the quality of knowledge is only as good as the information that it originated from. How knowledge is deployed also affects how the knowledge recipients decode and interpret it. It also affects their understanding, appreciation, and enthusiasm to work with the available knowledge, hence utilization. It would thus be expected that effective knowledge analysis allows an organization to determine whether the origins of knowledge are of credible quality and whether the right channels of knowledge deployment have been used. The analysis also establishes whether the available knowledge has been utilized as it should have been.
In some cases, knowledge analysis can happen in context – a situation referred to as “contextual knowledge analysis” (Goldkuhl & Braf 2001, p.4). In such a situation, the creation, transfer, and utilization aspects of knowledge are scrutinized based on specific organizational contexts. During analysis, a distinction is drawn between general and specific knowledge, with the former being a kind of knowledge that almost all people in an organization are expected to have. The latter form of knowledge on the other hand is tacit meaning that it takes practice and experience to acquire it. When analyzing tacit knowledge, an organization needs to consider whether people therein were deliberately targeted for equipping with the same. Knowledge analysis also allows for the categorization of knowledge is acquired for different reasons and purposes.
Selection of KMD Approaches
According to Rinkus, Johnson-Throop, and Zhang (2003), KM design approaches typically follow standardized design and engineering technologies. Such principles are preferable since they assure organizations that their KMS will fit into pre-existing, tried, and tested methods of managing knowledge. Notably, however, the principles do not take into account the unique organizational, social, and cognitive issues that are represented in a diverse work environment. Rinkus et al. (2003) observe that the foregoing failure by designers to account for the diversity in work environments can be remedied through a more human-centered computing system. Most KM designs are computer-based and as such, enhancing the human-computer interaction would be an ideal approach to identifying a design that will most likely cater to all the knowledge management needs presented in an organization.
Another viable approach is ensuring that the basic structure of KMD is user friendly and efficient. If meant for use in a group context, the KMD should be sufficiently equipped to handle group capabilities in the capture, distribution, and use of knowledge.
It would also be important for the people responsible for the development of KM designs to recognize and appreciate the need for the KMS to have collaborative and asynchronous workspaces, especially if the KMS is meant for use in a group environment.
Overall, Rinkus et al.’s (2003) sentiments that the design of KM should be based on an in-depth understanding of prevailing technical knowledge in the organization are convincing. Furthermore, the designers should consider (and include) the cultural, organizational, cognitive, and social aspects of individual or group KMS users. Only then can they develop a KMD that will be effective for use in a given organizational context.
The Importance of Knowledge Cycle
Since knowledge is not constant, it undergoes a cycle that starts with its creation. According to Lenci (2010, p. 289), knowledge “changes through time; It reproduces itself, generating new knowledge”. Knowledge also dies, and this is supported by the fact that some of what we knew in the past is lost in our memories, never to be remembered. Technically, there are times when knowledge becomes obsolete thus marking its death.
People create or acquire knowledge for a specific purpose (Lenci 2010). In the course of using the acquired knowledge, a person may develop knowledge about processes and entities, and this brings about innovations and developments. The generation of new knowledge sometimes occurs within the knowledge cycle. Once the new knowledge takes over, the old type of knowledge dies off or is not considered necessary anymore. When laptops were first developed, for example, they were based on the knowledge which had earlier enabled the making of desktop computers. As the preference for portable personal computers increases, PC manufacturers who still possess the knowledge of making desktop PC may find themselves concentrating more on the production of laptops and other portable devices. While current manufacturers still possess the knowledge to manufacture desktop PCs, future generations may not muster the same knowledge and expertise. In other words, the knowledge to make desktop PCs generated new knowledge that enabled the production of laptops, and the knowledge about making laptops has generated new knowledge needed for the development of smaller gadgets like the tablet and smartphones. The knowledge cycle can hence be credited with the innovations and developments that occur within organizations and the society at large.
In his first presentation, Lambe (2009) discusses knowledge audits and indicates that organizations conduct such audits for different reasons. Such include: identifying knowledge assets and where to find them; identifying existing knowledge gaps in the organization; using the identified knowledge gaps as evidence justifying the development of corporate taxonomies; identifying priority documents for migration into a portal; and using the findings of an audit to set knowledge management priorities and KM strategies.
As a knowledge consultant, Lambe (2009) seems to have gotten right the main reasons why knowledge audit is necessary. From his presentation, one gets the impression that a knowledge audit is critical for any organization that wishes to establish its knowledge status. It also appears that through knowledge audit, an organization can understand several aspects of its assets better. For example, the organization can know the users of its knowledge, the service or products they generate, and the factors that give it competitive positioning in the market place. An organization is also able to understand its human-centered knowledge especially about the management, leadership, creativity, and expertise skills contained in the human resource.
Lambe’s presentation further reveals that the knowledge audit enables the organization to understand its culture, processes, leadership, standards, and management values better. In other words, the audit is a good source of infrastructure knowledge about the organization. The organization is also able to understand its intellectual property better since the auditors also assess the organization’s product design, brand name, logo, and in some cases, trade secrets. The effectiveness of a knowledge audit would however depend on how data was collected and analyzed.
Wilson on Tacit and Implicit Knowledge
Wilson views tacit knowledge as a hidden form of knowledge, which according to him, is in some cases not even known to the holder of the same knowledge. He thus argues that tacit knowledge cannot be captured and can only be learned through observing people who demonstrate such knowledge. He blames specific authors (i.e. Nonaka 1991; Nonaka & Teuchi 1995 cited by Wilson 2002) arguing that they either misunderstood the tacit knowledge concept or deliberately distorted the same, hence creating the notion that tacit knowledge can be captured.
Any knowledge that can be expressed – e.g. through beliefs, viewpoints, paradigms, and/or schemata is according to Wilson (2002) implicit knowledge and not tacit knowledge as suggested by Nonaka and Takaeuchi (1995 cited by Wilson 2002). Wilson observes that implicit knowledge can be expressed although normally, it is not. Specifically, Wilson (2002) argues that implicit knowledge is the know-how that people take for granted, just because they know they know. As such, he argues that it should not be confused with tacit knowledge, which is the knowledge that most people do not even know they possess.
Wilson’s arguments seem to make a lot of sense, especially since he has used examples to detail the confusion that Nonaka and Takeuchi (1995 cited by Wilson 2002) introduced in the knowledge literature by not using the appropriate term for the type of knowledge they were describing.
If indeed Wilson’s (2002) observations are true (and I am convinced they are), it would appear that the confusion between tacit and implicit knowledge is far-reaching. For example, a brief look at the literature reveals that both tacit and implicit knowledge are used as synonymous terms.
Galandere-Zile, I & Vinogradova, V 2004, ‘Where is the border between an information system and a knowledge management system?’ Managing Global Transitions, vol. 3, no.2, pp. 179-196.
Goldkuhl, G & Braf, E 2001, ‘Contextual knowledge analysis – understanding knowledge and its relations to action and communication’, The 2nd European Conference on Knowledge Management, IEDV-Bled School of Management, Slovenia, November 8-9, pp. 1-11.
Lambe, P 2009, ‘Conducting a knowledge audit’, Green Chameleon and Straits Knowledge, Web.
Lenci, A 2010 ‘The lifecycle of knowledge’, In C Huang, N Calzolari & A Gangemi, Ontology and lexicon: natural language processing perspective, Cambridge, Cambridge University Press, pp. 289-305.
Matthews, K., Schwarz, G., Buchan, M., Rivington, M & Miller, D 2008, ‘Wither agricultural DSS?’ Computers and Electronics in Agriculture, vol. 61, no.2, pp. 149-159.
Rinkus, S., Johnson-Throop, K.A & Zhang, J 2003, ‘Designing a knowledge management system for distributed activities: a human-centred approach’, AMIA Annual Symposium Proceedings, pp. 559-562.
Wilson, T.D 2002, ‘The nonsense of ‘knowledge management’’, Information Research, vol.8, no.1, Web.