I think the lack of a standardized definition of information may somewhat impede the progress of information research, but not drastically so. The problem of exact word meaning is a problem in every field, and probably even more so in the social sciences, which tends to involve more inter-disciplinary research. It frequently occurs that the same word is used in different fields to designate closely-related, but not exactly equivalent, concepts.
The term information has different implications in different fields. In physics, it is the mathematical negative of entropy--sometimes referred to as negentropy. In descriptive terms, information is the structure of a system, in contrast to the entropic disorder.
From the mathematical/statistical view, information is the reduction of uncertainty through successive selections. This take on information, originally put forth by Shannon and Weaver, is most helpful in engineering and data communication.
Yet these two views of information differ from the common, everyday, prototypical notion of "a lot of text." We may also consider verbal directions, gossip, graphs, charts, or even images to be information. We often conceptualize information as a fluid--a sea of information, information flow, and drowning in information. Information is the synthesis of related data to form some idea, and that idea contributes to an individual's knowledge.
These three views--physical, mathematical, and everyday--have enough similarity to share the same term information. The decrease in disorder, decrease in uncertainty, and increase in knowledge have many parallels. But these theories were devised to investigate separate problems in different contexts, and trying to smash them together into a single theory can lead to confusion.
Hence the need for authors to differentiate which connotation they intend, especially when their works might be read by an audience outside their particular specialization. A step in this direction is the two articles we read for this week. They review the different definitions of information from a variety of different authors and theories. Being aware of these differences will aid future authors in describing more precisely what they mean by information.
In my own CIS work, I'm most interested in information from an information architecture view, which corresponds most closely to the everyday sense of information. However, if I write on this in depth, I may find myself adding yet another subtle definition of what we mean by information.
Meadow & Yuan use Cherry's work to explore the levels of information in information theory. Cherry states that there are semantic and syntactic levels (which together correspond to Weaver's semantic level), and a pragmatic level (which corresponds to Weaver's effective level). They also quote Cherry on how context can affect the meaning of a word.
Skyttner uses Cherry to describe how the pragmatic elements of a message depend on the "earlier experiences by the sender and receiver; present circumstances; [and] individual qualities." He also quotes Cherry to compare "uncertainties inherent in the use of natural language" to noise, thus highlighting the similarities between the physical and mathematical conceptions of information.
Both articles seemed to quote Cherry simply as an authoritative description of particular theories or phenomena, rather than to use him as any sort of essential frame for their entire article.
Meadow & Yuan mention information systems, but they refer only to information infrastructures--that is, information retrieval and transmission systems.
Skyttner uses the word system is the most general sense, which is appropriate considering his article is in a cybernetics and systems journal. Here, a system can be a living system such as a person, a social system such as a government, or a physical system, such as a cup of hot coffee on a desk. Skyttner also mentions communication or information systems of signs or encodings--such as used for binary or digital transmission.
Both of the major information theories (the physical and the mathematical) reviewed in these two articles talk about the information transfer within a system. In the physical view, the entropy of a system tends to increase over time; accordingly, the information of that system tends to decrease. In the mathematical view, there exists a system of a transmitter sending a signal to receiver. This system may need to adjust whenever noise is introduced.
The strength of designating a system boundary around a topic of research is that it compartmentalizes your area of interest from everything else. If we consider only the coffee cup as our system, we can ignore the effects of the sun's cosmic rays or what song is playing on the radio right now. We can pretend the coffee cup is a closed system. (Unless we find out that cosmic rays are significantly affecting our coffee, in which case we either have an open system or we need to include the sun within the bounds of our system.) Have such a boundary lets us focus our research only those elements that are important to us.
Describing the focus of our research as a system also allows us to abstract the interactions taking place. We can make statements about this particular coffee cup system that should apply to all similar coffee cup systems. Or we can make statements that should apply to all physical systems, such as how they tend to increase in entropy.
The main weakness of a systems approach is determining what is or is not part of the system. In human communication, there can be a huge range of minor factors that are ignored. For instance, though we may be thinking of a person only as a receiver in an information system, that person may be relying on their past experience and knowledge to decode messages to them. The fact that they tended to skip 10th grade English class and the fact that they broke up with their significant other this morning--both past events considered to be outside the system--could be affecting their performance. Yet if we include everything in our system--so that all of existence is contained within a single system--the benefits of using a system model to direct our research is lost.
Skyttner, Lars. "Information theory - a psychological study in old and new concepts." Kybernetes: The International Journal of Systems & Cybernetics (1998) 27:3. p. 284-311
Meadow, Charles T. & Weijing Yuan. "Measuring the impact of information: Defining the concepts." Information Processing & Management. (Nov 1997) 33:6. p. 697-714.
CIS: Week 9
|Last Edited: 22 Oct 2004|
©2004 by Z. Tomaszewski.