Summary of Chapter 3: Rigidity and Learning: Two Patterns of Communicative Behavior
In this chapter Wiener's point is to delineate two kinds of learning: one, with the example of ants, is rigid (instinctual or pre-programmed); the other, with the example of humans, is adaptive and capable of learning. He demonstrates with biology and actual bodily structures how and why the ant cannot develop the kind of memory and process of learning that a human can. Part of the point shows how human learning [and culture, though this is not raised] are made possible by our biology. There is a reversal going on in the language of this chapter: whereas in the previous chapter machines are being described as like living animals, now living animals are being described as like machines. Wiener talks about the importance of feedback, and that communication be a two-way phenomenon.
He briefly ventures into social organization with a simplistic typology from the Eskimo who are leaderless and apparently seen as living in a state of nature (though this is cooperative, not Hobbesian); through to the Indian caste system, and “oriental despots” (as the extreme example on the other end); to the US as a "moderately loose" structure somewhere in the middle. The US fails to achieve its potential because some people are psychologically attracted to fascism and "white supremacy.” In Wiener's account these seem to be individual psychological flaws, or errors in ways of thinking, rather than actually part of how US society works when it's at home. Anyway these "worshippers of efficiency" want a society based on that of the ant, and fail to see that humans are distinct from ants. Wiener details the differences between humans and ants, sticking in cybernetic pronouncements and lessons, such as “Cybernetics takes the view that the structure of the machine or of the organism is an index of the performance that may be expected from it” (57). He talks about the wastefulness of telephone exchanges which take the same amount of time and technology to connect you with anyone on the network, instead of remembering who you call most frequently and connecting you more rapidly with them (60-1) [and from such thinking has been born so much that is crappy and manipulative about current internet design].
He talks about the difference between analogue machines that operate by analogy, and digital machines which work on a “yes-no scale;” the analogue are limited by their use of analogy while the more abstract and numerical operation of the digital frees it for more uses; however, pace Wiener this limitation seems to have more to do with the process of translating from the analogue into the digital. As an example of an analogue device he gives a slide rule; a slide rule is ultimately limited in precision because the numbers on it need to be large enough to read. However, this seems incorrect: the numbers on the slide rule are themselves digital, and so are in fact a translation from the analogic relationship into the digital language of numbers. So what is happening here is not in fact a demonstration of the limitation of analogue measurement per se (given for instance a more precise technology for reading it, than the human eye); but rather, a demonstration that translating from analogue into digital is inherently lossy.
Wiener provides a brief scheme of history, particularly the break between the old Aristotelian view in which the goal of science was to determine categories into which to put things, to the modern view that science conducts experiments and in fact breaks down old categories or invents new ones. Newton is the big figure of the change here. Wiener puts forward, then walks back, suggestions that the human brain could be seen as digital, or that emotions are similar to the responses also in machines (and thus may actually serve a purpose). One goal of what he is working towards: "I wish to give a method of constructing learning machines, a method which will not only enable me to build certain special machines of this type, but will give me a general engineering technique for constructing a very large class of such machines" (66).