Skip to main content

Still working... another excerpt about the computational metaphor

For decades, the bidirectionality of the Computational Metaphor has caused concern amongst prominent computer scientists, suggesting that treating machines like people can lead to treating people like machines. Joseph Weizenbaum, the developer of the chatbot Eliza, was perhaps the most prominent critic of computer anthropomorphization for this reason (Weizenbaum, 1976). And Edsger W. Dijkstra, who coined the phrase, “structured programming” also spoke on his concerns about reversing the Computational Metaphor:

“A more serious byproduct of the tendency to talk about machines in anthropomorphic terms is the companion phenomenon of talking about people in mechanistic terminology. The critical reading of articles about computer-assisted learning... leaves you no option: in the eyes of their authors, the educational process is simply reduced to a caricature, something like the building up of conditional reflexes. For those educationists, Pavlov’s dog adequately captures the essence of Mankind —while I can assure you, from intimate observations, that it only captures a minute fraction of what is involved in being a dog,” (Dijkstra, 1985).

The bidirectionality of the Computational Metaphor is also apparent in our everyday conversations. Consider the following figures of speech which entail THE BRAIN IS A COMPUTER: “I can’t process all that information”; “Let me crunch the numbers”; “You can ping me later”; “He doesn’t have the bandwidth for this”. The reverse, THE COMPUTER IS A BRAIN, is also likely to be encountered quite frequently: “My computer is sleeping”; “The upgraded model has tons of memory”; “The camera sees my face”; “My laptop won’t talk to the projector”. These examples are so unremarkable it is easy to feel that they are not metaphorical at all. In fact, the distinction between metaphor and literality has been yet another concern among members of the computer sciences, arguing that the perceived literality of the Computational Metaphor not only limits scientific creativity, but also limits human attributes:

“Metaphors can be most dangerous when one forgets they are metaphors; one can become beguiled by familiarity rather than by corroborating evidence into accepting a metaphor as literal ... leading us to assume that the attributes normally possessed by either referent are possessed in the same way by the other. If humans and computers both possess ‘beliefs’, then a person may be led by the metaphorical usage to assume that the properties of human ‘belief’ should be limited to dispositions to act, since they are so limited in computers,” (MacCormac, 1984).

Yet the literal take on the Computational Metaphor seems quite popular as of recent (Marcus, 2015), and the mathematical reasoning is convincing (Richards, 2018). The general argument is that brains are not equal to laptops or smartphones; but rather that brains, laptops, and smartphones all fall under an abstract, mathematical definition of a computer. While this may be true in the literal sense, it doesn’t align with how most people conceptualize computers, and thus this brain/computer comparison still risks being widely mishandled. To put it another way, from a public outreach standpoint, it would be problematic to campaign the statement, a gun is a hole punch. This is literally true, as ultimately the purpose of both objects is to put holes in things. One could even study the properties of the punching element to design a better bullet, or vice versa. But if people start using rifles to perforate their sales reports, one may want to re-evaluate the cost of canvassing that statement.

This is not to say that scientific concepts have no place in the public arena, but only that the position of influence from which those concepts are communicated ought to be considered. Further, to downplay the importance of how non-experts, or the public, frame the Computational Metaphor based on common understandings is to forget that basic scientific research is largely funded by the public. As such, scientists are obligated to ensure that highly influential work is communicated in a language that is not easy to abuse.

Comments