Skip to main content

Excerpt from upcoming post on the social implications of the brain-computer metaphor



I've been working on an essay in which I point out an ignored implication of a very popular neuroscience debate -- whether the brain is a computer. In it, I focus on the idea that through reinforcing the metaphor, the tech industry wields it such that society gives special status to AI for making important decisions which it shouldn't. And because the neuroscience and computer science fields share a historically close relationship, neuroscientists may be inadvertently pushing pervasive forms of tech-solutionism which have been shown to be harmful to marginalized folk. It's taking longer to put it together than I had planned, but here's a small piece (to be edited):

If the computer is a brain, what’s missing from the metaphor?

Input:

To make data usable for a computer, often a series of human interventions needs to first be performed: collecting, cleaning, processing, and labeling are all crucial steps in getting a computer to “learn” patterns in the data. But what are these steps, and what is the data, in the COMPUTER IS A BRAIN metaphor? One could argue that these processes are part of the neural periphery, the nerve endings, the early stages of data processing, while the data is simply part of the world. But then the individuals charged with data collection, cleaning, and labeling would have to be considered part of the nervous system, and (by design) people are rarely, if ever, acknowledged as part of AI’s functioning. One could also argue that these steps are performed by the parents of an infant AI brain, helping it to learn about the world. But again, the parents of an AI are not commonly mapped as part of the metaphor, especially when the AI behaves poorly. This is ironic given that parents are legally responsible for the behavior of their children, while AI is often leveraged as a means to obscure accountability and deter recourse.

The fact that people are absent from the COMPUTER IS A BRAIN metaphor is not a fluke. In all predominant metaphors for data -- data are resources to be consumed, and data are forces of nature to be tamed -- people are absent. When tech evangelists promote the wonders of AI, they never mention the people from whom the data is extracted, or the data scientists and engineers who clean and process the data, or the gig workers making sub-minimum wage to label all that data, or those who are unknowingly subject to its beta-testing. These are all vital roles which make AI possible, yet they are afforded no slots in the metaphor schematic.

The same is true of the converse. When one suggests THE BRAIN IS A COMPUTER, it's very easy to strip important aspects of the person (i.e. body and sensation) away from one’s logic and intellect. This disregard of the brain’s role in sensation and emotion is deeply embedded in our language. For example, we may say, “We need a big brain to figure this out.” This is a figure of speech known as metonymy, and is inherently de-humanizing because it attributes a person’s value to a single part of themselves. The “brain” that is needed in this example is actually a whole person. But this person has been reduced to their talent for solving a very specific, perhaps mathematical, problem. We are typically more likely to refer to someone as “brainy” if they exhibit high-level STEM skills, but perhaps not so likely if they are instead a gifted musician, caregiver, or one who can see another person and empathize. When, in our language, did the brain become dissociated from these things? Why is someone considered smart when they speed through calculus, but not when they tear up reading poetry? This is noteworthy given that the brain’s limbic and reward systems make up a significant chunk of its volume, and removal or damage of these structures can be profoundly devastating on one’s ability to function. 

The brain is so much more than the logical, computable functions that serve as a muse for computer scientists to design object recognition tools. The neural networks which enable computer vision are inspired by the architecture of the brain’s visual cortex because it exhibits hierarchical behavior which is relatively easy to make sense of, compared to the vital systems in the brain which make socializing, caring, joy, and suffering possible. To design a tool that can compute one’s emotions takes much more than recognizing the difference between a sad and happy face. It takes more than vision. It takes seeing.




Comments