A conversation with a fellow writer turned into miniature a culture war last month when I mentioned that I didn’t believe that the human mind can be reduced to purely biochemical processes – that we are in fact more than a highly complicated biological machine.
“Really?” my friend said. He was baffled. “I’ve never met anyone intelligent who thinks that way.”
How culture has changed. The argument that we are nothing more than the sum of our parts wasn’t thought up by the New Atheists – Aristotle devoted significant time to rebutting it while the Buddha’s primary breakthrough as a philosopher was to redefine “self” in both material and spiritual terms. Across the millennia, much of the best that has been written and thought has been an examination of what it means to be human above and beyond our biology.
So what kind of cultural amnesia is it that my friend, an educated man in his 30s with a degree in neuroscience and an MFA, can honestly believe that he’s never met anyone intelligent who believes that mind is not reducible to brain and body?
It’s not that we have new evidence – we’ve always known that chemicals can have an impact on the mind (alcohol made you drunk in ancient Egypt, too) and we’ve always known that physical changes can lead to personality changes. The use of fMRI and PET scans changes the magnification of these observations, but not their substance.
But for whatever reason … and there are many potential culprits ranging from the cynicism brought on by two world wars to a historically unprecedented divorce in the west between the religious and academic cultures (which for most of history were inseparable) that hurts them both … it is now unfashionable in many educated circles to think that our human selfhood has to be taken seriously on its own terms.
Fortunately we are not alone. Over the last year I’ve seen some excellent writing, even in the “popular media,” that takes our human condition into account rather than wishing it away. As a public service to those who may feel alone and hopelessly out of intellectual fashion, I’d like to share a few.
The most recent, “Anything But Human,” was in the New York Times online just this week. Philosopher Richard Polt points out that attempts to reduce human behavior to natural or computational processes fails precisely because they try to explain away the very experiences – like right and wrong, like beauty and integrity – that human beings grapple with. “(T)he human race has evolved to be capable of a wide range of both selfish and altruistic behavior,” he writes. Absolutely. But this biological fact tells us nothing about how we should act – it assumes the acknowledgment of biology is enough to answer human questions when it fact it hasn’t even posed them. It is the fact that we grapple at all, in a way that machines and arguably many other animals don’t, that is most important: its what our humanity is in the first place, and biology has little to say about that process.
“What Consciousness Is Not,” in the wonderful website “The New Atlantis,” takes the argument from the nature of our humanity into the nature of consciousness itself – and professor of geriatric medicine Raymond Tallis expertly demolishes the assumption that hard science has got “consciousness” all figured out … or will once the MRI scans get more precise. He writes:
“(T)here are near-fatal difficulties posed by the idea that experience is, as philosophers of mind call it, “epiphenomenal”: how could a trait that is incapable of affecting an organism’s behavior, and so its ability to survive and reproduce, be effective at propagating itself through evolution? (Why would evolution continue the existence of pain if your body could simply withdraw your hand automatically without it?) Moreover, even if pain were epiphenomenal, the experience would nonetheless still exist, and would need to be accounted for and explained. The supposedly “easy” problems, no matter how one comes at them, still have irreducible, “hard” elements. “
Tallis by no means suggests that consciousness is immune from scientific investigation – but the assumption that our current paradigm has it figured out is intellectually unsustainable. We need a revolution.
“Just as rethinking the nature of light transformed our understanding of the physical world, shattering seemingly secure theories of physics to give rise to relativity theory and quantum mechanics, when we are finally able to account for the unfathomable depths of our own minds, it is sure to have profound and transformative consequences for our understanding of what kind of world we live in, and what manner of being we are.”
The Atlantic Monthly cover story “Mind vs. Machine” tells the extraordinary story of Brian Christian, a human who competed in the Turing Test. Normally the test is to make computers seem as human as possible – but Christian had a secret mission: to make himself a human being who seemed as human as possible. Along the way he discovered the dirty secret behind artificial intelligence: it’s not getting more human in any sense of the term. We human beings are increasingly limiting ourselves to act more like machines.
“We forget how impressive we are,” he writes. “Computers are reminding us.”
But why would we ever reduce ourselves to the level of machines, even unknowingly?
David Auerbach’s piece “The Stupidity of Computers” has a nice play-by-play. From search to chatbots, from language processing to Wikipedia, Auerbach shows how efficiency in routine processes is facilitated by humans limiting themselves to what computers can do better – or do at all. As Christian wrote in the Atlantic, a good coffee barista exhibits much more intelligence in the true sense of the term than even the most advanced data processor: all it can do is simple repetitive equations on a mass scale, and only then when it’s told. A barista has to exercise judgment on hundreds of unique issues every day.
But economic efficiency mean we’re happy to pay less for a computer that can do less more efficiently, and that customers will have to learn to accommodate. Once they install voice answering systems in call centers, you find yourself saying “Yes,” “No,” and “Account Balance” all the time, no matter what your problem actually is.
“We will bring ourselves to computers,” Auerbach writes. “This will bring about a flattening of the self—a reversal of the expansion of the self that occurred over the last several hundred years. While in the 20th century people came to see themselves as empty existential vessels, without a commitment to any particular internal essence, they will now see themselves as contingently but definitively embodying types derived from the overriding ontologies. This is as close to a solution to the modernist problem of the self as we will get.
“This will not mean a killing of creativity or of ineffable spirit, but it will change the nature of our creativity. The increasingly self-referential and allusive nature of art has already made “derivative” less of a pejorative, and the ability to mechanically process huge amounts of data with computer assistance will play a larger role in the construction of art of all kinds.”
Pleasant? No. But a great read – and an acknowledgement that humanity is so much more than its pieces parts and physical laws … if we let ourselves be.
In “Man’s Search for Himself” Rollo May wrote that each human being has to grow into freedom and into responsibility. We’re not born with it so much as the potential for it. Something about modern intellectual culture finds that potential very threatening indeed.