Penn State, Goldman Sachs, Enron, University of Virginia, SuperPACS, the Catholic Church–we live in an era of institutional scandal. If you want to know why we are careening from one major institutional scandal to the next, there’s a simple answer: the psychology of power has changed.
To be sure, there’s nothing new about a scandal. The oldest human texts, from the Bible to the Bhagavad Gita, are full of them–and it wouldn’t surprise me at all if the very first cave painting was an editorial cartoon exposing a hunting accident.
There’s also nothing new about a powerful institution getting embroiled in a scandal–“scandal” is practically the twin of “monarchy.”
But in the past, there’s been a sense that when powerful institutions fail to police themselves effectively, they have in fact failed. The moral codes they lived up to may have been deficient, but they were at least trying to live up to them. In the modern era, which is filled with more institutions of greater complexity than ever before, we seem to be seeing an increasing inability of powerful institutions (and the people who run them) to follow even the most cursory moral code. Continue reading With Great Power Comes…Amorality?
Those lucky enough to have jobs are spending more and more time at them—so it matters more than ever to our mental health and psychological well-being what makes people happy on the job.
A recent poll asked Americans what it is that makes them happy at work, and the answers aren’t really surprising to anyone with an existential bent.
A significant number (88 percent) said it mattered to them that they could believe in the mission and purpose of their company. Meanwhile companies with fewer than 100 employees had a higher chance of those employees saying they were satisfied than the employees of companies with more than 2,500 workers.
And the ability to telecommute or schedule one’s time in a way that supports work/life balance was seen as crucial by many employees too. Continue reading What Gives Your Job Meaning?
If there is an “original sin” to intellectual culture of the last 500 years, it is the Big Data fallacy—the idea that if we can just gather up enough raw data, we can finally understand everything we need to know about the world.
Is someone not happy? All you have to do is develop a better set of metrics for personality tests. Is consciousness not revealing its secrets? All you need to do is model the brain more perfectly. Are people behaving in unpredictable ways? You just need more information on their habits and shopping patterns.
More data, the theory goes, always provides more clarity.
Practitioners of the human sciences, like existential-humanists, have always knows this isn’t true: Nietzsche’s perspectivism, the idea that there is no privileged or uniquely correct view of reality, presaged Einstein’s relativity. Continue reading Numbers That Lie
The legendary movie about the conflict between rote learning and passionate engagement with the humanities at a boys prep school in the 1950s recently celebrated its 20th anniversary—and came in for a resounding barrage of criticism.
Kevin Dettmar, an English professor at Pomona College, penned a piece for The Atlantic entitled “Dead Poets Society Is a Terrible Defense of the Humanities,” in which he accused the movie of shorting serious scholarship for “a fan’s relationship to the humanities.”
“Dead Poets Society,” he writes, “finally comes down to a preference for fans over critics, amateurs over professionals. Everyone engaged in the debates swirling around the humanities, it seems, is willing to let humanists pursue their interests as amateurs, letting ‘poetry work its magic … in the enchantment of the moment.’ … Scholars and teachers of the humanities, however: We will insist on being welcomed to the table as professionals.” Continue reading Why Can’t “Dead Poets Society” Get Any Love?
A conversation with a fellow writer turned into miniature a culture war last month when I mentioned that I didn’t believe that the human mind can be reduced to purely biochemical processes – that we are in fact more than a highly complicated biological machine.
“Really?” my friend said. He was baffled. “I’ve never met anyone intelligent who thinks that way.”
How culture has changed. The argument that we are nothing more than the sum of our parts wasn’t thought up by the New Atheists – Aristotle devoted significant time to rebutting it while the Buddha’s primary breakthrough as a philosopher was to redefine “self” in both material and spiritual terms. Across the millennia, much of the best that has been written and thought has been an examination of what it means to be human above and beyond our biology.
So what kind of cultural amnesia is it that my friend, an educated man in his 30s with a degree in neuroscience and an MFA, can honestly believe that he’s never met anyone intelligent who believes that mind is not reducible to brain and body? Continue reading The sad split between intellectual culture and human nature – a little light reading
Are we addicted to evil?
That’s the provocative question asked by Stephen Metcalf in an article for Slate.com.
By piecing together the etymology of the work “amok” (as in, “he ran amok”), examining the scripted quality of media coverage of spree violence like the shootings in Aurora, and looking at the difference between the Batman movies’ bland heroes and colorful, personality-filled, villains, Metcalf comes to a depressing conclusion:
The only way to truly end the violence of these inevitably male “pseudocommandos” is to “divest evil of its grandiosity or mythic resonance by completely banalizing it.” Which we could do if we wanted – but we don’t.
The problem is that as the world has gotten increasingly complicated and harder to make our way through; as we find ourselves more and more at the mercy of abstract institutions that cannot be effectively reasoned with or protested against; as our individual humanity seems to be smaller and less meaningful every day … evil becomes more and more alluring because someone who goes on a rampage is at least relevant. Continue reading We turn into villains because we don’t know how to be heroes: the appeal of “evil” in the 21st century
If you get depressed when you turn on the radio, it could be the news – or it could be the music.
According to new research, pop music has gotten significantly sadder over the last half-century. That’s measured in terms of tempo (it’s gotten slower), key (minor keys have come to predominate), and subject matter (songs are more “self-focused and negative”) … all combining into a serious case of the blues.
Indeed, the proportion of songs in a minor key reaching the top of the pop charts has doubled in the past 50 years, reaching almost 60 percent by the second half of the last decade.
Could this fact be making a statement about the society we live in – or about music itself?
Intriguingly, researchers say, the evolution of pop music towards increasingly complexity doldrums mirrors the evolution of classical music. Continue reading Our Music Is Getting Sadder. What Does That Say About Us?
There are a lot of hypotheticals around the question: “if we are lonelier, is our technology to blame?”
A recent article in The Atlantic says “yes” and “yes,” with the title alone being a giveaway: “Is Facebook Making us Lonely?”
Writing in Slate, Eric Klinenberg (author of “Going Solo”) says “no” and “no.” We’re not lonelier, and it’s not Facebook’s fault.
“The quality and quantity of Americans’ relationships are about the same today as they were before the Internet,” he writes, citing the work of Claude Fischer, the author of “Still Connected.”
Which is lovely if true—and yet the question persists.
The question “are we lonelier today than we were back when?” has a long pedigree—and the fact that it won’t go away does suggest that at the very least a substantial portion of the population thinks they are, in fact, lonelier. And isn’t theirs the opinion that counts?
At the same time, the fact that the question has been asked for so long—Rollo May was writing about it in the 1950s in books like “The Meaning of Anxiety” and “Man’s Search for Himself”—suggests that perhaps we are not lonelier now – but we were plenty lonely then. In fact, while May couldn’t have imagined Facebook 50 years ahead, he did suggest that the loneliness and isolation people were coming to therapists to address in the 1950s was predictive: it was going to become much more widespread in the coming decades. Continue reading Facebook Isn’t Making Us Lonely, But It’s Making It Hard For Us To Be Anything Else
Did you know we’re running out of neurotics?
According to an essay in the New York Times, “one modern American type is slipping into the past without a rattle or even its familiar whimper – the neurotic.”
The problem, though, isn’t that there are too few neurotics in the 21st century: it’s that there are too many.
Reporter Benedict Carey laments that the diagnosis of “neurotic” no longer means much because we’re all kind of “neurotic” now. He suggests we need to keep the diagnosis precisely because admitting “we live in a time that makes us neurotic” is better than diagnosing everyone with something stronger.
It’s a fair point, but it has larger implications. Many of the predecessors of Existential and Humanistic psychology, including Carl Jung and Rollo May, explicitly said that neurotics are canaries in a coal mine: the people who seek out therapy today are often the most sensitive to larger social trends that eventually are going to snatch everybody up … at which point what were “symptoms” today become “normal” tomorrow. Continue reading In the 21st century, being a “neurotic” is a good thing
Is constant immersion in digital technology changing kids’ brains for better or for worse? A recent Pew poll of experts that is now getting a lot of media attention determined that: 55% of people polled thing it’s for the better, and 43% think it’s for worse.
Well, gee, that was helpful.
The inconclusive results aside, is just picking a lot of people who have written on the subject and polling them really the best way to get a handle on as complicated and treacherous a question as whether the internet is changing the cognitive processing of the young in a meaningful way, and (if so) the kind of impact it’s having?
Because I’d like to suggest that it’s not. Especially when many of the “experts” are “experts” precisely because they made their names by already having settled opinions about this stuff. In effect the Pew poll asked Clay Shirkey “Do you think Clay Shirkey is on to something when he talks about digital technology?” Continue reading How not to study the “web generation”