Bullshit may be the dominant form of expression in the early 21st century: we’ve reached a point where it’s impossible to have any cultural literacy at all if bullshit isn’t your second language.
So I was one of the people who celebrated when Henry Frankfurt published “On Bullshit,” his philosophical study of the unique language characteristics of bullshit. I’m not sure he really added anything to Orwell’s take on the subject, but, the more rigorous looks we have at bullshit, and why it’s a weed infesting our language and choking our culture, the better.
Except that a recent study out of the University of Waterloo (Canada) illustrates just how careful we have to be when interrogating this subject. One of bullshit’s most dangerous characteristics is that it’s sticky – and if we get it on our hands we have a hard time not spreading it around.
The study (read an interview with the author here) used a fake “Deepak Chopra quote generator” to create fake Deepak Chopra quotes (yes this is hilarious) and then portray them as actual quotes to study participants. The question they wanted answered: what kind of people can detect the “pseudo-profundity” of the mishmash quotes, and what kind of people can’t?
The answer: people of “lower intelligence,” religious people, people who believe in the paranormal, and people who believe in conspiracies.
People with “more analytic” cognitive styles, on the other hand, were better at detecting it.
Now the propensity of people who believe in conspiracy theories to believe pretty much anything is well documented and understood and doesn’t need to be remarked on here. But the rest of this – well – doesn’t it seem a little too calculated to make the kind of people who would conduct a study like this feel good about themselves?
This isn’t a casual question: the problem of social scientists inadvertently looking for results they want is actually a huge problem in the discipline (which is well documented, but Andrew Ferguson is the journalist to read on it). More than that, studies have clearly illustrated what artists and historians have been telling us all along: that our psychological biases strongly impact the way we approach and interpret quantitative data.
A study out of Yale showed that the more mathematically literate people are, the more likely they are to unconsciously twist mathematics to support their previously held conclusions. And a three-year study out of Dartmouth showed that most people – even very intelligent people – will cherry pick their facts to pick their preconceptions, even if explicitly presented with evidence that contradicts their beliefs.
Now it’s not all doom and gloom: obviously people do change their minds, and it’s only when someone’s beliefs are strongly tied to their sense of self (and self-worth) that their thinking becomes so rigid. But that’s just the point here: whenever we come across a study that flatters the self-image of the kind of people who make such studies (“We’re so smart and analytic! It’s people who believe in stuff that we don’t believe in who are susceptible to bullshit!”) that we should be extra cautious. Even vigilent.
Indeed, the methodology of this study has an obvious flaw: by generating bullshit statements in the style of Deepak Chopra, they are naturally going to seem more sensible to the people who are interested in the kinds of things Chopra talks about. Now the response from some self-styled “analytic” types might very well be “But everything Chopra says is already bullshit” – which may be true but is indicative of precisely the kind of bias we’re talking about here.
Given that scientific technical publications have a history of being fooled by bullshit science (and also this, among many others) I would be awfully curious as to how these same experiments would go if, instead of new-age-ish platitudes, they used the scrambled verbiage of prominent evolutionary biologists and string theorists and attributed the statement to these experts. Would it still be the religious people who had a harder time identifying the bullshit? Or would the kind of people who take pride in their scientifically-based skepticism and call themselves “analytic thinkers” be just as easily bamboozled?
The fact that nobody tried it suggests the size of the blind spot involved here. Bullshit researchers, more than anyone, should be careful of their own preconceptions.