Intellectual black holes

An October painting and article

One’s attempt to reckon with one’s lived experiences at a certain point in one’s life or one’s effort to heal from trauma and loss is often a lengthy and sometimes rocky process, more so in the absence of a support group rooted in a value system that supports change, clarity and freedom of thought and expression. As one disrupts silence and sets out to understand and tackle loss or / and injustice one soon realizes that each small or deeper personal wound and injustice latches onto a broader societal or cultural dysfunction. Then one realizes that increasing self awareness and waking up to more reality are not necessarily synonymous to healing and recovery from post traumatic effects, and yet healing does not take place in the absence of some level of clarity and capacity to situate one’s experience within the broader  sociocultural context. Stephen Batchelor says “When we wake up from sleep we return to our-life-as-it-is, not as we dream it. Waking up can also be the realization that one is caught up in one’s own views or reactivity and not seeing or responding to things-as-they-are.” As it were the journey to one’s Ithaca contains ample rabbit holes to tumble down and many more intellectual black holes to get sucked up in.

In today’s lengthy post I will touch a bit on the intellectual black holes we may all (those us who acknowledge that we have been traumatized, oppressed and conditioned and the rest of us) at times, to one extent or another, get sucked into, taking into consideration how much information is out there and how we may not always be alert to or aware of strategies used to influence people’s choices, views and their interpretations of experiences. Today’s topic is something I have been meaning to write about for a while. As I have now almost completed the recycling process (over many months) of piles of journals I have been keeping over the last fifteen years or so, and also, have been evaluating the many podcasts and books I have consumed along the journey, the need to write this piece became more salient. Going through all this material and my recorded observations brought the journey and new insights into focus again. As it seems both understanding and healing occur in layers, and thus, the learning process can be a mix of acquisition of valuable and useful knowledge, and also, encounters with intellectual black holes. Finally, I will be using Stephen Law’s book as a basic resource in order to organize this lengthy piece and make my points as brief and succinct as possible. Law provides a lot of examples to support his arguments in his book, which I will not go into depth here. My intention in this post is to mostly bring awareness to the mechanisms because I think that understanding these strategies can make it easier for us to apply critical thinking and discern when these mechanisms are employed during transactions, political debates, educational or religious contexts, conversations, and so on.

In his book, Believing Bullshit (2011), Stephen Law identifies eight key mechanisms that can potentially transform ideas into a psychological fly trap, which is a bubble of belief that is seductively easy to enter and maybe impossible to think your way out of again. He calls these sets of beliefs intellectual black holes and he writes “Cosmologists talk about black holes, objects so gravitationally powerful that nothing, not even light, can break away from them. Unwary space travelers passing too close to a black hole will find themselves sucked in. An increasingly powerful motor is required to resist its pull, until eventually one passes the “event horizon” and escape becomes impossible. My suggestion is that our contemporary cultural landscape contains, if you like, numerous Intellectual Black Holes— belief systems constructed in such a way that unwary passersby can find themselves similarly drawn in. While those of us lacking robust intellectual and other psychological defenses are most easily trapped, we’re all potentially vulnerable.” As I mentioned there is a lot of information out there and so many people vying for our attention. How many of us are fully immune and adequately skeptical? How many of us are fully present and aware of our buttons, have robust intellectual and psychological defenses, and also, are free of early cultural training or conditioning? To what extent are we immune to the media, the news, other authorities and cultural lies?

Stephen Law refers to eight belief-inducing mechanisms, which he calls: Playing the Mystery Card; “But It Fits!” and The Blunderbuss; Going Nuclear; Moving the Semantic Goalposts; “I Just Know!”; Pseudoprofundity; Piling Up the Anecdotes and Pressing Your Buttons. He believes that Intellectual Black Holes can exist without causing great harm, but they can also be dangerous. The effects of being sucked into these holes lie on a continuum. There are plenty historical examples of desperate youths that have been convinced to commit suicidal terrorist actions believing in an afterlife paradise or people idolizing and blindly following leaders of inhumane and oppressive regimes or supporting leaders’ decisions to embark on wars. More commonly, intellectual Black Holes can allow people to be pushed around or taken advantage of financially. Law writes: “Indeed, they are big business. But victims can be taken advantage of in other ways too. Intellectual Black Holes can also lead people to waste their lives.” He mentions that people may get sucked into intellectual black holes even though in other areas of their lives they may be cautious, subject claims to critical scrutiny and weigh evidence.

Law clarifies that Intellectual Black Holes lie at the end of a continuum and that almost all of us may engage in some of these eight strategies to some extent, particularly when beliefs to which we are strongly committed to are faced with a rational threat. However, he suggests that what transforms a belief system into a Black Hole is the extent to which we rely on such mechanisms when we try to deal with intellectual threats or we want to generate an appearance of reasonableness. He also discusses a few possible reasons why we are prone to getting sucked up in intellectual black holes to begin with. For instance, he refers to a theory that suggests we have evolved to be overly sensitive to agency because we evolved in an environment that contained many agents from family members to friends to rival tribes to predators. Being aware of other agents helped us survive and reproduce. So we evolved to be oversensitive to them. We hear a sound behind us and our subcortical brain lights up and we instinctively turn round, looking for an agent. Most times, there’s no one there, but in the few cases that there were a tiger in the bushes our vigilance would have saved our lives. It is similar to our evolved negativity bias. He suggests that this evolutionary view of things could partly explain our tendency to believe in the existence of invisible agents, which then can make us susceptible to believing others’ often unfounded claims. Law also refers to the theory of cognitive dissonance, which could explain our propensity to use the eight strategies described in his book. Dissonance is the psychological discomfort we feel when we hold conflicting beliefs and attitudes. It is suggested that we are motivated to reduce dissonance by adjusting our beliefs and attitudes or rationalizing them so that we may deal with the discomfort that the dissonance causes us. So Law says that this psychological defense in some sense may allow us to convince ourselves and others through applying the eight mechanisms that our beliefs in the healing power of crystals, for instance, is not contrary to reason, even though research has proven otherwise. Or we might believe that unicorns roamed the earth even though we can establish beyond reasonable doubt that unicorns probably never inhabited the earth because one would expect to find evidence of their presence, such as fossils of them or of closely related animals from which unicorns might have evolved.

Through applying these strategies we could, for instance, convince ourselves or persuade others that oppression, cruelty or poverty at a grand scale in the world are part of some grand scheme and those who suffer the most will be rewarded in an afterlife. Thinking that our belief system is the result of it being the only truth or maybe our being the chosen warriors or whatever, can lull our consciousness, soothe our despair and fear, distract us from tangible problems, and often justify persecution of others that hold a different worldview This kind of thinking has political implications, as well, because our belief systems can support particular economical and political systems. Questioning established opinions and beliefs can in many countries get people killed, whereas, in other more democratic societies people may experience assaults and marginalization. Another reason we may take up certain sets of beliefs is that even though science has progressed, and many of the supernatural agents that were invoked to explain natural phenomena have received plausible naturalistic explanations, there will probably always be questions science cannot answer, and therefore, it is often tempting to invoke some supernatural agent to explain that which we cannot yet understand or that which may be unknowable.

In brief, the eight mechanisms discussed in this book are:

Playing the Mystery Card, which involves immunizing our beliefs against refutation by making unjustified appeals to mystery. Law writes that one way someone might deal with any scientific evidence against their paranormal beliefs or claims might be by insisting, without justification, that what one believes is “beyond the ability of science to decide” or accusing someone of scientism, which is the idea that science can answer every possible question. Law writes: “Actually, few scientists embrace scientism. Most accept that there may well be questions science cannot answer. Take moral questions, for example…. the philosopher David Hume famously noted, that science ultimately reveals only what is the case; it cannot tell us what we morally ought or ought not to do. Nor, it would seem, can science explain why the universe itself exists— why there is anything at all. Scientific explanations involve appealing to natural causes or laws.” So a scientist can explain why water freezes, but not why there is a natural world in the first place. So, if there are questions that extend beyond its domain, then when the credibility of what someone believes is under scientific threat and rational scrutiny by applying this strategy one can a priori protect it by suggesting that it is simply something science cannot explain. He does however, clarify that mystery is no bad thing and that pointing out mysteries can be a valuable exercise in firing up our curiosity and getting us to engage our intellects, and also, that there is nothing wrong with acknowledging that some things may remain a mystery or be in principle unknowable.

“But It Fits!” involves coming up with ways of making evidence and theory “fit”. In the book Law proves that any theory, no matter how absurd, can be made consistent with the evidence. He even provides a fictional example of a theory about dogs being Venusian spies. He writes that “any belief, no matter how ludicrous, can be made consistent with the available evidence, given a little patience and ingenuity.” Law also refers to The Veil Analogy, which is when someone suggests that the observable, scientifically investigable world is not all there is and that there is a further mysterious reality hidden from us, as if behind a veil and that only some of us can glimpse behind the curtain. Using the above mechanisms people attempt to make their claims and beliefs immune to any kind of rational or scientific refutation. At a different level this kind of immunizing strategy can often be combined with an attack on the character of the critic or the person holding a differing view. Law expands on how in order for a theory to be strongly confirmed by the evidence, at least three conditions must be met. The theory must make predictions that are: clear and precise, surprising, and true. Additionally, a scientific theory requires if it is to be credible to be both consistent and strongly confirmed by the evidence. One example of a theory that can be strongly confirmed would be the theory of evolution. Apart from the scientific method he also suggests the use of conceptual and empirical refutation, which is based on natural observation. People have been using this method for millennia, long before the development of refined and specialized tools known as the scientific method to deal with claims. So, it might be wise to rely on scientific information, and to use our senses, observational skills and critical thinking before we get sucked into worldviews, ideas, claims and beliefs. It is also wise to be aware of the strategies used to influence people, the power of suggestion and the placebo effect. Coming back to the harm that Intellectual Black Holes can cause I will mention one example concerning education from the book. Law claims that his central criticism to teaching non scientific theories about our evolution as a species involves teaching children to think “in ways that, under other circumstances, might justifiably lead us to suspect the thinker is suffering from some sort of mental illness….. We run not only the risk that children will end up believing ridiculous falsehoods, which is bad enough, but, worse still, that they’ll end up supposing that the kind of warped and convoluted mental gymnastics…..is actually cogent scientific thinking. We may end up corrupting not just what they think but, more important, how they think.”

Going Nuclear” involves exploding a skeptical or relativist philosophical argument that appears to bring all beliefs down to the same level, rationally speaking. Law writes “Going Nuclear is an attempt to unleash an argument that lays waste to every position, bringing them all down to the same level of “reasonableness.” He describes it as utterly annihilating the rationality of every belief. All positions, no matter how sensible or ludicrous, come out as equally irrational or rational. Of course, those who use this strategy rely on reason in their daily life, and also, to make their own case, and they only reach for the nuclear button when they are confronted with rational arguments against their own beliefs. As he points out all strategies are used selectively when in need. Spending too much time with people who use this tactic extensively can bring about helplessness. Related to this strategy is the belief that there is no objective Truth with a capital “T”, but rather, truth is always our own construction. Law writes: “In its simplest form, this sort of relativism says that what is true is what the individual believes to be true.” So, suppose I believe we are visited by aliens or fairies, then, according to such a relativist, it is true, and if you believe we are not, then for you it’s true that we’re not. Law explains that there are certainly a few beliefs for which it might actually be true. He provides an example of wichitee grubs, the large larvae eaten live by some aboriginal Australians, who consider the grubs a delicacy. Most Westerners, on the other hand would find them revolting. In this case there might be no truth with a capital T because the property of being delicious is ultimately rooted not objectively in the grubs themselves, but rather in our subjective reaction to them.

Law also discusses what we might call the Disney theory of truth. Here there is a switch from the view that reality is what we want to believe it to be to the view that the truth is what we want it to be. In order to make something come true, we need only wish for it (on a star, perhaps). Some common statements reflecting these views might be “life doesn’t happen to us. We make it happen” Or “reality is not what we perceive or believe it to be but what we want it to be.” He asks the question: “Before Copernicus, was it true that the sun really went around the earth, because that’s how it looked to people? Had Neil Armstrong and enough others believed the moon was made of cheese, might the Eagle have landed on a sea of Camembert? Most of us would answer, No. He comments that relativism offers a useful get-out-of-jail-free card when one find themselves cornered in an argument.

Moving the Semantic Goalposts involves dodging possible refutations by switching back and forth between meanings. This strategy relies on seesawing between two meanings of an expression, and also, seesawing back and forth between opinion stating and non-opinion stating and using language to suit oneself. Law provides lengthy examples of employing this strategy that people use, especially, to support religious beliefs and theories.

“I Just Know!” involves suggesting that the truth of your belief has somehow been revealed to you, by, for example, “some sort of a psychic or god-sensing faculty (this suggestion is unreasonable if you are aware of grounds for supposing that at least a large proportion of these supposedly revelatory experiences are, in fact, delusional).” He explains that it’s been suggested that our gut feelings can be insightful and “I just know!” can definitely be an appropriate response.  We all go with our gut, intuition, or instinct on occasion and sometimes, it’s unavoidable when for example we don’t have enough information or we are uncertain or there’s an emergency. However, it can, when used as a strategy, allow for anything to be claimed as true. He provides a vignette relevant to George Bush. He writes: “…none of this is to say that it’s sensible to go with your gut feeling when you don’t need to because, say, there’s ample and decisive evidence available. We are also ill advised to place much confidence in the instincts of someone whose particular gut has a poor track record, or on topics on which we know that gut feeling has generally proven unreliable….. Bush was distrustful of book learning and those with established expertise in a given area. When he made the decision to invade Iraq, and was subsequently confronted by a skeptical audience, Bush said that ultimately, he just knew in his gut that invading was the right thing to do…… How did Bush suppose his gut was able to steer the ship of state? He supposed it was functioning as a sort of God-sensing faculty. ..”

In this chapter of the book Law discusses what knowledge is, the strengths and weaknesses of Plato’s understanding of what it means to know, of evidentialism and reliabilism, Reliabilism, in addition to Plato’s prerequisites of knowing, asserts that someone’s belief must be brought about via a reliable mechanism. A reliable mechanism is a mechanism that tends to produce true beliefs. An example of a fairly reliable mechanism is our knowing through our sense of sight, smell, etc.  It allows us to fairly reliably track how things are in our environment, though he does make a point of how in a let’s say fictional laboratory setting where people have compelling visual hallucinations (about fruit in this case) our reliance on our sight might not be a wise idea. In such an environment we should remain skeptical and aware of our experiences. He goes on to suggest if there is plenty evidence to suggest that many religious experiences or other experiences claimed by some psychics at least are delusional or fake then we should be skeptical about our own and others’ similar experiences or psychic claims and not take them at face value, no matter how compelling they might be.  I could add that such experiences could be undeciphered unconscious material or the result of unmetabolized old conditioning. Also, experiences are often shaped by our cultural expectations and by the power of suggestion. Law suggests that certain activities and experiences might have a marked psychological effect, produce some interesting, and possibly beneficial, psychological states like peace and contentment, and also, help us gain some valuable insights into ourselves and the human condition. But if we mix in the power of suggestion people could interpret their experience in many ways; however, interpretations may not necessarily bring about genuine insight into reality or mean that one has become attuned to some sort of ineffable transcendence,

Pseudoprofundity is the art used by many of making the trite, false or the nonsensical appear both true and deep. Law writes: “Pseudoprofundity is the art of sounding profound while talking nonsense. Unlike the art of actually being profound, the art of sounding profound is not particularly difficult to master. As we’ll see, there are certain basic recipes that can produce fairly convincing results— good enough to convince others, and perhaps even yourself, that you have gained some sort of profound insight into the human condition.”  He suggests that this technique works best if pronouncements focus on life’s big themes. Some examples from his book: We were all children once. Money can’t buy you love. Death is unavoidable.

A second technique is to select words with opposite or incompatible meanings and cryptically combine them in what appears to be a straightforward contradiction. Because such sentences are interpretable in all sorts of ways and because they can easily appear profound. Some examples from his book: Sanity is just another kind of madness. Life is often a form of death. The ordinary is extraordinary.

A third recipe for generating Pseudoprofundity, identified by philosopher Daniel Dennett is the deepity, which involves saying something with two meanings, one trivially true, the other profound sounding but false or nonsensical, like for instance, “Love is just a word.” So, Law explains: “the sentence is trivially true. On the other reading, the sentence is not about the word love but love itself— that which the word love refers to. Love is often defined as a feeling or emotion. Love may even, arguably, be an illusion. But the one thing it definitely isn’t is a word. So on this second reading, “Love is just a word” is obviously false.”

A fourth technique is using jargon. Law writes: “Whether you’re a business guru, lifestyle consultant, or mystic, introducing some jargon can further enhance the illusion of profundity…. For example, don’t talk about people being sad or happy; talk about them having “negative or positive attitudinal orientations.” Next, translate some truisms into your new vocabulary. Take the trite observation that happy people tend to make other people feel happier. That can be recast as “positive attitudinal orientations have high transferability.”

Another technique involves using science and scientific terms. He writes that “references to quantum mechanics are particularly popular among peddlers of pseudoscientific claptrap. Quantum mechanics is widely supposed to make weird claims, and hardly anyone understands it, so if you start spouting references to it in support of your own bizarre teachings, people will assume you must be very clever and probably won’t realize that you are, in fact, just bullshitting.” He continues “In 1997, Alan Sokal, a professor of physics at New York University…. annoyed with the way in which some postmodern writers were borrowing terms and theories from physics and applying them in a nonsensical way, published, along with his colleague Jean Bricmont, the book Intellectual Impostures. Impostures carefully and often hilariously exposes the scientific jargon– fueled nonsense of various intellectuals writing in this vein…… Intellectual Impostures followed the “Sokal Hoax” in 1996. Sokal submitted to the fashionable postmodern journal Social Text an essay packed full of pretentious-sounding, pseudoscientific claptrap.” This publication became an “Emperor’s New Clothes” moment.

Piling Up the Anecdotes. Law says that anecdotes are in most cases almost entirely worthless as evidence, particularly in support of supernatural claims, but they can be persuasive, especially when collected together. Another popular type of narrative involves interpreting coincidence in all sorts of ways, even though coincidences are inevitable. As Law says “There are billions of people living on this planet, each experiencing thousands of events each day. Inevitably, some of them are going to experience some really remarkable coincidences. Such coincidences will be thrown up by chance. The odds of flipping a coin and getting a run of ten heads by chance is very low if you flip the coin only ten times. But if billions of people do the same thing, it becomes very likely indeed that a run of ten heads will occur. Such coincidences can easily generate the appearance of supernatural activity.”

Consider the psychic claims concerning phone-ringing episodes.  (Example from book): “I know I’m psychic. For example, last week I was thinking about Aunt Sue, whom I hadn’t talked to for ages, when the phone rang.” However, every now and again we will meet or hear from people we have been thinking about. We fail to consider all the times that we have been thinking of other people, but have neither bumped into them nor heard from them. Or consider stories of spontaneous remission attributed to supernatural causes. Law writes that “interestingly, reports of “miraculous” medical recoveries tend to be largely restricted to the kinds of cases in which such spontaneous remission is known to occur …..” We also tend to remember the hits or successes, but not all the times something has failed. We remember the cases that support a belief, and ignore those that don’t. This is called confirmation bias. Law sites Francis Bacon, a pivotal figure in the development of the scientific method, who once said, “The general root of superstition is that men observe when things hit, and not when they miss; and commit to memory the one, and forget and pass over the other.”

Coincidence and the power of suggestion account for things like sightings of monsters in lakes (e.g. The Loch Ness monster). All sorts of shapes can be created in the water by floating wood, fish, otters, the wind, and so on. Like the shapes we see in clouds some of this activity may resemble a dinosaur or a monster. If we are primed through stories and beliefs to expect to see a monster, then a few people will probably report sightings of monsters. The power of suggestion is also illustrated by Kenneth Arnold’s famous sighting of the first flying saucer back in 1963, while he was flying his light plane. On landing, he reported unidentified flying objects, but the news media picked up the story and, soon after, many people were reporting the saucer-shaped objects in the sky. Law writes:” But here’s the thing— Arnold did not report seeing flying saucers. What Arnold said he saw were boomerang-shaped craft that bobbed up and down, somewhat like a saucer would do if skimmed across a lake.” Then finally, a photographer looked through his long telephoto lens and said, “Yep … that’s the planet Venus alright.”

But it’s not just perception that can be led astray by the power of suggestion. In the book there is mention of the well known psychologist, Jean Piaget, who had an early memory of nearly being kidnapped at the age of two while being walked in his pram by his nurse: However, when Piaget was about fifteen, his family received a letter in which the nurse admitted that the story she had told was false. So Piaget’s memory was the memory of the story being told. Law writes that “studies reveal that in somewhere between 18 and 37 percent of subjects, researchers can successfully “implant” false memories of events…”

Pressing Your Buttons involves reliance on certain kinds of non-truth-sensitive techniques for shaping belief, Isolation, control, uncertainty, repetition, and emotion can play a major role in producing and sustaining an Intellectual Black Hole:. Law writes these techniques if applied in a consistent and systematic way, amount to brainwashing and they are a mainstay of the “educational” programs of many cults and totalitarian regimes. Beliefs can also be shaped through the use of reward and punishment. He writes that someone may, for instance, influence the beliefs of children by giving them a sweet whenever they express the approved kind of beliefs, and by ignoring or punishing them when they expresses the “wrong” sort of belief. Some of this early transmission of attitudes and beliefs may be part of our adult implicit and explicit beliefs and biases.

According to Law, isolation is a useful belief-shaping tool. An isolated individual is more vulnerable to various forms of psychological manipulation. A related mechanism is control. If you want people to accept your belief system, you need to gain control over the kind of ideas to which they are exposed to and have access to, and censor and discredit or raise doubt about beliefs and ideas that threaten to undermine your own. Oppressive regimes, but also schools and other institutes can employ these strategies to one extent or another. Mindless repetition instead of critical thinking are also encouraged or reinforced and any sense of meaning, purpose, and belonging is preferably derived from one’s belonging to their system of belief. Emotions are often manipulated to shape beliefs, fear in particular. Of course, as one can understand, these mechanisms are particularly potent when applied to children and young adults, whose critical defenses are weak, and who more easily accept whatever they are told.

He supports that we can also use emotional manipulation, peer pressure, censorship, and so on to induce beliefs that happen to be true, and that the extent to which we shape the beliefs of others by pressing their buttons, rather than by relying on rational means, is often a matter of degree. There’s a sliding scale of reliance on non-truth-sensitive mechanisms, with brainwashing located at the far end of the scale. He writes: “There’s clearly a world of difference between, on the one hand, the parent who tries to give her child access to a wide range of religious and political points of views; who encourages her child to think, question, and value reason; and who allows her child to befriend children with different beliefs; and, on the other hand, the parent who deliberately isolates her child, who ensures her child has access only to ideas of which the parent approves, who demands formal recitation of certain beliefs, who allows her child to befriend children who share the same beliefs, and so on.”

Law concludes that “there’s at least one very obvious and important difference between the use of reason to persuade and the use of these kinds of belief-shaping techniques. Reason is truth sensitive. It favors true beliefs over false beliefs…… Reason functions, in effect, as a filter on false beliefs….. it’s not 100 percent reliable, of course— false beliefs can still get through. But it does tend to weed out false beliefs. There are innumerable beliefs out there that might end up lodging in your head, from the belief that Paris is the capital of France to the belief that the earth is ruled by alien lizard-people. Apply your filter of reason, and only those with a fair chance of being true will get through. Turn your filter off, and your head will soon fill up with nonsense.” Finally, he writes that when we rely on reason to try to influence the beliefs of others, we respect their freedom to make or fail to make a rational decision. Whereas, when we resort to pressing their buttons we are stripping them of that freedom and we have rendered them our puppet. The button-pressing strategy is, in essence, a dehumanizing approach.

Comments are closed.