We are all born empiricists. As infants, we begin to learn about the world through our senses. We watch, we listen, we feel, we taste. We learn to manipulate objects. We learn to crawl, and eventually to walk, by trial and error. From the beginning, we are endowed with both a curiosity about our surroundings and a capacity to experiment and observe. This is our first and purest way of knowing.
Later, when we acquire some facility for language, we learn a second way to know about the world. Typically, we start this new exploration by asking questions of our parents. “What is that?” “Will it bite?” Using language, we can ask questions about things that are beyond our present means of discovering firsthand. “What is the sun made of?” “How old do trees get?” We learn to incorporate our empirically-acquired knowledge and our linguistically-acquired knowledge into some tentative, incomplete, but more-or-less functional picture of reality. So armed, we venture forth into the world.
These two ways of learning, experience and language, are the only two ways of learning we will ever possess. It is worth examining how these two methods differ in the kind of knowledge1 they provide, and what the consequences are of ignoring this distinction.
An important, almost defining, characteristic of empirical knowledge is that its truth has nothing to do with how we might feel about it. On a clear summer day the sky is its own particular shade of blue -- whether you happen to like that shade of blue or not. While we can change some parts of the world in certain ways, the lesson we learn by observation is that most things have stable characteristics, or predictable transitions thought a series of characteristics, that define them. To stay with childish discoveries for now, we discover that rocks are hard, heavy and relatively changeless on our human timescale. Apples, on the other hand, begin small and green, grow large and (usually) red, and (if uneaten) turn brown, soft, and inedible. While we are capable of imagining mushy rocks and indestructible green apples we will not find such things in nature. Illusions and other failings of our senses aside, our experience reliably shows us what is.
The knowledge we gain through language, on the other hand, is of a very different character. Even in the realm of what we might call material facts, experience and language spawn different kinds of ideas. To know by word of mouth that your grandmother is seventy-eight years old is not the same as being aware of her wrinkles, grey hair, and bent posture. In this case both physical perceptions and the linguistic expressions do, in some sense, refer to the same underlying reality, but while wrinkles have a physical presence, “seventy-eight years old” is a conceptual entity – something we cannot physically point to. The ideas we express in words, while they may refer to the world2, are fundamentally constrained not by nature, but merely by the rules of language. One could as easily say that your grandmother is 678 years old. While probably not true, the proposition is just as expressible as one that is true. While the empirical realm corresponds to what is, then the realm of language corresponds only to what is imaginable.
Every bit of knowledge we acquire through language alone is, in an important sense, imaginary. If I tell you, for example, that I once walked from Fort William to Glenelg in the Scottish highlands, unless you happened to witness my whole trek from end to end, you cannot know what I’m saying is true in anything like the same sense that you know what you ate for breakfast or where you spent yesterday evening. You imagine my journey, however abstractly, and you make a certain assessment about whether or not it actually occurred. If I tell you that I walked from Ascraeus Mons to the Fesenkov crater on the planet Mars you would also imagine my journey in precisely the same sense – even though you would probably make a different assessment regarding the truth of such a claim.
Knowledge we acquire through language is contingent in a way that empirical knowledge isn’t. When we are told something or read something, we automatically measure its plausibility against other ideas we have already accepted as true. In parallel to this, we also assess the credibility of the source. Were this credibility assessment always a measure of past reliability it would not be particularly problematic. (e.g., Jones has rarely been wrong about his forecasts of the weather, so if he says that it will rain today it probably will.) Unfortunately, credibility often rests on far less rational grounds. (e.g., Jones is my uncle, a model citizen, and a freemason in good standing -- so if he says that it will rain today it probably will.) Authority often has dimensions that have nothing to do with empirical reality.
Consider our earliest social relationships. It is normal for young children to accept the words of their parents as facts. In part, this may be based on empirical past performance: as long as parents are not grossly incompetent or deeply pathological they can be expected to answer most of their children’s questions about everyday matters accurately. It would be foolish, though, to imagine the trust that children have for their parents is altogether rational. In the first place, young children don’t have much existing knowledge to measure new knowledge against. They are innocent, ignorant or pathetically unskeptical – depending on one’s perspective. In the second place, parents have a privileged position as providers and protectors, and we are probably predisposed by millions of years of evolution to accept their authority, at least as children.
Parents, or at least some adult assuming some semblance of a parental role, serve as our first examples of credible authorities. It follows, then, that the parent-child relationship must be the model for all subsequent authority relationships, both with other persons and with non-personal entities such as gods and nations. Annoyingly Freudian as this may sound, I believe it is self-evident. If you have doubts about this claim, you need only ask yourself -- How could it be otherwise? What possible model for authority relationships could any individual experience earlier? Even if we assume the pattern for authority relationships is not learned but wholly ingrained in us genetically, we cannot escape the primacy of the parent-child relationship.3 Genetics, driven by natural selection, would tend to produce traits that are advantageous to our survival, and if our genes predispose us to trust anyone for the sake of our own survival it would certainly be our parents.
The connection between parental relationships and relationships to adult institutions permeates language. “Our father in heaven.” “The fatherland.” “Mother Russia.” “America’s founding fathers.” The word “patriot” derives, ultimately, from the Greek patēr, meaning father. Even phrases like “international brotherhood” imply a paternal relationship indirectly. In fact, one struggles to find a relationship with a national or religious institution expressed in any other terms. One may speak of “friendly nations,” but this “friendship” refers to a relationship between one nation and another, not an individual’s relationship to a nation.
Inevitably, we model new relationships on old ones. Having established a certain level of trust for an external authority from infancy, we are predisposed to look for truth in the linguistic constructions of others from then on. This is why the theist believes in scripture, why the patriot is stirred by patriotic speeches, and also, at least in part, the reason that the scientist trusts the contents of a professional journal. We cannot learn everything we want to know empirically, so we accept the assertions of those whose authorities with whom we feel connected. The professional group, the political party, the nation, the state, the church – all become surrogates for the family.
It is true that what human beings want to know is a different matter than who they are inclined to trust. Once the strictly necessary knowledge of how to cope our everyday environment is learned, the quest for further material with which to occupy our minds can proceed in various directions. It varies according to one’s culture, one’s class, and no doubt plenty of highly individual factors. Typically, however, the quest for such non-essential knowledge is bound up with the quest for personal identity. Having worked out how to eat and not to be eaten, one can dabble in luxury of experimenting with who one is. One can learn how to be a Christian, a communist, or a certified public accountant. Each of these identities has its own associated group of adherents and its own unique set of social rules. In terms of providing an identity they all meet the same need, even though they offer drastically different ways of looking at the world. Social identity is, as it were, familial identity writ large. It is an expression not of the need for a reliable source of knowledge, but rather it is the expression of the need for a reliable source of personal context and security. To be either a Christian, a communist, or even (to an admittedly lesser extent) a certified public accountant is to project what is essentially a family identity onto a group of individuals far too large and diverse to be a family. It is to expect a certain level of protection from inclusion in this group, even if, in some cases, this protection only amounts to a vague sense of social legitimacy.
The idea that all human beings are irresistibly drawn to seek the truth is a naïve one. We may all be born empiricists, but, as I’ve already at least implied, the practical function of empiricism is survival. Having burned one’s hand on a hot stove, one does not generally seek the opinion of an authority, even a parent, to confirm that it would not be wise to repeat the process. Most decisions, however, are more tolerant of error. Questions like “What is the sun made of?” may devolve naturally from the same innate curiosity that moved us to touch the stove, but whether the sun is a large gaseous ball of mostly hydrogen or god with certain peculiar attributes matters little to one’s immediate survival. This is another sense in which our “knowledge” is of two kinds: those things we substantially need to know and those which we merely want to know. Whether or not false explanations of apparently non-threatening phenomena might be dangerous to the species is another question, and we shall touch on that later.
It is worth noting that almost everyone makes at least an unconscious distinction between beliefs passed to them by language and hard empirical truth. Even people who claim to believe fervently in the power of prayer, for example, rarely attempt to stop their cars with a prayer when a nice substantial brake pedal is available. Likewise, they would seldom pray for purely spiritual sustenance as a replacement physically nutritious food. People pray for love, for cures from diseases, for emotional strength, etc. In short, they pray for things they do not know how to attain by any means they know through actual experience. They do not, in general, pray for alterations of the physical world that their experience tells them do not occur. They do not ask God to do the laundry because they know intuitively that nothing will happen.
Religion, ideology, culture, and even science, are cognitive luxuries – things we indulge in because language gives us the capacity to, and which then take on their own peculiar trajectories. They are the byproduct of abilities we’ve evolved for other purposes. We can use our legs to dance if we are so inclined, but natural selection did not produce our legs for dancing.
Making matters even worse for cause of truth is the fact that erroneous beliefs and destructive ideologies can be personally advantageous under certain circumstances. For an average Russian in the 1940’s, a worshipful attitude toward Stalin was a better survival strategy than one of outspoken dissent, no matter how much Stalin may have lied or how many millions may have died as the result of his decisions. Or, to offer a less brutal example, given that the actual goal of the average Christian is a sense of security within the family of the church, an open and rigorous skepticism is rather counterproductive, especially if one lives in a predominantly Christian community. The believer believes that God exists in more-or-less the same way that the avid football fan believes his last-place local team is still, somehow, the best. It is not a matter of what is, it is a matter of who one is – and who one’s friends are.
The esoteric nature of much of human knowledge, along with the natural tendency to put undue trust in the often arbitrary constructs that make up human institutions, conspires against our native empiricism. This is a truth which the most educated among us often find especially difficult to grasp. Watch any debate between one of the more articulate advocates of atheism and any devout believer and you will witness the same tragic pattern, almost without fail. The atheist begins with the assumption that God’s existence or non-existence is a question about nature -- something you can work out empirically, like the age of a tree or the distance from here to the moon. The believer begins with the assumption that God is the head of a grand spiritual family, a figure whose existence is as unquestionable as the existence of the believer him or herself. The atheist lays out a concise, empirical argument. Unconvinced, the believer counters with the truth according to scripture. “Don’t insult my intelligence,” is more-or-less what the atheist is saying. “Don’t insult my family,” is more-or-less what the believer is saying. The debate ends in mutual perplexity and irritation, and no one’s mind is changed.
I believe (if you will pardon the irony with which I employ the term) that empiricism is generally a good thing. To unpack this slightly, understanding nature is a good thing and one achieves that understanding by observation. Being able learn by observation was a useful capacity when we were infants, and it’s a useful capacity to us still. The real universe is a far richer and, if I may say so, a more interesting place than any of the illusory worlds human beings have managed to invent. Physical reality is not without its dangers and we have to face them, but we in no way lessen those dangers by inventing new ones of our own. For many, I realize, a universe denuded of gods of one sort or another would seem an unbearably lonely place. This is a sensibility I do not share and frankly cannot share. Ultimately, such imaginary parents always demand more of us than their dubious comfort is worth. You may love God, your country, or the ideology of V.I. Lenin – but there can be no meaningful sense in which any of these ghosts can ever love you. That you mean anything to your God, your country, or your ideology is the saddest of self deceptions. It isn’t hard to see that the willingness of people to slaughter one another over religious or ideological differences is not a trait that benefits our species as a whole, even if the odd individual might profit from it here and there. To be in love with one’s illusions is a personal tragedy; to be willing to kill for them is a tragedy for us all.4
If scientific and philosophical advancement is to be a boon to humanity we must guard against a tendency to become smug about our grasp of the truth. Knowledge should not be reduced to a mere initiation criterion for yet another narrow social group. If you are a humanist, a skeptic, or anyone with a generally empirical frame of reference, it is virtually certain that your views are the product of a unique and fortunate personal history. Illusions abound in human society and there is plenty of opportunity to succumb to them. If you have avoided or overcome such illusions, it is only because your personal circumstances have endowed you with a certain critical mass of empirical knowledge that has allowed you to see another way. In other words, you have learned enough real, substantiated facts to have a general sense of how things actually work, and have had enough experience to learn what sort of explanations are likely to bear scrutiny. If you have reached that point, then blind belief is simply not an option for you. Adopting such an empirical perspective is not a grand, heroic choice, but merely the result of a certain series of events. Had your life been only slightly different you might well have arrived at very different conclusions. If we are truly proponents of reason, we can hardly be outraged that the often intellectually stultified lives of others may have led them to adopt greatly different worldviews. Though we may naturally find their illusions frustrating, we may no more fairly despise the unenlightened than we may fairly despise the disabled or the illiterate. To do so would be, in itself, unreasonable. In the struggle against ignorance, intolerance is counterproductive.
To be a proponent of reason in an often irrational society is to strike a very difficult balance. While intolerance is counterproductive, a blind, all-encompassing tolerance can be nearly suicidal. One cannot placate the fanatically religious or the fanatically ideological with patience and civility. One must be willing to resist anyone who actively denies empirically substantiated knowledge, who impedes the progress of knowledge on purely superstitious grounds, or who seeks to impose ideas on another by brute physical force. Anti-evolutionists, bomb-wielding fundamentalists, political extremists and holocaust deniers must be opposed. On the other hand, we should never become so self-righteous in our dedication to the real that we feel it necessary to immolate Santa on a pile of Harry Potter novels. It must be admitted that some false beliefs can be genuinely harmless, and that often even deeply deluded people can be naturally humane enough to refrain hostile or intolerant actions. The difficulty, of course, is that in any more-or-less functional democracy other peoples’ personal beliefs manifest themselves in public policy. The next-door neighbor who believes in a celestial father figure may not be a problem, but the millions of neighbors who elect a demagogue who undermines your civil liberties certainly are. Unable to trust in deities, we must put our faith in education. Ironically, the best tool we have to dispel humanities’ delusions may be the very language from which those delusions are ultimately made.
1 I am using the word “knowledge” in an everyday sense here, not in a narrow, epistemological one. I will use “knowledge” as a synonym for “belief” in certain cases. While the distinction between the two concepts is obviously important, I am striving for readability rather than philosophical rigor.
2 More rigorously, to “states of affairs”.
3 There are, perhaps, two distinct kinds of genetic predispositions to see the world in terms of authority, which we might call the competitive and the cooperative. Competitive hierarchies are the result of struggles for dominance between members of different species or of members of the same species that are not closely related. Birds contesting for mates and territories would be an obvious example. While this sort of hierarchy may have nothing to do with parental relationships, it is not the kind of authority I am referring to here. We do not, after all, adopt the beliefs of our enemies because we fear them. Cooperative hierarchies, on the other hand, are the result of a struggle to advance the common interests of close relatives. Animals as diverse as ants and human beings engage in this sort of authoritarian organization, and it may be said that even the lowly worker ant’s activities are driven by a parent-childhood relationship – in this case almost wholly driven by its genes.
4 Of course, by my own earlier admission, my beliefs are probably no more that the reflection of my own self identity. If I did not identify with certain views about what such a slippery term as “good” ought to mean, I would not indulge such a passion to proselytize.