“Science and religion are two windows that people look through, trying to understand the big universe outside, trying to understand why we are here. The two windows give different views, but they look out at the same universe. Both views are one-sided, neither is complete. Both leave out essential features of the real world. And both are worthy of respect.”
-- Freeman Dyson
It seems that at the beginning of the third millennium of Western civilization, we are still faced with the same perplexing question as our ancestors: What do we really know about our world and ourselves? By “know”, I do not mean to have cognizance or awareness, but to perceive and intellectively grasp a clear and certain understanding. But what exactly is a clear and certain understanding?
The Roman scholar Pliny the Elder posited in the first century A.D. that the only certainty is that nothing is certain. In the 17th century, the French mathematician, physiologist, and philosopher René Descartes started with this idea as a premise (actually an equivalent proposition using the word doubt) and used logic to establish a foundation for human knowledge in a similar manner as Euclid established a basis for geometry. But in his exhaustive search for certainty, he created an intellectual impasse that has since divided the natural world into mental and physical realms -- the Cartesian duality of “mind” and “matter”.
In 1739, the Scottish philosopher David Hume declared two types of truth: “truths of reason” (1+1=2) and “matters of fact” (If I release this ball, it will fall). Hume argued that all knowledge of the physical world is independent of reason and consists of only sensory experience. For instance, we see objects fall all the time, and thereby become cognizant or aware that it is the custom or habit for objects to fall. It is a “matter of fact” that things fall. Any explanation of how and why things fall is a “truth of reason,” and reason does not have any cause-and-effect relationship with the event of the object falling. The “law” of gravity expounded in 1687 by the English mathematician and physicist Isaac Newton demonstrates mathematically how things fall, and it precisely describes our observations of falling objects. But this is a product of reason, a human concept or idea of the cause and effect, not the cause and effect itself. The ancient Greeks called this “saving the appearances” -- i.e. this is a “mental model” or “construct” that we can use to attempt to understand and communicate how things appear to us to be true and real.
The 5th-century B.C. Greek sophist Protagoras, whose works were destroyed in antiquity and whose ideas survive only through Plato’s writings that bear his name, refused to differentiate between sensory experience and reason by denying altogether any possibility of objective knowledge -- that is we cannot know the reality of material phenomena independently of the concepts derived from our senses, and all knowledge of reality becomes subjective by each individual’s unique ability to perceive and reason.
The effectiveness of our thinking depends on how closely our perceptions correspond with objective reality while being confined within the subjective framework of a personal point of view. This personal point of view comprises the accumulated knowledge and experience that is used by each of us to interpret new experiences and form new beliefs, but it also limits the very perceptions that give meaning to sensory stimuli. For example, we see light waves and we hear sound waves, but we perceive automobiles and music. Perception can only occur from within a personal frame of reference, and a person living in Chicago would perceive an automobile from the same light waves that an indigenous inhabitant having been isolated in the Amazon rain forest with no prior knowledge of automobiles would perceive a strange beast. This restricts and confines to a dissonant frame of reference the interpreting of new evidence to solve problems as well as acquire new knowledge. A personal frame of reference containing false perceptions can be difficult to overcome; and if we had not been able to discard the belief of the Middle Ages that the Earth is the center of the universe, there would be no astrophysics today. But without some sort of guidance by our preconceptions, we would be unable to do anything at all.
Self-esteem can be a powerful stimulus to search for truth, but it also can be a powerful stimulus to self-deception when threatened. Resistance to new ideas is directly related to their threat to self-esteem. When orthodoxy is seriously threatened, the self-esteem that is derived from it is also seriously threatened. When this happens, reactions to heresy are often emotional and physically violent. When Galileo’s Dialogue Concerning the Two Chief World Systems was published in 1632, it was not just a threat to the orthodox Ptolemaic worldview of medieval Scholastics. Pope Urban’s self-esteem was also threatened when he was persuaded by enraged Aristotelians that Galileo had intended the butt of the dialogue, the scholastic Simplicio, to be none other than himself. Galileo was summoned before the Inquisition and was threatened with torture. He was forced to recant his thesis and was placed under house arrest for the rest of his life. His treatise was placed on the Index where it would remain along with those of Kepler and Copernicus for over two centuries.
Galileo was spared the full wrath of the Inquisition, but three decades earlier the Italian philosopher Giordano Bruno was not as fortunate. February 17, 2000 marked the fourth centenary of his being burned at the stake as a symbol of magisterial intolerance of new ideas. Bruno’s quarrel was with the authority of Aristotle and the Church’s methods that encouraged ignorance and conformity through coercion. He incorporated Copernican principles in a pantheistic worldview that postulated an infinite universe; but unlike Galileo, Bruno’s theory was based chiefly on metaphysical grounds. Even though his era was a time, not unlike our own, when science was at odds with metaphysics, Bruno’s remarkable insight presaged modern relativistic cosmology in his book On Cause, Principle, and Unity:
“There is no absolute up or down, as Aristotle taught; no absolute position in space; but the position of a body is relative to that of other bodies. Everywhere there is incessant relative change in position throughout the universe, and the observer is always at the center of things [emphasis added].”
While taking into account the aforementioned epitasis, let’s look at what we think we know about our world and ourselves at the beginning of the 21st century. What we think, we know. Precisely how this works is not known, but our brains probably use some of the strange properties of “matter” that will be discussed later. Human vision responds to only a small fraction of the electromagnetic spectrum; and since our brains ignore much of that, our visual experience is extremely limited. We believe what we see, but our beliefs result from each human “mind” being uniquely colored and influenced by the contingencies of many additional factors including heredity, education, culture, environment, and historical circumstance. Subsequently, how can we know if there is any basis of empirical fact for the validity of these beliefs? Furthermore, how can we be certain that this is even possible?
We have journeyed into the 21st century, but not without passing through tremendous suffering and death imposed by false beliefs along the way. Morality myths are very seductive, and the misguided application of their inferences often leaves a tragic legacy that is difficult to absolve. In the late 5th century B.C., Hippocrates the Great taught that pernicious disease was not sent by angry gods as punishment, but this lesson was not learned until well after the early 17th century when what we now ascribe to germ theory was perceived to be the result of a divinely inspired morality play -- as were the Egyptian plagues described in the book of Exodus in the Hebrew Bible. Huge numbers of Native Americans died from smallpox and other epidemics while European Americans usually recovered. Their god being on their side inspired the European Americans; and the contrary was true for the Native Americans whose gods ignored their plight, whose medicines did not work, and whose population was decimated. The resulting ethnocentrism of the European Americans further objectified the natives as inferior heathen, and this eventually led to acculturation and genocide. From today’s historical perspective this was a major 17th-century geopolitical event that was initiated by chance natural occurrences, reshaped by false beliefs of ignorance, and combined with intolerance stemming from fear and greed to form the 19th-century ideology of Manifest Destiny. If you live in the United States you are part of this legacy of conformity.
Much of human history is a chronology of chance occurrences that evolved into major geopolitical events that determined who we are and what we believe. The fact that you and I exist at all as individual human beings is the result of the outcomes of innumerable chance events. In addition, according to the late Harvard University paleontologist Stephen Jay Gould, if the Earth were to re-evolve, it is most probable that we would never happen again as a species. Yet it is interesting that Carl Jung, the 20th-century psychiatrist who founded analytic psychology, speculated that even what we attribute to “chance” as being merely ignorance of future events could be the result of unknown physical “laws” in cases where known “laws” of simple cause-and-effect do not work. Nonetheless, the question remains whether “laws” pertaining to matter and energy actually govern or are merely descriptive abstractions of our perceptions of nature’s habits.
The scientific method that is based on observation and experiment was introduced to the modern world in 1620 by the English philosopher Francis Bacon, and it has served to liberate humankind from many of its false and crippling beliefs, particularly in the areas of biology and medicine. The term “agnostic” was coined by the 19th-century English biologist Thomas H. Huxley. Huxley’s “agnosticism” has suffered the same fate as Darwin’s “evolution” by being misinterpreted, misrepresented, and misapplied. Science uses Huxley’s “method” by starting with a testable hypothesis, such as the circumstantial evidence of Hume’s “matters of fact,” and follows reason as far as it will allow while not pretending certainty in matters that cannot be demonstrated. Even if this search terminates with negative results, the search alone is important by demonstrating what is not “true.” In addition, some important breakthroughs in science have been serendipitous. Useful things that were not the object of the original search simply have been stumbled upon. The people whose lives have been saved by the antibiotic penicillin aren’t concerned that its discovery was an accident.
An interdisciplinary approach with its diversity of ideas is as essential to the pursuit of knowledge as diversity of the gene pool is to the survival of a species. Since there was so much yet to be discovered during the 17th century, the explosion of knowledge about our world that began at this time could be attributed to the open-minded approach of the field naturalist. It was a time when one could afford to be interested in everything. Today it is a different story since it is necessary to specialize in order to succeed. Huge amounts of material with esoteric terminology obfuscate the curricula of specialized disciplines to the extent that hardly anyone can understand any specialty other than their own. The resulting fragmentation of perspective limits the possibility of discovering a larger significance for things encrypted within the minutiae of a single discipline. Even the word academic has been reduced to mean something of no practical significance.
Honest intellectual debate is important and productive, while the clash of partisan politics among specialized disciplines is counterproductive. Given sufficient time, often the only difference between heresy and prophecy has been their sequence in history. The persecution of Galileo in 1633 by the papal Inquisition for his advocacy of the Copernican heliocentric model of our planetary system is an important example of witch-hunts that date as far back as the conviction of Socrates. Plato realized that with the conviction of Socrates, no one could long maintain his independence and integrity within the framework of partisan politics. If history teaches us anything it is that orthodoxy is transitory, and each new worldview is eventually replaced with another -- the most recent being the rejection by mainstream science of Einstein’s view that God does not play dice with an eternal universe. I cannot imagine what a Weltanschauung will be like a thousand years hence.
A more recent example of a witch-hunt was the 1950’s equivalent of a book burning of Dr. Immanuel Velikovsky’s best selling book Worlds in Collision. Velikovsky, a Russian-American physician, took a psychoanalytic approach to an investigation of how the events in the Hebrew Bible that were claimed to have occurred by divine intervention during the Exodus could be attributed to natural causes. He speculated that there was a near-miss close encounter of Earth with a cometary Venus, and proceeded to historically corroborate it with texts from around the world of ancient legends that recount contemporary perceptions of extraordinary occurrences. His theory flew in the face of orthodox Judeo-Christian beliefs and the prevailing uniformitarian doctrine of science that attributed all geological change to existing forces operating uniformly from the origin of the solar system to the present time. The threat of large comets and meteorites was believed to have ceased some 3,800 million years ago.
Surprisingly little was said in religious circles. A few influential scientists in specialized disciplines did not believe his theory to be scientifically valid, which is amazing since some of them also admitted to not having read the book. This was inconsequential because it was not a matter of hermeneutics. It was a heretical book that jeopardized their proselytism, and they proceeded to threaten a boycott of his publisher and forced the relinquishing of the rights to this best selling book to another house. Velikovsky’s editor was dismissed. It was not unorthodox theology that was causing the stir; it was unorthodox science. Intolerance is not an exclusive club.
The renowned theoretical physicist Albert Einstein reflected on such matters with this aphorism: “Yesterday idolized, today hated and spit upon, tomorrow forgotten and the day after tomorrow promoted to Sainthood. The only salvation is a sense of humor.” Velikovsky was never given an impartial and fair hearing on this matter, and the entire affair caused him to suffer from serious depression. Velikovsky’s dour mood was expressed in a gift inscription to his editor in a copy of Ages in Chaos, a sequel to Worlds in Collision: “Another heretical book that will cause wrath and indignation of those whose teaching is threatened by it.” Einstein compared the reception with that accorded Johannes Kepler. He noted that contemporaries often could not differentiate between a genius and a crank, and he encouraged Velikovsky to see the humor in the entire affair. Einstein conceded the possibility of a catastrophic scenario, although he would not consider Venus as the culprit. But at the time of Einstein’s death, some five years after the book’s publication, Worlds in Collision was said to have been open on his desk. He was keeping an open mind, since throughout his life Einstein was unconcerned about conformity and loathed hypocrisy and dogma.
The last fifty years of spectacular advances in technology have enabled discoveries that confirm some of the specific predictions of Velikovsky’s “crackpot” theory, but they are more simply explained within the scope of other theories. However, consider that people in modern times had never witnessed the collision of a comet with a planet until the fragments of comet Shoemaker-Levy 9 smashed into Jupiter in July 1994 and produced several impact areas the size of the Earth. Also consider the conclusive crater evidence recently found near the Yucatan peninsula in Mexico that further confirms the theory of Luis and Walter Alvarez that an impact of a large object caused the dinosaur extinction event at the end of the Cretaceous some 65 million years ago. In addition, there is new evidence that implies that an impact of a comet or asteroid caused the most severe mass extinction event in the geological record that ended the Paleozoic Era some 250 million years ago. Couple these unique chance occurrences with the “many-body” problem of nonlinear dynamics, and it shows that a strict uniformitarian doctrine is not correct. In a self-evolving complex system that is sensitive to its initial conditions, very small unrelated chance events will produce unpredictable and sometimes drastic results by triggering a series of increasingly significant events. However, it has been problematic for science to use historical accidents as proof of any theory, even though this fundamental contingency is intrinsic in nature and pervasive in all of science from particle physics and astrophysics to climatology and developmental biology.
In 1912, Alfred Wegener’s “crackpot” theory of continental drift was ridiculed to a somewhat lesser degree, and it turned out to be correct but for the wrong mechanism of plate tectonics that was established some fifty years later by geophysicists. This further substantiated the theory of evolution by providing a mechanism for particularities in the geographical distribution of animal species that was first noted by Darwin’s contemporary Alfred Russell Wallace. Similarly, Velikovsky’s theory was labeled “crackpot” since the mechanism for its causality disagreed with prevailing models of celestial mechanics that called for a stable and peaceful solar system in recent history. But if Velikovsky’s theory were to turn out to be “right” in the same sense as Wegener’s, think about the profound implications this would have on the foundation of faith for two thirds of the world’s population. It would be based on a unique chance occurrence of nature instead of a divinely inspired morality play, and the scope of the Hebrew Bible would be nothing more than a natural history of an ancient people that chronicles their emergence from barbarism to form a small nation. However, since truth has little to do with ideology, it probably wouldn’t make any difference except to the descendants of people on the wrong side of history’s bias.
Science rejects eyewitness accounts of unique occurrences because they cannot be subjected to empirical testing, and since this is not within the purview of science, it is warranted. However, science cannot provide all the answers, and we should not discount what the ancients knew or observed. Historians are not restricted to discovering predictive patterns, but instead merely hope to re-create accurate historical records as a means of preserving knowledge. Similar accounts of extraordinary events that occurred in the same time frame to geographically and culturally isolated peoples around the world are likely to be descriptions of the same event happening on a global scale. Myths have the stigma of being fables, and many are works of sensational fiction. Yet some are historical accounts that have been reshaped by false beliefs created in the collective imagination of an era. Our ancestors tried to make sense of events by creating connections and relationships with the use of metaphors that mirrored their ancient worldviews, and they left a rich legacy of written accounts of what they perceived to be happening in their world. If Heinrich Schliemann, a 19th-century amateur archaeologist, had believed his critics’ preconceived notions of what archaeology ought to be and that there wasn’t a shred of truth in Homer’s Iliad, he would not have been successful in excavating Hisarlik (Troy) and other discoveries at Mycenae, Ithaca, and Tiryns.
It is difficult to visualize the importance of celestial phenomena to ancient societies that did not have the technology of telescopes, since today only a few people with an unaided eye could locate Venus in the night sky from among the countless stars in the heavens, let alone distinguish any terrestrial significance from this tiny point of light. Yet the sun, moon, planets, and comets were paramount in ancient worldviews as gods, and they were afforded as many human attributes as the human mind could improvise. Since science along with its technology had not yet been invented, our ancestors could only imagine the unseen causes of phenomena in human terms -- not seeing things objectively as they are, but subjectively as they, themselves, were. Throughout human history, each and every view of the universe has contained some aspect of the processes of the human brain. Today “matter” is perceived differently, but it is still subjective. Evolving from the anima mundi of the ancient Romans, “matter” now consists of “fields” and “forces” instead of possessing “souls.” Empty space is teeming with unseen fluctuating energies of the virtual particles of a modern Aether. Yet nothing has changed except human perception.
To be continued . . .
Harold Williamson is a Chicago-based independent scholar. He can be reached at: email@example.com. Copyright © 2005, Harold Williamson
Other Articles by Harold Williamson
George Bush Trying to Pull a Rabbit Out of His Hat