The larger political and philosophic phenomenon that may be called “modernity” was the result of an effort, by and large successful, to bring about a fundamental change in human life—a change as significant as that brought about by the victory of Christianity in the fourth-century Roman Empire. This change was proposed by a group of thinkers in the 16th and 17th centuries, among the most prominent of whom were Niccolò Machiavelli, Francis Bacon, René Descartes, Thomas Hobbes, Benedict Spinoza, and John Locke.
This claim is admittedly astounding, and may well seem incredible to some. Indeed many scholars have rejected it. Various scholars have interpreted the thinkers noted above as being much less innovative in their thought than is implied by the notion of a “modern project.” Machiavelli, far from being a founder and promoter of “new modes and orders,” has been understood as striving for the revival of classical republicanism. The English editor of Bacon’s works, Thomas Fowler, saw him as “mid-way, as it were, between Scholasticism, on one side, and Modern Philosophy and Science, on the other.” John Locke has been seen, not as a bold innovator in the theory of natural law, but as a follower of “the judicious Hooker,” an Anglican theologian in the Thomistic tradition. This article won’t enter into this debate. However, regardless of what their intentions and self-understandings may have been, the changes in human life that have come into being since their time reflect much of what they wrote.
What was the modern project? In brief, it involved a new political approach aimed somewhat single-mindedly at security and prosperity, and a reformulation of human life on the basis of a new philosophic/scientific method aimed at increasing man’s power over nature. The latter became the triumph, however beleaguered and uncertain, of liberal democracy as a mode of governance. The former, which is most important for my purposes here, became modern science, with all the advances in technology and medicine it made possible.
The new science of nature can be regarded as new in at least two respects: a new approach, and a new goal. The new approach may be described as dogmatism based on skepticism: in other words, as Descartes explained his Discourse on Method, the proper procedure for science is to discard every idea and notion that can possibly be doubted and then build up a solid structure of knowledge of the basis of what remains—that is, what is indubitably true. Any conclusions that could be reached by means of this method, Descartes claimed, would necessarily be known as confidently as the proofs of geometry.
The comparison to geometry is not accidental: Whereas the philosophic schools of antiquity continued arguing with each other for centuries without reaching any sort of agreement, the truths of geometry were unchallenged. Thus, the geometric method—producing by strictly logical means air-tight conclusions based on seemingly indubitable first principles or axioms—recommended itself as the way out of the “scandal” of the schools, the constant debate among them that seemed to go nowhere.
The radicalism of this approach may be seen in Francis Bacon’s discussion, in The New Organon (henceforth, TNO), of the “idols” which he claims “are now in possession of the human understanding.” Our ordinary ability to understand nature is so deficient that basically nothing we believe can be trusted. We can’t begin with our ordinary notions and then seek to refine them: “the human understanding is like a false mirror, which, receiving rays irregularly, distorts and discolors the nature of things by mingling its own nature with it.”
To understand nature, we must do more than observe it and reflect on what we see. We must question it by means of carefully designed experiments and precisely record the answers it gives us: “all the truer kind of interpretation of nature is effected by instances and experiments fit and apposite; wherein the sense decides touching the experiment only, and the experiment touching the point in nature and the thing itself.”
Instead of relying on our ordinary observations of nature, we must discover a solid basis for founding a reliable structure of knowledge. That basis can’t come from (unreliably perceived) nature; it must be found in ourselves. We need an “Archimedean” point (the justification for using that phrase will become clear below) from which to begin our construction of the scientific edifice. As Hobbes notes, we can know with certainty only what we ourselves make.
As important as this change of approach is—science would no longer be the refinement and correction of common opinion but a humanly constructed structure of logically consistent propositions—the even more important innovation of modern science is its goal. As Bacon emphasizes in The Advancement of Learning (henceforth, AL), the biggest change he is advocating has to do with the purpose of science: Knowledge should be not “a couch whereupon to rest a searching and restless spirit; or a terrace for a wandering and variable mind to walk up and down with a fair prospect,” but rather “a rich storehouse . . . for the relief of man’s estate.” The goal of knowledge is no longer to enhance the good of the individual knower (by, for example, freeing him from superstitious terrors or satisfying his innate desire to know), but to “establish and extend the power and dominion of the human race itself over the universe” (TNO). Ancient philosophy’s failure to adopt this as the goal of its activity represents, according to Bacon, “the greatest error of all the rest” (AL) with which it may be charged.
For Bacon, the goal of human mastery of nature comes down to this: “On a given body, to generate and superinduce a new nature or new natures is the work and aim of human power” (TNO). In other words, humans would be able to change any substance into any other, or, for that matter, into a new, hitherto unknown substances that will have whatever qualities we want.
Although Descartes does not call attention to this point to the extent that Bacon does, he is in agreement with him: In Discours de la méthode he wrote that he felt compelled to publish his ideas once he saw that, as opposed the “speculative philosophy which is taught in the schools,” they could enable us to become “as masters and possessors of nature.”
The key to developing this kind of science is to focus on efficient causes: “Human knowledge and human power meet in one; for where the cause is not known the effect cannot be produced . . . that which in contemplation is as the cause is in operation as the rule” (TNO). By knowing the efficient causes of various effects, humans may be able to produce them—“artificially,” as we would say. This, of course, cannot be guaranteed, but, without knowing the efficient cause of something, it would be sheer luck if humans stumbled across a method of producing it.
Underlying this project is the assertion that “in nature nothing really exists besides individual bodies, performing pure individual acts according to a fixed law . . . the investigation, discovery, and explanation [of this law] is the foundation as well of knowledge as of operation” (TNO). These laws have nothing in common with the notion of “natures” in the Aristotelean sense. Bacon certainly understands the source of the Aristotelean understanding: “when man contemplates nature working freely1, he meets with different species of things, of animals, of plants, of minerals; whence he readily passes into the opinion that there are in nature certain primary forms which nature intends to educe” (TNO). However, as Bacon’s goal cited above makes clear, he regards Aristotelean “natures” as superficial; if his scientific project is successful, we will achieve, among other things, the alchemists’ dream of transmuting lead into gold—there is nothing about the “natures” of lead and gold which makes this impossible. (Indeed, from the point of view of modern physics, it is a matter of removing three protons and seven neutrons from each lead atom.)
However, the real causes of the phenomena we see are not visible to us unless we approach the problem methodically. Until we understand these real causes, we won’t be able to effect the changes we want: “For seeing that every natural action depends on things infinitely small, or at least too small to strike the sense, no one can hope to govern or change nature until he has duly comprehended and observed them” (TNO). In most cases, such “observation” can only be done by means of experiments, including the use of instruments which can detect these sub-microscopic events and reveal the results to us via counters, dials, and so forth.
Evolution of Modern Science
Despite Bacon’s importance for the development of modern science, it took several centuries before his vision began to take shape in reality. The first major development in science following the publication of Bacon’s works—Newton’s laws of motion and gravitation, which effectively did away with the notion that terrestrial and celestial objects were essentially different—didn’t require any investigation or understanding of the “infinitely small” bits of matter whose behavior, according to Bacon, underlie the observable phenomena. Similarly, it took a while before any technological innovations arose which depended on Baconian science—for example, the steam engines developed in the 18th and 19th centuries, which played such a big role in the industrial revolution, could be understood on the basis of pre-scientific common sense.
In the 19th century, however, with developments in areas such as electro-magnetism and chemistry, we began to enter a Baconian world in which the “secret springs” of nature were being understood and then harnessed by man for useful purposes, to produce effects of which common sense and naive observation would never have given us the smallest inkling. Bacon’s prediction, made centuries earlier, had been vindicated: After discussing the fortuitous discoveries of gunpowder, silk, and the magnet (he claimed no amount of speculation based on naive observation would have led men to suspect the existence of these items), he concluded: “There is therefore much ground for hoping that there are still laid up in the womb of nature many secrets of excellent use, having no affinity or parallelism with anything that is now known, but lying entirely out of the beat of the imagination, which have not yet been found out” (TNO).
Since those discoveries of the 19th century, the pace of scientific and technological progress has only accelerated. There is no need to catalogue all the ways in which scientific progress made possible the technologies that have changed our lives so much in the 20th and 21st centuries. While we can expect that this scientific progress will continue, in ways that will make real all sorts of technological possibilities, including some that we might regard today as redolent of science fiction, I believe we are at a point where we can fruitfully take stock of the modern scientific project and probe the challenges and paradoxes into which it is running—difficulties which, in hindsight at least, we can see were inherent in its initial structure and intention.
Divorce of Science from Philosophy
It is a commonplace to say that science became divorced (or, perhaps, emancipated) from philosophy in the modern period. From the scientific perspective, it is fair to say that philosophy is seen as a “handmaiden,” whose job it is to clear away any linguistic misunderstandings or puzzles, so that scientific progress can continue. At most, it can explain and justify the procedures scientists actually use, such as Karl Popper’s theory of falsification, which sought to “solve” the “problem of induction” as discussed by writers such as David Hume.
From the philosophic perspective, however, science may be characterized by its constrained ambition: It explicitly renounces any attempt to understand why we, or other beings, exist. As Bacon explained in his discussion of the “idols of the tribe,” the “human understanding” restlessly seeks for ultimate answers. We aren’t satisfied with “general principles in nature” (such as the laws of nature as science discovers them) but wish to attribute them to “something prior in the order of nature.” However, according to Bacon, we should treat the “general principles in nature” as “merely positive” (TNO); they cannot be referred to another or higher cause. In other words, according to Bacon, modern science begins with a “self-denying ordinance”—it cannot ask “ultimate” questions of the sort why we (or anything else) are here. That must be left to religion or philosophy, although Bacon himself admonishes that only an “unskilled and shallow philosopher [would] seek causes of that which is most general” (TNO).
This self-restraint of science need not, in itself, be the cause of a crisis. Most scientists probably accept the view that since science doesn’t deal with “ultimate” questions, it cannot opine on questions of religious belief—at least core religious beliefs such as the existence of God and an afterlife, the notion of God as the ultimate creator of all that is, and so forth.
It is true that some scientists—the “new atheists”—now claim that a refutation of religion is scientifically possible. To some extent, they produce arguments that religion is highly improbably and seek to conclude from that that it is impossible. But what religious believer ever thought that revelation was anything but miraculous—and thus improbable? In addition, they seek to give a “naturalistic,”—that is, evolutionary—account of the development of religious belief to counter the view that religious belief originated in revelation. We need not consider how compelling their accounts really are; of much greater theoretical importance is the argument that, if religious belief is the product of an evolutionary development process, it is hard to see why the same process does not explain philosophic beliefs as well, and ultimately their development of modern science. Thus science (despite what the new atheists assume) would have to allow that beliefs which evolve under evolutionary pressures can nevertheless be true.
Aside from the “new atheists,” many thoughtful scientists have looked to the very orderliness of nature—the fact that it obeys laws that can be expressed compactly and elegantly in mathematical formulae—for evidence that science has reached a level of fundamental truth. One of the 20th century’s leading physicists, Richard Feynman said in a lecture on the law of gravitation that he was “interested not so much in the human mind as in the marvel of a nature which can obey such an elegant and simple law as this law of gravitation.” This (mathematically) elegant and simple law allows the scientist to predict how hitherto unobserved phenomena will play themselves out. Feynman asks: “What is it about nature that lets this happen, that it is possible to guess from one part what the rest is going to do? That is an unscientific question: I do not know how to answer it, and therefore I am going to give an unscientific answer. I think it is because nature has a simplicity and therefore a great beauty.” Somehow, Feynman is able to understand the results of modern science as pointing to a truth, albeit an “unscientific” one.
Divorce from the World of Experience
In his famous Gifford lectures of 1927, the atomic physicist Arthur Eddington began by distinguishing between the “familiar” table, which is reassuringly solid and substantial, and the “scientific” table, which is “mostly emptiness. Sparsely scattered in that emptiness are numerous electric charges rushing about with great speed; but their combined bulk amounts to less than a billionth of the bulk of the table itself.”
According to Eddington, this divorce between the humanly perceived world and the world according to science was a new development, due to scientific progress in delving into the composition of the atom:
Until recently there was a much closer linkage; the physicist used to borrow the raw material of his world from the familiar world, but he does so no longer. His raw materials are aether, electrons, quanta, potentials, Hamiltonian functions, etc., . . . There is a familiar table parallel to the scientific table, but there is no familiar electron, quantum, or potential parallel to the scientific electron, quantum, or potential. . . . Science aims at constructing a world which shall be symbolic of the world of commonplace experience. It is not at all necessary that every individual symbol that is used should represent something in common experience or even something explicable in terms of common experience.
As we have seen, this development was “baked into the cake” from the Baconian beginnings of the modern scientific effort. But contemporary science does much more than question our ordinary understanding of the character of the objects that we encounter in daily life. Due to developments such as relativity and quantum mechanics, science also questions our most basic understanding of space and time, not to say logic.
This is much more disconcerting. It is one thing to say that the “real” characteristics of external objects (that is, their characteristics as science describes them) may be distorted in the process of our perceiving them, so that what we perceive is not necessarily what is “really” there. After all, we are familiar with examples of this in daily life: We know that, for example, what appears to us as water far off in the desert is in fact a mirage. However, our concepts of space and time don’t come from the external world but rather are those we use to understand it. When science tells us that these concepts (Euclidean geometry, the three-dimensional character of space, the “absolute” character of time) are wrong, we are at a loss. How can we imagine—as special relativity tells us—that space and time are part of a single space-time continuum, such that two simultaneous events separated by a given distance (as it appears to us) can be equally validly described by another observer as sequential events separated by a different distance? Or that space itself can be distorted, compressed or expanded, as general relativity tells us? Or, perhaps even more strangely, that a measurement made on a particle at one point can “instantaneously” affect the characteristics of another particle indefinitely far away—that is, that “non-locality,” which Einstein derided as “spooky action at a distance,” not only exists but has been experimentally verified.
As a result, the details of what, according to science, is “really” going on at any time bear no relation to the world with which we are familiar. Ultimately, just as Bacon suggested, the point of contact between the “real” world of science and the world with which we are acquainted is that the visible results of experiments agree with the predictions made on the basis of the scientific theories, regardless of the fact that the scientific theories themselves appear to describe physical situations that we cannot even imagine. Indeed, Feynman has gone so far as to assert: “I think I can safely say that nobody understands quantum mechanics.” As the flippant version of the reigning (“Copenhagen”) interpretation of quantum mechanics puts it: “Shut up and calculate.” The “real” world has indeed become, as Friedrich Nietzsche said, a fable.
None of this, however, need constitute a “crisis” of modern science, since none of this affects the intellectual and practical core of the enterprise. As long as the science progresses, throwing off new technological benefits as it does, it can maintain its intellectual credibility and the necessary material support. Bacon had already predicted that the science he was proposing would be comprehensible only to a scientific elite: “It cannot be brought down to common apprehension save by effects and works only” (TNO). Now, as the Feynman quip indicates, it may be only partially comprehensible even to that elite.
So if the increasing divorce of science from “truth” and “experience” is not an impediment to its further progress, what then is the issue? The assertion of this article is that there are two looming theoretical, if not practical, crises: one having to do with the goals of science, the other with its means. They both stem from an issue that was present at the birth of modern science but that is only now on the cusp of making itself felt: the fact that modern science regards man as having two roles. As a scientist, man is the analyzer (and hence potential or actual manipulator) of nature, whereas, as a member of the animal kingdom, he is as much a part of nature as any animate or, for that matter, inanimate object, to be understood (and even manipulated) according to the same principles and processes.
Although man’s dual role as the manipulator and the manipulated was evident at the beginning of modern science, it was originally a purely speculative matter. For all practical purposes, “human nature” could be taken as a given. Man was only theoretically, not practically, able to manipulate his own nature.
This began to change in the late 19th and 20th centuries, when movements such as eugenics proposed to “improve” human beings as a species, using supposed scientific knowledge to identify those who should be encouraged to reproduce, and those who should be discouraged or even prevented. The “scientific” basis of this movement rested on some very simplistic notions about genetics: For example, Charles Davenport, the biologist who founded the American eugenics movement in 1898, “believed that complex human traits were controlled by single genes and therefore inherited in a predictable pattern.”
In the mid-20th century, American psychologist B. F. Skinner proposed the science of behavioralism, or operant conditioning, as a means of improving human nature. In his novel Walden Two, Skinner described a utopian society in which operant conditioning has successfully molded citizens’ behavior. In the Soviet Union, the even more ambitious task of creating a “new Soviet man” was proposed. In an extreme statement, Leon Trotsky wrote that, under socialism, man
will try to master first the semiconscious and then the subconscious processes in his own organism, such as breathing, the circulation of the blood, digestion, reproduction, and, within necessary limits, he will try to subordinate them to the control of reason and will. Even purely physiologic life will become subject to collective experiments. The human species, . . . in his own hands, will become an object of the most complicated methods of artificial selection and psychophysical training.
This ambition was subsequently bolstered by the Soviet adoption of the pseudo-science of Lysenkoism, according to which acquired characteristics (which do not affect an individual’s genetic make-up) could nevertheless be transmitted to progeny. This, according to one scholar, was “the essential magic key that would open up the possibility of reshaping man and creating the New [Soviet] Man.”
These 20th-century attempts to manipulate human nature rested on scientific bases that can now easily be seen as laughably inadequate. The technologies underlying the society of Aldous Huxley’s Brave New World—the “fine-tuning” of in vitro fertilization so as to produce castes of human beings with predictably different mental and physical abilities, as well as the existence of a drug that produced temporary euphoria with no “hangover” or other negative consequences—remained safely in the realm of science fiction.
However, one has to wonder whether, given the tremendous recent progress in genetics and neuroscience, we can be confident that this will remain the case in the present century. If not, then the ability to manipulate “human nature”—sought in vain by the visionaries of the past—may become thinkable.
This is not the place for a review of the status of genetics and neuroscience, and what their prospects are for the remainder of this century and beyond. Progress in practical aspects of genetics has been very rapid, and it is becoming possible to “personalize” medical procedures and cures according to a patient’s genetic make-up. Using a new genetic engineering technique (CRISPR-Cas9), “researchers have already reversed mutations that cause blindness, stopped cancer cells from multiplying, and made cells impervious to the virus that causes AIDS.” At the same time, it has become clear that almost all relevant human characteristics, as well as susceptibility to most diseases, depends on the complex interaction of many different genes and environmental factors; in other words, we are a long way from knowing which genes should be altered, and how, in order to produce “designer” babies with increased intelligence, athletic or artistic virtuosity, or whatever characteristics their rich parents may desire.
Progress in neuroscience has been equally rapid. New imaging techniques have increased our knowledge of how the brain functions and which of its parts are responsible for specific mental activities. The discovery of neurotransmitters such as serotonin, and the increased understanding of how they function in the brain, has enabled the development of such psychotherapeutic drugs as Zoloft, Paxil, and Prozac. Nevertheless, as one British researcher has concluded, “modern neuroscience research has, as yet, had minimal impact on mental health practice” although he goes on to predict that “we are on the brink of an exciting period.”
In short, in both these crucial areas, it appears that we have, in the past decades, been accumulating basic knowledge and improving techniques at a rapid pace, but that the major pay-offs are still, by and large, in the future. Of course, we cannot be certain that even these recent advances will be enough to support the ambitious objectives that have been posited. Perhaps, to scientists of the 22nd century genetics and neuroscience will appear as inadequate to the task of manipulating human nature as eugenics and Lysenkoism do to us now.
But what if genetics and neuroscience really do have the potential that their advocates believe? Under this assumption, the consequences may show up in at least two ways: with respect to the goals of the scientific enterprise, and with respect how it understands its own functioning.
Science and the Human Good
As for the goals of science, we have noted the statements of Bacon and Descartes to the effect that the goal of science is to increase man’s power over nature. But these famous formulae are less precise about man himself and what constitutes his good. In particular, what are the good things for man that science will enable us to procure?
To some extent, of course, this question can be dismissed as unimportant. The abolition of hunger, the improvement of health, the invention or discovery of new and improved products for our convenience, comfort and amusement—all these things can be easily accepted as good for man without any need to philosophize about them. Underlying this easy acceptance is, however, our belief that we know what man is, and that we can accept as given our notion of what is good for him. Thus, regardless of the adoption of the “fact-value” distinction by contemporary social science, one might think that the modern natural scientific enterprise—in its role as the ultimate source of technological advances—would be justifiable only if it knew something about the human good.
However, it explicitly denies that it possesses any such knowledge. It contents itself with producing new technological possibilities and then is silent about whether these possibilities will be used for good, to say nothing of how to increase the chances that they might be. Thus when science gives rise to inventions, the potential of which for evil is manifest (nuclear weapons being the standard example), it has nothing, as science, to say. Given the division of humanity into competing nations, weapons can be developed even if everyone believes that their development is bad for mankind as a whole.
However, the advances on the horizon, which will increase human power over human “nature” itself, raise a much more fundamental question. How can science be for the good of man if it can change man and his “nature”? Would not a change in human “nature” change what is good for him? More fundamentally, is there any clear standard by which one could judge which changes in human “nature” are beneficial for human beings?
For example, liberal democracies hold that the opportunity freely to express one’s opinions and espouse one’s religious beliefs is something most men want and is, in fact, good for them. Thus political (and to some extent, technological) arrangements that facilitate this are to be favored. But could one not imagine the development of human beings, by genetic or other means, who would not feel such wants? Indeed Aldous Huxley’s Brave New World explicitly imagines this possibility. Of course, we citizens of liberal democracies are horrified at the thought of such things. But, in other regimes, the powers-that-be might find it an extremely attractive prospect, and they could argue that human beings who did not strongly care about their opinions and beliefs would be less likely to fight over them, making for a more harmonious society. They would argue that the liberal democratic position was just based on an unreasoning conservatism, a mindless preference for the way things have been rather than the way they could be.
Is this a problem for science itself? One result of this is that the goal for science could no longer be said to be the achievement of dominion over nature for the good of man, but instead the achievement of dominion over nature simply. To the extent that this power is used to manipulate man himself, the question would be, to whom are the scientists and technologists responsible? This has been a vexed question, in any case. But for the past several centuries, certain aspects of human nature have tended to favor the victory of liberal democracy over the course of the centuries. Briefly, despite its weaknesses, liberal democracy has given most of the people most of what they want most of time, which accounts for its relative strength and stability. But this also rests on certain other characteristics of human nature that on occasion motivate people to run great risks in the fight for freedom.2 If those characteristics can be manipulated, who is to say that other forms of government cannot be made stronger and more stable? We would not regard a science that serves to strengthen tyrannical forms of government—no matter how benevolent (along the lines of Brave New World) they could claim to be—as operating for the “relief of man’s estate.” But if human nature were suitably altered—that is, tamed—it is not clear why anyone would object to such a tyranny.
Furthermore, the “good of man” would have always been understood in the past to be the good of the human species, including those members yet to be born. Indeed, one could argue that science saw itself as more in the interest of future generations than of the present one, since the notion of scientific progress (the accumulation of new knowledge and hence new power) implies that future generations will have more technology at their disposal than we do. Genetic engineering, however, carried to an unlikely but not unimaginable extreme, would imply that the current generation is in a position to determine the characteristics of future generations.
We could perhaps, for example, make them like Nietzsche’s “last men”—essentially contented beings with no ambition or longing. Presumably, this would be done on the basis that this would make future generations happier than we are—or at least more contented. It could also be argued that the existence of weapons that could destroy all human life implies a need to make human beings massively less bellicose: as Bertrand Russell wrote, even before the development of nuclear weapons,
Science increases our power to do both good and harm, and therefore enhances the need for restraining destructive impulses. If a scientific world is to survive, it is therefore necessary that men should become tamer than they have been.
Alternatively, if science should clear the way to the fulfillment of perhaps our fondest wish—indefinite continuance in life—we might decide to dispense with future generations altogether, or to create them with characteristics that we prefer but that might hinder their ability to live autonomous lives (for example, we might engineer them to be content to cater to our needs and desires ahead of their own).
Science and Reason
Notwithstanding all of the above discussion, it seems clear that the scientific enterprise can continue to function even if it is no longer able to show that it functions for the sake of the human good, or even if we no longer understand what that might mean. What it cannot dispense with is reasoning or, more to the point, its reliance on the human ability to reason. As long as science knew basically nothing about the human brain and its functioning, this dependency was unproblematic. One could simply assume, as the early modern writers did, that human beings somehow had the ability in general to reason correctly (and to recognize and correct any errors in reasoning that they might make).
The more we know about how the brain functions, the more we are able to correlate the subjective experience of reasoning with chemical and electrical activity in specific parts of the brain. At this point, however, we come across certain conundrums that are difficult to understand, let alone resolve.
Unlike a computer, which is designed and programmed from the start to accomplish a certain set of tasks, the human brain (as understood by modern science) presumably developed gradually in response to evolutionary pressures over the long pre-agricultural period during which current-day Homo sapiens came into being; it had to enable its possessor to acquire sustenance via hunting and gathering and to navigate the inter-personal relationships of the troop to which he or she belonged.
Evolutionary theory recognizes that certain characteristics may develop, not because they contribute to the survival and reproduction of the organism in question, but rather as chance byproducts of characteristics that do. Presumably, the human ability to engage in abstract mathematical reasoning (for example, about prime numbers) would have to fit into this category; it is difficult to see how our ability to discover and understand a proof of the theorem that there are an infinite number of prime numbers could have enhanced our fitness to survive and reproduce during the period in which we evolved into our present state. (Indeed, it is hard to see why evolution would select for such a characteristic even now.)
This, in itself, may not be a difficulty. We could simply accept our ability to engage in mathematics and modern science as a whole as an inexplicable “gift” of nature. However, there is a deeper problem lurking here. If we knew that some object superficially resembling an adding machine had been developed for a different purpose, but that, as a byproduct of it serving that purpose, it turned out to be possible to enter a group of numbers into the machine and get another number back as an output, why would we ever trust that the output number represented the actual sum of the numbers we entered? Indeed, the situation is even worse than that. While our brains allow us to add up numbers, we sometimes make mistakes.3 When that happens, we are able—if we make the effort—to check our work and correct our mistake. Despite our vulnerability to making mistakes in arithmetic, we are somehow able to correct them and arrive at an answer that we can know with certainty to be correct.
Science currently possesses no clear explanation for how this is possible, and a strong case can be made that, on a materialist/Darwinian basis, it will never be able to. In Mind and Cosmos, Thomas Nagel points to three human phenomena which he claims cannot be explained within the modern scientific framework: consciousness, reasoning, and morality. But whereas modern science can view consciousness as epiphenomenal and morality as a cultural artifact with no scientific validity (for example, the fact-value distinction), it cannot dispense with reasoning; our ability to reason correctly for the most part, and, more importantly, to recognize with certainty correct reasoning when it is pointed out to us, is essential for the scientific enterprise.
As long as this human ability could be taken as a given, this didn’t pose any problems. As we begin to understand the brain and its functioning in greater and greater detail, and hence, presumably, begin to acquire the capability of affecting its functioning in ways that we chose, the paradox becomes more evident. Can we alter the human brain so as to give it new ways of “reasoning” of which it is currently incapable? Should we trust those new methods if we could?
The modern project, with respect to both politics and science, is prospering as never before, but its philosophical underpinnings appear weak. Liberal democracy, the Enlightenment’s most successful (but not only) political child, has proven able to satisfy sufficient human wants to give it the strength to combat successfully (so far, at least) the challenges that constantly arise against it. Modern science goes from triumph to triumph. To the extent that it can no longer claim to be unambiguously good for humanity (the development of nuclear weapons made that point clear to all), its practical position has been bolstered by the fact that no society can afford to fall far behind the scientific frontier if it wishes to safeguard its independence. Thus, as the recent detection of gravitational waves reminds us, science is still able to command vast resources necessary for its work, regardless of the absence of any prospect of near-term benefit to society as a whole.
Nevertheless, the rapid progress in areas such as genetics and neuroscience, which promise an increase in the scientific understanding of human beings and, among other things, their cognitive functions, means that the perplexities resulting from the initial dualism between man as the subject of study and man as the studier are likely to become more prominent.
1The notion of nature “working freely” is an important one for Bacon; it refers to the phenomena we meet with in the course of our lives, as opposed to what we can observe by a carefully constructed and instrumented experiment. It is only the latter, according to Bacon (and to modern science) that can reveal to us the “secret springs.” Bacon would have understood the notion of “phenomenology” as the observation of nature “working freely”: he rejected it avant la lettre, as it were.
2See, for example, the long discussion of the “struggle for recognition” in Frank Fukuyama, The End of History and the Last Man (The Free Press, 1992), pp. 143ff. Or consider the closing words of the American Declaration of Independence in which the signers “mutually pledge to each other [their] lives, [their] fortunes and [their] sacred honor.” Thus, to secure a government based on the protection of individual rights, they were willing to risk their lives and fortunes; but they intended to preserve their “sacred honor.”
3Strictly speaking, our brains, assuming, as science does, that they are natural objects working according to the laws of nature, cannot be said to make mistakes, any more than a watch which does not keep time accurately makes mistakes—such a watch operates according to the laws of nature, just as a “good” watch does. But “we” make mistakes all the time.