Is “Nonreductive Physicalism” an Oxymoron?Read the rest of the article.
Introduction
My view of human nature is physicalist—in the sense of not dualist. As do many philosophers of mind these days, I call my position nonreductive physicalism. But when I began using this term ten or so years ago, the “nonreductive” part, I realized, was just a place holder. I had no adequate answer to the question: if humans are purely physical, then how can it fail to be the case that all of their thoughts and behavior are merely the product of the laws of neurobiology? But doesn’t reductionism just have to be false? Otherwise we are not holding our positions for reasons—we are determined to do so. And in fact, we can’t make sense of a meeting such as the one at which this paper was presented—we must have just been taking turns making noises at one another.1
I believe that I now have the resources to provide an answer to the reductionists, due in large measure to the collaboration of my colleague in neuropsychology, Warren Brown.2 However, our solution to the problem took us three hundred pages, so I can’t give an adequate argument in this short essay. I’ll focus on one aspect of the issue, the role of downward causation, and then shamelessly promote our book to provide the rest of the story. The other significant ingredient in our argument is the development of what we call a post-Cartesian, and particularly post-Cartesian-materialist, account of the mental. A Cartesian-materialist account attempts to understand the mental almost entirely as related to the brain—inside the head. We argue instead for a concept of the mental as essentially embodied and constituted by action-feedback-evaluation-action loops in the environment, and “scaffolded” by cultural resources.3
Understanding Downward Causation
The topic of downward causation (and its opposite, causal reductionism) is an interesting one in its own right. But it would also be an interesting topic from the point of view of the sociology of knowledge. What I mean by this is, first, there are many ardent reductionists among philosophers and scientists, and I would state their position not in terms of “I have good grounds for this thesis,” but rather: “I can’t imagine how reductionism can fail to be true.” On the other hand, one can do a literature search in psychology and cognitive neuroscience and find hundreds of references to downward causation. Presumably these scientists would not use the term if they thought there was anything controversial about it.
Meanwhile in the philosophical literature there was an article in 1974 by Donald Campbell on downward causation, but scarcely any mention of the topic again until the 1990s, when it began to show up in philosophy of mind. I believe that the most common stated position on the relation of mental phenomena to the brain among current philosophers of mind is nonreductive physicalism. Yet Jaegwon Kim has been remarkably effective in using the concept of downward causation as one of the horns of a “five-horned-lemma”: you either have (1) to be a dualist, or (2) countenance some spooky form of causal overdetermination, or (3) accept an even spookier concept of downward causation, or (4) give up on the causal closure of the physical in order to avoid (5) reductionism. He has convinced a surprising number of philosophers that “nonreductive physicalism” is an oxymoron.
So I take the thesis of downward causation to be the denial of the thesis of causal reductionism. And we have scholars on both sides, some saying that reductionism must be true, others that it must be false. Ludwig Wittgenstein claimed that when we find ourselves saying that it just must be this way, we should be suspicious that our thinking has been captured by mental images rather than being motivated by arguments. So in this essay I’ll do four things. One is to trace the source of the mental imagery that makes it seem that reductionism must be true. Then I present a short history of developments in philosophy that have shown us how to get out of this particular Wittgensteinian fly-bottle.4 This account will end with the suggestion that downward causation is best understood in terms of “context-sensitive constraints” imposed by global characteristics of a dynamical system. Third, I illustrate this claim by applying it to the behavior of an ant colony. And, finally, I mention some of the additional issues that the nonreductive physicalist needs to deal with.
The Atomist-Reductionist Fly-Bottle
When I first began teaching, my students tended to be innate reductionists. That is, when I presented them with the model of the hierarchy of the sciences, and a corresponding hierarchy of complex systems, I never had to explain why reductionists held the position they did. Within an interval of about fifteen years, though, I’ve found that many students are innate anti-reductionists; thus it has become important to be able to explain why causal reductionism seems necessarily true to so many. There is a worldview change going on now, and reductionism has been one of the central features of the modern worldview.5
To understand how reductionism could have gone unchallenged for so long we need to see its origin in early modern physics. Aristotelian hylomorphism (the thesis that material things are composed of matter and an activating principle called a form) had to be rejected due to the new astronomy; an alternative theory of matter was found in ancient atomism. Reductionism was the outcome of combining the atomism that early modern physicists took over from Epicureanism with the notion of deterministic laws of physics. Early modern atomism consisted of the following theses: First, the essential elements of reality are the atoms. Second, atoms are unaffected by their interaction with other atoms or by the composites of which they are a part. Third, the atoms are the source of all motion and change. Fourth, insofar as the atoms behave deterministically they determine the behavior of all complex entities. Finally, in consequence, complex entities are not, ultimately, causes in their own right.
When modern scientists added Newton’s laws of motion it was then reasonable to assume that these deterministic laws governed the behavior of all physical processes. All causation is bottom-up (this is causal reductionism) and all physical processes are deterministic because the ultimate causal players, the atoms, obey deterministic laws. The determinism at the bottom of the hierarchy of the sciences is transmitted to all higher levels.
When we recognize that all of the assumptions in this early modern picture have been called into question, the reductionist dogma loses some of its grip on the imagination. Atoms modeled as tiny solar systems have given way to a plethora of smaller constituents whose "particle-ness" is problematic. The original assumption that the elementary particles are unaffected by their interactions has certainly been challenged by the peculiar phenomenon of quantum nonlocality. Particles that have once interacted continue to behave in coordinated ways even when they are too far apart for any known causal interaction in the time available. Thus, measuring some characteristic of one particle affects its partner, wherever it happens to be. The main point of my paper will be that when we consider parts from levels of complexity above the atomic and sub-atomic, the possibilities for the whole to effect changes are dramatic, and in the case of complex dynamical systems, the notion of a part shifts from that of a component thing to a component process or function.
Scientific ideas about the ultimate source of motion and change have gone through a complex history. For the Epicureans, atoms alone were the source of motion. An important development was Newton's concept of inertia: a body will remain at rest or continue in uniform acceleration unless acted upon by a force. In Newton’s system, initial movement could only be from a first cause, God, and the relation of the force of gravity to divine action remained for him a problem. Eventually three other forces were added to the picture. Big-bang cosmology played a role too. The force of the initial explosion plays a significant part in the causes of motion, and it is an open question whether there can be an explanation of that singularity.
There is also the problem that we no longer know how to define determinism. For the Epicureans, determinism was in nature itself. After the invention of the concept of laws of nature, we have to distinguish between the claim that things or events in nature determine subsequent events versus the claim that the laws of nature are deterministic. But much has changed during the modern period. The concept of a law of nature began as a metaphor: God has laws for human behavior and for non-human nature. While it was thought that nature always obeyed God’s laws, God presumably could change or override his own laws. By Laplace’s day the laws of nature were thought to be necessary. But today with multiple-universe cosmologies and reflection on the anthropic issue there is much room, again, to imagine that the laws of our universe are contingent: it can be asked why the universe has laws and constants, from within a vast range of possibilities, that belong to a very small set that permit the evolution of life.
Jeremy Butterfield argues that the only clear sense to be made of determinist theses is to ask whether significant scientific theories are deterministic. This is more difficult than it first appears, however. It may appear that the determinism of a set of equations is simply the mathematical necessity in their transformations and their use in predictions of future states of the system. One problem, though, according to Butterfield, is that “there are many examples of a set of differential equations which can be interpreted as a deterministic theory, or as an indeterminate theory, depending on the notion of state used to interpret the equations.”6
Second, even if a theory is deterministic, no theories apply to actual systems in the universe because no system can be suitably isolated from its environment. The only way around this problem would be to take the whole universe as the system in question. If the idea of a theory that describes the relevant (essential, intrinsic) properties of the state of the entire universe and allows for calculation of all future states is even coherent, it is wildly speculative.
A third problem, argued by Alwyn Scott, is the fact that many important theories dealing with higher levels of complexity (such as those governing the transmission of nerve impulses) can be shown not to be derivable from lower-level theories, and especially not from quantum mechanics.7
Finally, William Bechtel has called into question the pervasive emphasis on laws in scientific explanations. He argues that most scientific explanation proceeds by identifying a phenomenon (e.g., vision), then by identifying the system involved in the phenomenon, and by decomposing the system into its functional parts. No need to refer here to any basic laws of nature. And if the decomposition itself sounds reductionistic, it is not, because the explanatory task is only complete when one understands how the functions of the parts are organized into the phenomenon of interest. So the existence of deterministic laws in some aspects of physics, or even of deterministic laws in neuroscience such as the Hodgkin-Huxley equations, have little or no relevance for explaining cognitive phenomena.8
So, given all of these developments, we might say that the assumption of complete bottom-up determinism has had the rug pulled out from under it.
Offering multiple perspectives from many fields of human inquiry that may move all of us toward a more integrated understanding of who we are as conscious beings.
Thursday, June 11, 2009
Nancey Murphy - Is “Nonreductive Physicalism” an Oxymoron?
I have an interest in the notion of physicalism, but I don't really have a complete grasp of the idea - this idea seems to go a long way toward helping me understand it more.
Tags:
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment