Over at his Rationally Speaking blog, Massimo Pigliucci recently posted a 4-part series on Emergence, a still-controversial theory of higher-order properties. One of the earliest proponents of emergent properties was J.S. Mill:
All organized bodies are composed of parts, similar to those composing inorganic nature, and which have even themselves existed in an inorganic state; but the phenomena of life, which result from the juxtaposition of those parts in a certain manner, bear no analogy to any of the effects which would be produced by the action of the component substances considered as mere physical agents. To whatever degree we might imagine our knowledge of the properties of the several ingredients of a living body to be extended and perfected, it is certain that no mere summing up of the separate actions of those elements will ever amount to the action of the living body itself. (A System of Logic, Bk.III, Ch.6, §1)
In this series Pigliucci explores a variety of contemporary treatments of the idea.
Massimo Pigliucci is Professor of Philosophy at the City University of New York.On to the main event - I have included a couple of paragraphs from the beginning of each article. Follow the title link to read the whole article.
His research focuses on the structure of evolutionary theory, the relationship between science and philosophy, and the relationship between science and religion. He received a Doctorate in Genetics from the University of Ferrara in Italy, a PhD in Botany from the University of Connecticut, and a PhD in Philosophy of Science from the University of Tennessee.
He has published over one hundred technical papers, and a number of books for both technical audiences and the general public.
Pigliucci has won the Dobzhansky Prize from the Society for the Study of Evolution. He has been elected fellow of the American Association for the Advancement of Science “for fundamental studies of genotype by environment interactions and for public defense of evolutionary biology from pseudoscientific attack.” He is also editor in chief of Philosophy & Theory in Biology and associated editor of Biology & Philosophy. He can be reached on the web at www.platofootnote.org.
By Massimo Pigliucci | October 19th 2012I am about to go to an informal workshop on naturalism and its implications, organized by cosmologist Sean Carroll. The list of participants is impressive, including Pat Churchland, Jerry Coyne, Richard Dawkins, Dan Dennett, Rebecca Goldstein, Alex Rosenberg, Don Ross and Steven Weinberg. You may have recognized at least four names of people with whom I often disagree, as well as two former guests of the Rationally Speaking podcast (not to mention Don Ross’ colleague, James Ladyman).
The list of topics to be covered during the discussions is also not for the faint of heart: free will, morality, meaning, purpose, epistemology, emergence, consciousness, evolution and determinism. Unholy crap! So I decided — in partial preparation for the workshop — to start a series of essays on emergence, a much misunderstood concept that will likely be at the center of debate during the gathering, particularly as it relates to its metaphysical quasi-opposite, determinism (with both having obvious implications for most of the other topics, including free will, morality, and consciousness).
It’s a huge topic, and the way I’m going to approach it is to present a series of commentaries on four interesting papers on emergence that have appeared over the course of the last several years in the philosophical (mostly philosophy of physics) literature. Keep in mind that — although I make no mystery of my sympathy for the idea of emergence as well as of my troubles with reductive physicalism — this is just as much a journey of intellectual discovery for me as it will be for most readers. (A good overview can be found, as usual, in the corresponding entry in the Stanford Encyclopedia of Philosophy.)
That said, let us begin with “Emergence, Singularities, and Symmetry Breaking,” by Robert W. Batterman of the University of Western Ontario and the University of Pittsburgh. The paper was published in Foundations of Physics (Vol 41, n. 6, pp. 1031-1050, 2011), but you can find a free downloadable copy of an earlier version here.
* * * * *
By Massimo Pigliucci | October 26th 2012
Last time we examined Robert Batterman’s idea that the concept of emergence can be made more precise by the fact that emergent phenomena such as phase transitions can be described by models that include mathematical singularities (such as infinities). According to Batterman, the type of qualitative step that characterizes emergence is handled nicely by way of mathematical singularities, so that there is no need to invoke metaphysically suspect “higher organizing principles.” Still, emergence would remain a genuine case of ontological (not just epistemic) non-reducibility, thus contradicting fundamental reductionism.
This time I want to take a look at an earlier paper, Elena Castellani’s “Reductionism, Emergence, and Effective Field Theories,” dated from 2000 and available at arXiv:physics. She actually anticipates several of Batterman’s points, particularly the latter’s discussion of the role of renormalization group (RG) theory in understanding the concept of theory reduction.
Castellani starts with a brief recap of the recent history of “fundamental” physics, which she defines as “the physics concerned with the search for the ultimate constituents of the universe and the laws governing their behaviour and interactions.” This way of thinking of physics seemed to be spectacularly vindicated during the 1970s, with the establishment of the Standard Model and its account of the basic building blocks of the universe in terms of particles such as quarks.
* * * * *
By Massimo Pigliucci | October 29th 2012
So far in this series we have examined Robert Batterman’s idea that the concept of emergence can be made more precise by the fact that emergent phenomena such as phase transitions can be described by models that include mathematical singularities, as well as Elena Castellani’s analysis of the relationship between effective field theories in physics and emergence. This time we are going to take a look at Paul Humphreys’ “Emergence, not supervenience,” published in Philosophy of Science back in 1997 (64:S337-S345).
The thrust of Humphreys’ paper is that the philosophical concept of 'supervenience', which is often brought up when there is talk of reductionism vs anti-reductionism, is not sufficient, and that emergence is a much better bet for the anti-reductionistically inclined.
The Stanford Encyclopedia of Philosophy defines supervenience thus: “A set of properties A supervenes upon another set B just in case no two things can differ with respect to A-properties without also differing with respect to their B-properties. In slogan form, ‘there cannot be an A-difference without a B-difference.’”
A typical everyday example of supervenience is the relation between the amount of money in my pockets (A-property) and the specific make up of bills and coins I carry (B-property). While I am going to have the same amount of money (say, $20) regardless of the specific combination of coins and bills (say, no coins, 1 $10 bill and 2 $5 bills; or 4 25c coins, 9 $1 bills and 1 $10 bill), it is obvious that the total cannot possibly change unless I change the specific makeup of the coins+bills set (the opposite is not true, as we have just seen: we can change the composition of coins+bills without necessarily changing the total).
Again according to the SEP, “Supervenience is a central notion in analytic philosophy.
* * * * *
By Massimo Pigliucci | November 2nd 2012
The previous three installments of this series have covered Robert Batterman’s idea that the concept of emergence can be made more precise by the fact that emergent phenomena such as phase transitions can be described by models that include mathematical singularities; Elena Castellani’s analysis of the relationship between effective field theories in physics and emergence; and Paul Humphreys’ contention that a robust anti-reductionism needs a well articulated concept of emergence, not just the weaker one of supervenience.
For this last essay we are going to take a look at Margaret Morrison’s “Emergence, Reduction, and Theoretical Principles: Rethinking Fundamentalism,” published in 2006 in Philosophy of Science. The “fundamentalism” in Morrison’s title has nothing to do with the nasty religious variety, but refers instead to the reductionist program of searching for the most “fundamental” theory in science. The author, however, wishes to recast the idea of fundamentalism in this sense to mean that foundational phenomena like localization and symmetry breaking will turn out to be crucial to understand emergent phenomena and — more interestingly — to justify the rejection of radical reductionism on the ground that emergent behavior is immune to changes at the microphysical level (i.e., the “fundamental” details are irrelevant to the description and understanding of the behaviors instantiated by complex systems).
Morrison begins with an analysis of the type of “Grand Reductionism” proposed by physicists like Steven Weinberg, where a few (ideally, one) fundamental laws will provide — in principle — all the information one needs to understand the universe . Morrison brings up the by now familiar objection raised in the ‘70s by physicist Philip Anderson, who argued that the “constructionist” project (i.e., the idea that one can begin with the basic laws and derive all complex phenomena) is hopelessly misguided. Morrison brings this particular discussion into focus with a detailed analysis of a specific example, which I will quote extensively:
“The nonrelativistic Schrodinger equation presents a nice picture of the kind of reduction Weinberg might classify as ‘fundamental.’ It describes in fairly accurate terms the everyday world and can be completely specified by a small number of known quantities: the charge and mass of the electron, the charges and masses of the atomic nuclei, and Planck’s constant. Although there are things not described by this equation, such as nuclear fission and planetary motion, what is missing is not significantly relevant to the large scale phenomena that we encounter daily. Moreover, the equation can be solved accurately for small numbers of particles (isolated atoms and small molecules) and agrees in minute detail with experiment. However, it can’t be solved accurately when the number of particles exceeds around ten. But this is not due to a lack of calculational power, rather it is a catastrophe of dimension ... the schemes for approximating are not first principles deductions but instead require experimental input and local details. Hence, we have a breakdown not only of the reductionist picture but also of what Anderson calls the ‘constructionist’ scenario.”Morrison then turns to something that has now become familiar in our discussions on emergence: localization and symmetry breaking as originators of emergent phenomena, where emergence specifically means “independence from lower level processes and entities.”