CHAPTER 15: Is Nature Real?

 

page 217 middle

Medieval Christianity denied the immanent reality of nature…

The notion of “immanence” is intended to counter the religious heritage of science that considered the reality of the Creation to be derivative, and the heritage of Greek deductionism that considered that material world reducible to thought.

Hume criticized the argument from design on the ground that the order in nature might be immanent rather than externally imposed. The world might be more like an organism than an artifact. Darwin furthered this criticism, arguing that nature showed only the relative adaptations required for selective advantage, not the perfect adaptation that would be expected of a providential Designer. [Ian Barbour op cit, pp 44 and 58] Intelligent Design advocates hold that complexity is the hallmark of creation. The situation is exactly the opposite: simplicity is the hallmark of all artifice, and complexity is rather the sign of nature’s immanent reality.

Cf. Deason, in Lindberg and Numbers God and Nature, p176-77: “As instruments of God’s work, natural things do not have an inherent activity or end. Although they may have received a certain nature or property at creation, this constitutes only a ‘tendency’ that is ineffective apart from the Word of God. For Calvin, as for Luther, the behavior of a thing depends entirely on God… As a result of their belief in the radical sovereignty of God, the Reformers rejected Aristotle’s view of nature as having intrinsic powers. In place of the Aristotelian definition of nature as ‘the principle of motion and change’, the Reformers conceived of nature as entirely passive. For them the Word or command of God is the only active principle in the world.”

 

page 217 middle

The Enlightenment continued…discounting natural reality…

According to Margaret Cavendish, an early feminist contemporary of Newton, the reason for demoting nature is to underline masculine supremacy. She criticized the use of efficient cause to deny the inherent powers of “self-motion” possessed by matter. She complained that the new scientists would not admit to being part of nature (and so, not omniscient) and also biased in their own limited and partial point of view.

 

page 217 bottom

This autonomy guarantees that nature cannot be fully mapped conceptually.

Scientific realism agrees with the first part, but not necessarily with the second. It might be supposed, for example, that the Universe consists of a finite number of distinct elements, which can be mapped in a one-to-one relation with a logically complete deductive system. This, however, makes the further assumption that these elements can be correctly and exhaustively identified.

 

page 217 bottom

As a basic category, “realness” points not only to the world…

Cf. Weinberg Dreams of a Final Theory, p46: “When we say that a thing is real we are simply expressing a sort of respect. We mean that the thing must be taken seriously because it can affect us in ways that are not entirely in our control and because we cannot learn about it without making an effort that goes beyond our own imagination.”

 

page 218 top

It is characterized not by determinism but by randomness and irreversibility.

Cf. Prigogine and Stengers Order Out of Chaos, p9: “The artificial may be deterministic and reversible. The natural contains essential elements of randomness and irreversibility. This leads to a new view of matter in which matter is no longer the passive substance described in the mechanistic world view but is associated with spontaneous activity.” Random fluctuations (in the external energy flux that can lead to dissipative order) can produce new types of behavior, so that “noise” spontaneously gives rise to order. [p166]

 

page 218 lower top

Natural philosophy evolved into first-order science by ignoring the significance of those terms.

For example, Max Tegmark “Is the ‘theory of everything’ merely the ultimate ensemble theory” Annals of Physics, 270, 1-51 (1998), footnote 16: “…there is nothing about the physics we know today that suggests that the Universe could not be replaced by a discrete and finite model that approximated it so closely that we, its inhabitants, could not tell the difference.” I believe this statement is more a comment about “the physics we know today” than about the actual universe. It is a kind of restatement of Descartes’ skepticism. A demonstration that nature can be reduced to an algorithm would prove that nature is artificial, but it would not prove, as some religionists hope, that it is a divine creation. For, it could just as well be a creation of superior aliens.

 

page 218 upper bottom

…which, of course, includes the human brain.

The brain—as opposed to the formal constructions it can produce (including models of itself!)—is a good example of something with discrete aspects yet not deterministic. (One can, of course, imagine it to be deterministic—a program—which seems a strong temptation for many.) Yet any model we could make of the brain, while deterministic, is our own creation. It would be a fallacy to think it adequately represents the real brain, which is “transcendent” precisely because it’s not our invention.

As to nature’s ultimate structure, here are three relevant scenarios: (1) the universe is finitely large and finitely detailed, in which case its complexity can be exhausted in human descriptions; (2) the universe is infinitely large and/or infinitely detailed, in which case no finite scheme can encompass it; (3) independent of the nature of the world, the very nature of mind as an open system is such that understanding can never complete itself in an “ultimate” vision of nature.

 

page 218 bottom

Such an imposition by itself would render physical reality ultimately a deductive system, a set of propositions.

This is indeed the modern strategy invoked, for example, in using cellular automata to model a discrete structure of space. Space, time and fundamental particles are explicitly redefined as mathematical objects, which can only behave as they have been programmed to. In effect, physical law is replaced by computer instructions, which create imaginary behavior in simulations. The made replaces the found.

 

page 219 upper middle

Aristotle embraced this view, in tune with the archaic notion of the immanent reality of nature.

Aristotle’s primary matter is “without form and void,” like the primal waters before the biblical Creation and like Kant’s “world-in-itself.” It is plastic potential, without any properties, shape or qualities. Modernity identifies matter with mass or energy, but Aristotle would have thought of these attributes as ‘form’. The relationship between mass and force found inspiration in the ancient metaphysical opposition of matter and spirit. [See: Max Jammer Concepts of Mass in Classical and Modern Physics Dover 1961/1997, p5]

 

page 219 bottom

Not an industrial product, the very nature of nature is to resist tidy schemes…

Some contemporary examples of this “resistance” include the fact that the critical cosmic density parameter, ω, is close to one, but not exactly; that neutrinos are nearly massless, but not quite; that the proton and neutron have approximately the same mass. While the world would be far tidier without such complications, it could only be so by virtue of being a deductive system, as the ancients had hoped (e.g., why couldn’t a year be exactly 360 days?).

 

page 220 top

Reality lies precisely in such “imperfection.”

Newton borrowed from Plato the idea that the passive matter of the mechanistic world could run down or get off track periodically and need to have its order restored by divine intervention. Plato’s primal chaos, Newton’s rundown clockwork, and the 19th-century heat death of the universe all expressed the sentiment that matter is inert, without self-organizing or self-maintaining capabilities, in fact without any power or reality of its own, but only that bestowed upon it by its Designer.

 

page 220 upper middle

This view had been handed down from antiquity through Christian theology.

But it was not without controversy during the Enlightenment. To Leibniz, vis viva (kinetic energy) was the immanent force that animates things in nature, an active power of things to affect one another—in modern parlance, the ability to do work. Newton focused rather on vis mortua (momentum), change in which he considered to be the passive result of forces applied from outside the system (for example, by God). Cf. Brian Ellis Scientific Essentialism Cambridge UP 2001, p264: “Throughout the eighteenth century the question of which is the true measure of force—mv2 or mv—was hotly disputed… It was an absolutely fundamental disagreement about the sources of power in the world. Do the laws of nature operate on an essentially passive world… or are the things in the world animated by living forces…?”

 

page 220 middle

…the Enlightenment notion of divine whim supports the contingency…of nature.

Outside a religious context, for natural details to be contingent means that they cannot follow from theory or be logically necessary, but must be discovered through observation and experiment. The idea that they are dependent on divine whim is a religious way to express the resistance of natural patterns to any reduction to simple schemes. Intended to uphold divine freedom, it incidentally upholds nature’s independence from theory.

 

page 220 middle

…logical necessity, which presumably even divine will could not defy.

Greek deductionism received mixed support from Semitic religion. According to Leslie Dewart, the ‘I am that I am’ of Yahweh should be interpreted to mean that the very realness of things is God’s doing and not an inherent property of the world; everything, including nature and one’s own private experience, happens by divine grace and decree, even moment to moment. This is no less the understanding of Islam. On the other hand, at least some Christian thought rejects fatalism. And, while determinism may be inescapable in an impersonal system, the God of Moses and Muhammad is personal—a father figure. In a sense, then, scientific materialism (mater = mother) reverts to a pre-patriarchal world-view at least insofar as it acknowledges the independent reality of nature.

 

page 220 upper bottom

The contradiction…reflected in Newton’s mutually defining concepts of force and mass.

While mass was defined as the ratio of force to acceleration, the concept of force itself came under critical scrutiny, and an attempt was made to define mass in kinematic terms only. Masses could be compared this way (by the changes in velocity during mutual impact) but not established in an absolute sense. If all particles of matter were the same, mass could be measured simply by counting the number of particles (assuming that were feasible).

Change of velocity (acceleration) is a visual concept, whereas force is at root a kinesthetic concept. As a sense modality, vision was considered more objective. Perhaps one reason for this is the fact that the eye is 1032 times more sensitive to energy than the proprioceptive sense is to mass, owing to the relativistic exchange rate of mass and energy. Cf. Jammer Concepts of Mass, p190: “If this ratio were of the order of unity… the identity of mass and energy would have been an obvious fact of experience. The human eye, perceiving light from the sun, would then also feel the impact of photons.”

 

page 222 upper middle

…an algorithm to fit the observed patterns and details of nature after the fact.

The problem of modeling real systems amounts to finding an algorithm to express an apparently random sequence. By definition, however, a true random sequence is not compressible in an algorithm; conversely, compressibility implies that the sequence is not truly random. The argument of this book, of course, is that nature—being real—is essentially incompressible; it is only compressible (in laws) to the extent it has been redefined as ideal.

 

page 222 lower middle

The apparent universality and constancy of physical laws is an empirical finding, not a necessary or transcendent truth.

The appearance of universality and constancy of laws could be an illusion, owing to our unique position in time and space. In particular, it could be an effect of living in a particular phase of the overall history of matter. See Roberto M. Unger and Lee Smolin The Singular Universe and the Reality of Time: a proposal in natural philosophy Cambridge UP, 2015

 

page 222 lower middle

If nature were literally…it could not manifest indefinitely varied or random detail.

The complexity of a simulation is defined by the algorithm that generates it. Even if unlimited in extension, it would be limited in intension. The infinite detail generated by fractal algorithms, for example, is not infinitely varied, but self-similar.

One can imagine a test to see whether nature is real or artificial, on the analogy—say—of testing a recording to see whether it is analog or has been digitized. However, a putatively analog domain could ultimately turn out to be digital on a finer scale. To prove that a domain is infinitely dense (analog) is equivalent to proving that it is random: it can’t be done. Hence, nature could be proven to be artificial, but it cannot proven to be real!

 

page 222 lower middle

In that instance, moreover, no physical quantities would be truly random numbers.

A successful theory of everything—one that eliminates initial conditions and generates all known physical constants—would strongly suggest that nature is an artifact (which the first scientists took for granted). The universe could be artifactual, however, without implying that it was made supernaturally. It might have been engineered by ultra-capable aliens, following natural principles, as implied in the film and novel Contact, by Carl Sagan. For all practical human purposes, such beings would be divinely omnipotent, if not necessarily benevolent. However, as a theory of origins, this simply pushes back the explanation. We then have to explain the origin of the aliens and their materials.

In Carl Sagan’s book and film, extraterrestrial mathematicians have discovered a pattern deep within the decimal expansion of π (in base 11 rather than 10). The hidden pattern generates a circle, which is supposed to imply that π itself is an artifact. By further implication, the Euclidean space in which it is defined is also an artifact. (Of course, it might have made more sense for Sagan to use a physical constant rather than a number, since π is a matter of definition rather than measurement and is therefore not tied to physical space.) In the novel, the pattern in π has been deeply buried in the sequence, for any intelligence to discover that has sufficient computational power. The ability to probe mathematically thus paralleled the ability to probe experimentally, which led to the discovery of the extraterrestrial radio message central to the novel, the first contact in the search for extraterrestrial intelligence.

 

page 222 upper bottom

Conversely…this would prove that the natural world is not a simulation or artifact.

Thus Tegmark admits: “A convincing demonstration that there is such a thing as true randomness in the laws of physics (as opposed to mere ensembles where epistemological uncertainty grows) would therefore refute the [Mathematical Universe Hypothesis].” [Max Tegmark “The Mathematical Universe” 2007 sec IV B3]

 

 

page 222 bottom

We have mentioned nature’s general untidiness and quantum weirdness as examples.

Strictly speaking, there is a loophole in Bell-type experiments that demonstrate quantum weirdness (non-locality), since they depend on truly random inputs, which makes the argument circular. Cf. Nicolas Gisin “Is realism compatible with true randomness?” [arXiv 1012.2536vol1 Dec 2010]: “It is well known that a proper demonstration of quantum nonlocality (i.e. violation of a Bell inequality) requires that the inputs are truly chosen at random… More precisely, if the inputs were predetermined, then no proper violation of any Bell inequality is possible…” On the other hand, the very fact of violation strongly suggests that the inputs are random.

 

page 223 lower top

physical reality…is somehow self-programming and self-activating.

Of course, self-organization is now a rich field of study, in no little part because of the computer, which can quickly model complex non-linear processes. Accordingly, the computer provides an even more tempting mechanist metaphor. If advanced machines could self-organize and communicate, then why not imagine a more sophisticated mechanistic cosmos? The ongoing dualism of hardware and software, however, allows no explanation for how meaning can arise within and for a mechanistic system at all. The information output of existing computers serves the human programmer, not the computer itself.

 

page 223 middle

…variety of approaches…labeled “ontic pancomputationalism.”

Stanford Encyclopedia of Philosophy: Computation in Physical Systems, sec3.4: “Unlike [some] versions of pancomputationalism, which originate in philosophy, this ontic pancomputationalism originates in physics. It includes both an empirical claim and a metaphysical one… The empirical claim is that all fundamental physical magnitudes and their state transitions are such as to be exactly described by an appropriate computational formalism… They often make an additional metaphysical claim… that computation (or information, in the physical sense…) is what makes up the physical universe… Supporters appear to be motivated by the desire for exact computational models of the world rather than empirical evidence that the models are correct. Even someone who shares this desire may well question why we should expect nature to fulfill it. On the metaphysical front, … it is difficult to understand how abstract entities could give rise to physical qualities and their causal powers…”

 

page 223 upper bottom

all metaphors must be taken with a grain of salt.

The discrete states of a physical device, such as a digital computer, depend ultimately on quantum discreteness. [Roger Penrose, ENM, p403] Yet, discreteness is a matter of definition. Classical machines can be in discrete states because their parts correspond to their ideal counterparts (which are products of definition) well enough for their purpose, so that computers, for example, can transmit discrete states reliably. At a finer scale, however, the parts of macroscopic machines are fuzzy, and transmission of defined discrete states may be unreliable—through quantum fluctuation, for example. Yet, at an even finer (i.e. quantum) scale it is presumed that physical entities are identical to their theoretical counterparts, so that the discreteness of real particles and states corresponds by definition to their discreteness in theory. This is rather an article of faith, however, and even there discreteness could turn out to be a relative matter of scale.

 

page 225 lower top

Such an absurdity has no meaning unless one assumes, in the first place, the very thing that is to be concluded: that “reality” is not real.

Cf. the related “Boltzmann Brain paradox,” which “implies that over an eternity of time there are vastly more brains in the universe which are formed from small fluctuations than brains arising in the slow process of evolution… so, as conscious beings, it is overwhelmingly probable that we are Boltzmann brains.” [Smolin Time Reborn, p212]. A faulty notion of probability underlies both arguments, which assumes in the first place what is to be concluded—namely, that there is no essential difference between simulation and reality. The idea that one “could exist” (with equal prior probability) in either a real or a simulated world resembles the idea that one could just as well have been born into a different body.