CHAPTER 4: Law, Chance, and Necessity

 

page 59 middle

Yet, one could predict and control such a world only on condition…

The Cambridge Dictionary of Philosophy distinguishes between determinism and predictability. Determinism is the claim that “the full specification of the state of the world at one time is sufficient, along with the laws of nature, to fix the full state of the world at any other time” (p613). For one physical event to “fix” another is a different sort of thing than a subject’s specification of a state of knowledge. Even if a state of a system at one time could fix a future state, knowledge of either might not be available. As we shall see, the “full specification of the state of the world” is never available. It only could be so if the world itself is finitely specifiable because it is a mere product of definition.

 

page 59 bottom

…nature…capable of determining and limiting human experience and behavior.

Oddly, the question of nature’s intrinsic reality has remained somewhat independent of the question of its hold over human beings. Part of the reason for this may have to do with the reluctance of both science and religion to attribute any form of intentionality to the natural world. Our civilization has lived uneasily with dualism for centuries, and only recent scientific advances have made a reconciliation seem feasible or necessary.

 

page 59 bottom

…the scientist must be free to choose which experiment to perform.

Gisin purports to prove that violation of the Bell inequalities depends on at least a degree of choice in how the experiment is set up: “It is well known that a proper demonstration of quantum nonlocality (i.e. violation of a Bell inequality) requires that the inputs are truly chosen at random, or even better by the free will of two independent humans… Consequently, if one of the two partners lacks a single bit of free will, then no violation of any Bell inequality with binary outcomes could ever be observed, whatever the number of possible inputs.” [Nicolas Gisin “Is realism compatible with true randomness?” arXiv 1012.2536vol1 Dec 2010] Note the equivalence of “free will” and “random.”

 

page 59 bottom

…the need to stand outside it…if only to supply the input.

One can argue that it is not possible to stand outside the whole universe even conceptually. While determinism is no threat to the conduct of science, a science of the whole universe poses a threat to determinism, since the observer is physical and must stand somewhere physically, within the universe. While the observer can hold a transcendent perspective with respect to any specified conceptual system, as seems guaranteed by Gödel’s theorems, she must still occupy some perspective at a given moment.

 

page 60 middle

no formula has causal power to affect the future…

Knowing the future does not (necessarily) bring it about or alter it. Predictability does not strictly imply determinism, nor vice-versa. (It is possible to predict the future without knowing why—for example, by guessing.) This applies as well to records of present experience (for example, in memory). Such a record, being an artifact, may be considered deterministic. Yet, to the degree that memory and records may be unreliable traces of supposedly real past events, the past is as much an unknown as the future. By definition, an algorithm that extrapolates the past to the future is also deterministic. Even so, any algorithm is only as reliable as its input. Ultimately, the present is simply what we find it to be.

Cf. George F. R. Ellis “Physics in the Real Universe: Time and Spacetime” (online), footnote12: “There is uncertainty as regards both the future and the past, but its nature is quite different in these two cases. The future is uncertain because it is not yet determined: it does not yet exist in a physical sense. Thus this uncertainty has an ontological character. The past is fixed and unchanging because it has already happened, and the time when it happened cannot be revisited; but our knowledge about it is incomplete, and can change with time. Thus this uncertainty is epistemological in nature.” Unlike Ellis, I don’t see these uncertainties as “quite different.” I see no sense in “ontological” determinism, hence no sense in which the uncertainty of the future is ontological.

 

page 60 lower middle

Does causality even have a meaning apart from the assessment of conscious agents?

Later philosophy overlaid an ontological power of causality on Greek necessitarianism—an early version of determinism in science. Psychologically, the notion of cause is rooted in early experience of power over one’s own limbs and over other objects (cf. Piaget). Such cause therefore refers to personal agency, in contrast to the determinism of the Greeks, which refers essentially to logical necessity. In neither case is it a power of one object over another.

 

page 60 upper bottom

Finally, what does it mean for nature to be “free”—that is, undetermined?

Cf. Smolin Time Reborn, p148-9: “I find it marvelous to imagine that an elementary particle is truly free, even in this narrow sense [of unpredictability]. It implies that there’s no reason for what an electron chooses to do when we measure it—and thus that there’s more to how any small system unfolds than could be captured in any deterministic or algorithmic framework. This is both thrilling and scary, because the idea that choices atoms make are truly free (i.e. uncaused) fails to satisfy the demand for sufficient reason—for answers to every question we might ask of nature.”

The very notion of cause traditionally conflates reason with predictability—intentionality with observation. It is awkward in the first place to say that atoms and electrons “make choices,” when it is physicists who choose what antecedents or causes lead to given effects. “Free” refers 1st-personally to the human sense of volition, on the one hand, and 3rd-personally to our inability to predict, on the other. Both involve the subjectas well as the object. Perhaps Smolin’s marvel at nature’s “freedom” comes from the implicit traditional expectation that it should not be free—that is, not independent of thought or theory, but confined as a mere product of definition.

On the other hand, should it be the case that nature has, in some sense, its own reasons, the problem of indeterminacy is simply that we cannot discern what those reasons are. The role we accord to human freedom hinges essentially on the inscrutability of human motivations. The freedom of nature is not essentially different.

 

page 61 top

…the medieval vision of planets carried around their paths by angels.

Kepler had postulated a force, inspired by magnetism, that emanated from the sun and swept the planets along in their orbits with the sun’s rotation. Though following an inverse square law, it operated only in the plane of the orbits.[see: Allen G. Debus Man and Nature in the Renaissance Cambridge UP 1978, p94] This represents a transitional idea between the theological conception, in which an angel or God himself is the motive power, and a naturalistic one in which the force emanates from “the sun.” In the light of solar mythologies, the sun is an obvious symbol of the divine.

 

page 61 upper middle

…God would have had no freedom in specifying its details.

Einstein differed significantly from Newton on this issue: “I cannot imagine a unified and reasonable theory which explicitly contains a number which the whim of the Creator might just as well have chosen differently, whereby a qualitatively different lawfulness of the world would have resulted…” [Einstein quoted in John D. Barrow Theories of Everything: the quest for ultimate explanation Fawcett/Balantine 1991, p127] Einstein’s idealism, like the Greeks’, would have all be deduced from first principles. This is opposed to the idealism of Newton, who upheld the right of the Creator to set physical constants “by whim.”

 

page 61 middle

…Platonic forms, which could better be known through rational intuition than observation.

Cf. Richard Tarnas The Passion of the Western Mind Ballantine 1991 p45: “In the Platonic understanding, the irrational was associated with matter, with the sensible world, and with instinctual desire, while the rational was associated with mind, with the transcendent, and with spiritual desire. Ananké, the refractory purposelessness and random irrationality in the universe, resists full conformity to the creative Reason.”

 

page 61 lower middle

…perfect, infinitely precise, eternal mathematical relationships.

Paul Davies “On the Multiverse” fqxi.org conference on time, 2010: “The concept of an externally imposed set of infinitely precise transcendent immutable time-symmetric mathematical relationships (‘laws of physics’), together with their attendant mathematical entities such as real numbers, is an idealization with no justification in science, and represents an act of faith that is a hangover from European medieval theology and Platonic philosophy… The asymmetry between the laws and the states of the world is a direct reflection of the asymmetry between creature and creator.”

 

page 61 bottom

The scientific concept of chance begins with what can be approached mathematically…

Confusing mathematically idealized situations, such as the role of geometrically symmetrical dice (or other well-defined probability situations, such as occur in casinos) with real-life situations gives what Nicolas Taleb calls the ludic fallacy. See his discussion of Platonically idealized chance in The Black Swan, especially, 127ff.

 

page 62 upper bottom

The modern view is inconsistent insofar as it holds that the universe arose through deterministic processes yet ultimately by lottery.

Determinism should be distinguished from materialism, reductionism, and mechanism. Determinism assumes that specific causes can be found for any effect. Materialism assumes that all phenomena, physical or mental, can be rationally accounted for by physical processes. Reductionism assumes that the whole can be understood in terms of its parts—that any phenomenon can be understood in terms of simpler principles and elements. Mechanism assumes that natural systems can be understood as machines—i.e., closed reversible systems. The “modern view” usually encompasses all four assumptions.

 

page 63 bottom

It is not nature, then, that is deterministic, but human thought systems projected upon it.

It is sometimes said that the brain, unlike the mind, is a causal system and therefore must be deterministic. This is misleading, however. The mind is able to transcend any of its particular creations, each of which amounts to a deductive system, and as such is deterministic. The brain is not deterministic because it is not such a creation. On the other hand, any model of the brain (for example, as a logical system) is a creation. It cannot exhaustively represent the real brain, which is as ambiguous as any other part of nature.

 

page 63 bottom

…perhaps an alien engineering project, or a divine creation as originally supposed.

In short, determinism implies creationism. Einstein’s dream of a “complete” deterministic theory would imply that nature is an artifact. (He often expressed a desire to know the “thoughts of God,” and perhaps his deism should be taken more literally.) Similarly, proof of a deterministic “hidden variable” theory for the quantum realm, for instance, might imply creationism. On the other hand, success of a hidden-variable theory would only be relative to scale, since—logically at least—there could be an even deeper level of further hidden variables. This is simply the problem of the whole and its parts, with its possible infinite regression. Logical closure (completeness) can only apply to a deductive system (model), in which determinism is logical implication.

 

page 64 top

…the “immanent reality” of nature resides…in the very randomness of the details…

A computer simulation can provide both large-scale order and seemingly random details, but the details it provides are actually pseudo-random, produced by an algorithm. We know the rule of the algorithm, since it was deliberately programmed, whereas we are only guessing after the fact at rules that fit the observed details of nature. Of course, a computer program or simulation could incorporate a true random generator, such as atomic emissions.

 

page 65 lower top

So-called metaphysical necessity… is simply logical necessity…projected upon the external world.

Some philosophers distinguish varieties of necessity. In particular, they recognize “metaphysical necessity,” whose power lies “in virtue of the essences of things.” [Brian Ellis The Philosophy of Nature, p15] I hold rather that there is only logical necessity, which includes analytic truths. To assert metaphysical or “natural” necessity is to posit a deductive system that one believes corresponds with reality. This is a matter of faith, whereas the imputed necessity is but the logical necessity of the deductive system. Whatever might be the “essences” of things is debatable and must ultimately be decided empirically, and natural kinds are a matter of how you slice it. This is not to deny that there is something really there to which laws and kinds refer.

 

page 66 top

Determinism in physics suggests that the state of a system at a given time is “fixed” by its prior states.

The Cambridge Dictionary of Philosophy describes determinism as “the view that the state of the world at any instant determines a unique future, and that knowledge of all the positions of things and the prevailing natural forces would permit an intelligence to predict the future state of the world with absolute precision.” Similarly, the Stanford Encyclopedia of Philosophy describes causal determinism as “roughly speaking, the idea that every event is necessitated by antecedent events and conditions together with the laws of nature.”

 

page 66 middle

Time characterizes the real world, while timelessness characterizes deductive systems.

Cf. Lee Smolin Time Reborn, p 51: “What the Newtonian paradigm does is replace causal processes—processes played out over time—with logical implication, which is timeless.” Not just the “Newtonian paradigm,” however, but any form of deductionism does this.

 

page 66 lower middle

…the system must be well defined and follow mechanical rules.

As Earman notes, “determinism does not entail mechanism in the crude sense… of a mechanical contrivance composed of gears, levers, and pulleys. But it remains open that determinism involves mechanism in the more abstract sense that it works according to mechanical rules, whether or not these rules are embodied in mechanical devices. In the converse direction, we can wonder whether mechanistic rules are necessarily deterministic.” [John Earman A Primer on Determinism D. Reidel Publishing Co. 1986, p111] That is, we can wonder whether they produce unique outputs for given inputs.

 

page 66 bottom

It is only an assumption…and so there is no guarantee of predictability.

I have been arguing that causal (physical, or ontological) determinism is an empty concept. Deductive systems are predictable (non-random) because they are designed to be so. No natural system, however, is merely a product of design or definition, corresponding to an algorithm; hence, it must correspond effectively to a random sequence. The problem of exhaustively modeling a real system amounts to finding an algorithm to express a random sequence—yet, no deterministic system (such as a computer) can be counted upon to do this.

 

page 66 bottom

…only a unique mapping… regardless of the nature of those domains.

According to Earman, determinism doesn’t even imply sharp values (point by point correspondence), but could as well be understood more vaguely in terms of “interval valued magnitudes.” [Earman, op cit, p226] While first-order differential equations preserve the sense of a unique output, the situation is more complicated for partial differential equations. [p118]

 

page 67 top

What distinguishes…is not the realness of the entities, but the different statistics.

Quantum systems may appear to have “freedom,” in contrast to macroscopic systems, which are held to be deterministic with uncertainties owing merely to ignorance. However, this distinction is artificial; the difference is only of degree. In both cases, the more a system is predictable, the less “freedom” it has. Smolin [Time Reborn, p150] describes quantum systems as being “maximally free”: as having as much “freedom from determinism as any physical system described by probabilities can have.” However, the expression “freedom from determinism” is at best redundant. I believe he means simply that such systems are maximally unpredictable.

 

page 67 lower middle

In contrast…but their output is probability not individual events.

Schrodinger’s wave equation is deterministic, while nature is not deterministic, even at the classical level. As a physical behavior, wave motion is generally not reversible, even though a wave equation is. A water wave, for example, emanates from a point outward and does not generally re-converge on that point. (It only could if the reflecting shoreline were perfectly circular, centered on the origin.) Nevertheless, a light wave does (in effect) converge on a point of absorption, suggesting the behavior of particles rather than waves. This discrepancy is the basis of the “measurement problem.” The wave equation cannot predict where that absorption will take place; but a large sample of such events would correspond to an expanding spherical wave.

 

page 67 bottom

…statistically precise averaged macroscopic pattern, which is then codified as a law.

Cf. Poincaré’s speculation that the law of gravitation might have only statistical precision: “Who knows if it be not due to some complicated mechanism, to the impact of some subtle matter animated by irregular movements, and if it has not become simple merely through the play of averages and large numbers?” [Poincaré Science and Hypothesis Walter Scott Pub. Co.1905 p148]

 

page 68 top

If the motion of a planet…predicting its path would be impossible.

George F. R. Ellis (see endnote 12) gives the example of a massive object powered by rocket engines, whose guidance is triggered by random quantum events: its path would be unpredictable, like Brownian motion. Such amplification would require “great entropic resources.” He argues, moreover, that this would render the entire space-time structure undetermined; for such a massive object would affect it and the effects cannot be predicted because of quantum uncertainty.

 

page 68 upper middle

…(which is how “deterministic chaos” was discovered in the first place).

Or, rather, rediscovered. Poincaré had laid the foundations early in twentieth century. In fact, it is a logical development of the gravitational “three-body problem.” Celestial mechanics led from the simple case of elliptical orbits with constant period to more complex cases involving multiple bodies, a broadened concept of periodicity, and so eventually to chaos theory. Statistical mechanics, in trying to treat molecules as mini planets, was forced to use statistical methods to deal with what were essentially chaotic micro orbits. The common assumption was an underlying determinism, which was also presumed by information theory, insofar as a signal is effectively (random) noise when no pattern can be specified. In these three independent areas of study, the starting point is a system presumed to be deterministic, yet rendered indeterminate by physical circumstances. This is simply another way to say that the presumption of determinism is not realistic, that a deterministic system is a simplified fiction only approximating reality. Cf. Massimiliano Badino “Bridging Conceptual Gaps: The Kolmogorov-Sinai Entropy” Philpapers, March 28, 2013, p16.

 

page 68 middle

Any slight difference in the input results in enormous difference in the output…

Put this way, chaos seems to play havoc with prediction. Steven Weinberg casts it in a more positive light, by turning the relationship upside down: “…however far into the future we want to be able to predict the behavior of a [classical] physical system… there is some degree of accuracy with which a measurement of the initial conditions would allow us to make this prediction.” [Weinberg DFT p36] More neutrally, for a given margin of error in the prediction, there is some corresponding margin of error that must be achieved in the initial measurement, which varies exponentially with time in chaotic systems.

 

page 68 middle

…even though they are physical and not mathematical systems.

Supposedly one can test whether a natural process is deterministic or not by comparing the time evolution from slightly different inputs. The idea is that a deterministic system will show a continuous path (through phase space), regardless of whether it is linear or exponential. (Even when exponential, paths from similar initial states should appear more similar than if arbitrary or randomly distributed.) Such tests can be difficult in practice, however, because of the amount of computation required. [Wikipedia: “Chaos theory: distinguishing random from chaotic data”] There is a theoretical issue at stake, beyond practical considerations. For, a deterministic system cannot produce truly random behavior.

 

page 68 lower middle

…they are as different as pseudo-random and true random numbers.

A clear distinction might also be drawn between statistical samples that contain only relatively small variations (suggesting linear processes), and samples that may contain large and unpredictable variations. However, the type of sample cannot be known in advance: a sample may at first sight appear to be of the first type until an unexpected deviation is noticed.

 

page 69 upper middle

However perfectly…in the end they correspond only imperfectly to real systems.

Roger Penrose distinguishes between determinism and computability—on the hope that, if some aspects of the world were non-computable (though deterministic), there would be a place for the operation of free will. [Penrose, The Emperor’s New Mind, p170] Yet this distinction presumes that determinism is not simply a property of mathematical systems (logical necessity) but a kind of metaphysical necessity that is a property of the world. I claim (with Hume) that there is no such thing, and that the only necessity is logical—purely in the domain of “free will” (intentionality) to begin with.

 

page 70 middle

…only their relative frequency in indefinitely large samples.

It is not impossible, but simply unlikely, to generate non-random looking sequences—for instance, flipping 10 heads in a row. If coin tosses are continued indefinitely, such sequences within the longer sequence are to be expected. The average approaches a precise value as the number of trials approaches infinity.

 

page 70 middle

Their imperfections may systematically skew the results—hence the use of loaded dice to cheat.

These differences, between theoretical and real situations, led to some confusion about their relationship in the minds of the early originators of probability theory. Pascal, for example, claimed that calculations of probability concerned “what is rightly due,” but that this can be “little investigated by trial.” To a modern mind, repeated trials of actual dice do give “what is rightly due”—in the limit of an infinite sequence. The idea of convergence to a limit was the key concept that eventually put the calculus, for example, on a rigorous footing.

 

page 70 lower middle

Logical reasoning about evidence has a long history, invoking diverse interpretations of probability.

Thus, a medieval commentator could state: “For nothing prevents some false propositions from being more probable than true ones.” [Pierre de Ceffons, quoted in James Franklin The Science of Conjecture: evidence and probability before Pascal John Hopkins U. Press 2001, p208-09] This sounds rather odd to a modern ear, for we regard propositions known to be true as 100 percent probable, and those known to be false as having zero probability! (Only when it is actually raining, for instance, is there a 100 percent chance of rain!) For medieval thinkers, however, a proposition could be certain because it was an indisputable article of the Catholic faith (in effect, an element of a deductive system). The merely probable, on the other hand, involved everything else: evidence of reason, the senses, personal testimony, or common sense.

 

page 71 top

Probability is a complex issue even in hard science.

Author Jonah Lehrer describes a phenomenon known as the “decline effect.” This refers to the inability of later experiments or studies to confirm original research findings. In part it can be explained by the skewed way in which data are interpreted in the first place. Research with an invested bias is particularly vulnerable, such as pharmaceutical research or studies of paranormal phenomena, especially wherever statistical correlations must be analyzed. [“The Truth Wears Off” The New Yorker Dec 13, 2010]

 

page 72 top

There may be minor variations in density, but never such a major fluctuation…

The rearrangement of molecules through their random motions in a gas is not analogous to the shuffling of cards in a deck (no matter how many cards). The latter represents a potentially well-defined operation, which could have an inverse operation taking the cards back to their previous ordering. Looked at that way, card shuffling is not truly randomizing. The random rearrangement of molecules, however, is not a well-defined operation and has no definable inverse, despite the reversibility of “laws of motion.” In any case, a single shuffle (a “fluctuation”) cannot return a randomized deck to a completely ordered state. Poincaré recurrence would have to take place incrementally, through at least as many steps as were required to randomize in the first place. There is thus no basis to believe that a gas could spontaneously return to an ordered state as the result of a “fluctuation.” There are physical forces between molecules, while not between cards. Furthermore, individual molecules do not have the identity that individual cards have.

 

page 72 upper bottom

…prior probability can only apply to well-defined situations—which the origin of the cosmos is not.

A similar argument can be made that the universe would either have collapsed back upon itself, or else expanded too rapidly for stars to form, if the initial expansion rate of the Big Bang had differed by as little as one part in 1060. The recent discovery of the accelerating rate of expansion makes a mockery of this mechanistic dependence on precise initial conditions, which did not include the still mysterious role of “dark energy,” whatever that turns out to be.

 

page 73 top

each interpretation is appropriate in different circumstances.

Specifically, the frequentist approach is more appropriate for repeatable experiments whose results are inherently statistical, while the Bayesian approach is more appropriate for theoretical predictions regarding singular events, based on well-defined models whose fit to nature remains a matter of faith. Some situations call for a combination of both. “For example, when a patient is suspected of having cancer, the physician assigns an initial probability… based on data such as known incidence of the disease in the general population, the patient’s family history, and other relevant factors. On receiving the patient’s test results, the doctor then updates this probability using Bayes’ law. The resulting number is no more and no less than the doctor’s personal degree of belief.” [Hans Christian Baeyer “Quantum Weirdness? It’s all in your mind” Scientific American June 2013] Note that the first assessment considers only the statistic, not the individual; the update may be a self-fulfilling prophecy—for example if a biopsy induces the condition it is supposed to detect by releasing malignant cells into the blood stream, thus bringing about metastasis when such cells are in fact present but contained.

 

page 73 bottom

…quantum statistics yields a modified pattern, characterized by superposed curves (interference).

It turns out that indefinitely many patterns are possible between the extremes of Bose-Einstein and Dirac statistics. Hypothetical particles exhibiting these statistics are generically called “anyons”. [cf. Science Weekly April 1991: “Frank Wilczek on Anyons and their Role in Superconductivity”]

 

page74 lower middle

Randomness has no rigorous meaning apart from such tests.

Many mathematical definitions of randomness have been proposed. In general, a random sequence should be 1) unpredictable, 2) incompressible, 3) statistically typical. The fair toss of a coin meets these criteria. Coin tosses are used for random decisions because the single event is unpredictable; the actual (indefinitely long) sequence of coin tosses cannot be generated by a formula (compressed); yet, that sequence averages out to a probability of ½ (statistically typical).

A random number is a non-terminating sequence of digits, which do not repeat in any predictable way and are not known to have been generated by any formula. There is no finite general procedure to prove that a number has not been algorithmically generated. Furthermore, any infinite sequence could contain indefinitely large (but finite) non-random stretches. Similarly, a random event means a single trial in a series that does not turn out to follow some pattern that can be compressed mathematically in a formula. (Thus, a random event corresponds to a digit in the decimal expansion of a random number.) It makes no more sense to speak of the randomness of the event in isolation, without reference to its context, than it does to speak of the randomness of a single digit without reference to the sequence of digits to which it belongs. Cf. John von Neumann’s paradoxical-sounding observation that there is no such thing as a random number, only methods to produce them. By definition, there is no method to produce “truly” random numbers, but there is a class of numbers (digits or sequences) that are non-terminating and look random, though they can be generated by formulas. These are “pseudo-random” numbers, such as π.

The concept of the random can be associated with the so-called non-computable real numbers: what is left over when the rational and irrational numbers are removed from the real number continuum. (The irrational numbers are computable because they can be generated through a procedure such as taking square root.) The idea of the non-computable numbers is somewhat paradoxical (as the irrationals must have seemed to the Greeks), since by definition they cannot be specified through a procedure. You cannot put your finger on them, so to speak, even though they can be proven to exist. Some mathematicians do not take them seriously. The early 20th-century mathematician Emile Borel called them “inaccessible.” Borel (and presumably other “constructivists” of his era) believed that a number is real only if it is computable (can be constructed). Even some contemporary mathematicians might prefer to ignore them. Max Tegmark, for example, seems to prefer to do without non-computable numbers, which he suspects might be “a mere illusion, fundamentally undefined and simply not existing in any meaningful sense.” [Max Tegmark “The Mathematical Universe” 2007sec VII F]             Computable numbers have identity: they can be named, described, counted as distinct individuals. But they are infinitely outnumbered by other positions on the real number continuum that cannot be so specified. The computable numbers are separated by gaps in this continuum that are infinitely further divisible. The computable numbers correspond to instructions on how to specify certain points. The real number continuum corresponds to unspecifiable points as well. These categories seem as different as the found and the made.

The non-computable reals are inaccessible to a physics based on computing specifiable events, entities, or states. (Cf. James B. Hartle Sources of Predictability 1997: “… it is important for practical predictability that the known laws of physics predict computable numbers extractable in particular circumstances by the application of standard algorithms. This is not a trivial statement because there are infinitely many more non-computable numbers than computable ones.”) While prediction requires computable numbers, physical reality (or at least real space) is assumed in classical physics to correspond to the real number continuum. Physics could be re-formulated to eliminate non-computable numbers; but such a move would amount to treating nature as a deductive system. Randomness would be eliminated at the price of reality. The only possible events would be solutions to existing equations. In effect, such a world would be a simulation, generated by an algorithm, an artificial world in which there would only exist pseudo-random numbers and programmed events.

Indeed, in mid-twentieth century, mathematicians discovered whole classes of simple functions that can generate indefinite complexity, based on pseudo-randomness. The Mandlebrot set is an example, along with many others that demonstrate self-similarity. Such functions are used routinely to create the computer enhancements and simulations that have become such a major part of motion picture entertainment. In our computer age, it is all the more tempting to imagine that this is how the universe really is: a deterministic place after all, generated by simple laws in the way that a virtual reality is. In such a world, the found is the made!

 

page 74 lower middle

…the situation is less than straightforward, if only because randomness cannot be proven.

The Decision Problem is the challenge to find a general procedure that would decide whether any proposition formulated within a given deductive system is provable within that system. This was put forward by Hilbert as one of the major remaining unresolved problems in mathematics at the turn of the twentieth century. It was interpreted by Church, Turing, and others as the problem of finding a general program to decide the computability of functions. Later, it was re-interpreted by Gregory Chaitin as the problem of finding a general procedure to decide whether a given sequence is mathematically compressible. In all versions, the answer is: no. This ostensibly negative result at least represents progress in clarifying the concept of the random. A sequence of numbers is then considered random if no shorter expression of it can be found. It therefore amounts to a primitive “axiom” of some system, insofar as it cannot be derived from simpler expressions. However, this makes the concept depend on computational resources and techniques. A sequence of digits is evaluated according to its “complexity,” which measures the size of the compressing algorithm relative to the sequence itself. By such a definition, randomness is not an either/or affair, but a matter of degree, at least for finite sequences. This suggests that randomness, even mathematically defined, is less a property than a relationship.

Gregory Chaitin’s famous theorem does not exactly say that it is impossible to prove a number or sequence random. Rather, it says that “in a formal system of complexity n it is impossible to prove that a particular series of binary digits is of complexity greater than n+c, where c is a constant [reflecting the machine language] that is independent of the particular system employed.” In other words, a given program cannot prove a sequence to be more random (complex) than itself. Chaitin’s theorem concerns formal systems. By my hypothesis, nature—or any natural system—cannot be reduced to a formal system. Chaitin’s theorem says that even if it could, by extension it cannot be proven that it is more complex than the system (of complexity n) to which it is compared.

Furthermore, a random sequence may be either one for which no compressed expression can be found or one that has already been maximally compressed. Applied to information theory, for example, this means that if you eliminate all the redundancy in a message you end up with something that appears to be noise—that is, not a message at all! Cf. Gregory Chaitin, “How Real are Real Numbers?” [web archive]: “This may seem paradoxical, but it is a basic result in information theory that once you compress something and get rid of all the redundancy in it, if you take a meaningful message and do this to it, afterwards it looks just like noise.”