RhodesR@BottomLayer.com

This paper surveys evidence and arguments for the proposition that the universe as we know it is not a physical, material world but a computer-generated simulation -- a kind of virtual reality. The evidence is drawn from the observations of natural phenomena in the realm of quantum mechanics. The arguments are drawn from philosophy and from the results of experiment. While the experiments discussed are not conclusive in this regard, they are found to be consistent with a computer model of the universe. Six categories of quantum puzzles are examined: quantum waves, the measurement effect (including the uncertainty principle), the equivalence of quantum units, discontinuity, non-locality, and the overall relationship of natural phenomena to the mathematical formalism. Many of the phenomena observed in the laboratory are puzzling because they are difficult to conceptualize as physical phenomena, yet they can be modeled exactly by mathematical manipulations. When we analogize to the operations of a digital computer, these same phenomena can be understood as logical and, in some cases, necessary features of computer programming designed to produce a virtual reality simulation for the benefit of the user.

Ver. 2.0 July 11, 2001

http://www.bottomlayer.com/bottom/argument/Argument4.PDF

A. Waves with no medium,as though they were mathematical formula only

If light were made of particles, they would travel in straight lines from the source and hit the screen in two places.

If light traveled as waves, they would spread out, overlap, and form a distinctive pattern on the screen.

**The First Computer Analogy.** One way to resolve this seeming paradox of waves without medium is to note that there remains another kind of wave altogether. A wave with which we are all familiar, yet which exists without any medium in the ordinary sense. This is the computer-generated wave. Let us examine a computer-generated sound wave.

*would look like* if it were played by a "real" instrument. The synthesizer's output is routed to a computer and stored as a series of numbers. The numbers are burned into a disk as a series of pits that can be read by a laser -- in other words, a CD recording. The CD is shipped to a store. You buy the CD, bring it home, and put it in your home entertainment system, and press the play button. The "music" has traveled from the recording studio to yourliving room. Through what medium did the music wave travel? To a degree, you might say that it traveled as electricity through the wires from the keyboard to the computer. But you might just as well say it traveled by truck along the highway to the store. In fact, this "sound wave" never existed as anything more than a digital representation of a hypothetical sound wave which itself never existed. It is, first and last, a string of numbers. Therefore, although it will produce wave like effects when placed in your stereo, this wave never needed any medium other than the computer memory to spread itself all over the music loving world. As you can tell from your CD collection, computers are very good at generating, storing, and regenerating waves in this fashion.

Calculations from an equation [here, y = sin (x) + sin (2.5 x)] produce a string of numbers, i.e., 1, 1.5, 0.4, 0, 0.5, 1.1, 0.3, -1.1, -2, -1.1, 0.1, and 0.5.

These numbers can be graphed to create a picture of the wave that would be created by combining (interfering) the two simple sine waves.

*exist* in the absence of any medium.

B. Waves of calculation, not otherwise manifest,as though they really were differential equations

*the underlying process itself was nothing more than calculation*.

**The Second Computer Analogy. **A process that produces a result based on nothing more than calculation is an excellent way to describe the operations of a computer program. The two-step procedure of the Schrodinger equation and the Feynman system may be impossible to duplicate with physical systems, but for the computer it is trivial. That is what a computer does -- it manipulates numbers and calculates. (As we will discuss later, the computer must then interpret and display the result to imbue it with meaning that can be conveyed to the user.)

**Wave summary. **Quantum mechanics involves "waves" which cannot be duplicated or even approximated physically; but which easily can be calculated by mathematical formula and stored in memory, creating in effect a static map of the wave shape. This quality of something having the appearance and effect of a wave, but not the nature of a wave, is pervasive in quantum mechanics, and so is fundamental to all things in our universe. It is also an example of how things which are inexplicable in physical terms turn out to be necessary or convenient qualities of computer operations.

A. "Collapse of the wave function" -- consciousness as mediator,as though the sensory universe was a display to the user

**However, a most mysterious thing** happens when we detect these particles at the slots: the interference patterns disappears and is replaced by a clumping in line with the source and the slots. Thus, if we thought that some type of wave was traveling through this space in the absence of observation, we find instead a true particle upon observation -- a particle which behaves just like a particle is supposed to behave, to the point even of traveling in straight lines like a billiard ball.

**The computer analogy.** As John Gribbin puts it, "nature seems to 'make the calculation' and then present us with an observed event."[2] Both the "how" and the "why" of this process can be addressed through the metaphor of a computer which is programmed to project images to create an experience for the user, who is a conscious being.

*position* and *momentum*. According to classical Newtonian physics and to common sense, if an object simply exists we should be able to measure both where it is and how fast it is moving. Measuring these two properties would allow us to predict where the object will be in the future. In practice, it turns out that both position and momentum cannot be exactly determined at the same moment -- a discovery that threw a monkey wrench into the clockwork predictability of the universe. Put simply, the uncertainty relationship is this: for any two complementary properties, any *increase in the certainty of knowledge* of one property will necessarily lead to a *decrease in the certainty of knowledge* of the other property.

*change* the momentum, as they made the momentum *less certain*, less predictable. On remeasurement, the momentum might be the same, faster, or slower. What is more, the *range* of uncertainty of momentum increased in direct proportion to the accuracy of the measurement of location.

*mathematical incompatibility* between the two properties. Heisenberg was able to state that there was a mathematical relationship between the properties p (position) and m (momentum), such that the more precise your knowledge of the one, the less precise your knowledge of the other. This "uncertainty" followed a formula which, itself, was quite certain. Heisenberg's mathematical formula accounted for the experimental results far, far more accurately than any notion of needing better equipment in the laboratory. It seems, then, that uncertainty in the knowledge of two complementary properties is more than a laboratory phenomenon -- it is a law of nature which can be expressed mathematically.

**A good way to understand** the uncertainty principle is to take the extreme cases. As we will discuss later on, a distinguishing feature of quantum units is that many of their properties come in whole units and whole units only. That is, many quantum properties have an either/or quality such that there is no in between: the quantum unit must be either one way or the other. We say that these properties are "quantized," meaning that the property must be one specific value (quantity) or another, but never anything else. When the uncertainty principle is applied to two complementary properties which are themselves quantized, the result is stark. Think about it. If a property is quantized, it can only be one way or the other; therefore, if we know *anything* about this property, we know *everything* about this property.

*everything* about one complementary property is that, as a law of nature, we then would know *nothing* about the other complementary property. For our example, we must imagine that, by learning whether a married woman is pregnant, we thereby no longer know whether she is married. We don't forget what we once knew; we just can no longer be certain that we will get any particular answer the next time we check on her marital status. The mathematical statement is that, by knowing pregnancy, you do not know whether she is married; and, by knowing marital status, you do not know whether she is pregnant. In order to make this statement true, if you once know her marital status, and you then learn her pregnancy status (without having you forget your prior knowledge of marital status), the very fact of her marital status must become random yes or no. A definite maybe.

**A computer's data.** If we cease to think of the quantum unit as a "thing," and begin to imagine it as a pixel, that is, as a display of information in graphic (or other sensory) form, it is far easier to conceive of how the uncertainty principle might work. The "properties" we measure are variables which are computed for the purpose of display, which is to say, for the purpose of giving the user knowledge via the interface. A computed variable will display according to the underlying algorithm each time it is computed, and while the *algorithm* remains stable, the results of a particular *calculation* can be made to depend on some other factor, including another variable.

*either *C *or* D. This impacts on the math predicting what will happen in any given quantum situation and, as it turns out, the final probabilities agree with this interchangeable state of affairs.

**The computer analogy.** Roger Penrose has likened this sameness to the images produced by a computer.[4] Imagine the letter "t." On the page you are viewing, the letter "t" appears many times. Every letter t is exactly like every other letter t. That is because on a computer, the letter t is produced by displaying a particular set of pixels on the screen. You could not, even in principle, tell one from the other because each is the identical image of a letter t. The formula for this image is buried in many layers of subroutines for displaying pixels, and the image does not change regardless of whether it is called upon to form part of the word "mathematical" or "marital".

IV. Continuity and Discontinuity in Observed Behaviors

A. "Quantum leaps,"as though there was no time or space between quantum events

*there is no time or space in which the process exists in any intermediate state*.

**The pre-computer analogy.** Before computer animation there was the motion picture. Imagine that you are watching a movie. The motion on the screen appears to be smooth and continuous. Now, the projectionist begins to slow the projection rate. At some point, you begin to notice a certain jerkiness in the picture. As the projection rate slows, the jerkiness increases, and you are able to focus on one frame of the movie, followed by a blanking of the screen, followed by the next frame of the movie. Eventually, you see that the motion which seemed so smooth and continuous when projected at 30 frames per second or so is really only a series of still shots. There is no motion in any of the pictures, yet by rapidly flashing a series of pictures depicting intermediate positions of an actor or object, the effective illusion is one of motion.

**The computer analogy.** Computers create images in the same manner. First, they compose a still image and project it; then they compose the next still image and project that one. If the computer is quick enough, you do not notice any transition. Nevertheless, the computer's "time" is completely discrete, discontinuous, and digital. One step at a time.

*only* these three numbers. By contrast, if you consider any *physical* object, it will have some size, which is to say it will have its own height, width, and depth. If you were to exactly place such a physical object, you would have to take into account its own size, and to do so you would have to assign coordinates to each *edge* of the object.

*as particles*, there does not seem to be any easy way to determine their outer edges, if, in fact, they have any outer edges. Accordingly, quantum "particles" are designated as simple points, without size and, therefore, without edges. The three coordinate numbers are then sufficient to locate such a pointlike particle at a single point in space.

*somewhere* short of the actual zero point. It seemed much too arbitrary. Nevertheless, this mathematical quirk eventually gave physicists a method for doing their calculations according to a process called "renormalization," which allowed them to keep their assumption that an actual zero point exists, while balancing one positive infinity with another negative infinity in such a way that all of the infinities cancel each other out, leaving a definite, useful number.

*any* arbitrarily small unit of distance -- is sufficient for the resolution of the paradox. For the physicist, however, there should appear some *reason* for choosing one small distance over another. None of the theoretical models have presented any compelling reason for choosing any particular model as the "quantum of length." Because nosuch reason appears, the physicist resorts to the "renormalization" process, which is profoundly dissatisfying to both philosopher and physicist. Richard Feynman, who won a Nobel prize for developing the renormalization process, himself describes the procedure as "dippy" and "hocus-pocus." The need to resort to such a mathematical sleight-of-hand to obtain meaningful results in quantum calculations is frequently cited as the most convincing piece of evidence that quantum theory -- for all its precision and ubiquitous application -- is somehow lacking, somehow missing something. It may be that one missing element is quantized space -- a shortest distance below which there is no space, and below which one need not calculate. The arbitrariness of choosing the distance would be no more of a theoretical problem than the arbitrariness of the other fundamental constants of nature -- the speed of light, the quantum of action, and the gravitational constant. None of these can be derived from theory, but are simply observed to be constant values. Alas, this argument will not be settled until we can make far more accurate measurements than are possible today.

**Quantum time.** If space is quantized, then time almost surely must be quantized also. This relationship is implied by the theory of relativity, which supposes that time and space are so interrelated as to be practically the same thing. Thus, relativity is most
commonly understood to imply that space and time cannot be thought of in isolation from each other; rather, we must analyze our world in terms of a single concept -- "space-time." Although the theory ofrelativity is largely outside the scope of this essay, the reader can see from Zeno's paradoxes how space and time are intimately related in the analysis of motion. For the moment, I will only note that the theory of relativity significantly extends this view, to the point where space and time may be considered two sides of the same coin.

*if* time is not continuous, then the changes are taking place too rapidly to measure, and too rapidly to make any detectable difference in any experiment that they have dreamed up. The theoretical work that has been done on the assumption that time *may* consist of discontinuous jumps often focuses on the most plausible scale, which is related to the three fundamental constants of nature -- the speed of light, the quantum of action, and the gravitational constant. This is sometimes called the "Planck scale," involving the "Planck time," after the German physicist Max Planck, who laid much of the foundation of quantum mechanics through his study of minimum units in nature. On this theoretical basis, the pace of time would be around 10^{-44} seconds. That is one billionth-of-a-billionth-of-a-billionth-of-a-billionth of a second. And that is much too quick to measure by today's methods, or by any method that today's scientists are able to conceive of, or even hope for.

**Mixing philosophy, science, time, and space.** We see that the branch of physics known as relativity has been remarkably successful in its conclusion that space and time are two sides of the same coin, and should properly be thought of as a single entity: space-time. We see also that the philosophical logic of Zeno's paradoxes has always strongly implied that both space and time are quantized at some smallest, irreducible level, but that this conclusion has long been resisted because it did not seem to agree with humanexperience in the "real world." Further, we see that quantum mechanics has both discovered the ancientparadoxes anew in its mathematics, and provided some evidence of quantized space and time in its essentialexperimental results showing that "physical" processes jump from one state to the next without transition. The most plausible conclusion to be drawn from all of this is that space and time are, indeed, quantized. That is, there is some unit of distance or length which can be called "1," and which admits no fractions; and, similarly, there is some unit of time which can be called "1," and which also admits no fractions.

The essence of a local interaction is direct contact -- as basic as a punch in the nose. Body A affects body B locallywhen it either touches B or touches something else that touches B. A gear train is a typical local mechanism. Motion passes from one gear wheel to another in an unbroken chain. Break the chain by taking out a single gear and the movement cannot continue. Without something there to mediate it, a local interaction cannot cross a gap.

On the other hand, the essence of non locality is unmediated action-at-a-distance. A non-local interaction jumps from body A to body B without touching anything in between. Voodoo injury is an example of a non-local interaction. When a voodoo practitioner sticks a pin in her doll, the distant target is (supposedly) instantly wounded, although nothing actually travels from doll to victim. Believers in voodoo claim that an action herecauses an effectthere; that's all there is to it. Without benefit of mediation, a non-local interaction effortlessly flashes across the void.[5]

*implied* that their properties thereafter would be connected regardless of separation in space or time (just as *x* + 2 = 4 *implies* that *x* = 2). It then turned out that these properties *are* connected regardless of separation in space or time. The experimentalists in the laboratory had confirmed that where the math can be manipulated to produce an absurd result, the matter and energy all around us obligingly will be found to behave in exactly that absurd manner. In the case of non-locality, the behavior is uncomfortably close to magic.

**The computer analogy. **The non-locality which appears to be a basic feature of our world also finds an analogy in the same metaphor of a computer simulation. In terms of cosmology, the scientific question is, "How can two particles separated by half a universe be understood as connected such that they interact as though they were right on top of each other?" If we analogize to a computer simulation, the question would be, "How can two pictures at the far corners of the screen be understood as connected such that the distance between them is irrelevant?"

*always* prevail over physical insight -- the equivalence between quantum symbolism and universal reality must be more than an oddity: it must be the very nature of reality.

**The final computer analogy. **An example which literally fits this description is the computer simulation, which is a graphic representation created by executing programming code. The programming code itself consists of nothing but symbols, such as 0 and 1. Numbers, text, graphics and anything else you please are coded by unique series of numbers. These symbolic codes have no meaning in themselves, but arbitrarily are assigned values which have significance according to the operations of the computer. The symbols are manipulated according to the various step-by-step sequences (algorithms) by which the programming instructs the computer how to create the graphic representation. The picture presented on-screen to the user is a world executed in colored dots; the computer's programming is a world (the same world) executed in symbols. Anyone who has experienced a computer crash knows that the programming (good or bad) governs the picture, and not vice versa. All of this forms a remarkably tight analogy to the relationship between the quantum math on paper, and the behavior of the "quantumstuff" in the outside world.

Great Neck, New York

May 2, 1999

1 | M. Kaku, Hyperspace, at 8n. | Back |

2 | J. Gribbin, In Search of Schrodinger's Cat, 111. | Back |

3 | J. Gleick, Genius, 122. | Back |

4 | R. Penrose, The Emperor's New Mind, 25-26. See also D. Eck, The Most Complex
Machine, 8-9. | Back |

5 | N. Herbert, Quantum Reality, 212-13. | Back |

6 | "Entangled Trio to Put Nonlocality to the Test," Science 283, 1429 (Mar. 5, 1999). | Back |

7 | N. Herbert at 41. | Back |

8 | N. Herbert at 41. | Back |