VICUG-L Archives

Visually Impaired Computer Users' Group List

VICUG-L@LISTSERV.ICORS.ORG

Options: Use Forum View

Use Monospaced Font
Show Text Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Subject:
From:
Kelly Pierce <[log in to unmask]>
Reply To:
VICUG-L: Visually Impaired Computer Users' Group List
Date:
Sun, 3 Jan 1999 11:00:27 -0600
Content-Type:
TEXT/PLAIN
Parts/Attachments:
TEXT/PLAIN (1752 lines)
In 1976, Ray Kurzweil launched the blind into the digital revolution when
he demonstrated the first Kurzweil Reading Machine, which converted
printed text into synthetic speech.  The machine and its constituent
technologies of flatbed scanning, optical character recognition and speech
synthesis began a process of increasing information independence for the
blind.  The New York Times Book Review features a new book by Ray Kurzweil
today.  The review is below as well as the full text of the first chapter.
The two items are separated by a line of asterisks (****).

kelly 


   
      January 3, 1999
      
     Hello, HAL
     ______________________________________________________________
     
     Three books examine the future of artificial intelligence and find
     the human brain is in trouble.
     

      By COLIN McGINN
      
     _________________________________________________________________
                                                                         
                                            THE AGE OF SPIRITUAL MACHINES
                                                    When Computers Exceed
                                                      Human Intelligence.
                                                         By Ray Kurzweil.
                                           Illustrated. 388 pp. New York:
                                                          Viking. $25.95.
     _________________________________________________________________
                                                                         
                                                                         
                                                                    ROBOT
                                       Mere Machine to Transcendent Mind.
                                                         By Hans Moravec.
                                           Illustrated. 227 pp. New York:
                                            Oxford University Press. $25.
     _________________________________________________________________
                                                                         
                                               WHEN THINGS START TO THINK
                                                     By Neil Gershenfeld.
                                           Illustrated. 225 pp. New York:
                                               Henry Holt & Company. $25.
     _________________________________________________________________
                                                                         
                                                                         
   Has the invasion already begun? Are the aliens already right under
   our noses? Are machines, the products of human engineering
   intelligence, poised to take over the world -- or is this an
   irrational fear, the latest spasm of the Luddite spirit? Finally, is
   the whole idea just a clever marketing ploy for the investment-hungry
   artificial intelligence industry? Here we have three books, all
   written by experts in computer intelligence, aimed to persuade us that
   the Age of Machines is nigh. We are to be eclipsed by our own
   technology, ceding our outdated flesh, blood and neural tissue to
   integrated circuits and their mechanistic progeny. The future belongs
   to the robots.
   
   The roots of this dystopian vision (or utopian, depending on your
   view) go back to a prediction made in the mid-1960's by a former
   chairman of Intel, Gordon Moore, that the size of each transistor on
   an integrated circuit will be reduced by 50 percent every 24 months.
   This prediction, now grandly known as Moore's law, implies the
   exponentially expanding power of circuit-based computation over time.
   A rough corollary is that you will get double the computational power
   for the same price at two-year intervals. Thus computers today can
   perform millions more computations per second than equivalently priced
   computers of only a few decades ago. It is further predicted that new
   computer technologies will take over where integrated circuits leave
   off and continue the inexorable march toward exponentially increasing
   computational power. The computational capacity of the human brain is
   only a few decades away from being duplicated on an affordable
   computing machine. Brains are about to be outpaced by one of their
   products. They are already being outdone in certain areas: speed of
   calculation, data storage, theorem-proving, chess.
   
   All three of these books provide a vivid window on the state of the
   art in artificial intelligence research, and offer provocative
   speculations on where we might be heading as the information age
   advances. Of the three, ''The Age of Spiritual Machines,'' by Ray
   Kurzweil, is the best: it is more detailed, thoughtful, clearly
   explained and attractively written than ''Robot: Mere Machine to
   Transcendent Mind,'' by Hans Moravec, and ''When Things Start to
   Think,'' by Neil Gershenfeld -- though all three are creditable
   efforts at popularization.
   
   Since the books cover much the same ground, with some difference of
   emphasis, Kurzweil's gives you the most bits for your buck.
   Gershenfeld's breezily chatty book sometimes reads too much like an
   advertisement for the Media Lab at M.I.T., of which he is director.
   There is much discussion (and not a little hype) of his many
   achievements in harnessing computer technology to more physical
   concerns: electronic books, smart shoes, wearable computers,
   technologically enhanced cellos.
   
   Moravec's book is more intellectually adventurous and free with
   confident futuristic speculation. He envisages autonomous robot-run
   industries that we tax to siphon off their wealth, and the gradual
   replacement of organic humans with mechanical descendants -- our
   ''mind children.'' His vision is of a world in which machines are the
   next evolutionary step, with organic tissue but a blink in the eye of
   cosmic history. Once intelligence is created by natural selection it
   will be only a matter of time (a very short one by cosmic standards)
   before the products of intelligence outshine their creators, finally
   displacing them altogether. This is good knockabout stuff, a heady and
   unnerving glimpse into a possible future. Where Moravec is weak is in
   attempts at philosophical discussion of machine consciousness and the
   nature of mind. He writes bizarre, confused, incomprehensible things
   about consciousness as an abstraction, like number, and as a mere
   ''interpretation'' of brain activity. He also loses his grip on the
   distinction between virtual and real reality as his speculations
   spiral majestically into incoherence.
   
   Kurzweil is more philosophically sensitive, and hence cautious, in his
   claims for computer consciousness; he develops the same kinds of
   speculations as Moravec, but with more of an emphasis on the meaning
   of such innovations for human life. He has an engaging discussion of
   the future of virtual sex once the technology includes realistic
   haptic simulations (what other bodies feel like to touch); here he
   envisages the eventual triumph of the virtual over the real. His book
   ranges widely over such juicy topics as entropy, chaos, the big bang,
   quantum theory, DNA computers, quantum computers, Godel's theorem,
   neural nets, genetic algorithms, nanoengineering, the Turing test,
   brain scanning, the slowness of neurons, chess playing programs, the
   Internet -- the whole world of information technology past, present
   and future. This is a book for computer enthusiasts, science fiction
   writers in search of cutting-edge themes and anyone who wonders where
   human technology is going next.
   
   But the question must be asked: How seriously are we to take all this
   breathless compuhype? Will the 21st century really see machines
   acquire mentality?
   
   There is naturally a lot of talk in these books about the possibility
   of machines duplicating the operations of the human mind. But it is
   vital to distinguish two questions, which are often run together by
   our authors: Can machines duplicate the external intelligent behavior
   of humans? And can machines duplicate the inner subjective experience
   of people? Call these the questions of outside and inside duplication.
   What is known as the Turing test says in effect that if a machine can
   mimic the outside of a human then it has thereby replicated the
   inside: if it behaves like a human with a mind, it has a mind. All
   three authors are partial to the Turing test, thus equating the
   simulation of external manifestations of mind with the reality of mind
   itself. However, the Turing test is seriously flawed as a criterion of
   mentality.
   
   First, it is just an application of the doctrine of behaviorism, the
   view that minds reduce to bodily motions; and behaviorism has long
   since been abandoned, even by psychologists. Behavior is just the
   evidence for mind in others, not its very nature. This is why you can
   act as if you are in pain and not really be in pain -- you are just
   pretending.
   
   Second, there is the kind of problem highlighted by the philosopher
   John Searle in his ''Chinese Room'' argument: computer programs work
   merely by the manipulation of symbols without any reference to what
   these symbols might mean, so that it would be possible for a human to
   follow such a program for a language he has no understanding of. The
   computer is like my manipulating sentences of Chinese according to
   formal rules and yet having no understanding of the Chinese language.
   It follows that mimicking the externals of human understanding by
   means of a symbol-crunching computer program is not devising a machine
   that itself understands. None of our authors even so much as consider
   this well-known and actually quite devastating argument.
   
   Third, to know whether we can construct a machine that is conscious we
   need to know what makes us conscious, for only then can we determine
   whether the actual basis of consciousness can occur in an inorganic
   system. But we simply don't know what makes organic brains conscious;
   we don't know what properties of neurons are responsible for the
   emergence of subjectivity. We would need to solve the age-old
   mind-body problem before we could sensibly raise the question of minds
   in machines. My hunch is that it is something about specifically
   organic tissue that is responsible for consciousness, since this seems
   to be the way nature has chosen to engineer consciousness; but that
   can only be a guess in view of our deep ignorance of the roots of
   consciousness in the brain. In any case, lacking insight into the
   basis of consciousness, it is futile to ask whether a machine could
   have what it takes to generate consciousness.
   
   Passing the Turing test is therefore no proof of machine
   consciousness: outside duplication does not guarantee inside
   duplication. This bears strongly on a practical suggestion of Kurzweil
   -- that during the course of the 21st century we might decide to
   ''upload'' ourselves into a suitable computing machine as a way of
   extending our lives and acquiring a more robust physical constitution.
   Let us suppose that the machine you choose to upload into passes the
   Turing test; it had better or else you would not wish to inhabit it.
   The problem is that it might do so without containing the potential
   for any form of consciousness, so that uploading your mind into it
   amounts to letting your mind evaporate into thin air. You will pass
   from sentient being to insentient robot.
   
   That is a lot to risk on the veracity of the Turing test! And it is no
   good hoping that the robots themselves will tell you whether they are
   conscious, since they will say they are -- whether or not they are. If
   people become convinced of the validity of the Turing test on mistaken
   philosophical grounds, then we might find ourselves in the position of
   unknowingly extinguishing our consciousness by uploading into machines
   that are inherently incapable of feeling anything. If Kurzweil is
   right when he says that machines that mimic the externals of human
   performance will become available sometime during the next century,
   then I suggest that the human race ponder the merits of the Turing
   test very carefully before taking any drastic steps. I for one would
   prefer sentient mortality to insentient immortality, or, more
   accurately, to the end of my self and the creation of an unconscious
   machine that merely behaves like me.
   
   Kurzweil, Moravec and Gershenfeld take it as a given that the mind is
   essentially a computer. The question then is just how powerful a
   computer the mind is and whether a machine could duplicate this power.
   But the authors do not think hard enough about their basic assumption.
   It is true that human minds manipulate symbols and engage in mental
   computations, as when doing arithmetic. But it does not follow from
   this that computing is the essence of mind; maybe computing is just
   one aspect of the nature of mind. And isn't this already obvious from
   the fact that many nonmental systems engage in computations? Silicon
   chips are not conscious, nor are the components of any future
   molecular or quantum computer. The fact is that minds are just one
   kind of computational system among many, not all of which have any
   trace of mentality in them. So computation cannot be definitive of
   mind.
   
   One aspect of mind wholly omitted by the computational conception is
   the phenomenological features of experience -- the specific way a rose
   smells, for instance. This is something over and above any
   rose-related computations a machine might perform. A DNA computer has
   biochemical as well as computational properties; a conscious mind has
   phenomenological as well as computational properties. These
   phenomenological properties have a stronger claim to being distinctive
   of the mind than mere computational ones. There is thus no reductive
   explanation of the mental in terms of the computational; we cannot
   regard consciousness as nothing but a volley of physically implemented
   symbol manipulations. And this means that there is no reason at all to
   believe that building ever larger and faster computers will take us
   one jot closer to building a genuinely mental machine. The fallacy
   here is analogous to reasoning that if a human body is a device for
   taking you from A to B, and a car also does this, then a human body is
   the same thing as a car. Minds compute and so do silicon chips, but
   that is no reason to suppose that minds are nothing more than what
   they have in common with silicon chips (any more than silicon chips
   are nothing more than what they have in common with minds).
   
   If our three authors are wobbly on the philosophy of mind and
   artificial intelligence, they are strong on computer technology
   itself; and here is where their books are particularly interesting.
   The reader can simply detach all the dubious speculations about
   machine consciousness and focus on the authors' predictions about the
   future of computer and robot technology, its potential benefits and
   hazards. Consider two examples of the kind of technology that might
   well be just over the horizon: the foglets and the nanobots. Foglets
   are tiny, cell-sized robots, each more computationally powerful than
   the human brain, that are equipped with minute gripping arms that
   enable them to join together into diverse physical structures. At ease
   the foglets are just a loose swarm of suspended particles in the air,
   but when you press a button they execute a program for forming
   themselves into an object of your choosing. We may come to live in
   foglet houses whose rooms are formed from the same foggy swarm. We may
   come to have foglet friends and take foglet vacations. Our entire
   physical environment may come to consist of a 3-D mosaic of
   cooperating microscopic computers. This would be virtual reality made
   concrete.
   
   Nanobots are devices for nanoengineering, the manipulation of matter
   on the atomic scale. They are also high-power microcomputers, equipped
   with manipulative skills and an urge to perpetuate their kind. They
   can make copies of themselves by following a program for nano-scale
   operations on chunks of surrounding matter. Imagine you start with 10
   of them and that they can each make a copy of themselves in five
   seconds (they can do many millions of computations a second and their
   little mechanical limbs move, insectlike, with great rapidity). That
   means they double their numbers every five seconds, and an exponential
   nanobot population explosion is set to break out. These little
   blighters could consume the entire planet in a matter of weeks,
   including all the organic material on it! Nor would they be picked off
   by natural predators, being quite indigestible. In a very short time
   the nanobots will have razed everything in sight.
   
   Self-replication is perhaps the biggest hazard presented by advanced
   computer technology. Even today computers are routinely used to design
   other computers; in the next century they may be making computers that
   challenge humans in all sorts of ways. Victor Frankenstein refused to
   give his monstrous creation a bride for fear of their reproductive
   potential. Maybe we should be thinking hard now about the replicative
   powers of intelligent machines. If the 20th century was the century of
   nuclear weapons, then the 21st might be the century of self-breeding
   aliens of our own devising.
   _________________________________________________________________
   
   Colin McGinn, a professor of philosophy at Rutgers University, is the
   author of ''Ethics, Evil and Fiction'' and ''The Mysterious Flame:
   Conscious Minds in a Material World,'' to be published this spring.
   

*******************

   
     CHAPTER ONE 
     
     The Age of Spiritual Machines
     When Computers Exceed Human Intelligence
       ______________________________________________________________
     
     By RAY KURZWEIL
     Viking
     
     
     The Law Of Time And Chaos
     A (Very Brief) History of the Universe:
     Time Slowing Down
     
     The universe is made of stories, not of atoms.
     -- Muriel Rukeyser
     
     Is the universe a great mechanism, a great computation, a great
     symmetry, a great accident or a great thought?
     -- John D. Barrow
     
     As we start at the beginning, we will notice an unusual attribute
     of the nature of time, one that is critical to our passage to the
     twenty-first century. Our story begins perhaps 15 billion years
     ago. No conscious life existed to appreciate the birth of our
     Universe at the time, but we appreciate it now, so retroactively it
     did happen. (In retrospect -- from one perspective of quantum
     mechanics -- we could say that any Universe that fails to evolve
     conscious life to apprehend its existence never existed in the
     first place.)
     
     It was not until 10-43 seconds (a tenth of a millionth of a
     trillionth of a trillionth of a trillionth of a second) after the
     birth of the Universe that the situation had cooled off
     sufficiently (to 100 million trillion trillion degrees) that a
     distinct force -- gravity -- evolved.
     
     Not much happened for another 10-34 seconds (this is also a very
     tiny fraction of a second, but it is a billion times longer than
     10-43 seconds), at which point an even cooler Universe (now only a
     billion billion billion degrees) allowed the emergence of matter in
     the form of electrons and quarks. To keep things balanced,
     antimatter appeared as well. It was an eventful time, as new forces
     evolved at a rapid rate. We were now up to three: gravity, the
     strong force, and the electroweak force. After another 10-10
     seconds (a tenth of a billionth of a second), the electroweak force
     split into the electromagnetic and weak forces we know so well
     today.
     
     Things got complicated after another 10-5 seconds (ten millionths
     of a second). With the temperature now down to a relatively balmy
     trillion degrees, the quarks came together to form protons and
     neutrons. The antiquarks did the same, forming antiprotons.
     
     Somehow, the matter particles achieved a slight edge. How this
     happened is not entirely clear. Up until then, everything had
     seemed so, well, even. But had everything stayed evenly balanced,
     it would have been a rather boring Universe. For one thing, life
     never would have evolved, and thus we could conclude that the
     Universe would never have existed in the first place.
     
     For every 10 billion antiprotons, the Universe contained 10 billion
     and 1 protons. The protons and antiprotons collided, causing the
     emergence of another important phenomenon: light (photons). Thus,
     almost all of the antimatter was destroyed, leaving matter as
     dominant. (This shows you the danger of allowing a competitor to
     achieve even a slight advantage.)
     
     Of course, had antimatter won, its descendants would have called it
     matter and would have called matter antimatter, so we would be back
     where we started (perhaps that is what happened).
     
     After another second (a second is a very long time compared to some
     of the earlier chapters in the Universe's history, so notice how
     the time frames are growing exponentially larger), the electrons
     and antielectrons (called positrons) followed the lead of the
     protons and antiprotons and similarly annihilated each other,
     leaving mostly the electrons.
     
     After another minute, the neutrons and protons began coalescing
     into heavier nuclei, such as helium, lithium, and heavy forms of
     hydrogen. The temperature was now only a billion degrees.
     
     About 300,000 years later (things are slowing down now rather
     quickly), with the average temperature now only 3,000 degrees, the
     first atoms were created as the nuclei took control of nearby
     electrons.
     
     After a billion years, these atoms formed large clouds that
     gradually swirled into galaxies.
     
     After another two billion years, the matter within the galaxies
     coalesced further into distinct stars, many with their own solar
     systems.
     
     Three billion years later, circling an unexceptional star on the
     arm of a common galaxy, an unremarkable planet we call the Earth
     was born.
     
     Now before we go any further, let's notice a striking feature of
     the passage of time. Events moved quickly at the beginning of the
     Universe's history. We had three paradigm shifts in just the first
     billionth of a second. Later on, events of cosmological
     significance took billions of years. The nature of time is that it
     inherently moves in an exponential fashion -- either geometrically
     gaining in speed, or, as in the history of our Universe,
     geometrically slowing down. Time only seems to be linear during
     those eons in which not much happens. Thus most of the time, the
     linear passage of time is a reasonable approximation of its
     passage. But that's not the inherent nature of time.
     
     Why is this significant? It's not when you're stuck in the eons in
     which not much happens. But it is of great significance when you
     find yourself in the "knee of the curve," those periods in which
     the exponential nature of the curve of time explodes either
     inwardly or outwardly. It's like falling into a black hole (in that
     case, time accelerates exponentially faster as one falls in).
     
     The Speed of Time
     
     But wait a second, how can we say that time is changing its
     "speed"? We can talk about the rate of a process, in terms of its
     progress per second, but can we say that time is changing its rate?
     Can time start moving at, say, two seconds per second?
     
     Einstein said exactly this -- time is relative to the entities
     experiencing it. One man's second can be another woman's forty
     years. Einstein gives the example of a man who travels at very
     close to the speed of light to a star -- say, twenty light-years
     away. From our Earth-bound perspective, the trip takes slightly
     more than twenty years in each direction. When the man gets back,
     his wife has aged forty years. For him, however, the trip was
     rather brief. If he travels at close enough to the speed of light,
     it may have only taken a second or less (from a practical
     perspective we would have to consider some limitations, such as the
     time to accelerate and decelerate without crushing his body). Whose
     time frame is the correct one? Einstein says they are both correct,
     and exist only relative to each other.
     
     Certain species of birds have a life span of only several years. If
     you observe their rapid movements, it appears that they are
     experiencing the passage of time on a different scale. We
     experience this in our own lives. A young child's rate of change
     and experience of time is different from that of an adult. Of
     particular note, we will see that the acceleration in the passage
     of time for evolution is moving in a different direction than that
     for the Universe from which it emerges.
     
     It is in the nature of exponential growth that events develop
     extremely slowly for extremely long periods of time, but as one
     glides through the knee of the curve, events erupt at an
     increasingly furious pace. And that is what we will experience as
     we enter the twenty-first century.
     
     EVOLUTION: TIME SPEEDING UP
     
     In the beginning was the word. . . . And the word became flesh.
     -- John 1:1,14
     
     A great deal of the universe does not need any explanation.
     Elephants, for instance. Once molecules have learnt to compete and
     create other molecules in their own image, elephants, and things
     resembling elephants, will in due course be found roaming through
     the countryside.
     -- Peter Atkins
     
     The further backward you look, the further forward you can see.
     -- Winston Churchill
     
     We'll come back to the knee of the curve, but let's delve further
     into the exponential nature of time. In the nineteenth century, a
     set of unifying principles called the laws of thermodynamics was
     postulated. As the name implies, they deal with the dynamic nature
     of heat and were the first major refinement of the laws of
     classical mechanics perfected by Isaac Newton a century earlier.
     Whereas Newton had described a world of clockwork perfection in
     which particles and objects of all sizes followed highly
     disciplined, predictable patterns, the laws of thermodynamics
     describe a world of chaos. Indeed, that is what heat is.
     
     Heat is the chaotic -- unpredictable -- movement of the particles
     that make up the world. A corollary of the second law of
     thermodynamics is that in a closed system (interacting entities and
     forces not subject to outside influence; for example, the
     Universe), disorder (called "entropy") increases. Thus, left to its
     own devices, a system such as the world we live in becomes
     increasingly chaotic. Many people find this describes their lives
     rather well. But in the nineteenth century, the laws of
     thermodynamics were considered a disturbing discovery. At the
     beginning of that century, it appeared that the basic principles
     governing the world were both understood and orderly. There were a
     few details left to be filled in, but the basic picture was under
     control. Thermodynamics was the first contradiction to this
     complacent picture. It would not be the last.
     
     The second law of thermodynamics, sometimes called the Law of
     Increasing Entropy, would seem to imply that the natural emergence
     of intelligence is impossible. Intelligent behavior is the opposite
     of random behavior, and any system capable of intelligent responses
     to its environment needs to be highly ordered. The chemistry of
     life, particularly of intelligent life, is comprised of
     exceptionally intricate designs. Out of the increasingly chaotic
     swirl of particles and energy in the world, extraordinary designs
     somehow emerged. How do we reconcile the emergence of intelligent
     life with the Law of Increasing Entropy?
     
     There are two answers here. First, while the Law of Increasing
     Entropy would appear to contradict the thrust of evolution, which
     is toward increasingly elaborate order, the two phenomena are not
     inherently contradictory. The order of life takes place amid great
     chaos, and the existence of life-forms does not appreciably affect
     the measure of entropy in the larger system in which life has
     evolved. An organism is not a closed system. It is part of a larger
     system we call the environment, which remains high in entropy. In
     other words, the order represented by the existence of life-forms
     is insignificant in terms of measuring overall entropy.
     
     Thus, while chaos increases in the Universe, it is possible for
     evolutionary processes that create increasingly intricate, ordered
     patterns to exist simultaneously. Evolution is a process, but it is
     not a closed system. It is subject to outside influence, and indeed
     draws upon the chaos in which it is embedded. So the Law of
     Increasing Entropy does not rule out the emergence of life and
     intelligence.
     
     For the second answer, we need to take a closer look at evolution,
     as it was the original creator of intelligence.
     
     The Exponentially Quickening Pace of Evolution
     
     As you will recall, after billions of years, the unremarkable
     planet called Earth was formed. Churned by the energy of the sun,
     the elements formed more and more complex molecules. From physics,
     chemistry was born.
     
     Two billion years later, life began. That is to say, patterns of
     matter and energy that could perpetuate themselves and survive
     perpetuated themselves and survived. That this apparent tautology
     went unnoticed until a couple of centuries ago is itself
     remarkable.
     
     Over time, the patterns became more complicated than mere chains of
     molecules. Structures of molecules performing distinct functions
     organized themselves into little societies of molecules. From
     chemistry, biology was born.
     
     Thus, about 3.4 billion years ago, the first earthly organisms
     emerged: anaerobic (not requiring oxygen) prokaryotes
     (single-celled creatures) with a rudimentary method for
     perpetuating their own designs. Early innovations that followed
     included a simple genetic system, the ability to swim, and
     photosynthesis, which set the stage for more advanced,
     oxygen-consuming organisms. The most important development for the
     next couple of billion years was the DNA-based genetics that would
     henceforth guide and record evolutionary development.
     
     A key requirement for an evolutionary process is a "written" record
     of achievement, for otherwise the process would be doomed to repeat
     finding solutions to problems already solved. For the earliest
     organisms, the record was written (embodied) in their bodies, coded
     directly into the chemistry of their primitive cellular structures.
     With the invention of DNA-based genetics, evolution had designed a
     digital computer to record its handiwork. This design permitted
     more complex experiments. The aggregations of molecules called
     cells organized themselves into societies of cells with the
     appearance of the first multicellular plants and animals about 700
     million years ago. For the next 130 million years, the basic body
     plans of modern animals were designed, including a spinal
     cord-based skeleton that provided early fish with an efficient
     swimming style.
     
     So while evolution took billions of years to design the first
     primitive cells, salient events then began occurring in hundreds of
     millions of years, a distinct quickening of the pace. When some
     calamity finished off the dinosaurs 65 million years ago, mammals
     inherited the Earth (although the insects might disagree). With the
     emergence of the primates, progress was then measured in mere tens
     of millions of years. Humanoids emerged 15 million years ago,
     distinguished by walking on their hind legs, and now we're down to
     millions of years.
     
     With larger brains, particularly in the area of the highly
     convoluted cortex responsible for rational thought, our own
     species, Homo sapiens, emerged perhaps 500,000 years ago. Homo
     sapiens are not very different from other advanced primates in
     terms of their genetic heritage. Their DNA is 98.6 percent the same
     as the lowland gorilla, and 97.8 percent the same as the orangutan.
     The story of evolution since that time now focuses in on a
     human-sponsored variant of evolution: technology.
     
     TECHNOLOGY: EVOLUTION BY OTHER MEANS
     
     When a scientist states that something is possible, he is almost
     certainly right. When he states that something is impossible, he is
     very probably wrong. The only way of discovering the limits of the
     possible is to venture a little way past them into the impossible.
     Any sufficiently advanced technology is indistinguishable from
     magic.
     -- Arthur C. Clarke's three laws of technology
     
     A machine is as distinctively and brilliantly and expressively
     human as a violin sonata or a theorem in Euclid.
     -- Gregory Vlastos
     
     Technology picks right up with the exponentially quickening pace of
     evolution. Although not the only tool-using animal, Homo sapiens
     are distinguished by their creation of technology. Technology goes
     beyond the mere fashioning and use of tools. It involves a record
     of tool making and a progression in the sophistication of tools. It
     requires invention and is itself a continuation of evolution by
     other means. The "genetic code" of the evolutionary process of
     technology is the record maintained by the tool-making species.
     Just as the genetic code of the early life-forms was simply the
     chemical composition of the organisms themselves, the written
     record of early tools consisted of the tools themselves. Later on,
     the "genes" of technological evolution evolved into records using
     written language and are now often stored in computer databases.
     Ultimately, the technology itself will create new technology. But
     we are getting ahead of ourselves.
     
     Our story is now marked in tens of thousands of years. There were
     multiple subspecies of Homo sapiens. Homo sapiens neanderthalensis
     emerged about 100,000 years ago in Europe and the Middle East and
     then disappeared mysteriously about 35,000 to 40,000 years ago.
     Despite their brutish image, Neanderthals cultivated an involved
     culture that included elaborate funeral rituals -- burying their
     dead with ornaments, including flowers. We're not entirely sure
     what happened to our Homo sapiens cousins, but they apparently got
     into conflict with our own immediate ancestors Homo sapiens
     sapiens, who emerged about 90,000 years ago. Several species and
     subspecies of humanoids initiated the creation of technology. The
     most clever and aggressive of these subspecies was the only one to
     survive. This established a pattern that would repeat itself
     throughout human history, in that the technologically more advanced
     group ends up becoming dominant. This trend may not bode well as
     intelligent machines themselves surpass us in intelligence and
     technological sophistication in the twenty-first century.
     
     Our Homo sapiens sapiens subspecies was thus left alone among
     humanoids about 40,000 years ago.
     
     Our forebears had already inherited from earlier hominid species
     and subspecies such innovations as the recording of events on cave
     walls, pictorial art, music, dance, religion, advanced language,
     fire, and weapons. For tens of thousands of years, humans had
     created tools by sharpening one side of a stone. It took our
     species tens of thousands of years to figure out that by sharpening
     both sides, the resultant sharp edge provided a far more useful
     tool. One significant point, however, is that these innovations did
     occur, and they endured. No other tool-using animal on Earth has
     demonstrated the ability to create and retain innovations in their
     use of tools.
     
     The other significant point is that technology, like the evolution
     of life-forms that spawned it, is inherently an accelerating
     process. The foundations of technology -- such as creating a sharp
     edge from a stone -- took eons to perfect, although for
     human-created technology, eons means thousands of years rather than
     the billions of years that the evolution of life-forms required to
     get started.
     
     Like the evolution of life-forms, the pace of technology has
     greatly accelerated over time. The progress of technology in the
     nineteenth century, for example, greatly exceeded that of earlier
     centuries, with the building of canals and great ships, the advent
     of paved roads, the spread of the railroad, the development of the
     telegraph, and the invention of photography, the bicycle, sewing
     machine, typewriter, telephone, phonograph, motion picture,
     automobile, and of course Thomas Edison's light bulb. The continued
     exponential growth of technology in the first two decades of the
     twentieth century matched that of the entire nineteenth century.
     Today, we have major transformations in just a few years' time. As
     one of many examples, the latest revolution in communications --
     the World Wide Web -- didn't exist just a few years ago.
     
     WHAT IS TECHNOLOGY?
     
     As technology is the continuation of evolution by other means, it
     shares the phenomenon of an exponentially quickening pace. The word
     is derived from the Greek tekhn¯e, which means "craft" or "art,"
     and logia, which means "the study of." Thus one interpretation of
     technology is the study of crafting, in which crafting refers to
     the shaping of resources for a practical purpose. I use the term
     resources rather than materials because technology extends to the
     shaping of nonmaterial resources such as information.
     
     Technology is often defined as the creation of tools to gain
     control over the environment. However, this definition is not
     entirely sufficient. Humans are not alone in their use or even
     creation of tools. Orangutans in Sumatra's Suaq Balimbing swamp
     make tools out of long sticks to break open termite nests. Crows
     fashion tools from sticks and leaves. The leaf-cutter ant mixes dry
     leaves with its saliva to create a paste. Crocodiles use tree roots
     to anchor dead prey.
     
     What is uniquely human is the application of knowledge -- recorded
     knowledge -- to the fashioning of tools. The knowledge base
     represents the genetic code for the evolving technology. And as
     technology has evolved, the means for recording this knowledge base
     has also evolved, from the oral traditions of antiquity to the
     written design logs of nineteenth-century craftsmen to the
     computer-assisted design databases of the 1990s.
     
     Technology also implies a transcendence of the materials used to
     comprise it. When the elements of an invention are assembled in
     just the right way, they produce an enchanting effect that goes
     beyond the mere parts. When Alexander Graham Bell accidentally
     wire-connected two moving drums and solenoids (metal cores wrapped
     in wire) in 1875, the result transcended the materials he was
     working with. For the first time, a human voice was transported,
     magically it seemed, to a remote location. Most assemblages are
     just that: random assemblies. But when materials -- and in the case
     of modern technology, information -- are assembled in just the
     right way, transcendence occurs. The assembled object becomes far
     greater than the sum of its parts.
     
     The same phenomenon of transcendence occurs in art, which may
     properly be regarded as another form of human technology. When
     wood, varnishes, and strings are assembled in just the right way,
     the result is wondrous: a violin, a piano. When such a device is
     manipulated in just the right way, there is magic of another sort:
     music. Music goes beyond mere sound. It evokes a response --
     cognitive, emotional, perhaps spiritual -- in the listener, another
     form of transcendence. All of the arts share the same goal: of
     communicating from artist to audience. The communication is not of
     unadorned data, but of the more important items in the
     phenomenological garden: feelings, ideas, experiences, longings.
     The Greek meaning of tekhne logia includes art as a key
     manifestation of technology.
     
     Language is another form of human-created technology. One of the
     primary applications of technology is communication, and language
     provides the foundation for Homo sapiens communication.
     Communication is a critical survival skill. It enabled human
     families and tribes to develop cooperative strategies to overcome
     obstacles and adversaries. Other animals communicate. Monkeys and
     apes use elaborate gestures and grunts to communicate a variety of
     messages. Bees perform intricate dances in a figure-eight pattern
     to communicate where caches of nectar may be found. Female tree
     frogs in Malaysia do tap dances to signal their availability. Crabs
     wave their claws in one way to warn adversaries but use a different
     rhythm for courtship. But these methods do not appear to evolve,
     other than through the usual DNA-based evolution. These species
     lack a way to record their means of communication, so the methods
     remain static from one generation to the next. In contrast, human
     language does evolve, as do all forms of technology. Along with the
     evolving forms of language itself, technology has provided
     ever-improving means for recording and distributing human language.
     
     Homo sapiens are unique in their use and fostering of all forms of
     what I regard as technology: art, language, and machines, all
     representing evolution by other means. In the 1960s through 1990s,
     several well-publicized primates were said to have mastered at
     least childlike language skills. Chimpanzees Lana and Kanzi pressed
     sequences of buttons with symbols on them. Gorillas Washoe and Koko
     were said to be using American Sign Language. Many linguists are
     skeptical, noting that many primate "sentences" were jumbles, such
     as "Nim eat, Nim eat, drink eat me Nim, me gum me gum, tickle me,
     Nim play, you me banana me banana you." Even if we view this
     
     phenomenon more generously, it would be the exception that proves
     the rule. These primates did not evolve the languages they are
     credited with using, they do not appear to develop these skills
     spontaneously, and their use of these skills is very limited. They
     are at best participating peripherally in what is still a uniquely
     human invention -- communicating using the recursive
     (self-referencing), symbolic, evolving means called language.
     
     The Inevitability of Technology
     
     Once life takes hold on a planet, we can consider the emergence of
     technology as inevitable. The ability to expand the reach of one's
     physical capabilities, not to mention mental facilities, through
     technology is clearly useful for survival. Technology has enabled
     our subspecies to dominate its ecological niche. Technology
     requires two attributes of its creator: intelligence and the
     physical ability to manipulate the environment. We'll talk more in
     chapter 4, "A New Form of Intelligence on Earth," about the nature
     of intelligence, but it clearly represents an ability to use
     limited resources optimally, including time. This ability is
     inherently useful for survival, so it is favored. The ability to
     manipulate the environment is also useful; otherwise an organism is
     at the mercy of its environment for safety, food, and the
     satisfaction of its other needs. Sooner or later, an organism is
     bound to emerge with both attributes.
     
     THE INEVITABILITY OF COMPUTATION
     
     It is not a bad definition of man to describe him as a tool-making
     animal. His earliest contrivances to support uncivilized life were
     tools of the simplest and rudest construction. His latest
     achievements in the substitution of machinery, not merely for the
     skill of the human hand, but for the relief of the human intellect,
     are founded on the use of tools of a still higher order.
     -- Charles Babbage
     
     All of the fundamental processes we have examined -- the
     development of the Universe, the evolution of life-forms, the
     subsequent evolution of technology -- have all progressed in an
     exponential fashion, some slowing down, some speeding up. What is
     the common thread here? Why did cosmology exponentially slow down
     while evolution accelerated? The answers are surprising, and
     fundamental to understanding the twenty-first century.
     
     But before I attempt to answer these questions, let's examine one
     other very relevant example of acceleration: the exponential growth
     of computation.
     
     Early in the evolution of life-forms, specialized organs developed
     the ability to maintain internal states and respond differentially
     to external stimuli. The trend ever since has been toward more
     complex and capable nervous systems with the ability to store
     extensive memories; recognize patterns in visual, auditory, and
     tactile stimuli; and engage in increasingly sophisticated levels of
     rea-soning. The ability to remember and to solve problems --
     computation -- has constituted the cutting edge in the evolution of
     multicellular organisms.
     
     The same value of computation holds true in the evolution of
     human-created technology. Products are more useful if they can
     maintain internal states and respond differentially to varying
     conditions and situations. As machines moved beyond mere implements
     to extend human reach and strength, they also began to accumulate
     the ability to remember and perform logical manipulations. The
     simple cams, gears, and levers of the Middle Ages were assembled
     into the elaborate automata of the European Renaissance. Mechanical
     calculators, which first emerged in the seventeenth century, became
     increasingly complex, culminating in the first automated U.S.
     census in 1890. Computers played a crucial role in at least one
     theater of the Second World War, and have developed in an
     accelerating spiral ever since.
     
     THE LIFE CYCLE OF A TECHNOLOGY
     
     Technologies fight for survival, evolve, and undergo their own
     characteristic life cycle. We can identify seven distinct stages.
     During the precursor stage, the prerequisites of a technology
     exist, and dreamers may contemplate these elements coming together.
     We do not, however, regard dreaming to be the same as inventing,
     even if the dreams are written down. Leonardo da Vinci drew
     convincing pictures of airplanes and automobiles, but he is not
     considered to have invented either.
     
     The next stage, one highly celebrated in our culture, is invention,
     a very brief stage, not dissimilar in some respects to the process
     of birth after an extended period of labor. Here the inventor
     blends curiosity, scientific skills, determination, and usually a
     measure of showmanship to combine methods in a new way to bring a
     new technology to life.
     
     The next stage is development, during which the invention is
     protected and supported by doting guardians (which may include the
     original inventor). Often this stage is more crucial than invention
     and may involve additional creation that can have greater
     significance than the original invention. Many tinkerers had
     constructed finely hand-tuned horseless carriages, but it was Henry
     Ford's innovation of mass production that enabled the automobile to
     take root and flourish.
     
     The fourth stage is maturity. Although continuing to evolve, the
     technology now has a life of its own and has become an independent
     and established part of the community. It may become so interwoven
     in the fabric of life that it appears to many observers that it
     will last forever. This creates an interesting drama when the next
     stage arrives, which I call the stage of the pretenders. Here an
     upstart threatens to eclipse the older technology. Its enthusiasts
     prematurely predict victory. While providing some distinct
     benefits, the newer technology is found on reflection to be missing
     some key element of functionality or quality. When it indeed fails
     to dislodge the established order, the technology conservatives
     take this as evidence that the original approach will indeed live
     forever.
     
     This is usually a short-lived victory for the aging technology.
     Shortly thereafter, another new technology typically does succeed
     in rendering the original technology into the stage of
     obsolescence. In this part of the life cycle, the technology lives
     out its senior years in gradual decline, its original purpose and
     functionality now subsumed by a more spry competitor. This stage,
     which may comprise 5 to 10 percent of the life cycle, finally
     yields to antiquity (examples today: the horse and buggy, the
     harpsichord, the manual typewriter, and the electromechanical
     calculator).
     
     To illustrate this, consider the phonograph record. In the
     mid-nineteenth century, there were several precursors, including
     Édouard-Léon Scott de Martinville's phonautograph, a device that
     recorded sound vibrations as a printed pattern. It was Thomas
     Edison, however, who in 1877 brought all of the elements together
     and invented the first device that could record and reproduce
     sound. Further refinements were necessary for the phonograph to
     become commercially viable. It became a fully mature technology in
     1948 when Columbia introduced the 33 revolutions-per-minute (rpm)
     long-playing record (LP) and RCA Victor introduced the 45-rpm small
     disc. The pretender was the cassette tape, introduced in the 1960s
     and popularized during the 1970s. Early enthusiasts predicted that
     its small size and ability to be rerecorded would make the
     relatively bulky and scratchable record obsolete.
     
     Despite these obvious benefits, cassettes lack random access (the
     ability to play selections in a desired order) and are prone to
     their own forms of distortion and lack of fidelity. In the late
     1980s and early 1990s, the digital compact disc (CD) did deliver
     the mortal blow. With the CD providing both random access and a
     level of quality close to the limits of the human auditory system,
     the phonograph record entered the stage of obsolescence in the
     first half of the 1990s. Although still produced in small
     quantities, the technology that Edison gave birth to more than a
     century ago is now approaching antiquity.
     
     Another example is the print book, a rather mature technology
     today. It is now in the stage of the pretenders, with the
     software-based "virtual" book as the pretender. Lacking the
     resolution, contrast, lack of flicker, and other visual qualities
     of paper and ink, the current generation of virtual book does not
     have the capability of displacing paper-based publications. Yet
     this victory of the paper-based book will be short-lived as future
     generations of computer displays succeed in providing a fully
     satisfactory alternative to paper.
     
     The Emergence of Moore's Law
     
     Gordon Moore, an inventor of the integrated circuit and then
     chairman of Intel, noted in 1965 that the surface area of a
     transistor (as etched on an integrated circuit) was being reduced
     by approximately 50 percent every twelve months. In 1975, he was
     widely reported to have revised this observation to eighteen
     months. Moore claims that his 1975 update was to twenty-four
     months, and that does appear to be a better fit to the data.
     
   MOORE'S LAW AT WORK
   Year
   1972
   1974
   1978
   1982
   1985
   1989
   1993
   1995
   1997 Transistors in Intel's Latest Computer Chip*
   3,500
   6,000
   29,000
   134,000
   275,000
   1,200,000
   3,100,000
   5,500,000
   7,500,000
   *Consumer Electronics Manufacturers Association
   
     The result is that every two years, you can pack twice as many
     transistors on an integrated circuit. This doubles both the number
     of components on a chip as well as its speed. Since the cost of an
     integrated circuit is fairly constant, the implication is that
     every two years you can get twice as much circuitry running at
     twice the speed for the same price. For many applications, that's
     an effective quadrupling of the value. The observation holds true
     for every type of circuit, from memory chips to computer
     processors.
     
     This insightful observation has become known as Moore's Law on
     Integrated Circuits, and the remarkable phenomenon of the law has
     been driving the acceleration of computing for the past forty
     years. But how much longer can this go on? The chip companies have
     expressed confidence in another fifteen to twenty years of Moore's
     Law by continuing their practice of using increasingly higher
     resolutions of optical lithography (an electronic process similar
     to photographic printing) to reduce the feature size -- measured
     today in millionths of a meter -- of transistors and other key
     components. But then -- after almost sixty years -- this paradigm
     will break down. The transistor insulators will then be just a few
     atoms thick, and the conventional approach of shrinking them won't
     work.
     
     What then?
     
     We first note that the exponential growth of computing did not
     start with Moore's Law on Integrated Circuits. In the accompanying
     figure, "The Exponential Growth of Computing, 1900-1998," I plotted
     forty-nine notable computing machines spanning the twentieth
     century on an exponential chart, in which the vertical axis
     represents powers of ten in computer speed per unit cost (as
     measured in the number of "calculations per second" that can be
     purchased for $1,000). Each point on the graph represents one of
     the machines. The first five machines used mechanical technology,
     followed by three electromechanical (relay based) computers,
     followed by eleven vacuum-tube machines, followed by twelve
     machines using discrete transistors. Only the last eighteen
     computers used integrated circuits.
     
     I then fit a curve to the points called a fourth-order polynomial,
     which allows for up to four bends. In other words, I did not try to
     fit a straight line to the points, just the closest fourth-order
     curve. Yet a straight line is close to what I got. A straight line
     on an exponential graph means exponential growth. A careful
     examination of the trend shows that the curve is actually bending
     slightly upward, indicating a small exponential growth in the rate
     of exponential growth. This may result from the interaction of two
     different exponential trends, as I will discuss in chapter 6,
     "Building New Brains." Or there may indeed be two levels of
     exponential growth. Yet even if we take the more conservative view
     that there is just one level of acceleration, we can see that the
     exponential growth of computing did not start with Moore's Law on
     Integrated Circuits, but dates back to the advent of electrical
     computing at the beginning of the twentieth century.
     
     Mechanical Computing Devices
    1. 1900  Analytical Engine
    2. 1908  Hollerith Tabulator
    3. 1911  Monroe Calculator
    4. 1919  IBM Tabulator
    5. 1928  National Ellis 3000
       
     Electromechanical (Relay Based) Computers
    6. 1939  Zuse 2
    7. 1940  Bell Calculator Model 1
    8. 1941  Zuse 3
       
     Vacuum-Tube Computers
    9. 1943  Colossus
   10. 1946  ENIAC
   11. 1948  IBM SSEC
   12. 1949  BINAC
   13. 1949  EDSAC
   14. 1951  Univac I
   15. 1953  Univac 1103
   16. 1953  IBM 701
   17. 1954  EDVAC
   18. 1955  Whirlwind
   19. 1955  IBM 704
       
     Discrete Transistor Computers
   20. 1958  Datamatic 1000
   21. 1958  Univac II
   22. 1959  Mobidic
   23. 1959  IBM 7090
   24. 1960  IBM 1620
   25. 1960  DEC PDP-1
   26. 1961  DEC PDP-4
   27. 1962  Univac III
   28. 1964  CDC 6600
   29. 1965  IBM 1130
   30. 1965  DEC PDP-8
   31. 1966  IBM 360 Model 75
       
     Integrated Circuit Computers
   32. 1968  DEC PDP-10
   33. 1973  Intellec-8
   34. 1973  Data General Nova
   35. 1975  Altair 8800
   36. 1976  DEC PDP-11 Model 70
   37. 1977  Cray 1
   38. 1977  Apple II
   39. 1979  DEC VAX 11 Model 780
   40. 1980  Sun-1
   41. 1982  IBM PC
   42. 1982  Compaq Portable
   43. 1983  IBM AT-80286
   44. 1984  Apple Macintosh
   45. 1986  Compaq Deskpro 386
   46. 1987  Apple Mac II
   47. 1993  Pentium PC
   48. 1996  Pentium PC
   49. 1998  Pentium II PC
       
     In the 1980s, a number of observers, including Carnegie Mellon
     University professor Hans Moravec, Nippon Electric Company's David
     Waltz, and myself, noticed that computers have been growing
     exponentially in power, long before the invention of the integrated
     circuit in 1958 or even the transistor in 1947. The speed and
     density of computation have been doubling every three years (at the
     beginning of the twentieth century) to one year (at the end of the
     twentieth century), regardless of the type of hardware used.
     Remarkably, this "Exponential Law of Computing" has held true for
     at least a century, from the mechanical card-based electrical
     computing technology used in the 1890 U.S. census, to the
     relay-based computers that cracked the Nazi Enigma code, to the
     vacuum-tube-based computers of the 1950s, to the transistor-based
     machines of the 1960s, and to all of the generations of integrated
     circuits of the past four decades. Computers are about one hundred
     million times more powerful for the same unit cost than they were a
     half century ago. If the automobile industry had made as much
     progress in the past fifty years, a car today would cost a
     hundredth of a cent and go faster than the speed of light.
     
     As with any phenomenon of exponential growth, the increases are so
     slow at first as to be virtually unnoticeable. Despite many decades
     of progress since the first electrical calculating equipment was
     used in the 1890 census, it was not until the mid-1960s that this
     phenomenon was even noticed (although Alan Turing had an inkling of
     it in 1950). Even then, it was appreciated only by a small
     community of computer engineers and scientists. Today, you have
     only to scan the personal computer ads -- or the toy ads -- in your
     local newspaper to see the dramatic improvements in the price
     performance of computation that now arrive on a monthly basis.
     
     So Moore's Law on Integrated Circuits was not the first, but the
     fifth paradigm to continue the now one-century-long exponential
     growth of computing. Each new paradigm came along just when needed.
     This suggests that exponential growth won't stop with the end of
     Moore's Law. But the answer to our question on the continuation of
     the exponential growth of computing is critical to our
     understanding of the twenty-first century. So to gain a deeper
     understanding of the true nature of this trend, we need to go back
     to our earlier questions on the exponential nature of time.
     
     THE LAW OF TIME AND CHAOS
     
     Is the flow of time something real, or might our sense of time
     passing be just an illusion that hides the fact that what is real
     is only a vast collection of moments?
     -- Lee Smolin
     
     Time is nature's way of preventing everything from happening at
     once.
     -- Graffito
     
     Things are more like they are now than they ever were before.
     -- Dwight Eisenhower
     
     Consider these diverse exponential trends:
     * The exponentially slowing pace that the Universe followed, with
       three epochs in the first billionth of a second, with later
       salient events taking billions of years.
     * The exponentially slowing pace in the development of an organism.
       In the first month after conception, we grow a body, a head, even
       a tail. We grow a brain in the first couple of months. After
       leaving our maternal confines, our maturation both physically and
       mentally is rapid at first. In the first year, we learn basic
       forms of mobility and communication. We experience milestones
       every month or so. Later on, key events march ever more slowly,
       taking years and then decades.
     * The exponentially quickening pace of the evolution of life-forms
       on Earth.
     * The exponentially quickening pace of the evolution of
       human-created technology, which picked up the pace from the
       evolution of life-forms.
     * The exponential growth of computing. Note that exponential growth
       of a process over time is just another way of expressing an
       exponentially quickening pace. For example, it took about ninety
       years to achieve the first MIP (Million Instructions per Second)
       for a thousand dollars. Now we add an additional MIP per thousand
       dollars every day. The overall innovation rate is clearly
       accelerating as well.
     * Moore's Law on Integrated Circuits. As I noted, this was the fifth
       paradigm to achieve the exponential growth of computing.
       
     Many questions come to mind:
     
     What is the common thread between these varied exponential trends?
     
     Why do some of these processes speed up while others slow down?
     
     And what does this tell us about the continuation of the
     exponential growth of computing when Moore's Law dies?
     
     Is Moore's Law just a set of industry expectations and goals, as
     Randy Isaac, head of basic science at IBM, contends? Or is it part
     of a deeper phenomenon that goes far beyond the photolithography of
     integrated circuits?
     
     After thinking about the relationship between these apparently
     diverse trends for several years, the surprising common theme
     became apparent to me.
     
     What determines whether time speeds up or slows down? The
     consistent answer is that time moves in relation to the amount of
     chaos. We can state the Law of Time and Chaos as follows:
     
     The Law of Time and Chaos: In a process, the time interval between
     salient events (that is, events that change the nature of the
     process, or significantly affect the future of the process) expands
     or contracts along with the amount of chaos.
     
     When there is a lot of chaos in a process, it takes more time for
     significant events to occur. Conversely, as order increases, the
     time periods between salient events decrease.
     
     We have to be careful here in our definition of chaos. It refers to
     the quantity of disordered (that is, random) events that are
     relevant to the process. If we're dealing with the random movement
     of atoms and molecules in a gas or liquid, then heat is an
     appropriate measure. If we're dealing with the process of evolution
     of life-forms, then chaos represents the unpredictable events
     encountered by organisms, and the random mutations that are
     introduced in the genetic code.
     
     Let's see how the Law of Time and Chaos applies to our examples. If
     chaos is increasing, the Law of Time and Chaos implies the
     following sublaw:
     
     The Law of Increasing Chaos: As chaos exponentially increases, time
     exponentially slows down (that is, the time interval between
     salient events grows longer as time passes).
     
     This fits the Universe rather well. When the entire Universe was
     just a "naked" singularity -- a perfectly orderly single point in
     space and time -- there was no chaos and conspicuous events took
     almost no time at all. As the Universe grew in size, chaos
     increased exponentially, and so did the timescale for epochal
     changes. Now, with billions of galaxies sprawled out over trillions
     of light-years of space, the Universe contains vast reaches of
     chaos, and indeed requires billions of years to get everything
     organized for a paradigm shift to take place.
     
     We see a similar phenomenon in the progression of an organism's
     life. We start out as a single fertilized cell, so there's only
     rather limited chaos there. Ending up with trillions of cells,
     chaos greatly expands. Finally, at the end of our lives, our
     designs deteriorate, engendering even greater randomness. So the
     time period between salient biological events grows longer as we
     grow older. And that is indeed what we experience.
     
     But it is the opposite spiral of the Law of Time and Chaos that is
     the most important and relevant for our purposes. Consider the
     inverse sublaw, which I call the Law of Accelerating Returns:
     
     The Law of Accelerating Returns: As order exponentially increases,
     time exponentially speeds up (that is, the time interval between
     salient events grows shorter as time passes).
     
     The Law of Accelerating Returns (to distinguish it from a
     better-known law in which returns diminish) applies specifically to
     evolutionary processes. In an evolutionary process, it is order --
     the opposite of chaos -- that is increasing. And, as we have seen,
     time speeds up.
     
     Disdisorder
     
     I noted above that the concept of chaos in the Law of Time and
     Chaos is tricky. Chaos alone is not sufficient -- disorder for our
     purposes requires randomness that is relevant to the process we are
     concerned with. The opposite of disorder -- which I called "order"
     in the above Law of Accelerating Returns -- is even trickier.
     
     Let's start with our definition of disorder and work backward. If
     disorder represents a random sequence of events, then the opposite
     of disorder should imply "not random." And if random means
     unpredictable, then we might conclude that order means predictable.
     But that would be wrong.
     
     Borrowing a page from information theory, consider the difference
     between information and noise. Information is a sequence of data
     that is meaningful in a process, such as the DNA code of an
     organism, or the bits in a computer program. Noise, on the other
     hand, is a random sequence. Neither noise nor information is
     predictable. Noise is inherently unpredictable, but carries no
     information. Information, however, is also unpredictable. If we can
     predict future data from past data, then that future data stops
     being information. For example, consider a sequence which simply
     alternates between zero and one (01010101 . . .). Such a sequence
     is certainly orderly, and very predictable. Specifically because it
     is so predictable, we do not consider it information bearing,
     beyond the first couple of bits.
     
     Thus orderliness does not constitute order because order requires
     information. So, perhaps I should use the word information instead
     of order. However, information alone is not sufficient for our
     purposes either. Consider a phone book. It certainly represents a
     lot of information, and some order as well. Yet if we double the
     size of the phone book, we have increased the amount of data, but
     we have not achieved a deeper level of order.
     
     Order, then, is information that fits a purpose. The measure of
     order is the measure of how well the information fits the purpose.
     In the evolution of life-forms, the purpose is to survive. In an
     evolutionary algorithm (a computer program that simulates evolution
     to solve a problem) applied to, say, investing in the stock market,
     the purpose is to make money. Simply having more information does
     not necessarily result in a better fit. A superior solution for a
     purpose may very well involve less data.
     
     The concept of "complexity" has been used recently to describe the
     nature of the information created by an evolutionary process.
     Complexity is a reasonably close fit to the concept of order that I
     am describing. After all, the designs created by the evolution of
     life-forms on Earth appear to have become more complex over time.
     However, complexity is not a perfect fit, either. Sometimes, a
     deeper order -- a better fit to a purpose -- is achieved through
     simplification rather than further increases in complexity. As
     Einstein said, "Everything should be made as simple as possible,
     but no simpler." For example, a new theory that ties together
     apparently disparate ideas into one broader, more coherent theory
     reduces complexity but nonetheless may increase the "order for a
     purpose" that I am describing. Evolution has shown, however, that
     the general trend toward greater order does generally result in
     greater complexity.
     
     Thus improving a solution to a problem -- which may increase or
     decrease complexity -- increases order. Now that just leaves the
     issue of defining the problem. And as we will see, defining a
     problem well is often the key to finding its solution.
     
     The Law of Increasing Entropy Versus the Growth of Order
     
     Another consideration is how the Law of Time and Chaos relates to
     the second law of thermodynamics. Unlike the second law, the Law of
     Time and Chaos is not necessarily concerned with a closed system.
     It deals instead with a process. The Universe is a closed system
     (not subject to outside influence, since there is nothing outside
     the Universe), so in accordance with the second law of
     thermodynamics, chaos increases and time slows down. In contrast,
     evolution is precisely not a closed system. It takes place amid
     great chaos, and indeed depends on the disorder in its midst, from
     which it draws its options for diversity. And from these options,
     an evolutionary process continually prunes its choices to create
     ever greater order. Even a crisis that appears to introduce a
     significant new source of chaos is likely to end up increasing --
     deepening -- the order created by an evolutionary process. For
     example, consider the asteroid that is thought to have killed off
     big organisms such as the dinosaurs 65 million years ago. The crash
     of that asteroid suddenly created a vast increase in chaos (and
     lots of dust, too). Yet it appears to have hastened the rise of
     mammals in the niche previously dominated by large reptiles and
     ultimately led to the emergence of a technology-creating species.
     When the dust settled (literally), the crisis of the asteroid had
     increased order.
     
     As I pointed out earlier, only a tiny fraction of the stuff in the
     Universe, or even on a life- and technology-bearing planet such as
     Earth, can be considered to be part of evolution's inventions. Thus
     evolution does not contradict the Law of Increasing Entropy.
     Indeed, it depends on it to provide a never-ending supply of
     options.
     
     As I noted, given the emergence of life, the emergence of a
     technology-creating species -- and of technology -- is inevitable.
     Technology is the continuation of evolution by other means, and is
     itself an evolutionary process. So it, too, speeds up.
     
     A primary reason that evolution -- of life-forms or of technology
     -- speeds up is that it builds on its own increasing order.
     Innovations created by evolution encourage and enable faster
     evolution. In the case of the evolution of life-forms, the most
     notable example is DNA, which provides a recorded and protected
     transcription of life's design from which to launch further
     experiments.
     
     In the case of the evolution of technology, ever improving human
     methods of recording information have fostered further technology.
     The first computers were designed on paper and assembled by hand.
     Today, they are designed on computer workstations with the
     computers themselves working out many details of the next
     generation's design, and are then produced in fully automated
     factories with human guidance but limited direct intervention.
     
     The evolutionary process of technology seeks to improve
     capabilities in an exponential fashion. Innovators seek to improve
     things by multiples. Innovation is multiplicative, not additive.
     Technology, like any evolutionary process, builds on itself. This
     aspect will continue to accelerate when the technology itself takes
     full control of its own progression.
     
     We can thus conclude the following with regard to the evolution of
     life-forms, and of technology:
     
     The Law of Accelerating Returns as Applied to an Evolutionary
     Process: 
     * An evolutionary process is not a closed system; therefore,
       evolution draws upon the chaos in the larger system in which it
       takes place for its options for diversity; and
     * Evolution builds on its own increasing order.
       
     Therefore:
     * In an evolutionary process, order increases exponentially.
       
     Therefore:
     * Time exponentially speeds up.
       
     Therefore:
     * The returns (that is, the valuable products of the process)
       accelerate.
       
     The phenomenon of time slowing down and speeding up is occurring
     simultaneously. Cosmologically speaking, the Universe continues to
     slow down. Evolution, now most noticeably in the form of
     human-created technology, continues to speed up. These are the two
     sides -- two interleaved spirals -- of the Law of Time and Chaos.
     
     The spiral we are most interested in -- the Law of Accelerating
     Returns -- gives us ever greater order in technology, which
     inevitably leads to the emergence of computation. Computation is
     the essence of order. It provides the ability for a technology to
     respond in a variable and appropriate manner to its environment to
     carry out its mission. Thus computational technology is also an
     evolutionary process, and also builds on its own progress. The time
     to accomplish a fixed objective gets exponentially shorter over
     time (for example, ninety years for the first MIP per thousand
     dollars versus one day for an additional MIP today). That the power
     of computing grows exponentially over time is just another way to
     say the same thing.
     
     So Where Does That Leave Moore's Law?
     
     Well, it still leaves it dead by the year 2020. Moore's Law came
     along in 1958 just when it was needed and will have done its sixty
     years of service by 2018, a rather long period of time for a
     paradigm nowadays. Unlike Moore's Law, however, the Law of
     Accelerating Returns is not a temporary methodology. It is a basic
     attribute of the nature of time and chaos -- a sublaw of the Law of
     Time and Chaos -- and describes a wide range of apparently
     divergent phenomena and trends. In accordance with the Law of
     Accelerating Returns, another computational technology will pick up
     where Moore's Law will have left off, without missing a beat.
     
     Most Exponential Trends Hit a Wall . . . but Not This One
     
     A frequent criticism of predictions of the future is that they rely
     on mindless extrapolation of current trends without consideration
     of forces that may terminate or alter that trend. This criticism is
     particularly relevant in the case of exponential trends. A classic
     example is a species happening upon a hospitable new habitat,
     perhaps transplanted there by human intervention (rabbits in
     Australia, say). Its numbers multiply exponentially for a while,
     but this phenomenon is quickly terminated when the exploding
     population runs into a new predator or the limits of its
     environment. Similarly, the geometric population growth of our own
     species has been a source of anxiety, but changing social and
     economic factors, including growing prosperity, have greatly slowed
     this expansion in recent years, even in developing countries.
     
     Based on this, some observers are quick to predict the demise of
     the exponential growth of computing.
     
     But the growth predicted by the Law of Accelerating Returns is an
     exception to the frequently cited limitations to exponential
     growth. Even a catastrophe, as apparently befell our reptilian
     cohabitants in the late Cretaceous period, only sidesteps an
     evolutionary process, which then picks up the pieces and continues
     unabated (unless the entire process is wiped out). An evolutionary
     process accelerates because it builds on its past achievements,
     which includes improvements in its own means for further evolution.
     In the evolution of life-forms, in addition to DNA-based genetic
     coding, the innovation of sexual reproduction provided for improved
     means of experimenting with diverse characteristics within an
     otherwise homogenous population. The establishment of basic body
     plans of modern animals in the "Cambrian explosion," about 570
     million years ago, allowed evolution to concentrate on higher-level
     features such as expanded brain function. The inventions of
     evolution in one era provide the means, and often the intelligence,
     for innovation in the next.
     
     The Law of Accelerating Returns applies equally to the evolutionary
     process of computation, which inherently will grow exponentially
     and essentially without limit. The two resources it needs -- the
     growing order of the evolving technology itself and the chaos from
     which an evolutionary process draws its options for further
     diversity -- are unbounded. Ultimately, the innovation needed for
     further turns of the screw will come from the machines themselves.
     
     How will the power of computing continue to accelerate after
     Moore's Law dies? We are just beginning to explore the third
     dimension in chip design. The vast majority of today's chips are
     flat, whereas our brain is organized in three dimensions. We live
     in a three-dimensional world, so why not use the third dimension?
     Improvements in semiconductor materials, including superconducting
     circuits that don't generate heat, will enable us to develop chips
     -- that is, cubes -- with thousands of layers of circuitry that,
     combined with far smaller component geometries, will improve
     computing power by a factor of many millions. And there are more
     than enough other new computing technologies waiting in the wings
     -- nanotube, optical, crystalline, DNA, and quantum (which we'll
     visit in chapter 6, "Building New Brains") -- to keep the Law of
     Accelerating Returns going in the world of computation for a very
     long time.
     
     THE LEARNING CURVE: SLUG VERSUS HUMAN
     
     The "learning curve" describes the mastery of a skill over time. As
     an entity -- slug or human -- learns a new skill, the newly
     acquired ability builds on itself, and so the learning curve starts
     out looking like the exponential growth we see in the Law of
     Accelerating Returns. Skills tend to be bounded, so as the new
     expertise is mastered, the law of diminishing returns sets in, and
     growth in mastery levels off. So the learning curve is what we call
     an S curve because exponential growth followed by a leveling off
     looks like an S leaning slightly to the right: S.
     
     The learning curve is remarkably universal: Most multicellular
     creatures do it. Slugs, for example, follow the learning curve when
     learning how to ascend a new tree in search of leaves. Humans, of
     course, are always learning something new.
     
     But there's a salient difference between humans and slugs. Humans
     are capable of innovation, which is the creation and retention of
     new skills and knowledge. Innovation is the driving force in the
     Law of Accelerating Returns, and eliminates the leveling-off part
     of the S curve. So innovation turns the S curve into indefinite
     exponential expansion.
     
     Overcoming the S curve is another way to express the unique status
     of the human species. No other species appears to do this. Why are
     we unique in this way, given that other primates are so close to us
     in terms of genetic similarity?
     
     The reason is that the ability to overcome the S curve defines a
     new ecological niche. As I pointed out, there were indeed other
     humanoid species and subspecies capable of innovation, but the
     niche seems to have tolerated only one surviving competitor. But we
     will have company in the twenty-first century as our machines join
     us in this exclusive niche.
     
     A Planetary Affair
     
     The introduction of technology on Earth is not merely the private
     affair of one of the Earth's innumerable species. It is a pivotal
     event in the history of the planet. Evolution's grandest creation
     -- human intelligence -- is providing the means for the next stage
     of evolution, which is technology. The emergence of technology is
     predicted by the Law of Accelerating Returns. The Homo sapiens
     sapiens subspecies emerged only tens of thousands of years after
     its human forebears. According to the Law of Accelerating Returns,
     the next stage of evolution should measure its salient events in
     mere thousands of years, too quick for DNA-based evolution. This
     next stage of evolution was necessarily created by human
     intelligence itself, another example of the exponential engine of
     evolution using its innovations from one period (human beings) to
     create the next (intelligent machines).
     
     Evolution draws upon the great chaos in its midst -- the ever
     increasing entropy governed by the flip side of the Law of Time and
     Chaos -- for its options for innovation. These two strands of the
     Law of Time and Chaos -- time exponentially slowing down due to the
     increasing chaos predicted by the second law of thermodynamics; and
     time exponentially speeding up due to the increasing order created
     by evolution -- coexist and progress without limit. In particular,
     the resources of evolution, order and chaos, are unbounded. I
     stress this point because it is crucial to understanding the
     evolutionary -- and revolutionary -- nature of computer technology.
     
     The emergence of technology was a milestone in the evolution of
     intelligence on Earth because it represented a new means of
     evolution recording its designs. The next milestone will be
     technology creating its own next generation without human
     intervention. That there is only a period of tens of thousands of
     years between these two milestones is another example of the
     exponentially quickening pace that is evolution.
     
     The Inventor of Chess and the Emperor of China
     
     To appreciate the implications of this (or any) geometric trend, it
     is useful to recall the legend of the inventor of chess and his
     patron, the emperor of China. The emperor had so fallen in love
     with his new game that he offered the inventor a reward of anything
     he wanted in the kingdom.
     
     "Just one grain of rice on the first square, Your Majesty."
     
     "Just one grain of rice?"
     
     "Yes, Your Majesty, just one grain of rice on the first square, and
     two grains of rice on the second square."
     
     "That's it -- one and two grains of rice?"
     
     "Well, okay, and four grains on the third square, and so on."
     
     The emperor immediately granted the inventor's seemingly humble
     request. One version of the story has the emperor going bankrupt
     because the doubling of grains of rice for each square ultimately
     equaled 18 million trillion grains of rice. At ten grains of rice
     per square inch, this requires rice fields covering twice the
     surface area of the Earth, oceans included.
     
     The other version of the story has the inventor losing his head.
     It's not yet clear which outcome we're headed for.
     
     But there is one thing we should note: It was fairly uneventful as
     the emperor and the inventor went through the first half of the
     chessboard. After thirty-two squares, the emperor had given the
     inventor about 4 billion grains of rice. That's a reasonable
     quantity -- about one large field's worth -- and the emperor did
     start to take notice.
     
     But the emperor could still remain an emperor. And the inventor
     could still retain his head. It was as they headed into the second
     half of the chessboard that at least one of them got into trouble.
     
     So where do we stand now? There have been about thirty-two
     doublings of speed and capacity since the first operating computers
     were built in the 1940s. Where we stand right now is that we have
     finished the first half of the chessboard. And, indeed, people are
     starting to take notice.
     
     Now, as we head into the next century, we are heading into the
     second half of the chessboard. And this is where things start to
     get interesting.
     
     OKAY, LET ME GET THIS STRAIGHT, MY CONCEPTION AS A FERTILIZED EGG
     WAS LIKE THE UNIVERSE'S BIG BANG -- UH, NO PUN INTENDED -- THAT IS,
     THINGS STARTED OUT HAPPENING VERY FAST, THEN KIND OF SLOWED DOWN,
     AND NOW THEY'RE REAL SLOW?
     
     That's a reasonable way to put it, the time interval now between
     milestones is a lot longer than it was when you were an infant, let
     alone a fetus.
     
     YOU MENTIONED THE UNIVERSE HAD THREE PARADIGM SHIFTS IN THE FIRST
     BILLIONTH OF A SECOND. WERE THINGS THAT FAST WHEN I GOT STARTED?
     
     Not quite that fast. The Universe started as a singularity, a
     single point taking up no space and comprising, therefore, no
     chaos. So the first major event, which was the creation of the
     Universe, took no time at all. With the Universe still very small,
     events unfolded extremely quickly. We don't start out as a single
     point, but as a rather complex cell. It has order but there is a
     lot of random activity within a cell compared to a single point in
     space. So our first major event as an organism, which is the first
     mitosis of our fertilized egg, is measured in hours, not
     trillionths of a second. Things slow down from there.
     
     BUT I FEEL LIKE TIME IS SPEEDING UP. THE YEARS JUST GO BY SO MUCH
     FASTER NOW THAN THEY DID WHEN I WAS A KID. DON'T YOU HAVE IT
     BACKWARD?
     
     Yes, well, the subjective experience is the opposite of the
     objective reality.
     
     OF COURSE. WHY DIDN'T I THINK OF THAT?
     
     Let me clarify what I mean. The objective reality is the reality of
     the outside observer observing the process. If we observe the
     development of an individual, salient events happen very quickly at
     first, but later on milestones are more spread out, so we say time
     is slowing down. The subjective experience, however, is the
     experience of the process itself, assuming, of course, that the
     process is conscious. Which in your case, it is. At least, I assume
     that's the case.
     
     THANK YOU.
     
     Subjectively, our perception of time is affected by the spacing of
     milestones.
     
     MILESTONES?
     
     Yeah, like growing a body and a brain.
     
     AND BEING BORN?
     
     Sure, that's a milestone. Then learning to sit up, walking, talking
     . . .
     
     OKAY.
     
     We can consider each subjective unit of time to be equivalent to
     one milestone spacing. Since our milestones are spaced further
     apart as we grow older, a subjective unit of time will represent a
     longer span of time for an adult than for a child. Thus time feels
     like it is passing by more quickly as we grow older. That is, an
     interval of a few years as an adult may be perceived as comparable
     to a few months to a young child. Thus a long interval to an adult
     and a short interval to a child both represent the same subjective
     time in terms of the passage of salient events. Of course, long and
     short intervals also represent comparable fractions of their
     respective past lives.
     
     SO DOES THAT EXPLAIN WHY TIME PASSES MORE QUICKLY WHEN I'M HAVING A
     GOOD TIME?
     
     Well, it may be relevant to one phenomenon. If someone goes through
     an experience in which a lot of significant events occur, that
     experience may feel like a much longer period of time than a calmer
     period. Again, we measure subjective time in terms of salient
     experiences.
     
     NOW IF I FIND TIME SPEEDING UP WHEN OBJECTIVELY IT IS SLOWING DOWN,
     THEN EVOLUTION WOULD SUBJECTIVELY FIND TIME SLOWING DOWN AS IT
     OBJECTIVELY SPEEDS UP, DO I HAVE THAT STRAIGHT?
     
     Yes, if evolution were conscious.
     
     WELL, IS IT?
     
     There's no way to really tell, but evolution has its time spiral
     going in the opposite direction from entities we generally consider
     to be conscious, such as humans. In other words, evolution starts
     out slow and speeds up over time, whereas the development of a
     person starts out fast and then slows down. The Universe, however,
     does have its time spiral going in the same direction as us
     organisms, so it would make more sense to say that the Universe is
     conscious. And come to think of it, that does shed some light on
     what happened before the big bang.
     
     I WAS JUST WONDERING ABOUT THAT.
     
     As we look back in time and get closer to the event of the big
     bang, chaos is shrinking to zero. Thus from the subjective
     perspective, time is stretching out. Indeed, as we go back in time
     and approach the big bang, subjective time approaches infinity.
     Thus it is not possible to go back past a subjective infinity of
     time.
     
     THAT'S A LOAD OFF MY MIND. NOW YOU SAID THAT THE EXPONENTIAL
     PROGRESS OF AN EVOLUTIONARY PROCESS GOES ON FOREVER. IS THERE
     ANYTHING THAT CAN STOP IT?
     
     Only a catastrophe that wipes out the entire process.
     
     SUCH AS AN ALL-OUT NUCLEAR WAR?
     
     That's one scenario, but in the next century, we will encounter a
     plethora of other "failure modes." We'll talk about this in later
     chapters.
     
     I CAN'T WAIT. NOW TELL ME THIS, WHAT DOES THE LAW OF ACCELERATING
     RETURNS HAVE TO DO WITH THE TWENTY-FIRST CENTURY?
     
     Exponential trends are immensely powerful but deceptive. They
     linger for eons with very little effect. But once they reach the
     "knee of the curve," they explode with unrelenting fury. With
     regard to computer technology and its impact on human society, that
     knee is approaching with the new millennium. Now I have a question
     for you.
     
     SHOOT.
     
     Just who are you anyway?
     
     WHY, I'M THE READER.
     
     Of course. Well, it's good to have you contributing to the book
     while there's still time to do something about it.
     
     GLAD TO. NOW, YOU NEVER DID GIVE THE ENDING TO THE EMPEROR STORY.
     SO DOES THE EMPEROR LOSE HIS EMPIRE, OR DOES THE INVENTOR LOSE HIS
     HEAD?
     
     I have two endings, so I just can't say.
     
     MAYBE THEY REACH A COMPROMISE SOLUTION. THE INVENTOR MIGHT BE HAPPY
     TO SETTLE FOR, SAY, JUST ONE PROVINCE OF CHINA.
     
     Yes, that would be a good result. And maybe an even better parable
     for the twenty-first century.
     
     (C) 1998 Ray Kurzweil All rights reserved. ISBN: 0-670-88217-8
                                      
                 Copyright 1998 The New York Times Company


VICUG-L is the Visually Impaired Computer User Group List.
To join or leave the list, send a message to
[log in to unmask]  In the body of the message, simply type
"subscribe vicug-l" or "unsubscribe vicug-l" without the quotations.
 VICUG-L is archived on the World Wide Web at
http://maelstrom.stjohns.edu/archives/vicug-l.html


ATOM RSS1 RSS2