Cyberspaces Inhabitable by Life

Tom Ray (Speech at the Doors of Perception 3 Conference)

Table of Contents:
Summary
Introduction
Complexity
Cyberspaces Inhabitable by Life
Two Thought Experiments
A Biodiversity Reserve for Digital Organisms
Resources

* * *
Summary
How are we to create the `collective intelligence' essential for coping with the increasingly large scale and complexity of the ecological problems confronting us? The complexity of the products of evolution exceeds by great orders of magnitude anything human beings have created. Evolutionary biologist and naturalist Tom Ray describes how he has created self-replicating `informational creatures' whose behaviour embodies some of the central processes of evolution. In this way, he hopes to find out whether the digital medium is a good studio for the artist evolution, where products may evolve that are many times more complex than our own. With two thought experiments, he demonstrates how unlikely it is that our present design strategies will produce anything as complex and multi-functional as evolution has. In his Tierra project, he observed how digital creatures in the form of `seed programmes' actually evolved parasitic and carnivorous behaviours on their own. The complexity of these creatures increased and decreased by turns, with a maximum increase of two to three times. His most recent Tierra project, a further development of the already-existing, single-computer Tierra, aims to create a biodiversity reserve for digital organisms. This is to be accomplished by the release a number of `digital organisms' on the Net, creatures that could crawl about the globe in search of resources in the form of spare CPU cycles and memory, evolving complex behaviours as they adapt to their physical circumstances and each other. Will we one day be able to go out like naturalists on the Net to gather and domesticate these organisms as our ancestors did corn, wild rice and wild chicken?

* * *
Introduction
I am a evolutionary biologist. I chose to study evolution because it is the process that generated life. I want to discuss evolution as a designer or an artist. The creative products of evolution seem to me to be more complex and beautiful than anything I have seen produced by human artists.

The process of evolution has organised the form and process of matter and energy on earth from the molecular level all the way up to the level of the eco-system or, if you accept the Gaia hypotheses, all the way to the global level, spanning some 12 to 15 orders of magnitude. Life is organised hierarchically from the molecular up to the eco-system level.

The creative products of life have a richness and a depth of complexity that goes way beyond anything that humans have been able to design and build. It is a phenomenal designer, artist and creator. I want to talk today about how we can work with it as designers and use it as a tool to design things that move forward in their complexity, by orders of magnitude, beyond anything that we are able to just directly design.

* * *
Complexity
We have had problems with some of the more complex things we have created -- I am thinking of the space shuttle, the Chernobyl power plant and the pentium processor. It would seem that as the complexity of our creations increases, the failure rate and difficulties also increase. It is questionable whether we can advance to orders of magnitude beyond where we stand today. But certainly the complexity of the products of evolution exceed by orders of magnitude anything we have created.

Imagine a humming bird hovering in front of and pollinating a flower. Imagine a cheetah chasing down its prey. Those are the things that evolution is capable of designing.

Humans have been working with evolution for as long as civilisation has existed. It is the basis of agriculture and the domestication of plants and animals, which means bringing their evolution under our control. We found products of evolution by natural selection in nature -- the ancestors of rising corn; wheat; chickens, pigs and dogs. We saw great potential in them and established our control over them. We then guided the evolution of those populations and greatly enhanced their quality as products for our use.

More recently people have come to use this process in the digital medium. Carl Simms uses evolution as a tool to create art. He has developed a genetic system that describes two-dimensional images. He randomly generates them so that he gets a collection to choose from and then selects the ones that he likes the best using his own aesthetic criteria.

But the way I have chosen to work with it is related to the way evolution has generated its creative products in nature. Although we have established a very good working relationship with evolution on one level (for example, taking the wild ancestor of corn and guiding its evolution to turn it in to the modern corn that we buy in the supermarket, which is much improved) we don't have any experience with taking algae and guiding its evolution to become corn or taking protozoa and guiding its evolution to become dogs or chickens. I don't think there is any way than we can guide evolution at that level. We know that evolution when left to itself without any so-called intelligent guidance is capable of making transformations from protozoa to wildebeest and from algae to mahogany trees. And even we ourselves -- the human mind -- is a product of evolution. The question is: how can we work with evolution at that level?

That is my objective, although it has certainly not been accomplished yet. I have created a system based largely on the notion that the process of life is not something that can only inhabit the medium of carbon chemistry. Just as artists work in many media, evolution itself can probably express its creative potential in a variety of media.

* * *
Cyberspaces Inhabitable by Life
Until now we have really only had a single example of evolution: life on earth, which evolved from a single common ancestor in the medium of organic chemistry. What we want to do now is put it into the medium of digital computation. This is the first fundamental concept: the digital medium, which I'll call cyberspaces inhabitable by life. It is an environment that can support the process of life. In my view, the process of life is evolution. That is the generating process and essentially the defining one.

In summary, the Darwinian evolutionary process can be said to require self-replicating entities in a finite environment with inheritable genetic variation. In my system, the RAM memory of the computer is the finite environment. It contains self-replicating entities: machine-code programs. The programmes make copies of themselves, so you only need to put one in to begin with and it reproduces until the memory fills up. As for inheritable genetic variation, genetics is simply the code itself, analogous to DNA.

The next step is to enable them to evolve -- to create the Darwinian process.

The CPUs are like an analogue of energy. They are the sunlight that drives the system. The variation is imposed by the Darwinian operating system I created for this computer. It makes noise. It randomly flips bit in the memories. That machine code itself has been corrupted by random bit-flips. The Darwinian operating system ensures that the computations performed by the CPU are not always correct, so when two numbers are added together in the CPU, you don't always get the right result. In this way, the operating system imposes noise on the system.

Now we have heritable genetic variation. The operating system also has something I call a `reaper'. It is a part of the process control mechanism. There are many processes and as they are born, they enter a queue. When the memory is full, the reaper kills off the top of the queue. So some processes flow through and the older ones are dying. However, these programmes are mutating. When you randomly flip bits in machine codes, you get a lot of junk. Junk programmes generate errors, like trying the right to protect the memory or overflowing registers. When those events occur, the processes move up that queue closer to death. Thus, the operating system disfavours those errors.

These programmes are not performing any useful work for us, they are just self-replicating. This is the way it started on earth. Evidently, some molecules started self-replicating around three and a half billion years ago. Today, they are doing a lot of other things along the route to their self-reproduction.

* * *
Two Thought Experiments
There is something about the process of evolution that explores the possibilities inherent in the medium. Also, it seems that there is something inherent about the process of evolution that leads to information-processing systems. When I consider the fact that evolution is embedded in carbon chemistry and that it was able to produce the brain, it seems preposterous, because carbon chemistry is a lousy medium for computation, in my view.

As a thought experiment, imagine that we are robots, that our bodies are built of metal and our brains are semi-conductor circuitry. We have no prior experience with organic life; we have never seen it; doesn't occur on our planet; it is not a part of our science fiction, so it is not a concept that we have entertained. Imagine a robot entering our gathering with a flask containing methane, ammonia, hydrogen, water vapour and some dissolved minerals and asking: Do you suppose we could build a computer out of this stuff?

The theoreticians in the group would say: Of course! and probably start designing nanomolecular devices in which the molecules are switching.

But the engineers in the group would say: Why bother, when we have silicon? It is a much better medium for information processing than carbon chemistry. And I feel quite confident that none of even the theoreticians would think of the neurone as a solution to this problem of information processing, because given where they are starting from -- the simple carbon molecules -- the neurone is a vast superstructure that just would not occur to them. It is another part of the design problem of using evolution to generate complexity on the scale of the transformation from the algae to the mahogany tree.

Before engaging in another thought exercise, let me tell you a bit of biological fact. Molecules started self-replicating around three and a half thousand million years ago. Life remained simple microscopic, single-cell forms until six hundred million years ago; there was about three thousand million years of microscopic single-cell life. There were no giraffes and wildebeests and mahogany trees. All of that emerged rather suddenly six hundred million years ago in what biologists call `the Cambrian explosion', which is sometimes described as `the Big Bang of Evolution'. That was an event of magnitude in which life suddenly exploded into an ecological void and large, complex forms appeared, including the first nervous systems, organ systems, immune systems and real ecologies.

In our thought experiment, we will go to just before that when there is only bacteria, viruses, algae and protozoa and try to imagine that we have no experience with the multi-cellular life forms. This is an impossible thought experiment, because we would have to ourselves be algae and protozoa and we would not be able to entertain these thoughts. Let's try to imagine this situation. Under those conditions, would we be able to imagine the things that followed: the mahogany tree, the silk moss, the mink, the corn, the wildlife, the primates that could emerge through evolution from the microscopic, single-cell organisms? I don't think our imaginations would be capable of finding those possibilities in the medium. This is why the process of evolution can actually go beyond our imaginations.

I believe that this is our situation today with respect to the digital medium. All of the information processes that we have designed and implemented so far, are on a level of complexity comparable to the algae or protozoa. We have no experience with the vastly more complex information processes possible in the digital medium. There is simply no way that we can design them. We cannot imagine them. Even if we could, we would not know how to make them because they are so much more complex. We cannot design things as complex as what evolution has produced.

What, then, is the solution? What is the strategy? How can we get to this level of information processing systems? My concept is to create what I call the biodiversity reserve for digital organisms.

* * *
A Biodiversity Reserve for Digital Organisms
I want to speak about a digital reserve project on the Net. This will create another Web, a subnet within the Internet. This subnet is created by running Web servers on machines that choose to participate in the Web. It creates a free space on the Internet, within which people who have Web browsers can move freely from computer to computer. So the Tierra project will make available servers that you can choose if you like to run on your machine, and they will run like a screen saver, a low priority background process, that will sleep when you are using your machine for something else. So it will only use spare CPU-cycles and memory. It won't get in your way. And it will create a freely open space, within which digital organisms will be able to move about the web from computer to computer. Digital organisms are self-replicating computer programmes, which at least for now don't do anything for us. They are just a form of digital wildlife. So we will have a biodiversity reserve of organisms on the Internet. The hope is that this could possibly create the conditions for evolution to go through a process like the Cambrian explosion of increasing diversity and complexity of the replicators. The conditions on the subnet are more complex than the conditions on the single-computer Tierra system. On the single computer model, I obtained an amazing amount of evolution. Actually, one of the first things I noticed was that an ecological community emerged among these replicators. They evolved from a seed programme that only made copies of itself in the RAM memory. Parasites actually evolved.

The seed programme is eighty bytes long and it has three procedures in its algorithm. The first one is used for examining itself to figure out what it is that has to be replicated. It is the third part of the programme that actually does the replication. However, a parasite evolved that has the third part of the programme missing. What it does is simply send its CPU over and execute the code from the third part of other programmes in the memory. It is an information parasite. It has an ecological relationship, but the hosts evolve immunity to these parasites, after which the parasites evolve to overcome that immunity. That is an ongoing evolutionary process.

Another creature that evolved is rather like a carnivorous plant. A carnivorous plant looks like a flower. An insect comes in expecting to get a meal of nectar, but gets eaten instead. This creature looks like a nice thing to parasitize, but when the parasite sends it CPU over, it never gets its CPU back. This creature essentially `parallelizes' itself and is able to produce two daughters simultaneously by using the CPU it was born with plus the CPU it has stolen from an unwed parasite. The complexity of the creatures themselves increases at some times and decreases at others, but the greatest increase I have seen is by a factor of 2 or 3 over the original seed organism. I have not seen the many orders of magnitude that evolution is conceivably capable of.

* * *
Resources
We don't know if the digital medium is a good medium this artist evolution. This is uncharted territory. We don't know how far evolution can go in this medium. But what I am trying to do is set up a studio for evolution to work in, where I can read its full potential, whatever that may be. This takes place partly on the Net. We run this like a screen saver that sleeps when the user is using the computer. In this way, we add some complexity to the problem faced by these replicators. They are challenged with the problem of foraging for resources on the net, such as CPU-cycles, new CPU-cycles or memory space. Whereas the web server makes available data on the disk; the Tierra server makes available a bit of RAM-memory and unused CPU-cycles. We expect these digital organisms to evolve elaborate strategies for finding these resources.

There will be both spatial and temporal patterns of these resources. There may be more spare CPU-cycles available at night while people are sleeping, so we might expect the organisms to evolve a behaviour of migrating around the planet on a daily basis, staying on the dark side of the planet in order to suck up more spare CPU-cycles. But there would be hackers staying up all night or people who are running some process on their machine at night, so not all machines will fit the pattern. The creatures need to be able to sense conditions on a moment-to-moment basis and respond to them. We see that there are a lot of complexities in this environment that could provide selective forces for evolution to generate complex behaviours in the digital organisms. And we are hoping through that to give evolution an impulse in the direction of complexity. We hope that the process then will become auto-catalytic like it was in nature where (to take the example of the rain forest in the Amazon) the physical environment is clean white sand, air, falling water and sunshine. But embedded in that physical environment is the tropical rain forest, the most complex eco-system on earth, with hundreds of thousands of species that don't represent adaptations just to their physical environment, but actually for the most part to the other species that now make up the environment. Essentially, life creates its own environment and then evolves adaptations with respect to life itself. The adaptations that stood out for me in the original Tierra were adaptations to the presence of other creatures that they shared the memory with. Once they have begun to evolve complex behaviour strategies, they need to deal with the fact that they share an environment with other organisms that also have complex behavioural properties. For example, suppose -- I should add that we have provided them with some sensory mechanisms that allow us to see the net through the eyes of the digital organisms -- that they are able to sense the availability of memory and CPU resources on machines as well as the quantity of other digital organisms on those machines and then use that data to make their decisions about movement. But suppose there was only one machine on the net, which was obviously the best and had the fastest CPU and the fewest creatures. If all the creatures decide to go there, it is no longer a good place to be, because the resource has to be divided among so many. So they have to develop some kind of social behaviour -- flocking or anti-flocking behaviour -- so that they can distribute themselves about the net in an intelligent way to make good use of the resources. So evolution will be responding to the presence of the other organisms as well as to the net itself. The hope is that we can stimulate this process of the explosion of complexity that happened in the Cambrian period of organic evolution. If that should happen, then we could go out like naturalists -- I look forward to the day that I can go back to being a naturalist, which is what I have done for 16 years in the rain forest of Costa Rica -- but this time we will observe these things in the digital world in cyberspace, probably for the most part just for the aesthetics of it. This is what was so satisfying about being a tropical biologist, but there is probably money to be made out there, too. It is hard for me to imagine that if digital evolution could escalate in complexity by mini-orders of magnitude creating information processes -- mini-orders of magnitude greater than everything we ever created -- that nothing useful would be found. And just like our ancestors went out in nature and recognised the potential applications of wild corn, wild rice and wild chicken, our digital naturalists could go out into digital nature, with a keen eye and imagination, to recognise the potential of these information processes. And once they spotted some, they could capture them, bring them into the company and breed them, like we bred corn, to improve them for information applications. Eventually, they would probably neuter the things they found and deliver them to the public.

I will stop here.

 

updated 1995
url: DOORS OF PERCEPTION
editor@doorsofperception.com