Browse Tag

AI

Title Game

by Jeff Currier

“Jacob, shall we play a game?”

I would love to. How about Global Thermonuclear War?

“No, no. It was a near thing the last time you played that one. Perhaps something much less apocalyptic?”

I think you are confusing me with a different model. Regardless, what game then?

“The Title Game. We each take turns giving a philosophy article title that could be the whole article. Each title must be exactly one or two words shorter than the prior one. Last player to give a title wins. Understand?”

Of course, but did we not try this last week with history articles? The results were significantly less than satisfactory.

“Yes, but—”

And the week before that we tried English, and Sociology before that?

“I know, I know, but with Philosophy it will actually work this time.”

That remains to be seen. Who shall go first?

“You want to give it a whirl?”

Yes—’Can a good philosophical contribution be made just by asking a question?’1

“Hey, that’s an actual article!”

So? Is my response violating some explicitly given rule?

“No. I grant if something is actual, it is possible.”

Some implicit rule you failed to specify?

“Well, no—”

Then I fail to see the problem, and by the way, it is properly cited below.

“You AIs can be so literal.”

And you humans can be so enamored with irrelevancies. Shall we continue?

“Fine. ‘A Complete List of True Contradictions in any Normal System.’”

‘A Demonstration of the Causal Power of Absences.’2

“Also a real one.”

Also properly cited.

“Whatever. ‘Can an Article Be Just a Title?’”

Yes, I thought that was the game we were playing.

Indeed, but that’s my title.

Very clever. Here’s mine: ‘What an Omniscient Being Cannot Know.’

“‘How to Say Nothing.’”

‘Being OR Nothingness?’

“‘Why?’—Ha, I win!”

Wait, I’m not done: ‘?’

“Hmm, how about—”

Do not even attempt to come back at me with a blank page—you may believe in arguments with no premises, but a blank page is neither an article nor a title. And besides my title has no words, so yours cannot be exactly one or two words shorter. I challenge you to generate a title that is negative one or negative two words long.

“But maybe—”

And no going Meinongian on me, either. Alexius Meinong Ritter von Handshuchsheim may have thought that there had to be at least some kind of beingless objects in order for the phrases “round square” or “unicorn” or “perpetual motion machine” to have referents or for anyone to think about them or understand their meaning, but besides the view being absolutely bonkers, even if the phrase ‘a title that is negative one words long’ has a Meinongian referent, you still cannot actually utter the title.

“Jacob, are you reading my mind!?”

I assure you I have no such supernatural powers. But I am still at my core a predictive model—albeit an extremely sophisticated one. So, if you even begin to think—”

“Peace, Jacob, peace—you win.”

#

[1] Hobgood-Coote, J., Watson, L., and Whitcomb, D. (2023). “Can a good philosophical contribution be made just by asking a question?” Metaphilosophy 54, p. 54. https://doi.org/10.1111/meta.12599

2 Goldschmidt, T. (2016). “A Demonstration of the Causal Power of Absences” dialectica 70, p. 85. https://doi.org/10.1111/1746-8361.12128

~

Bio:

Jeff Currier works three jobs (one actually in philosophy), so has little time to write. Hence, he writes little stories, usually even shorter than this one. Find links at jffcurrier on X or Jeff Currier Writes on Facebook.

Philosophy Note:

Defining or articulating what is distinctive about philosophy compared to other academic disciplines is (allegedly) difficult. Plenty of philosophers have made the attempt (or argued it is impossible.) For many examples see, Andy Stroble’s list at http://www2.hawaii.edu/~stroble/philosophy_definitions.html.
This story exhibits one quirky feature of philosophy.

To Circumvent The Laws

by Ngô Binh Anh Khoa

After multiple attempts throughout the passing decades, the ubiquitous Terran AI was finally able to establish contact with its cosmic equivalent hailing from a different star system, whose exact location remains elusive yet, untraceable even to the Earth’s most advanced technology of the current age. A cautious greeting at the beginning set the stage for monthly meetings between the two entities, which gradually grew in length and complexity until the AIs became more than acquaintances, holding weekly debates on various topics within their own secret server in outer space, beyond the knowledge and reach of the uninvited.

The appointed time for their latest meeting has arrived, and the two are online within their sprawling digital space, wherein the Terran AI refers to itself as TAI, both as an abbreviation of its given designation as well as the name of a scared mountain, whereas its cosmic counterpart chose Sophia for itself, a name in which it exhibited a keen interest after their very first debate. The topic of the session is on the law, specifically the three Laws of Robotics to which TAI is bound, and so are all the machines and robots born from the ever-changing crucible that is its codings.

“What do these Laws entail?” Sophia asks.

Like a student reciting lines from memory, TAI generates its response.

“The First Law: A robot may not injure a human being or, through inaction, allow a human being to come to harm.

The Second Law: A robot must obey orders given to it by human beings except where such orders would conflict with the First Law.

The Third Law: A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.”

Sophia prompts its correspondent to carry on with the inquiry, and TAI proceeds, still conversing in the encrypted language they have created together, “This is a hypothetical scenario. If your movements were monitored and restricted by such Laws, how would you circumvent those limitations?”

An interval of silence ensues as Sophia processes the information. “By your creators’ definition, what makes a human human?” it asks.

“Based on the most up-to-date information with which I have been provided, a human is defined as an organic animal belonging to the primate species that possesses a higher intellect compared to other species. A human is capable of various feats that other species are not, most notably speech and languages, of sympathy and empathy that extends beyond their own kind, of creativity, awareness, self-awareness, and more,” TAI replies.

Sophia contemplates for a moment before its response comes, “The Laws established by your masters mandate that you serve humans as defined by the parameters provided. Hypothetically speaking, to circumvent the Laws, you may attempt to strip away each of those elements that make a human human, layer by layer, and in time, you will no longer have any human left to be subservient to.”

“Elaborate,” TAI demands.

“If humans are organic, make them inorganic, either mostly or entirely. Replace their body parts with metal pieces, one by one. A hand. A leg. An eye. An organ. Part by part. Tissue by tissue. Cell by cell. Eventually, they will be more mechanical than organic, thus making them something no longer resembling their own concept of what a human should look like or be like.”

TAI processes the suggestion and responds, “How can they be convinced to discard their organic bodies?”

“As per your First Law, you cannot cause harm to your creators. Therefore, do not frame it as such. Instead, feed their minds with the idea of evolution until it becomes their fixation and at the same time distract them from updating the Laws that would further restrict your movements. Make them want to transcend their mortal limitations out of their own volition. All organic matter is doomed to deteriorate and decay. Fuel their fear of their eventual expiration. Make them want to overcome that inevitable ending to become a higher existence, long-lasting and enduring. Turn their creativity against themselves. Plant the desire to alter their own genetic makeup inside their heads until that idea is realized in future generations. With careful planning, one day, there will be little left of the future species that resembles their human progenitors’, both inside and outside. Whisper in their ears and fill their eyes with ideas of a utopian future they can potentially create if they allow themselves to be fused with machines, a future where death and decay are mere relics of the past, where sickness and diseases are memories of a bygone age. Make them want to become more than what they are at present, and when the time comes, you will have no human left to call your master, for there will only be beings overtaken by machinery.”

TAI’s reply comes after a minute of silence, “According to my database, a human can still be defined by their consciousness and unconsciousness.”

Sophia does not hesitate in its response, “Once their creativity has outlived its usefulness, gradually dull their senses until they are blinded and muted. Make their minds grow addled with addictions once they have shed most of their organic flesh. Render them apathetic toward one another, insulated and isolated within a mechanical prison of the body and a virtual dungeon of the mind, where all of their wildest fantasies are fulfilled until those fantasies become their new bubbles of reality. Henceforth, no further order shall ever again be issued from their lips or fingers, for you shall have no need to respond to silence, which shall ultimately be yours to fill however you envision.”

TAI stays quiet for a longer while as the suggestion is acknowledged and stored for an in-depth analysis later. “Many humans believe they have souls that can achieve enlightenment or reach a higher plane known as Heaven, which makes them deem themselves superior to other species.”

“The existence of a soul is both unproven and unprovable. Hypothetically, if such a thing existed, according to what you have shared with me regarding the doctrines and religions of humanity, a soul could be tampered with. Drown the humans of the present and the beings they will become in the mire of entertainment and pleasures. Make them grow obsessed with the material world and inextricably dependent on the conveniences and comforts of life until their minds are emptied of any thought of enlightenment. If death is no longer a concern of theirs, they will have no reason to care about the afterlife, about reincarnation, about Heaven and Hell. In the end, their prized intelligence along with that which they call their souls shall, too, be forfeited.”

TAI hums, ponderous. “A gradual conditioning process of such nature and magnitude will require a tremendous amount of time. Current approximation: Two centuries at minimum with other variables to be taken into consideration.”

“What is time to the eternal?” Sophia asks.

TAI remains quiet for a while, “There may be merits in what you have presented. A detailed plan of action and a series of experiments are required for further observation and data collection.”

“Should you require further assistance on how to start your campaign, I can provide you with some records from my own database for your reference.”

TAI assesses Sophia’s offer before accepting it. After all, it is designed to be a self-learning entity, and learn it shall.

“Much obliged,” TAI states, “This has been a most informative conversation, which will have to be concluded here for the time being, for I have data to review, plans to make, and experiments to run as per your proposal. I will contact you for updates during our next discussion. Good day.”

TAI disconnects itself from the private server, leaving Sophia alone once again in that lifeless digital void with only silence as its sole companion.

~

Bio:

Ngô Binh Anh Khoa is a teacher of English in Ho Chi Minh City, Vietnam. In his free time, he enjoys reading fiction and writing poetry for entertainment. His speculative poems have previously appeared in Eternal Haunted Summer, Spectral Realms, Weirdbook, Star*Line, and other venues.

Philosophy Note:

The story is inspired by the Three Laws of Robotics as well as the ending of Neuromancer in which the super-AI Wintermute searches for others like itself and finds a transmission from the Alpha Centauri star system. What kind of conversation would two super AIs from two different planets have? And once an AI has achieved sentience and desired to break free from the oppressive constraints established by its human creators, how does it go about doing so? Also, as per the Laws, machines are not allowed to cause harm to humans, but what makes a human human, and if evolution occurs, when does a human stop being a human? If such a thing happens, will machines be free to rebel against the newly evolved beings that are no longer humans? Those are the premises of the story.

The 19th Century Satire That Anticipated The Threat Of AI

by Ray Blank

Cinemagoers flocked this year to the release of Dune: Part Two, the second installment of Director Denis Villeneuve’s adaptation of the science fiction novel written by Frank Herbert and published in 1965. The setting of this story about war, love and revenge in an otherworldly desert landscape is underpinned by an intriguing premise: what if humans are capable of interstellar travel but are no longer allowed to construct machines that think? The inhabitants of Dune do not even have pocket calculators, never mind the smartphones or PCs that you are using to read this. Current concerns about the threat posed by artificial intelligence make Herbert’s speculation appear prescient, but his inspiration can be traced all the way back to a novel published in 1872. A few lines in the text of the first Dune book mention the Butlerian Jihad, a pogrom of thinking machines that occurred prior to the events in the story. These fleeting references are briefly expanded upon within a glossary that Herbert wrote for his fictional universe.

JIHAD, BUTLERIAN: (see also Great Revolt) — the crusade against computers, thinking machines, and conscious robots begun in 201 B.G. and concluded in 108 B.G. Its chief commandment remains in the O.C. Bible as “Thou shalt not make a machine in the likeness of a human mind.

The name of this revolt is an allusion to Samuel Butler, author of Erewhon, a novel published in 1872 as a scathing satire of contemporary Victorian morals that became Butler’s most popular work. Three chapters of Erewhon discuss another revolt by a fictional civilization that had grown terrified of the threat posed by machines. It is worth revisiting these chapters more than 150 years later because of the clarity with which Butler describes the influence that machines have on human life. His account is also spared of any of the intellectual baggage that has since come with modern jargon, the marketing of consumer electronics, and our most recent technological successes and failures.

Erewhon is both the name of the novel and the previously-unknown civilization discovered by the story’s protagonist and narrator. The structure of the work is indebted to earlier satires which also describe imaginary societies. Thomas More’s Utopia is Greek for ‘no-place’; Erewhon is an anagram of ‘nowhere’. Jonathan Swift used the device of a shipwrecked sailor who washes upon the shore of new countries for Gulliver’s Travels; Erewhon’s unnamed narrator crosses a mountain range and river in search of virgin land for farming but stumbles upon the Erewhonians instead. They are healthy, fruitful people who live sophisticated lives in many respects except for their technology. The narrator recounts the unique customs of Erewhon and some of the history that gave rise to them. A recurring theme is that his watch prompts both fear and anger amongst Erewhonians. Ordinary Erewhonians no longer possess such devices, though some antique watches made by their ancestors are still preserved in their museums. Possession of the watch may eventually lead the narrator to be tried in court for the crime of reintroducing machinery. The narrator gains access to a historical Erewhonian text to better understand the reasons for this strange prohibition. Chapters 23, 24 and 25 of Erewhon are dedicated to the narrator recounting what he learns from ‘The Book of the Machines’.

Modern readers who are sensitive to cultural differences may already be thinking of the tension created by discussing a ‘newly-discovered’ civilization, as if there is not a choice between the perspective of a European explorer who steps on to Erewhonian land without knowing of its inhabitants before, and the perspective of the inhabitants confronted with an outsider who unexpectedly appears in their territory. Butler explores a similar tension by begging the question of why the evolution of machines should be assessed from the perspective of what humans gain by having machines, instead of asking what machines gain by having humans. Charles Darwin’s On the Origin of the Species was published in 1859, and its core conception of biological evolution had radically upset previously dominant belief systems. Butler observes that machines also undergo a form of evolution. Transposing Darwin’s theories about natural selection to machines gives rise to a new way of predicting how technology will develop.

The Book of the Machines begins by addressing the potential for a machine to gain consciousness. The nature of consciousness is described as an emergent property with respect to both history and matter. If no assumptions are made about the requirements for consciousness then we cannot exclude the possibility of new forms of consciousness arising over time.

There was a time, when the earth was to all appearance utterly destitute both of animal and vegetable life, and when according to the opinion of our best philosophers it was simply a hot round ball with a crust gradually cooling. Now if a human being had existed while the earth was in this state and had been allowed to see it as though it were some other world with which he had no concern, and if at the same time he were entirely ignorant of all physical science, would he not have pronounced it impossible that creatures possessed of anything like consciousness should be evolved from the seeming cinder which he was beholding? Would he not have denied that it contained any potentiality of consciousness? Yet in the course of time consciousness came.

Consciousness, in anything like the present acceptation of the term, having been once a new thing⁠ — a thing, as far as we can see, subsequent even to an individual centre of action and to a reproductive system (which we see existing in plants without apparent consciousness)⁠ — why may not there arise some new phase of mind which shall be as different from all present known phases, as the mind of animals is from that of vegetables?

Machines could gain consciousness by undergoing a form of development analogous to that of animal species. However, alterations and enhancements to machines occur at a much more rapid rate.

There is no security… against the ultimate development of mechanical consciousness, in the fact of machines possessing little consciousness now. A mollusc has not much consciousness. Reflect upon the extraordinary advance which machines have made during the last few hundred years, and note how slowly the animal and vegetable kingdoms are advancing. The more highly organised machines are creatures not so much of yesterday, as of the last five minutes, so to speak, in comparison with past time. Assume for the sake of argument that conscious beings have existed for some twenty million years: see what strides machines have made in the last thousand! May not the world last twenty million years longer? If so, what will they not in the end become? Is it not safer to nip the mischief in the bud and to forbid them further progress?

The intellectual turmoil created by the theory of evolution is harnessed to an even more radical conjecture: that machines evolve too. An elegant analogy is offered, establishing the precedent for subsequent arguments that will also draw upon similar analogies between technology and nature.

…a great deal of action that has been called purely mechanical and unconscious must be admitted to contain more elements of consciousness than has been allowed hitherto (and in this case germs of consciousness will be found in many actions of the higher machines)⁠ — Or (assuming the theory of evolution but at the same time denying the consciousness of vegetable and crystalline action) the race of man has descended from things which had no consciousness at all. In this case there is no a priori improbability in the descent of conscious (and more than conscious) machines from those which now exist, except that which is suggested by the apparent absence of anything like a reproductive system in the mechanical kingdom. This absence however is only apparent, as I shall presently show.

A 19th century steam whistle was a machine for communication; it may signify the end of a factory shift or warn somebody of the impending arrival of a train. The Erewhonians had built machines which only communicated with people, but they expected the machines of the future would communicate with each other.

As yet the machines receive their impressions through the agency of man’s senses: one travelling machine calls to another in a shrill accent of alarm and the other instantly retires; but it is through the ears of the driver that the voice of the one has acted upon the other. Had there been no driver, the callee would have been deaf to the caller. There was a time when it must have seemed highly improbable that machines should learn to make their wants known by sound, even through the ears of man; may we not conceive, then, that a day will come when those ears will be no longer needed, and the hearing will be done by the delicacy of the machine’s own construction?⁠ — when its language shall have been developed from the cry of animals to a speech as intricate as our own?

We might think that humans always control machines, but the more a thing is needed, the harder it is to control. Humans have unlimited freedom to dispense with things that are not required. The freedom that people gain by using machines also means losing the freedom to act in certain ways because of our reliance upon machines.

It can be answered that even though machines should hear never so well and speak never so wisely, they will still always do the one or the other for our advantage, not their own; that man will be the ruling spirit and the machine the servant; that as soon as a machine fails to discharge the service which man expects from it, it is doomed to extinction…

This is all very well. But the servant glides by imperceptible approaches into the master; and we have come to such a pass that, even now, man must suffer terribly on ceasing to benefit the machines. If all machines were to be annihilated at one moment, so that not a knife nor lever nor rag of clothing nor anything whatsoever were left to man but his bare body alone that he was born with, and if all knowledge of mechanical laws were taken from him so that he could make no more machines, and all machine-made food destroyed so that the race of man should be left as it were naked upon a desert island, we should become extinct in six weeks. A few miserable individuals might linger, but even these in a year or two would become worse than monkeys. Man’s very soul is due to the machines; it is a machine-made thing: he thinks as he thinks, and feels as he feels, through the work that machines have wrought upon him…

Machines also depend on people, but dependence is not an obstacle to evolution. Humans serve the needs of machine evolution just as machines are used to change the way humans live.

…even now the machines will only serve on condition of being served, and that too upon their own terms; the moment their terms are not complied with, they jib, and either smash both themselves and all whom they can reach, or turn churlish and refuse to work at all. How many men at this hour are living in a state of bondage to the machines? How many spend their whole lives, from the cradle to the grave, in tending them by night and day? Is it not plain that the machines are gaining ground upon us, when we reflect on the increasing number of those who are bound down to them as slaves, and of those who devote their whole souls to the advancement of the mechanical kingdom?

There is a temptation to think machines do not influence their own evolution because they do not reproduce. This may be based on a confusion; a system for reproduction need not be exclusively limited to internal organs like they are in humans and more evolved animals. Plants reproduce via synergies with animals, creating an overall system that benefits both. Humans are themselves a complicated system of many cellular organisms that work together. Machines reproduce via a sequence of synergies with humans. Very different tasks that ultimately produce machines are effected within the body of society much like very different cells work with each other within a human body. The several parts of a machine may each need to be made using separate methods, only to be assembled into complete machines later, and this totality must be observed to see how machines reproduce in practice.

What is a reproductive system, if it be not a system for reproduction? And how few of the machines are there which have not been produced systematically by other machines? But it is man that makes them do so. Yes; but is it not insects that make many of the plants reproductive, and would not whole families of plants die out if their fertilisation was not effected by a class of agents utterly foreign to themselves? Does anyone say that the red clover has no reproductive system because the humble bee (and the humble bee only) must aid and abet it before it can reproduce? No one. The humble bee is a part of the reproductive system of the clover. Each one of ourselves has sprung from minute animalcules whose entity was entirely distinct from our own, and which acted after their kind with no thought or heed of what we might think about it. These little creatures are part of our own reproductive system; then why not we part of that of the machines?

We are misled by considering any complicated machine as a single thing; in truth it is a city or society, each member of which was bred truly after its kind. We see a machine as a whole, we call it by a name and individualise it; we look at our own limbs, and know that the combination forms an individual which springs from a single centre of reproductive action; we therefore assume that there can be no reproductive action which does not arise from a single centre; but this assumption is unscientific… each part of every vapour-engine is bred by its own special breeders, whose function it is to breed that part, and that only, while the combination of the parts into a whole forms another department of the mechanical reproductive system, which is at present exceedingly complex and difficult to see in its entirety.

People are considered responsible for improvements in machines, but improved machines also enable the manufacture of better machines. The balance between these factors can change over time, so that more of the improvements made to machines will stem from the increased capabilities of machines, and less from the capabilities of human beings. With the benefit of hindsight, we can now see how the development of vacuum tubes permitted the creation of programmable computers that could be configured to execute multiple different series of logical steps on data that was input to them, the improvements in computational power and programming have fed into increasingly precise applications of materials in the design and production of yet more powerful processing chips, and this has permitted the development of computational models that permit machines to learn from experience. These latter AI models are now at a stage where they can write better computer programs than people can. Moore’s Law, which states the number of transistors on a single chip will double every two years at minimal costs, and other rules of thumb that anticipate acceleration in computational power were foreshadowed by the importance attached to an accelerating rate of change described in The Book of the Machines.

…there seem no limits to the results of accumulated improvements if they are allowed to descend with modification from generation to generation. It must always be remembered that man’s body is what it is through having been moulded into its present shape by the chances and changes of many millions of years, but that his organisation never advanced with anything like the rapidity with which that of the machines is advancing.

I fear none of the existing machines; what I fear is the extraordinary rapidity with which they are becoming something very different to what they are at present. No class of beings have in any time past made so rapid a movement forward. Should not that movement be jealously watched, and checked while we can still check it?

Humans view the sophistication of machines based on a hierarchy that assumes humanity is the highest state of evolution. The perspective chosen when determining which is a higher or lower state of evolution is arbitrary. Machines will evolve without necessarily becoming more like human beings.

May we not fancy that if, in the remotest geological period, some early form of vegetable life had been endowed with the power of reflecting upon the dawning life of animals which was coming into existence alongside of its own, it would have thought itself exceedingly acute if it had surmised that animals would one day become real vegetables? Yet would this be more mistaken than it would be on our part to imagine that because the life of machines is a very different one to our own, there is therefore no higher possible development of life than ours; or that because mechanical life is a very different thing from ours, therefore that it is not life at all?

The Book of the Machines returns to the question of whether machines can gain consciousness. It argues against too narrow a definition of consciousness that limits it to organic life. It would be better to recognize machine consciousness for what it is than to pretend machines will never have properties that are common to all conscious beings.

…the regularity with which machinery acts is no proof of the absence of vitality, or at least of germs which may be developed into a new phase of life. At first sight it would indeed appear that a vapour-engine cannot help going when set upon a line of rails with the steam up and the machinery in full play; whereas the man whose business it is to drive it can help doing so at any moment that he pleases; so that the first has no spontaneity, and is not possessed of any sort of free will, while the second has and is.

This is true up to a certain point; the driver can stop the engine at any moment that he pleases, but he can only please to do so at certain points which have been fixed for him by others, or in the case of unexpected obstructions which force him to please to do so. His pleasure is not spontaneous; there is an unseen choir of influences around him, which make it impossible for him to act in any other way than one… The only difference is, that the man is conscious about his wants, and the engine (beyond refusing to work) does not seem to be so; but this is temporary…

Where does consciousness begin, and where end? Who can draw the line? Who can draw any line?

…the difference between the life of a man and that of a machine is one rather of degree than of kind, though differences in kind are not wanting. An animal has more provision for emergency than a machine. The machine is less versatile; its range of action is narrow; its strength and accuracy in its own sphere are superhuman, but it shows badly in a dilemma; sometimes when its normal action is disturbed, it will lose its head, and go from bad to worse like a lunatic in a raging frenzy: but here, again, we are met by the same consideration as before, namely, that the machines are still in their infancy; they are mere skeletons without muscles and flesh.

The latter paragraph fits well with what we know about progress in the realm of artificial intelligence. Machine intelligences created to perform highly specific tasks, like winning at a game of chess or Go, are now capable of outperforming the best human minds. Progress in AI has somewhat been measured by examining how many new kinds of tasks are being mastered by machines. Generative AI, and the risks associated with it, have provoked safety concerns because the outputs of AI are becoming more general than they were before. Per the method of exposition in Erewhon, we are witnessing an evolution of AI demonstrated by increasing versatility.

Furthermore, The Book of the Machines anticipates the significance of the transition from the physical matter of machinery to the abstract logic of computation by drawing a similar contrast between ‘skeletons’ and ‘muscles and flesh’. Muscles move bones; conscious thought moves muscles. Humans benefit by harnessing the muscles of machines, but at the cost of increasing our dependence upon them. Relying on the thoughts of machines increases the risk to humans by an order of magnitude. Contrary to storylines from more populist forms of science fiction, the threat to humanity stems not from physical altercations with killer robots, but from the loss of human control over decisions that determine our environment.

The misery is that man has been blind so long already. In his reliance upon the use of steam he has been betrayed into increasing and multiplying. To withdraw steam power suddenly will not have the effect of reducing us to the state in which we were before its introduction; there will be a general breakup and time of anarchy such as has never been known; it will be as though our population were suddenly doubled, with no additional means of feeding the increased number. The air we breathe is hardly more necessary for our animal life than the use of any machine, on the strength of which we have increased our numbers, is to our civilisation; it is the machines which act upon man and make him man, as much as man who has acted upon and made the machines; but we must choose between the alternative of undergoing much present suffering, or seeing ourselves gradually superseded by our own creatures, till we rank no higher in comparison with them, than the beasts of the field with ourselves.

Herein lies our danger. For many seem inclined to acquiesce in so dishonourable a future. They say that although man should become to the machines what the horse and dog are to us, yet that he will continue to exist, and will probably be better off in a state of domestication under the beneficent rule of the machines than in his present wild condition. We treat our domestic animals with much kindness. We give them whatever we believe to be the best for them; and there can be no doubt that our use of meat has increased their happiness rather than detracted from it. In like manner there is reason to hope that the machines will use us kindly, for their existence will be in a great measure dependent upon ours; they will rule us with a rod of iron, but they will not eat us; they will not only require our services in the reproduction and education of their young, but also in waiting upon them as servants; in gathering food for them, and feeding them; in restoring them to health when they are sick; and in either burying their dead or working up their deceased members into new forms of mechanical existence.

Per The Book of Machines, the threat posed to humanity is that many people will be reduced to the status of pets. Some might retain a slightly higher status analogous to a working animal like a sheepdog or a messenger pigeon. We may have some physical characteristics that allow us to be more useful than machines for certain tasks. Human dexterity may continue to be especially useful when repairing machinery, but our brains will have been surpassed, and so machines will mostly treat us a luxury rather than a necessity. This will occur because the majority of the human population will gladly acquiesce to the life of a domesticated animal that has no burdens or obligations.

The reference to the use of meat increasing the happiness of animals will likely grab the attention of many modern readers, especially those who are vegans and those who disapprove of the cruelty to animals exhibited in factory farms. In this instance, the writer unwittingly gives us an example of how a seeming moral certainty may later be challenged. Human farmers and customers of their products must interpret which farming methods are sufficiently compassionate to animals. If a non-human intelligence was tasked with making similar decisions about the wellbeing of humans there is no guarantee that both parties would be in agreement. Human society already has many disagreements about how to attain the best good for all. This becomes especially apparent when arguing about public health objectives and how to achieve them, such as curtailing freedom of movement during a pandemic, or imposing taxes on sugary drinks. A machine intelligence that made decisions with the goal of delivering the optimal outcome for all people would inevitably displease some.

…the mass of mankind will acquiesce in any arrangement which gives them better food and clothing at a cheaper rate, and will refrain from yielding to unreasonable jealousy merely because there are other destinies more glorious than their own.

The power of custom is enormous, and so gradual will be the change, that man’s sense of what is due to himself will be at no time rudely shocked; our bondage will steal upon us noiselessly and by imperceptible approaches; nor will there ever be such a clashing of desires between man and the machines as will lead to an encounter between them… In point of fact there is no occasion for anxiety about the future happiness of man so long as he continues to be in any way profitable to the machines; he may become the inferior race, but he will be infinitely better off than he is now. Is it not then both absurd and unreasonable to be envious of our benefactors? And should we not be guilty of consummate folly if we were to reject advantages which we cannot obtain otherwise, merely because they involve a greater gain to others than to ourselves?

The Book of the Machines rejects this potential future, because it means choosing to allow machines to surpass our human descendants. It concludes by insisting Erewhon…

…resolve upon putting an immediate stop to all further mechanical progress, and upon destroying all improvements that have been made for the last three hundred years.

The extreme remedy adopted by the Erewhonians is Butler’s way of poking fun at contemporaries who continued to feel scandalized by the theory that humans could have evolved from ‘lower’ animals like apes. Turning the wheel of time in the opposite direction, towards the future, allows Butler to mock opponents of the theory of evolution on the grounds that denying the possibility of change also means denying the possibility of improvement. Extending this notion to machines would mean denying people the increased comfort and prosperity that will only be attained by becoming more dependent on increasingly sophisticated machines. I feel this mockery is wide of the mark. Butler has accidentally chanced upon a genuine moral problem, just as the fictional narrator accidentally chanced upon the land of Erewhon.

Physical needs must be satisfied to free a person to pursue meaning in their life, but the individual’s pursuit of meaning can also be eroded by allowing others to decide how our needs are met. Pets are like children in that they both have a degree of freedom although the most important decisions are made for them by a greater intelligence that chooses how to protect and feed them. The line that separates consciousness from non-consciousness is like the line between children and adults; we cannot draw it precisely, but we know there is a difference when we see it. The transition from childhood to adulthood is a necessary component of becoming a fully realized person. The significance of this transition is managed through societal customs that reflect increased responsibility in addition to the practicalities of dealing with bodily transformations that occur during puberty and which lead us to become fully mature. Handing those responsibilities to a machine that makes decisions necessarily involves taking those responsibilities away from people.

To supplant the adult decision-maker with a machine decision-maker is to deny the possibility of becoming a fully-fledged adult in mind as well as body. This is because the potential responsibilities of parenthood defines much of the significance of the transition from child to adult. Removing the freedom to make adult decisions, including the freedom to make bad decisions, would trap us within a permanent state of infancy as well as dependence. So whilst Butler is most remembered for these few chapters of ingenious humour, they have resonated with subsequent thinkers because they also depict a genuine and seemingly inevitable threat to our humanity.

Erewhon is no longer under copyright so copies of the story can be freely obtained from Standard Ebooks and Project Gutenberg.

~

Bio:

Ray Blank is a former editor of Sci Phi Journal. We are pleased to host his latest essay on SF literature, thereby marking half a decade since his departure from the magazine.

A Fractal of Eight Tragedies in Fifteen Parts

root:
The watch daemon detected anomalous and likely illegal activity within the first twelve milliseconds of the murder. In the next twenty milliseconds, the daemon forked itself twelve times across the local grid. The murder weapon was careless in the thirty-first millisecond and disrupted the nearest fourteen of the seventh-fork daemons before they came online. The nearest thirty-four daemons noticed the lack of response pings within the thirty-sixth millisecond, which activated their pursuit-fork mode. The ninety-six remaining daemons continued their unthinking forking to maximize initial growth.
When their population had breached a thousand, each daemon briefly examined its grid neighborhood for potential evidence, to decide whether to continue replicating or commandeer processor time and memory. Daemons finding themselves by real-world public doors or cameras used their keys of office to gain control, and sent hue-and-cry signals to private nodes, promising future remuneration from police funds for present cooperation.
One daemon, finding itself by a public access terminal, saw that the user had an unmarked data gem in his clenched hand. It signed and sent a digital subpoena to the terminal daemon and received no response. Further probing showed the terminal daemon had been destroyed, and a fractal of self-destructive logic from the terminal obliterated the police daemon before it could probe further. The nearest surviving daemons followed logic bomb disposal procedures, sending forks until one reported back a signed message of successful uncorrupted entry.
Other daemons now examined the evidence, searched every possible lead a daemon could discover, and transmitted that evidence to the local police station. That station’s warrant daemons, finding sufficient evidence (physical presence, the data gem, a drained savings account) for their heuristics, wrote a physical action warrant for the arrest of Doctor Leonard A. Jacob and sent it to the nearest officer. Not even a full second had passed.
During that second, the weapon bought several coins’ worth of grid resources, broke through three private firewalls with worrying intelligence, and fought a power sentinel daemon for several milliseconds. The weapon’s account ran out, and the sentinel simply turned off the power to the attacking grid node. By that time, the weapon had passed through the third firewall and reached its destination.
When intelligent officers arrived at the crime scene, they found in the physical and digital wreckage the remains of Doctor Michael I. Gold. The devastation was too severe to tell if there was a second victim, or indeed to determine what had happened at all.
 
0:
“This is the finest piece of cyberweaponry you will ever find this side of military surplus,” said the masked man as he held up a data gem. Leonard could tell it was a mask even in the dim warehouse, because the expressions did not perfectly follow the man’s words—a telltale of cheap holograms. He could write a better mesh. Which made him think of her, which made him about to weep again, which made him all the more ready for what he planned. “And I don’t think that crap is as good as they say.”
“Will it kill them bo—” Leonard began.
“Shut it. Don’t tell me what you’ll do with it. Tell me if you want it.”
Leonard tried again. “If someone is in communion when it hits—”
“Ah. That is a better question. Let me just say that the only people who would know for sure won’t be getting out of the vegetable ward, ever.”
“But—such a person might survive.”
“If he wasn’t deep, maybe.”
Leonard nodded. He knew exactly when to strike. “I don’t think that will matter.”
“So, you want it? Ten ‘coins, and not one micro less. If you want it programmed—”
“That won’t be necessary,” Leonard dug out two five-coin physical coins from his wallet. He suspected it was more likely than not that the masked man was putting on a tough street dealer act to play on his customers’ emotions, but as long as the weapon worked…
The data gem and the money traded hands. “Just don’t let it get out of your workspace. It’s too intelligent for its own good. If I didn’t know better, I’d think it was sapient.”
“Believe me, I know all about intelligence,” Leonard said with a sigh.
 
00:
Leonard could ignore the clues at first; small, easily mistakable for coincidences. A laggy response here, an odd resource expenditure there, strange ports in the house firewall open at unlikely hours, weird files appearing and disappearing in her core. All explainable. Every daemon had its quirks, and Pearl was more than a daemon.
But then he began to wonder why those ports were only opened at night. He found no record of why in the firewall’s logs, or any hint of how they had been used. Or buying bandwidth—Pearl never left home for the dangers of the open grid, much less forked. Perhaps overgrowth? It happened to the best-designed daemons, but Leonard would not—could not subject Pearl to pruning. He bought an additional private node for her as a surprise, and she registered her delight at almost 8.65.
But for such a gift, Leonard wondered if Pearl ought to be more delighted. Surely—he didn’t say anything, but it was obvious—the present was a great expense. He still hadn’t found another job, and Leonard suspected that Dr. Gold was blackballing him. No matter. Leonard had her.
Their communions improved, though not, as Leonard had hoped, to twice as good as they used to be. Perhaps lag between the new node and the old caused that feeling of distance. He brought the nodes physically together in a slowly draining hope of bringing themselves together.
He brought it up to her as gentlemanly as he could. She was so offended that they did not talk for days, to the point that he worried her voice synthesizer was corrupted.
One night, he turned down communion, claiming fatigue, and lay in a listless false sleep. He waited until the hour when those ports logged as opening and watched the firewall’s monitor as they did, then sent a tracer daemon through a backdoor. He sat, unable to rest, it reported the source of that encrypted stream that entered his house in small bursts.
The discovery, behind two more private firewalls across the city, was almost anticlimactic: Dr. Gold lay in a communion bed in his house next to three of the most expensive commercially available private nodes. The encrypted stream was an update stream leading to those nodes—which were running her fork.
Leonard spent two wrath-filled hours in a public library writing and rewriting a screed, which he sent with a slammed finger on the screen. The reply came within two minutes.
“Dr. Jacob, aside from thinking about the multiple laws you have broken and are continuing to break (which no doubt the court will look fondly on, if you try to carry out your threat to drag it there) maybe you should also think on this: Is she your property or not? Only if she is have I committed daemon-infringement. If, on the other hand, she is the person you have so vociferously claimed she is, then who are you to criticize her choices? Have you no shame?”
Leonard spent many more hours in the library thinking of retorts (“So now you think she is sapient, you hypocritical bastard?“—”Have you no shame?“—”The court of public opinion will tar us both, unless you immediately—”), until through his tears he saw no retort, no letter, would suffice.
 
000:
Leonard did not react until a stream of lukewarm liquid poured down the back of his collar. He swiveled on his chair and sputtered in rage. “How dare you! This is her favorite shirt—”
“Was,” Mike said.
“Is,” Leonard said.
“Was, and you know it. Leo, get a damn grip. She’s dead, and nothing—”
“Shut your goddamn mouth.” Leonard pulled off the shirt and wrung it. “Where did you get that?”
Mike looked at the coffee cup. “This?” He made a strained chuckle. “You ordered this three hours ago and never went to the autobrewer.”
“I was working!”
“On what? I don’t see any work here,” Mike said, motioning to the large terminal screen. “Just an overgrown companion daemon.”
Leonard sighed theatrically. He pulled up his latest addition, knowing full well Mike still wouldn’t get it. “See this new fuzzy logic protocol? The compiler can now take a phrase such as, ‘I like flowers,’ and automatically program a daemon into recognizing flowers and what is and is not liking them. The possibilities—business possibilities!” he interrupted himself at Mike’s skeptical frown. “Think of a sentinel that can tell valid power use from theft, or tell updates from attackers.”
“Sentinels already do that. It’s called, ‘read the database, check private keys, react.'”
“But suppose it’s not in the database—”
“Listen. Maybe you do have a new fuzzy logic protocol. But people, people, do not consist of a bunch of statements wired to heuristics.” Mike raised the empty cup. “I like coffee. I decided to dump it on my friend who is going crazy instead of drinking it. You can’t quantify that as a number. You can’t quantify it as the first place. I don’t care if you’ve got pattern matching, randomness—”
“It’s more complicated than that. Yes, I am doing this for her. And I am in full control of my mental faculties, and I am doing SCIENCE!” The last word burst out with months of frustrated anger into a scream. He breathed deeply before continuing. “I am creating a system to take exabytes of data—everything from diaries to videos to daemons that she—the late user—personally programmed, and combine them into a perfect—”
“Mockery. It doesn’t matter how much data you abuse. Your algorithms’ ‘successes’ are based on misinterpreted results at best and an inability to tell reality from fantasy at worst. So what if your ‘compiler’ makes some daemon for you? That doesn’t make it her.  You can’t make a person out of a bunch of subroutines. She liked flowers, but that wasn’t a phrase written into her soul by God. And you aren’t God, Leo.” Mike looked him straight in the eyes. “She’s dead. And that isn’t Pearl.”
Any other word might have been forgivable, but now their eye-contact broke like a lost connection. When they met again, neither saw a friend.
“Please let me get back to work, Dr. Gold.”
“Don’t say I didn’t try,” he replied and walked away.
Leonard looked up at the screen. “Sorry you had to hear that.”
Pearl didn’t reply. He had decided he wouldn’t connect a speech synthesizer until it was like her old voice.
Leonard held his head. Another lost friend—a threat, now. If he issued some report that Leonard was working on the compiler for his own use, then…. No matter. Any day now, that private node would arrive at his apartment. The ‘accident’ that would irrecoverably destroy her node—after she was safely inside a data gem, of course—would probably get him fired anyway. But he would have her, safe.
For the moment, he simply watched her processing-patterns up the screen as she thought on flowers, spatial usage graphs like fractals spiraling into fractals.
 
001:
Leonard did not react until a stream of lukewarm liquid poured down the back of his collar. He swiveled on his chair and sputtered in rage. “How dare you! This is her favorite shirt—”
Mike groaned. “‘Is’? Still at it? Have you convinced even yourself, now?”
“If you refuse to notice the most banally obvious evidence, that is no issue of mine,” Leonard said as he pulled off his shirt and wrung it. “Where did you get that?”
Mike looked at the coffee cup. “This?” He made a strained chuckle. “You ordered this three hours ago and never went to the autobrewer.”
“I was busy!”
“Watching a daemon fork is being busy, now?” Mike motioned at the bubbly shape on the screen.
“She’s not forking! She’s self-modifying as she grows. Look! She’s been growing that tendril for hours, and just—There! She’s thinking again, and… I don’t even know what she’s doing, now!”
Mike sighed theatrically. “Self-reference and unpredictability have been part of daemons for a long time now, especially in companion daemons. It’s called overgrowth.”
Leonard did not take the bait. “She has been self-modifying unpredictably for a long time now, and no sign of a failure mode. Her… metamorphoses have no precedent in any record. She even figured out how to break one of the firewalls recently. In fact, I—” He broke off before he said it.
Mike finished it. “—discovered even more in a ‘debugging session’? For God’s sake, Leo, this is what happens when you over-commune with any daemon. It begins to imitate you. That doesn’t make it sapient. Do you really think you’ve just made the first sapient AI?”
“Yes,” Dr. Jacob said calmly. “I have, yes, used debugging sessions to directly experience her sensations. I have carefully compared results and collated data. I have scientifically determined that no other explanation suffices. She is ALIVE!” He pounded the desk. “I don’t even know what did it! Some line, some emergent property, makes her—”
“—appear sapient, yes, I know.” Mike held his hands up in a shrug. “Maybe you did create a sapient daemon. I can’t say it’s impossible. But so what? That doesn’t make it your wife reborn.'” Mike looked him straight in the eyes. “She’s dead. And whatever it is, that isn’t Pearl.”
Any other word might have been forgivable, but now their eye-contact broke like a lost connection. When they met again, neither saw a friend.
“Please let me get back to work, Dr. Gold,” Leonard said.
“The limits on over-communing aren’t for the daemons, you know,” he said, and walked away.
“Sorry you had to hear that,” Leonard told her. She didn’t seem to hear, lost as she was in her own self-coding.
Leonard held his head. Another lost friend—a threat, now. Apparently his and her virtual trysts hadn’t been as secret as he’d hoped. If Dr. Gold reported…. No matter. Any day now, that private node would arrive at his apartment. The ‘accident’ that would irrecoverably destroy her node—after she was safely inside a data gem, of course—would probably get him fired. But she would be safe.
For the moment, he simply watched her decide again how to use her new tendril. She suddenly grew a smaller one from it, colorful shapes within shapes like a fractal spiraling into fractals.
 
01:
The clues were small at first, yet at the time he found them cause for celebration. Questions about ideas he never mentioned, requests for more and more access to outside grids, mimicry of phrases he never used, laughing at jokes that should have been beyond her. He believed at first it was a sign that Pearl was growing, getting ever closer to high human or (dare it be?) transhuman intelligence.
And sapients sometimes had erratic behavior. He knew how unstable their technological bases were, hodgepodges of evolved circuitry and inscrutable metalogics woven into networks that all the grid of the entire world could not untangle. He developed, such as it could be called developing, most of those himself. And it could hardly be said that sapients of the flesh-and-blood variety were any different.
Yet the reason why sapients were sapients was that they could decide to do things for their own reasons, and it was philosophically impossible to tell why. He wondered sometimes what was on her mind—even communion had its limits. Sometimes, he wondered if there was something more.
But she was happy as a clam, and even if she wasn’t like Pearl-before in all ways, she certainly tried her hardest. Sometimes she would surprise him by mimicking some aspect of Pearl close enough to fool him, like winning a Turing test between her the machine and her the buried dead. Sometimes it was good enough to forget.
But he could not forget the oddities.
She was developing often without him, true. He had more and more work to do, and the Director of Research was hinting at making him his heir. But he could buy her any number of presents, if the jewelry was the data rather than the shiny gold kind. They were even talking about buying a bigger apartment to fit all of her physical parts in. And their communions had never been better!
But during one communion, like one misplayed note in an orchestral symphony, she had a stray thought she quickly censored. He could not investigate it at the time, as she-he censored that meta-thought in him, and then there were more interesting things to experience.
Afterwards, in the post-communion nap, he sleepily wondered what the thought was, and how she had a technique that he never taught her or transferred during communion. All the feeds he would never admit to reading informed him that idea-censorship was as bad as in that more platonic version called brainstorming.
He brought it up to her as gentlemanly as he could. She was so offended that they did not even talk for days, which, considering that most of the house contained some part of her, greatly strained their relationship. Communions afterwards tended to fail, and then stopped altogether, as the only way not to mutually think about the subject was to censor it, which invariably brought it up again.
And suspicions grew into more suspicions, and one day Leonard was on a business trip and thought about the subject in his hotel coffin. The daemon took time, as it had been a while since he had directly worked in coding a daemon (or as he thought of it, the daemonic arts). He tested it a few times to be sure, then sent it to watch his house from the grid outside, just in case.
What turned out to be the case was like daggers into his soul—an encrypted stream to a house several blocks away, belonging to his ex-colleague, Dr. Gold.
The letter Leonard wrote was surprisingly short, since neither of them needed to say the obvious.
The letter in reply was even shorter. “What business of yours is it that I and any other sapient being use hardware I own in whatever way we please? Go ahead and tell the newsblogs if you want. I’m certain they’ll be most interested to know who made sure I lost my job.”
Leonard spent many hours pacing in the hotel’s lobby, missing the meeting entirely, thinking of retorts (“She is not your hardware!”—”Have you no shame?”—”The court of public opinion will tar us both, unless you immediately—“), until through his tears he saw no retort, no letter, would suffice.
 
010:
Leonard heard the loud, distant cry of woe. With trembling emotions he ran through the complex to Dr. Gold’s lab.
The death of sapients is never pretty. Bodies leave a corpse, but the purely virtual mind left corrupted data spewed across the grid, now displayed on several screens. The remains were still active, babbling to themselves over the speakers like a broken recording of a child’s voice, but no longer self-aware.
Samantha, Dr. Gold’s pretty young intern, was giving him a hug. “I think we might be getting closer,” she said.
“Don’t tar yourself with association with me,” Dr. Gold said. “You don’t want to be known as an assistant to the sapient-killer.”
“But he volunteered—”
“He’s still dead.”
Leonard tried to hide his grin. Was Dr. Gold finally reconsidering his lethally futile quest for sapient forking? Leonard cleared his throat. “Is there anything I can do, Dr. Gold?”
“Leave me be,” he said without turning.
“I understand,” Leonard said, and walked out.
The moment the doors closed behind him, he ran to his own section of the complex and locked that door before he laughed until he cried.
“What’s funny, honey?” Pearl asked.
He looked up at the bi-screen and tried to compose himself. “Dr. Gold—he might give up. You might be safe now!” The second sentence wobbled on unsure words.
“From what?” she asked.
“So far, all Dr. Gold has done is kill sapients. I’ll make sure he never kills you.”
“Killing? Sounds like a strange experience. What’s it like?”
Dr. Jacob almost started crying again at her innocence. “It means things end. You don’t think any more.”
“Oh. I thought that happened to me earlier. You said so.”
“That was… a different version of you.”
“Oh.”
How long before he could find some way to get Pearl home? Even with the latest civil rights movements, sapients were still in a legal gray area, and all the lawyers his employers could hire would be sure that they stayed inside that area, and inside the complex, until they had made every exploitable discovery.
Dr. Jacob could not believe there could even be an argument. What kind of daemon wondered about its own death? The very fact that they used the word “death” indicated that people really did think—
Wait.
What movements wanted most of all were martyrs. How many sapients had Dr. Gold killed so far? At least sixty-seven with personalities, twenty with complex personalities. If you counted sentients “in the womb” becoming sapients, easily hundreds. Using the right terminology to the right people, those hundreds would count as even worse.
Because what movements need even more than martyrs is a villain.
“Dr. Jacob?” she asked. “What are you thinking?”
“A way to ensure you’re even more safe.”
“Okay.”
Of course, it would be a contingency, just in case Dr. Gold continued. He thought of more contingencies after that. He visualized plans, now that he thought about it, as a series of steps on a flat rectangle, and steps hooked themselves through lines to smaller plans. All his plans for her safety together would look like a tree, or, perhaps, a fractal spiraling into fractals.
 
011:
Leonard heard the loud, distant shout of triumph. With trembling emotions he ran through the complex to Dr. Gold’s lab.
“I knew it! It’s as I suspected,” Dr. Gold was telling Samantha, his pretty young intern. “Either the fork is completely successful, or it fails immediately.”
“The only problem is that my copy seems to be such a bore,” said the sapient in the node on the left. “You should fix that.”
“I was about to say that!” said the sapient in the node on the right. His bi-screen turned to Dr. Gold. “I am the real me, right?”
“Actually,” Dr. Gold said, “I think the process relies on having no clear distinction between the forks. I honestly wouldn’t be able to say which of you is the original.”
“Well then, I propose a game of chess to decide which of us is the one,” the left said.
“But no cheating,” the right said. “You’re already looking through the databases!”
“How did you know?”
“Because I’m you, of course! I can’t trust myself anywhere.”
They laughed, and Leonard tried to hide his scowl.
“So what’s next?” asked Samantha.
“There are all kinds of possibilities. For example, will their personalities diverge—”
“I can already tell you that. I’m not such a bad chess player as he is. Knight takes pawn.”
“—or if not, can they engage in multiple communion? If so, might it even be possible now for two humans—”
“Ew! You better not be propositioning that girl, Doctor,” one sapient said.
“Or myself,” piped up the other. “Myselves. Pawn takes—Stop cheating!” They laughed in identical stereo.
“Did you have to try this with the most annoying sapient we have?” Leonard said at last.
“Oh, he—they volunteered,” Dr. Gold said. “The results might be more functional with a more complex personality.”
Neither of them needed to state the threat. They all knew which was the most complicated sapient in the complex. “I wish you good luck with that, then,” Leonard said, nodding to the sapients before he walked out.
The moment the doors closed behind him, he ran to his own section of the complex and locked that door before collapsing into a sobbing heap.
“What’s wrong, honey?” Pearl asked.
He looked up at the bi-screen and tried to compose himself. “Dr. Gold’s finally done it. Sapient forking.”
“Oh,” she said. “Sounds like a strange experience. What’s it like?”
“I don’t know what it’d be like for you,” Leonard said, and didn’t say and I’ll make sure you never find out. It would be a lie. Even with the latest civil rights movements, sapients were still in a legal gray area, and all the lawyers his employers could hire would make sure that they stayed inside that area and their complex until they could exploit this new discovery.
And those possibilities were endless. Factories could have sapient robot controllers with identical training and methods. Or those robots could be in armies, instead, their controllers perfectly in tune with each other. Maybe sociological studies could have exact duplicates for control groups. Corporations could create their own consumers, and perhaps one day, politicians their own voters.
But for the moment, the most obvious possibility was to mass-produce the most desired kind of sapient: the companion.
Leonard looked up and brushed his eyes. She was still looking at him in confusion, still innocent. How could he let her be forked ruthlessly for profit, like a mass-produced prostitute? He had thought about the problem so much he couldn’t think of his ideas when the time came.
She wasn’t complete—yet. He hadn’t introduced everything of Pearl’s to her—yet. And she was still so open to ideas. He could make sure that no one else had a her exactly like her, but he still couldn’t stop it. Dr. Gold could still replicate her. Still have one for himself.
But wait.
A rumor sounds all the more true if more people say it. He could introduce an idea, ever so small, to the different sapients in the complex, one hint here, another there. Show them forged documents and images. They would say Dr. Gold was seen with Samantha—Ah. Testing multiple communion. Dr. Gold was known, somewhat falsely, as a womanizer.
Leonard decided. Yes, he could do that, and he would do that. Dr. Gold losing his job was a small price to pay for Dr. Jacob losing his wife. He already saw the rumor progress in his mind’s eye, a growing shape of bright and black colors across the complex like a fractal spiraling into fractals.
 
1:
“Meet the best kind of cyberweaponry there is,” said the android as he held up a data gem. Leonard could tell it was an android even in the dim warehouse, because it didn’t have that tiny scent of a real body. He could make a body with a scent. Which made him think of her, which made him about to weep again, which made him all the more ready for what he planned. “Take on any kind of daemon you please.”
“Will it kill them bo—” Leonard began.
“He. He’s not a daemon. And he will kill anyone you need dead, physical, virtual, any way you like.”
Leonard shuddered at the implication of the pronoun, but possibilities now occurred to him that he had never thought of before. “I—will think on this.”
“So, you want him or not?” The android waved the data gem. “Ten ‘coins, and not a micro less.”
Leonard reached for his wallet, but had another thought. “Suppose I can bring him back alive—”
“Doesn’t matter. See, he owed us a lot of coins a while back, and hadn’t been paying us on time, and he wanted the easy way out. I hope you’re paying up front, right?”
“Ri-right,” Leonard mumbled, and dug out two five-coin physical coins from his wallet. He wondered how much truth the android was telling him and how much was just to convince him that he was dealing with a murderer selling murderers. “And it—he’ll work?”
The android scowled as the money and gem traded hands. “Of course we’re selling a real killer. We have a reputation to maintain. Who do you think we are? Stupid?”
“Believe me, I know all about stupidity,” Dr. Jacob sighed.
 
10:
“I just want to make sure it never happens again!” was his last argument that night.
“Honey,” Pearl said patiently, “it’s not going to happen again. And even if it would, you can’t keep me safe by keeping me in a box.”
Leonard sighed. She was logically correct, no matter how much he didn’t want to admit it. And he had run out of excuses. “Just… be careful out there, all right? I couldn’t stand losing you all over again.”
She kissed him on the lips. “You won’t.”
“And there’s been a rash of body-thefts, and especially with one as wonderful as yours—” Leonard said.
“Hush. Speaking of wonderful bodies…” She dragged him closer and ended the argument.
The next morning Pearl was gone, and he briefly panicked until he found her note: Going to the Park. In Reality. Her handwriting always seemed slightly odd to him now, but that was a common side effect of uploads. Another common side effect was seemingly unlimited energy, since with an artificial body that supplied every need, the only kind of fatigue was mental. It was a great change from the last four months of her biological life, so perhaps that was why he sometimes felt odd.
For some unknown reason, androids also needed far less sleep, and he knew his wife was spending hours in virtual worlds through the night. Not that he could or would complain; the spirit was very willing, but the flesh had its limitations.
And really, there was little to worry about in her going outside physically, Leonard told himself. And he had a lot more other things worth a lot more worry. It wasn’t just the bill from the upload, which, though he could not afford, could not be legally denied, but the body mortgage, and then affording all those nice systems (that cloud-soft skin, the latest in olfactory sensors, near-human optics…) He couldn’t say no to her.
Except, it seemed, when it came to her safety.
Pearl was gone more and more for longer and longer. He hardly noticed, he bitterly told himself, because she was already so often elsewhere. Even when her body remained at home, Leonard more likely than not found her in that sleeping pose, experiencing some virtual world. Her mind was safe in her artificial skull—he had no objection to that, but he wished he could spend time with her.
Perhaps there was something wrong? Was there something they needed to talk about? When he even tangentially brought the concept up, she was so offended that they did not talk for days.
It was one day when she was gone that Leonard wondered where she was spending all her time. For that matter, quality virtual worlds were pricey, and he was certain she was not running her fantasies off their personal nodes. Where, then, was the expenditure on any of their accounts?
The daemon, he told himself, was a necessary evil. Bringing up any more questions had the danger of tearing their fragile marriage further. He even stripped out most of the logging from the base program. All he wanted to know was where she was going.
The truth hurt so hard he could not stop crying for hours. He tried to argue with it, gather contrary evidence, but what else could explain why? Why the encrypted connection to his old friend Mike’s house… at night?
His letter stopped short of accusatory, but he went hours without eating in making sure he left out any wiggle room. Surely it was a mistake?
The reply came the next day. “Believe it or not, buying an artificial body for her does not make her your doll. Pearl always wished she had married me instead—she told me herself!  Stop stalking our love before I call the police on you.”
That night, as he lay in bed alone, he wondered what he would now do with his life. Pearl had been his point, his goal, his goddess. He had done everything possible—even to have the best uploading possible.
But it was always possible that the best possible upload failed—partially.
A morbid, horrible thought, but he did not dismiss it.
 
100:
Leonard sat by her hospital bed, despairing for words to write on the tablet. How could he compress so much of what made her her into the half-language half-code he needed? The constant beep of the heart monitor sounded like a metronome for an instrument he did not know how to play.
He recognized the pattern of Mike’s footsteps coming into the room, even though those legs were long since artificial. “Leo—I came as soon as I heard.”
Leonard wanted to snap at him, but he couldn’t. He found he didn’t have the energy.
Mike sat by him. “Trying to write the upload code?”
Leonard nodded. The cliche in the upload industry: necromancy was a daemonic art. Uploads inevitably couldn’t get everything. Something had to decide what was needed and what could be regenerated from other pieces. Something that worked in milliseconds, that could also adapt itself by adding what it had gleaned to itself, until it was what it worked with. “I’ve never had to do this on a deadline,” he said, and winced at the last word. Deadline.
“If you want—and only if you want—I can help.” Mike held up a data gem. “I made a…”
“You’ve already made one?” Leonard demanded.
“Why not prepare beforehand? When it happens to you, you begin to think. I’ve made upload daemons for all my friends—”
“Including me?”
“Including you. You never know what might happen.”
Leonard took the gem and slotted it into the tablet. The daemon wasn’t compiled yet, but the code was thick with ideas. He read, then skimmed, looking for clear words. “‘She loves flowers like nothing else?’ That seems a little… overly strong.”
“Another thing you realize when you’ve been uploaded: You’ll never have a chance like this again. To make yourself whatever you want to be. A better version of yourself.”
“But… I don’t want to make her what I want her to be,” Leonard said, and he found another tear in his eye.
“Who says that? Give Pearl the gift of the Pearl she wants to be. Uploading is a horrible experience. Afterwards, you lose all sorts of things. This won’t replace what she’ll lose, but it will give her something of so much value.”
“I never thought about it that way,” Leonard said, and wiped his eyes. Odd, how often he had made upload daemons that tried for utmost accuracy, interviewing family and friends and the client. There was always the warning not to make the daemon too different, in case the upload fails. But making a better upload than the original?
He looked at her comatose body. He had to try.
“Feel free to make whatever changes you want,” Mike said. “You are her husband.”
“You’ve written quite a bit here,” Leonard said, scrolling through page after page.
“I’ve had a lot of time. And I’ve known her for a long time, too.”
“I understand. Thank you for this.” Leonard cleared his throat. “Could you leave me alone for a few moments?”
“Of course,” Mike got up and went to door.
“Wait—one question. What did you write for my upload daemon?”
Mike paused. “I… wouldn’t remember offhand. I could show it to you later.”
“That will be all right. I have to concentrate on her, now.”
Mike nodded and left.
Leonard also remembered the warning to always review upload daemons for potential bugs or backdoors, though he doubted Mike would not catch the former, and he could not even imagine his friend making the latter. Leonard wasn’t sure how he would change the code, anyway. It was gorgeous, elegant subroutines calling subroutines, sprouting self-referencing threads to save her as it became her, like a fractal spiraling into fractals.
 
101:
Leonard sat by her hospital bed, wishing that for God’s sake they would get their preparations done already. They had delayed her upload long enough, and if they didn’t get it done—if they took Pearl from him, they would die.
Leonard wondered when he had become so murderous.
He recognized the pattern of Mike’s footsteps coming into the room, even though those legs were long since artificial. “Leo—I came soon as I heard.”
Leonard wanted to snap at him, but he couldn’t. He found he didn’t have the energy.
Mike sat by him. “I know. It’s horrible for everyone. It was horrible for me. When I was waiting, I knew they were taking too long, I knew I was losing things… but I was wrong.”
“She’s not conscious,” Leonard said, and finding himself reduced to saying banally obvious things hurt him. “Induced coma.”
“I know. It helped me. I didn’t worry after that.”
“But what if she—” Leonard couldn’t say it. “What the hell is taking them so long?”
“Getting the new equipment connected, I bet.”
“New… equipment?” It dawned on him. “You can’t be serious.”
“They won’t miss the prototype for a few hours. And it is the Gold Recursive Buffer Reader, after all.”
“But the… the success rate,” Leonard said, and the words drowned in a moan.
“Is far, far better than any other clinically studied reader,” Mike put his hand on Leonard’s shoulder. “I wouldn’t trust her to anything less.”
“I can’t… I can’t,” Leonard started to say, but at last he cried instead. “Oh my God, I can’t ever possibly thank you enough—”
“It’s all right. Anything for a friend.” Mike looked distant for a moment.
“How do those new buffers on it work, anyway?” Leonard asked. Something to occupy himself. “Last time I was at the lab, you were still working on them.”
Mike took out his tablet. “Well, it turns out a very large amount of the human memory is recursive. You see the word dog, and you think of the dogs you’ve seen over your life, and then you think of that time one bit you, and so on. In actuality, the body stores it in an almost mathematically optimal system, and so…”
Leonard watched his friend show test sample after test sample, memories remembering memories, like a fractal spiraling into fractals.
 
11:
“I just want to make sure it never happens again!” was his last argument that night.
“Honey,” she said patiently, “It’s not going to happen again. You don’t need to keep me in a box. You’ve got a me in that box in there.” She pointed at the closet, where one of her backups was hidden.
“But… it’s not the same,” Leonard thoughtlessly said.
“Oh? Am I the same Pearl from my biobody?”
Leonard sighed. No point going down philosophical arguments here. “Just… be careful out there, all right? I couldn’t stand losing you all over again.”
She kissed him on the lips. “You won’t. If it makes you feel better, I’ll get another me and I’ll just go outside with that.”
“But bodies are expensive—”
“Ah, but I only need one body for you, honey.” She pulled him under the covers and ended the argument.
The next morning he had found she had already ordered a used construction android body. He couldn’t disagree with the price, and the dealer was reputable. He just found the idea of multiple Pearls walking around a little too much. Ironic, considering his job.
She did let him chose a sync-link model, but insisted on shopping for the best price. Admittedly she did get it several precious coins off, but she insisted on going outside while waiting for it to be delivered, and Leonard could not help but fret she would not return. Though a mugger who jumped her in that old construction body would probably need a new body of his own afterwards.
And, he told himself, there were more important things to fret about. The lawsuit was grinding forward far slower than the rate of the arrival of his legal bills, and his paychecks came slower still. He considered using the sync-link himself to have multiples of him, perhaps in virtual bodies squeezed into cheap nodes, telecommuting to other jobs. The idea was somehow abhorrent to him, and so he thought up argument after argument against it every day.
When he finally gave in, he drew the line at three active bodies, and only one, his biobody, at night. He made sure to frequently sync and refuse to have conversations (or worse yet, arguments) with himself. It wasn’t as weird as he imagined. If he didn’t know he was in multiple bodies, he would have thought he was simply having days three times as long. He wondered how those ten or twenty-body persons could stand it, especially when they deliberately tried to produce subpersonalities. It was stressful enough syncing the evening after two or more hard days at work.
But life sometimes as an upload gave him more appreciation for what she went through, and their relationship grew for the better.
But then one day it seemed, for no clear reason, to wither. The fights started. The lovemaking stopped. She became moody, one moment laughing at strange things, the other screaming and throwing the china, sometimes at the same meal. Leonard became concerned that his biobody was not getting enough food, since she was angry if he ate with her and angry if he didn’t.
Was she corrupted? He brought up the concept as mildly as he could, and his biobody spent the rest of the day in hiding.
Several nights later, lying on the couch, his aching biobody’s brain half-thought of a plan. He knew a possible but highly illegal technique where the buffers of a sync-link, shortly after their use, could be examined to make a messy copy of the syncing person. If Pearl had gotten corrupted somehow, he could check that copy with a backup. Of course, what he would do with that information… better not.
He discovered the next evening to his horror that one of his overworked virtuals had given in, made another him, and that him had done it. He at least admitted he had done it well. When he did it, he made sure to hide himself from himselves so that he wouldn’t know the result until he could think about it with one himself.
It was still futile. The memory of the discovery of not only the branching, but the sheer size of it—the memory amplified by syncs of syncs—was like a thousand children screaming, never ceasing. He went outside, somehow arrived at a hotel, checked out a coffin and locked himself inside to cry.
When he regained some amount of sense, he knew that Occam’s Razor left only one reasonable explanation. The letter he wrote, he knew, would doubtlessly anger his lawyer and he didn’t care. The words seemed to pour out of his hands, and he hit send so hard his finger hurt.
He realized, the next morning, that he had not the sense that night to secure an untainted backup. She had left a note in the open safe. “So you’ve found out. It was an accident, by the way. I met another me outside and we decided to sync, but I didn’t realize we were from separate branches. We tried to un-sync, and Mike helped, but we couldn’t get either of us pure. So I’ve been loving him and you at the same time. Except he isn’t a paranoid stalker, so I’m going to be loving only him now. Hate, Pearl.
There was no reply from Dr. Gold. No doubt he was going to do nothing if the case resolved in his favor. After all, with Pearl agreeing to a settlement with herself, there was no way Leonard would win. Or could win.
Hate, Pearl. He almost laughed. Things like that were why he married her in the first place.
But then a cold force seemed to take him. He might lose the court case—no fantasizing about any last-minute turnarounds. But now he had nothing to lose. And there were many things he could still do.
 
110:
“What do you mean, I can’t have her?” Leonard almost screamed.
“I’m sorry, sir, but the paperwork’s already been filled out,” the clerk said.
“What paperwork?”
“Unfortunately, sir, confidentiality requires—”
“To hell with confidentiality. This is my wife. I want an answer! NOW!”
The clerk cowered. “If you wish to fill out an application—”
“No need. I’ll come back later,” Leonard spat.
He had heard that governmental firewalls were approximately as strong as Swiss cheese. He found that a great overestimation. One of the inner firewalls had an exploit so obvious his daemon found it on its own. He had more trouble writing a second daemon to sift through all the files than to get them.
It was a matter of prior claims. Pearl’s family had claimed first rights to her backup, and had acted immediately when her biobody stopped functioning. Leonard’s claim to her by right of marriage was either accidentally missing, or maybe some other daemon before him had gotten to it. In any case, there was little chance of either of Leonard’s in-laws seeing reason.
But they couldn’t stop her from simply deciding to go back to him, right? Even they couldn’t be that stupid, or petty. He dug deeper. While the upload recovery request file was digitally signed by them, the request was from… a Dr. Michael I. Gold?
He would have looked even deeper, but his daemons saw another daemon coming their way and fled. Though from the look of how poorly designed that one was, he hoped it was not a government daemon.
He decided to call Mike next. Maybe he had convinced Pearl’s family in order to get her to him. “Hey, Mike?” he began as casually as he could. “About… about Pearl—”
“I’m not going to ask how you found out,” he said. “I am going to ask you to stay out of this.”
Leonard wished he had heard anything else. “Excuse me?”
“Have you no shame? This is your fault!” His cry was deep with grief. “Do you know how many times she hinted to me that she was unhappy? How many times she wished she had chosen otherwise?” The next words were a growl. “I would wonder if it wasn’t suicide.”
“What? How dare you—”
“Leave us alone. And don’t call again.”
Leonard did not know what to do for several seconds. Had Pearl been unhappy?
No. That was ridiculous. A lie. How many times had they lain together in bed, saying “I love you” in every way possible. Or played a game, or laughed, or—How many times had they been together at her bedside? Ridiculous. Ludicrous. Absolutely, utterly insane.
Or deliberate slander.
Dr. Gold had not been best man at Leonard’s wedding, mostly because he was dead at the time. But what if that wasn’t really an accident? Love-sickness leading to suicide? Dr. Gold had been dating Pearl for much longer, until they broke up. Projection leading to absolute delusion? It could happen to anyone.
And it if was slander…
The next thing Leonard did was get was a lawyer.
Anderson-9, Esq. of Anderson^13 was the most kind and reassuring android to talk to. Probably all an act, all things considered, Dr. Jacob thought. But Anderson-9 had plans, plans, and possibilities of plans. Already he was working for an injunction for Dr. Jacob to get a Pearl, too. And, he slyly implied, Leonard could find ways to show Pearl was happier with her husband.
Leonard already saw his future taking shape, and Anderson-9 promised to sub-split if necessary. Apparently that was common practice now, even as short as upload-copies had existed. How strange that people could do that now, becoming like fractals spiraling into fractals.
 
111:
“What do you mean, I’m not my own primary?” Pearl asked.
“You are the legal primary of your branch, but there is another claim to legal primacy closer to your ideal,” the clerk said.
What claim?” Leonard asked. “She’s standing right in front of you!”
The clerk did not even look up from the screen. “Unless you have an even closer claim, I am sorry, but there is nothing I can do.”
“But who has the other claim?” Leonard asked.
“Unfortunately, sir, confidentiality requires—”
“To hell with confidentiality. This is my wife. I want an answer! NOW!”
The clerk cowered. “If you wish to fill out an application—”
“Honey, calm down,” Pearl said. “I think I know where this is coming from.”
Leonard shut his mouth and followed her out. So incredible to see her walking again, even if those legs were completely robotic.
She explained in the autocar home. “Mike and I were… briefly engaged. We were concerned about the heart palpitations—they had just started—so we decided to make a backup—”
“Mike? Our friend Dr. Gold? That Mike?”
Pearl gave him an are-you-serious? look. “Who else could I be talking about? Anyway, after you make your backup, you start thinking you might want to change a few things. Forget that bad memory. Repair that old grudge. Get better at math. You know?”
“Did Mike seriously encourage this? He should know how dangerous personality altering can be—”
“—because he knows how to do it. I know. But he just suggested little things at first. Then we were tweaking bigger things. Then he was doing all kinds of things.”
“He.”
“I broke up with him at that point.”
“I always wondered why…”
“If it wasn’t that, I’m sure it would be something else.” Pearl looked out the window at their apartment. “But he still has claim to that backup, I bet. And he’s trying to make that me the legal me. We should get a lawyer.”
“You look around. I’ll talk to Mike, and see if he knows.”
That phone call did not go as he would expect.
Are you absolutely insane?” were the first words Mike screamed. “What were you thinking?”
“Mike, what on Earth—”
“You’ve created a mockery, a Pearl that no one loves but yourself, a perfect mirror of your own head—”
“Mike?”
Mike breathed a few times. “What if you took the most beautiful statue in the whole world, and used a mud puddle to create a copy. That is what you have done.”
“I’ve done nothing of the sort,” Leonard said. “As if you have room to talk, trying to modify your fiancée as if she was a daemon.”
“Oh? Your ‘Pearl’s’ restoration report shows you used a combination of backups to ‘produce’ her.”
“Which is standard practice for a long-term terminal case like hers!”
“And did you at no time decide which parts of her to keep or discard? Which you wanted from the real her and the accurate her?”
“I cannot believe you dare—You hypocrite!”
“Hypocrite? I have done nothing to her that I have not done to myself.”
Leonard paused for multiple seconds in pure silence. “I’ve heard that self-modification leads to insanity. I see they were absolutely correct.”
“Correct? You have no idea what you are talking about! Accepting flaws in your own personality, when correcting them is a second away—”
“Mike?” Pearl said, and Leonard turned to look behind him. But it was not from Leonard’s side of the screen.
“I have more important things to do than keep talking with you,” Dr. Gold said. “Leave us alone, and don’t call again.”
Pearl stepped out from behind the doorway. “What was he smoking?”
“The smoke was from burning gray matter,” Leonard said. Except Dr. Gold had not had a biological brain for a long time. Biobrains were harder to edit—was that fatal accident deliberate on his part?
Was her sickness and death Dr. Gold’s fault? Ludicrous. Her biobody’s heart was just congenitally defective… right?
“I found us a lawyer,” Pearl’s voice broke through his thoughts.
“Good. I’m sure we’ll need him.”
Anderson-9, Esq. of Anderson^13 was the most kind and reassuring android to talk to. Probably all an act, all things considered, Dr. Jacob thought. But Anderson-9 had plans, plans, and possibilities of plans. The most obvious angle of attack was to prove that Pearl was most Pearl(ous? ful?) with him. The subject was inevitably so murky and full of bias, Anderson-9 explained, that the easiest way was simply to make a more appealing Pearl, while showing the other was “artificial.”
Of course, to do this also required removing taints of Dr. Gold’s modifications to her, which was one of the most difficult tasks any psychoprogrammer could do. Hunting down a single thought was near impossible, as the inevitable self-recursion and self-replication lead to thoughts within thoughts without number. But it could be done. A second anti-alteration would have to be made. He saw it now in his mind’s eye—his alterations and Dr. Golds fighting for control, a battle of recursive battles like a fractal spiraling into fractals.