Rudy Rucker
Software Cover

For the Transhumanist


I can't remember if I was the first to read a chapter, or if my husband was, the night I fell in love with Software. Either way, it was my chapter to read. We already had the lights out, the baby was already asleep in her bassinet.

Earlier in the story, Sta-hi Mooney (a stoner and one of the protagonists) had been kidnapped by a gang called the Little Kidders. They were a mongrel group, but all wore brightly colored shoes. Sta-hi was trapped with his head sticking out of a hole in a table, listening to the gang-members discuss how much they enjoyed eating brains. It was uncomfortable to read out loud, but I kept going. If I had been reading to myself, maybe it wouldn't have been such a shock. I might have paused and processed what was happening. It might not have affected me so much having the words flow through my own head, but these were flowing out of my mouth. Through me, one of the Little Kidders was saying how much they enjoyed the moment when they scooped out the speech center of their victim.

Sta-hi escapes and, later, the kidnapping incident is mentioned to a bopper (AI/robot) who was completely dismissive of human moral concerns. They explained that, embedded in the Little Kidders was a bopper with the capabilities to digest the brains and "tape" them. In other words, to upload the contents of that brain through the appearance of ingestion. Bopper morality does not have anything to say about cannibalism. Instead, the fact that the brain was being uploaded made the act morally positive, because that person could now be merged into one of the Big Boppers and become closer to The One (bopper god).

What struck me so much about that moment was that it was so logical and contrary to my visceral reactions to the content. I had to stop and consider the bopper's position. At that point, it was easy to still come down on Sta-hi's side. There was lack of consent, there was the inevitability that parts would be lost as the human's ingested brain-matter instead of the bopper. But I had to consider, from the perspective of a machine who sees humans as biological machines, was this act moral for one machine to do to another? Why do I think it's moral if one machine is non-consensually taking apart another and absorbing their programming, but not for people? Do I think that? What if it is consensual?

Rucker doesn't attempt to present any of this as an answer, to convince the reader that the boppers are correct or incorrect. He presents the issue fairly. Is my "soul" my continuity of consciousness? Is it the brain patters that make me *me*? Can it be transferred? If it is transferred and a copy has been made, do I still exist? Am I still me? Is immortality worth trusting an alien being with everything that I am? If the essence of *me* can be be altered or subjugated without my consent or knowledge, am I still me? Every question is left hanging in the air, for the reader to think about rather than decide on.

I am a sucker for transhumanist themes, maybe because I'm both eager and afraid that such technology might happen during my lifetime. I want to be immortal, but I'm afraid that I will only be creating another me, with this one dying anyway. I am curious to see how all of our philosophizing and speculation holds up when we know the answer (if such an answer can be known).