I was reading up on Transhumanism because I thought it would make an interesting blog topic. Mind officially blown. I decided that it’s way too intense and complicated for it to be broken down into one of my random musings. (In other words, I am feeling too lazy to make the effort.)
But within that topic I came across the idea of Posthumanism, and it made me muse, indeed. The website whatistranshumanism.org describes it like this:
“Many transhumanists wish to follow life paths which would, sooner or later, require growing into posthuman persons: they yearn to reach intellectual heights as far above any current human genius as humans are above other primates; to be resistant to disease and impervious to aging; to have unlimited youth and vigor; to exercise control over their own desires, moods, and mental states; to be able to avoid feeling tired, hateful, or irritated about petty things; to have an increased capacity for pleasure, love, artistic appreciation, and serenity; to experience novel states of consciousness that current human brains cannot access. It seems likely that the simple fact of living an indefinitely long, healthy, active life would take anyone to posthumanity if they went on accumulating memories, skills, and intelligence.
“Posthumans could be completely synthetic artificial intelligences, or they could be enhanced uploads, or they could be the result of making many smaller but cumulatively profound augmentations to a biological human. The latter alternative would probably require either the redesign of the human organism using advanced nanotechnology or its radical enhancement using some combination of technologies such as genetic engineering, psycho pharmacology, anti-aging therapies, neural interfaces, advanced information management tools, memory enhancing drugs, wearable computers, and cognitive techniques.
“It is difficult for us to imagine what it would be like to be a posthuman person. Posthumans may have experiences and concerns that we cannot fathom, thoughts that cannot fit into the three-pound lumps of neural tissue that we use for thinking. Some posthumans may find it advantageous to jettison their bodies altogether and live as information patterns on vast super-fast computer networks. Their minds may be not only more powerful than ours but may also employ different cognitive architectures or include new sensory modalities that enable greater participation in their virtual reality settings. Posthuman minds might be able to share memories and experiences directly, greatly increasing the efficiency, quality, and modes in which posthumans could communicate with each other. The boundaries between posthuman minds may not be as sharply defined as those between humans.”
Okay, so is anyone else a little freaked out by this concept? Yes, it would be nice to have an enhanced capacity for learning, and who wouldn’t want a little extra vigor? But I really don’t want to live forever. I think that would become tedious and depressing. If I couldn’t count on an expiration date, I’d take everything for granted and not appreciate or value anything. I would procrastinate even more than I already do. Nothing would be precious. It would all feel inevitable.
I wouldn’t mind not feeling “tired, hateful, or irritated about petty things,” but I’m not so sure I’d want to be able to control my desires or mental state completely. Everything would become predictable. There’d be no surprises and nothing to get excited about. What would be the point?
And do I really want to risk augmentation? Too much could go wrong. Not only that, but would I want to live in such a superior state that I could no longer relate to humanity? I would hate to view people as mere primates. And while I might be able to communicate more effectively with my fellow posthumans, I would cease to be able to communicate with anyone else, and that would be tragic. And I genuinely believe that the most valuable sign of intelligence is the ability to get your point across to anyone, regardless of their IQ.
And then there’s the fact that certain people, if given these enhanced powers, would not use them for good. And because they would be so far ahead of us mere mortals, there would be little, if anything, we could do about it. That scares me.
While I can’t predict the future, and I’m sure that there are things around the corner that I can’t even begin to imagine, one thing is for certain: I wouldn’t want to meet a posthuman in a dark alley, or anywhere else.
6 thoughts on “Spare Me Posthumanity”
About 135 degrees away on this, I am. I can think of a lot of things that need improving, and I’ve never felt like I relate to people much of the time anyway… Immortality is problematic, but ridding ourselves of nature’s various frailties while mortal can only be good. I and everyone else have a pile of physical complaints that we do not deserve. Why should eliminating them be any different from heating the house during a cold spell, or going to the doctor for those things that doctors can fix now?
As for the predictability of a life completely in one’s control, I would be willing to just up the percentage of control some. Over emotions, a bit–wouldn’t want to become so serene that I’d not care if a real danger came up. Yes, it needs a lot of thought. Some years back I read a story about a man who went on a quest for immortality, and the lessons he learned along the way caused him, when he reached his goal, to decide to make life here really worth prolonging.
But increased intelligence and health would help with that. I am not sure that I would risk a lot of unknown procedures to get there, but still…
It’s a cost/benefit analysis for the ages. Health issues are horrible, no doubt. But I’m sad to say that there are people (particularly in politics) that I’m looking forward to seeing age out of this mortal coil and leave things to people who are hopefully more sane and less destructive. Would we want Charles Manson forever in his prime? Would he, perhaps, have controlled his emotions by keeping them permanently in the rage zone? I know that’s an extreme example, but playing God does have consequences that we can’t always predict.
The idea of ascension to a higher plane has been the subject of many an author and film director. I expect achieving such a level would allow leaving the folly of humanity behind to explore greater interests than the basic drives that make people both ridiculously predictable and unpredictable.
I just think that all the unknown factors, all the complex moving parts, make this a risk not worth taking. Maybe I’d let a million or so people go first and work all the kinks out.
I’d be worried about which ones get to go. Some, as mentioned above, I’d be glad to see no more of… Someone put it well–the future is here, it just isn’t distributed evenly.