Since I've never had an opportunity to explain anything to her before (she is very well learned) and was sensing this would be the only chance I'd ever get, I magnanimously gave her the gist on transhumanism as I've been made to understand it: that as our technology and knowledge continue to evolve, it is inevitable and desirable that humanity integrates its technology with its physiology, creating people who are more intelligent, durable, capable, and ultimately happier. "Everyone gets smartphones in their brains and we live forever," is how I may have summed it up.
Her gut reaction was NO; just no.
When I asked her why, she admitted that her rejection of this vision of the future was purely reflexive. She couldn't cite any substantive reasons -- ethical spiritual, or otherwise -- as to why she found the idea so appalling. (Granted, she'd had only twenty seconds to think it over.)
When we find ourselves upset by some argument or new idea, it's helpful to ask why instead of just leaving it at fuck that. Once we've gone beyond an emotional reaction and can put it in context, we can respond to the argument with sounder reasoning and a greater degree of intellectual honesty. I sympathize with my colleague's antipathy, but the stuff I've been reading and writing lately has compelled me to try and unpack my own anxiety towards the prospect of a humanity whose life is subcutaneously interwoven with its technology. My viewpoints might be evolving.
As for the cause of our anxiety, I've come up with two speculations.
One. Our reflexive abhorrence (hers and mine) of the assertion that humanity's march toward the Singularity is not only well-advised but inevitable has less to do with principles than with stomach-level trepidation at the notion of systemic changes in human life (our lives) occurring at such a magnitude as to cast doubt on our whole conception of "humanity."
This beggars the question: what is it we're talking about when we talk about "humanity" or the concept of "human?"
Most of us have a general idea: the people whom we know (personally and in the abstract, and including ourselves) share a set of characteristics, and it is some by arrangement of these characteristics that we define what is fundamentally "human." This definition is important to us. Our abstract landscape is a human landscape. Our concerns are human concerns; our passions and fears are human passions and human fears. (We are predisposed towards the anthropic fallacy -- and by the logic of natural selection, we are properly so.) We order our world by way of our conception of humanity (and perhaps also of "the human condition"); and so we might be profoundly disturbed if we imagine that all of a sudden -- and we will imagine it is all of a sudden, crashing down at once without the intervention of the staggered millennia, centuries, or years over which any epochal changes in a culture (or species) must encroach -- that all of a sudden, ourselves and everyone we know are made to be completely different, without anyone consulting us or asking our consent before the switch was thrown.
Any proper definition of what an organism is must not exclude what that organism does -- the full extent of what it does. (So if we claim to "know" an organism, what we are claiming is a simplified but fairly thorough abstraction.)
With most organisms, there is a fairly close correspondence between a species's exhibited behavior and how we might reasonably expect to behave, given its physiology. (Anything we can't guess about its behavior must be laid at the unquantifiable obscure and unobservable aspects of its physiology.) But a knowledgeable biologist can, for instance, look at a dinosaur skeleton and extrapolate, by way of deduction, many sound inferences as to how this extinct organism likely behaved. An apatosaurus has a small brain cavity, a long neck, and dull teeth: an apatosaurus probably had relatively low intelligence and subsided on leaves from tall trees. That's a simple example, but the basic principle holds true throughout most of the animal kingdom.
But a human being is an upright, mostly hairless primate with a large brain and opposable thumbs on its forelimbs. It does not necessarily follow from the bare facts of its anatomy that human beings are organisms that drive cars. Build prisons and diesel engines. Fish and grow food. Read books. Sit up all night watching television. Talk on the phone. Play tennis. Take photographs. Strap dynamite to themselves and and blow themselves up. Ride horses. Build model ships in bottles. Heat up frozen pizzas in the oven before putting them in the microwave. Suck helium out of balloons to heighten the pitch of their voices and amuse their friends.
However interminable the range of any given terrestrial organism's general locus of behavior, a modern human being's potential actions constitute a higher degree of infinitude. We can attribute this fact to humanity's high quotient of the characteristic commonly referred to as "intelligence" (which is a tricky word), but it is also the result of the "modern" qualifier. Humanity's history is now more cultural than it is genetic or physiological -- which is self-evident, given that technology must be counted as an aspect of culture.
Humanity has uniquely altered its environment to such an extent as to systemically alter human behavior -- and over a relatively brief period of time.
We are products of our environment. Different kinds of circumstances build different kinds of people. A generation of human beings born and bred within a culture that has access to stone tools, wheels, and knowledge about firemaking and irrigation will behave much differently than a generation born and bred in an environment where these things are unavailable. Depending on which of humanity's epochs we are considering in our definition of "human," we might already have to consider ourselves transhuman.
We've already remade our world; in doing so, we've remade ourselves. It is an ongoing process.
Another consideration: many of the attributes we might call "intrinsically human" are contingencies of our species' history, many of which are virtually ubiquitous. Where our cultural history is concerned -- and again, humanity is at a stage where culture brings far more to bear on behavior than genetic characteristics -- the situation is much more susceptible to change, and changes can occur with a sweeping speed that far outpaces the creaking gears of biological evolution. Our definition of "human" -- unless we consider a supernatural possibility or only permit the most general and nebulous qualities -- must depend disproportionately on cultural contingencies. The very concept of "exoticism" attests to the radical impact of cultural factors on what we consider to be "normal" or "natural." (And we can probably bet that most of the qualities we've elected to represent what is most basically "human" will coincide with our notions of what is normal and natural.)
So: much of what we define as "human" must be arbitrary. Ergo, any arguments regarding transhumanism (whether in opposition or advocation) citing "human nature" or "the human spirit" must be commensurately flimsy.
But then there's the second objection: that this is different; that although cultural changes have tremendously changed human behavior, none have thus far affected the constitution of our human meat. The thought of a wetware future warping what's under our skin can equally or more appalling than the aforementioned scenario of the human "soul" undergoing an invasive electronic implant. To someone who isn't already sold on transhumanism, the old "computer chip in the brain" trope doesn't connote much in the way of desirability, no matter how smooth the sales pitch.
But these objections are, again, are purely cultural. Cultures change in time; what is taboo for me will be embraced by my grandchildren. Case in point: the widespread public support for the legalization of gay marriage, which would have been politically unmentionable (let alone feasible) a century ago.
(Still, cultural objections are perfectly valid -- during a given moment. Just don't count on "accepted" reasoning to stay on your side for very long. You must also acknowledge that your objections probably spring more from opinion than reason.)
Besides: we are already modifying our bodies. Eyeglasses: an optical enhancement tool that allows a human being to correct a biological deficiency. Vaccinations: a homeopathic augmentation of one's cells to prevent disease. A pacemaker: a mechanical implant to improve the performance of an (otherwise) irreparable and indispensable organ.
Neural implants are not too tremendous a leap forward from this.
Again: I don't think a rejection of transhumanism on a sentimental appeal to "human nature" is intellectually sound or even honest, and biological modification of human beings is already something we've been doing, albeit on a low-tech level.
But I remain a skeptic.
|From Beyond ONE.|
While we can't make any effective objections to the transhumanist philosophy or the eschatology of the Singularity on behalf of the human spirit, we can make them on behalf of the human species and the tenability of its existence on this planet.
I've gathered that transhumanists tend to be fervent believers in progress. Humanity will invariably improve as its technology improves. We entrust our salvation to science and technology. It's a good thing for today's futurists that we're fresh out of World War I veterans who might tell us about a similar age that made similar noises, and how these myths were literally exploded. (The situation today, of course, is vastly different, but it is only responsible to examine the past when considering the future.)
My faith in science and technology is not unalloyed. I fear we've already created a humanity that's been priming itself for a catastrophic meltdown.
If I have a broad objection to futurists and technologists, it's in the assertion (whether explicit or implicit) that the proliferation of technology, developed and proliferated for its own sake (or some providential whim of the "invisible hand") is our stair to salvation. Whatever the market wants to sell will be sold; whatever the engineers can build, the engineers will build; and this is how it should be. (Guns that can be made on 3D printers! Great! Whatever! Science! What's the worst that could happen!? But I digress.)
We've already been winging it under the banner of "rational self-interest" and the humanity into which we've redesigned ourselves is, within the planet's ecology, like a bomb on a bus (to borrow a phrase from Jack Collom).
Speculations about a transhuman tomorrow imply a future in which we use even more energy than now; I want to know how it will be generated after the oil runs out. They assume a future in which humanity will have enough to eat, despite the effects of climate change and soil degradation; I want to know where the food will be coming from. They assume a future in which nations aren't fighting drone wars over dwindling natural resources; I'd like an assurance that we've got some plan in place, or are thinking about getting a plan in place, and I want to know what that plan is.
The question shouldn't be about what kind of technologies we can develop or will develop. The question should be: what path forward is in humanity's best interest, and how can science and technology help us toward that path? (I hope and pray humanity is approaching an epiphany during which it realizes it has grown too massive and too complex to safely improvise for much longer.)
Humans beings are products of their environments. What sort of humanity would we like to succeed us? What kind of world do we think is best for them to inherit? These are reciprocal questions.
Obviously I benefit from modern technology. I'm not claiming I don't. But if we're talking about what developments will lead to the best future for humanity, sustainability must be our first consideration. As it is, the "advanced" world civilization we've haphazardly constructed on science and technology is running on borrowed time. I don't mean to act the doomsayer, but again: like a bomb on a bus. Are we certain the solution is to keep on racing in the same direction that brought us near to the cliff?
If transhuman technology presents itself as the solution (or part of it), great. Sign me up. Otherwise, I'm inclined to put optimistic preaching about transhumanism and the Singularity on level with Christian talk about the Second Coming (don't worry, the scientists have got the air/food/water/energy thing covered, or they will anyway, so just whatever, we'll all live forever, it'll be great): both represent an obstinate faith in the certainty that divine intervention (whether by the hand of Jesus or Science) will arrive right on schedule to renew the world, and they distract us from more pressing concerns: namely, redesigning human civilization such that it is able to withstand the potentially cataclysmic shocks looming on the horizon without lapsing into a new dark age, screwing over the poor, or miring itself in international wars over natural resources.
(As always: if you think I am misunderstanding an issue or an argument, please do correct me.)