The “New Biology” and “The Self”

A couple of weeks ago I posted some musings about “the self” in anticipation of being on a panel with Steven Pinker (author of The Blank Slate and The Stuff of Thought) and Noga Arikha (author of Passions and Tempers: A History of the Humours) at Tufts University. The panel, convened by Jonathan Wilson, was titled “The New Biology and the Self,” and what follows was my contribution. The graduate student referred to is Monica Chau of Emory University.

I told a very smart neurobiology graduate student named Monica yesterday that I’d been asked to speak on “The New Biology and the Self.” She said, “What’s the new biology?” I said, “I don’t know, but that’s the least of my problems. What’s the self?” Why I don’t know the answer to this one is legitimate to ask, but let’s just say it’s not because I haven’t been thinking about it since middle school.

I said to Monica, “Well, you probably use the word ‘myself,’ what do you mean?” She said, “It’s the part of me that’s unique, that no one else has. It’s also my consciousness, my private thoughts, my identity.” I thought that was about as good a definition as I could offer, although I added that the self is what you alone see, a kind of parallax shift from what others see and even measure, but what subjectivity allows only you to access.

Nevertheless, as Leopold Bloom muses in Joyce’s Ulysses, we do try rather desperately to “see ourselves as others see us”—we are intersubjective from childhood–and so every new outside measure of who we are has the power to change that parallax view.

As to my notion of “the new biology,” it has to include at least four things.

First, especially in this year of Darwin’s birthday and anniversary, it has to include the new evolutionary biology.

Second, it has to involve the new genomics, including personal genomics and ethnic tracing.

Third, it must take account of the revolution in brain imaging.

And, finally, it has to acknowledge the transformative and accelerating power of enhancement drugs.

So please indulge me while I try to figure out how these four kinds of facts might, for better and for worse, be changing our ineffable—our nearly unutterable–subjectivities.

From the time I ceased to be religious at 17, evolution has been my overarching narrative of origins, and to say it affects my sense of myself is putting it mildly. At the time of a famous debate between Thomas Huxley and the Bishop of Worcester, the bishop’s wife is supposed to have said, “Descended from apes! My dear, let us hope it is not so, but if it is, let us pray that it does not become generally known.”

This marvelous reflection declares a profound fear of the consequences of this particular piece of objective knowledge for the self writ large, the selves, as it were, of millions. I agree with her intuition that the consequences are momentous, but not that they are dire. Her first fond hope, alas, was not realized; it is indeed true. But her second hope, a prayer, remains alive, at least in the United States, where more than half of our fellow citizens reject this important fact.

But what’s new?

Too many things to list them all, but I will mention one. The October 2nd issue of Science contained 11 articles in which the great paleontologist Tim White and his many colleagues described in detail the fossil species Ardipithecus ramidus, unearthed in hundreds of specimens over two decades.

This species, interred by nature for over 4 million years, has now had its eternal rest disturbed, but it is taking revenge by disturbing our complacency. Walking upright on the ground, but with apelike feet for clambering in the trees, it is the clearest example of the transition we have so far, a million years before the justly famous “Lucy” species. The bishop’s wife, if she is listening, has never been more chagrined.

But the news doesn’t end there. Ardipithecus males are very similar to females—around the same size and with small canine teeth. This means they were not competing very fiercely for mating opportunities. In the light of neodarwinian models of behavior that have made many people think differently of late, this creates a new view of our origins.

It rules out, for example, an ancestor just like the chimpanzee, a species where males are brutal to females and, even fatally, to each other. It suggests instead that we arose from a species with less violence and more shared parental care of the young, which is looking more and more like a key to our evolution.

I don’t know how this will ultimately be resolved. Will the answer affect your view of yourself? I have to say it will affect mine. And in this case the “new” biology is more than 4 million years old.

Which brings me to a slight disagreement with Monica. While the self entails a uniquely private viewpoint, the mind it sees is not unique. Not just since Darwin but since Linnaeus, we have known that we share some things in common with all human beings, some with all apes, some with all mammals, and so on.

Part of the new genomics, different enough from genetics to deserve its new name, has been a clear confirmation of these shared heritages. But I used to say we were 99 percent chimpanzee, and I can’t any longer. Why? Because the past decade has revealed that much of what we called junk DNA is not junk at all, but is making RNAs that have crucial regulatory functions.

We have no idea as yet how much our regulatory RNA differs from that of apes, but we already know that their evolution has been exceptionally rapid, especially in the brain, where the complex, poorly understood hierarchy of our genomes controls the course of development.

Still, it is clear that the genome is widely shared among all humans, and that is why there are many universals of culture and mind. Cultural anthropologists like to stress the differences, but my two years among the !Kung Bushmen of Botswana made me think long and hard about the similarities. Modern humans arose in Africa over a hundred thousand years ago and spread throughout the world; we are one species, with different cultures but profoundly similar minds, and similar selves.

But that doesn’t mean there are no important differences, and that’s where personal genomics comes in. Steve has explored his own genes, and we all will have that option increasingly in the future. I learned in medical school and in life that few things separate people as illness does, and specific illnesses create exclusive clubs of shared subjective experience that almost entail new selves for the members.

What happens when you find out that you may be destined to join one of those clubs in the distant future? Well, if the club is one you can easily do something about—say, the Type 2 diabetes club—your self becomes an agent in changing your destiny. When there is little you can do—Alzheimer’s for example—other than doing away with yourself or buying long term care insurance (if you can still get it), does your self now include passivity and victimhood? I don’t know.

Not yet, but some day soon perhaps, personal genomics will tell us something about the things Noga studies—whether we are phlegmatic, say, or choleric, although we’ll use different words. What will this tell us? If your life has told you that you are a timid person, what will genetics add? a greater sense of calm about it? a new passivity in the face of something that might otherwise be changed?

What if you find you have a genetic tendency to exploit others? Will this lead you to want to change, or to justify what you do? The genomic self will raise many new questions.

It will also answer some. In between the selves we share with the species and the selves no one shares, there are intermediate sharings: identical twins, families, ethnic groups, and cultures. Finding ethnic origins in the genes is very important to some people, for example for some African-Americans. When you have had your origins torn away from you by oppressors, you may especially need to explore them. Listening to Henry Louis Gates, Oprah Winfrey, and others talk about what they found and how it affected them, it is impossible not to be moved.

But although Gates and, say, the rapper Diddy probably share ethnic genes, they belong to different cultures, and I suspect this is at least as important in their self-definition.

And this can easily go too far. The other day a brilliant lawyer who should have known better said to me half-jokingly, “What if I find out my wife is not Jewish?” I know them well, and she is Jewish, period, no matter what the genes say. Yet it is clear that such information could somehow change their view of themselves.

We will not be in complete control of how we react to these and other biological revelations. Which brings me to brain imaging.

This, I would argue, is at least as important a biological revolution as genomics, but it requires an even subtler philosophical approach. Let me give you two hot-off-the-press examples.

Steve is a coauthor of a beautiful new study of language and the brain, published in Science on October 16th. It uses a combination of functional magnetic resonance imaging and recording from electrodes in the brain to parse the exact ways in which Broca’s speech area sets up a meaningful utterance.

Without going into detail, I will say that this is an extremely important study that begins to take the mystery out of language generation and, in my view, supports the long-standing claim that the human brain is uniquely, and in a modular way, adapted for language. For myself, it strengthens my sense of separation from the apes, and my shared biological heritage with all humankind.

The other study, equally beautiful, published in Nature one day earlier, provides a remarkable contrast. It is called “An Anatomical Signature for Literacy,” and it proves the profound power of human agency over the brain. It begins:

After decades spent fighting, members of the guerrilla forces have begun re-integrating into the mainstream of Colombian society, introducing a sizeable population of illiterate adults who have no formal education. Upon putting down their weapons and returning to society, some had the opportunity to learn to read for the first time in their early twenties, providing the perfect natural situation for experiments investigating structural brain differences associated with the acquisition of literacy…

The study, done in Bogota and at the Basque Center for Science in Bilbao, showed that learning to read, even in adulthood, specifically increases the anatomical connection between the two halves of the brain, in areas linking vision and language.

The young former guerrillas decided to put down their weapons, learned to read, and changed the anatomy of their brains. It is hard to imagine a better case for the ability of the subjective self to change the objectively visible one. The study is one of many warnings to those who may too readily conclude that if something is seen in the brain, it must be causing what is seen in the mind.

Not so. It is only a correlation, another set of data to be meticulously compared with those of thought and behavior, leaving the task of discerning causality as difficult as ever, sometimes more so. Yet it is very, very important data, and it can certainly change our sense of ourselves—especially if we are among those who think of the mind as something separate from the brain.

Finally, I can’t think about “the new biology and the self” without thinking about the medicines we use today to shape and change ourselves. To cosmetic surgery—rapidly increasing for both sexes—we have added cosmetic endocrinology and most importantly cosmetic pharmacology.

For a child considered “too small” (usually a boy) we have the option of growth hormone; for one “too tall” (usually a girl), hormones that bring on puberty. But in considering the implications for the self, we want to look at adults who choose for themselves: The older man or woman who takes testosterone to restore or enhance sexual drive, or growth hormone to increase strength and energy.

Do they become subjectively younger as they become more sexual or energetic? They certainly seem to feel better about themselves.

Surgery clearly has ambiguous consequences. People who have facelifts may wear turtleneck sweaters because they feel self-conscious about their necks; they live in a personal space between natural aging and the more or less successful pretense of youth. A friend of mine had a nose job in her teens; she certainly adjusted to being seen as beautiful, but she also said she always felt like an impostor.

Is this also the case with psychoactive drugs? Antidepressants like Prozac have lifted the moods and calmed the anxiety of many, and because the side effects of the new-generation drugs are small, most people who take them just feel “normal.” Only the daily ritual of pill taking may bring to mind the thought that they are not quite themselves. But generally they like the new selves very well, thank you.

Stimulants like Adderall and Ritalin, we know, are no longer child’s play; adults by the millions modulate their own attention, and their cognitive performance, more or less at will. Modafanil regulates the sleep-waking cycle, enhancing alertness, and drugs that are used in Alzheimer’s, like Namenda or Aricept, may find uses as cognitive enhancers for “normal” people.

Creative people with bipolar disorders often titrate their own lithium or valproate to try to achieve inspiration without mania. And millions of men, erectilely dysfunctional or not, use Viagra and similar agents to make sex work on call, sometimes exploiting women in the process. We are told the drugs are not aphrodisiacs, but getting a good erection is, so you figure out what they really are.

Does the man who takes daily Cialis have a more normal-feeling sexual self than the one who must take the pill each time he and his partner feel ready? What of the woman who takes the pill off-label? Does her self get a delicious added feeling of wickedness along with the lubrication and clitoral erection?

Stay tuned for the answers, which will not be easy ones. Just over the horizon are countless other choices—cognitive enhancers, true aphrodisiacs, energizers, serenity agents, and more. New technologies of drug production and screening guarantee it.
We can perhaps take comfort in the fact that humans have used herbal mind-altering substances since time immemorial, to enhance everything from visions and trances to meditation and sleep. Traditional cultures have managed to integrate their drugs with their sense of self, and, I suppose, so can we. But we will have an unprecedented array of options at hand.

Some bioethicists insist that we not use them, that we should be satisfied with whatever endowments we have. I don’t agree.

What really counts in all this is agency. If you take our evolutionary past, or the facts of your own genome, as an excuse to be passive, greedy, or violent, that is one kind of consequence for the self; if you insist on saying It wasn’t me, it was my brain, or my genes, then I suppose the new biology trumps the self. But that kind of choice is more a consequence of who you were going into it than who it really made you.

If, on the other hand, you use the information about these things, or the medicines your doctor is willing to give you, to enhance your range of choices, and perhaps strongly go against the grain of what your tendencies may be, that is different.

If you refuse to know these facts, or completely reject drugs, that’s fine too, but you will be living in a world where you have to make those choices, and where many people around you are making different ones; even this fact must change your concept of self.

In the end we must hope that the main result will be an enhanced human agency, both at the individual and the species level. And we can hope against hope to believe in the old saying, The truth shall make you free.

Leave a Reply