CGI Commencement Address

1:00 PM; Sunday, September 28, 2008; Los Angeles, California; USA

Graduation is a time of transition from being a full- or part-time student to having a career path that may last for the rest of your life. So you'd better calibrate yourself and make sure that you've made the right choices. This is a time when you ought to step back and take a look at your trajectory through life, to see where you might end up and whether you'll be able to say at the end, "It was all worthwhile because I personally made a difference."

Let me persuade you with the logic of the following argument. But first some personal biographical background...

Back in the Fall of 1954, when I was still a Freshman in high school, I grappled with three fundamental philosophical questions that set the tone for the rest of my life. In plain English, these questions were...

1. Who am I?

2. Where am I?

3. Where am I going?

All of these questions have logical, metaphysical (ontological, cosmological, and teleological), epistemological, ethical, and aesthetic counterparts, but I didn't have the vocabulary at the time to figure all that out. I answered these questions as best I could by raising two more specific surrogate questions (or problems) that I felt needed to be worked on before I could answer them...

1. How does the brain work? and

2. Why do we get old and die?

More specifically, the first question really asks: How come humans are special compared to other animals in the zoo (or in the wild) with regard to their rational thought and linguistic skills when, superficially, all mammalian brains appear to look the same on autopsy, except for their size? The second question, was stuck-in just in case I were to run out of time before I answered the first one and got back to the first three questions.

This first surrogate question led me down a path not of neurophysiology, which one might expect, but of Electrical Engineering (John von Neumann), Computer Science (Alan Perlis), and Artificial Intelligence, since simulating a person on a computer seemed easier to do than figuring out where consciousness resided in the brain. I asked, can one really program a computer like HAL-9000 in the Arthur Clarke/Stanley Kubrick movie "2001: A Space Odyssey" (1968) that thinks, talks, and acts just like a real person? i.e., is able to pass the "Turing Test") (BTW, the term AI was coined in 1956 by my thesis advisor, Prof. Herbert Simon, Nobel Prize winner in Economics, of Carnegie-Mellon University and four others at a Conference in Dartmouth University where Prof. John McCarthy was teaching at the time. (Allen Newell, Marvin Minsky, and Arthur Samuel were the others.) BTW, my thesis work involved computational linguistics (1967).

I worked on this first surrogate question for 20 years from 1955 to 1975 at which point I switched over to the second surrogate question because the first question appeared to be too hard to get done in one lifetime. So, I prepared to understand and possibly intervene in the aging process by going to medical school! And I've been a gerontologist (not a geriatrician), so to speak, ever since. BTW, my current research effort is to sequence the DNA of Supercentenarians and find out what genes they have in common.

In the intervening years, I've learned a few things here and there, about how to go about answering these original questions with the help of geniuses like astronomer Carl Sagan and physicist Stephen Hawking. But I'm still dissatisfied with my tentative answers.

To provide some perspective on our original three questions, I would like to introduce five (counter-anthropomorphic) revolutions in the intellectual history of mankind (they're counter-anthropomorphic in that they defy the human conceit that always puts us at the center of things, "as God intended"). Each revolution in human thought puts us successively into a lessor status in the grand scheme of things, as it were...

Throughout antiquity, from cavemen times (200,000 years ago in sub-Saharan Africa) and the origin of recorded history (~10,000 years ago) to the beginning of Western Civilization (Athens, Greece; ~2,500 years ago) Homo sapiens were characterized by magical thinking. Ordinary men assumed that the Earth was flat (essentially described as a large disk floating on an infinite ocean but fixed in place by pillars [or maybe four huge elephants, standing on the back of one giant turtle {'it's turtles all the way down' according to a woman reprimanding Bertrand Russell for his unorthodox views of recursion!}]). A stationary hemispherical dome sits above the Earth -- called the firmament -- in which the stars were embedded. The stars rotate relatively quickly in their outer sphere, while five visible planets, the sun, and the moon dwell within smaller spheres somewhere in between.

1. The Aristotelian Revolution (330 BC); A Spherical Earth and a Geocentric system (based on observations of lunar eclipses and measurements of ground shadows in different locations at equinox times as proposed by Aristotle. The Earth's circumference was calculated as 252,000 stades (~40,000 Km), and this turned out to be nearly correct!

A spherical Earth was quite a stretch of the ancient human imagination, but once adopted, this geocentric model was refined by the Egyptian cosmologist Ptolemy to include dozens of celestial spheres for all the known heavenly bodies (150 AD). Epicycles were invented in the Middle Ages to account for discrepancies between model predictions and actual observations by careful astrologers. Contrary to folklore, by the time of Columbus all educated people knew that the Earth was round. Falling off the edge into the mouth of a dragon was a myth only for the superstitious.

2. The Copernican Revolution (1514 AD); A Heliocentric Model of the solar system; men were no longer located at the center of the universe. We are the ones that go around the sun, and not conversely.

A heliocentric solar system was subsequently adopted by Galileo, but this belief nearly cost him his life at the time of his Roman Catholic inquisition for heresy. He challenged the central dogma, claiming that the Earth actually moved while the Sun was (relatively) stationary (1633 AD).

3. The Darwinian Revolution (1859); Men were not created by God first with animals arriving on the scene secondarily. In fact, we were derived from a common ancestor through a process called evolution.

4. The Freudian Revolution (1923); Men were not even possessed of a rational mind (Id, Ego, and Superego).

5. The Watsonian Revolution (2000); With the sequencing of the human genome we can now start to read "the book of life" (Francis Collins, Craig Venter, Eric Lander, and Leroy Hood). Synthetic Biology will be the major application of this knowledge that will someday lead to a cure for all chronic diseases, possibly through the use of stem-cell therapy.

Notice that I deliberately did not include in my list events like the Wright Brothers flight at Kitty Hawk (1903), the invention of the atomic bomb (1945), or landing of men on the Moon (1969), although these were all significant events of the last century.

So we should now go back and review where we are in answering our three fundamental questions. Sadly, the rate of progress in answering these sorts of questions even by genius-level humans (Aristotle, Leonardo da Vinci, Galileo, Newton, Darwin, and Einstein are some of my favorites) leads me to conclude that I won't get to satisfactory answers in my lifetime. The time remaining is simply too short. (Look at the book Year Million edited by Damien Broderick.)

That means that I need to devote my remaining years to solving the problem of aging before I do anything else (or die trying). And that's what I've tried to do in the last 15 years or so. I still need several more lifetimes before I'm done. Also, I've also come to the conclusion that I can't do it alone. I need your help! If you share my dilemma, go to our GRG.ORG website and start reading to find out more specifically what you can do. You can make the engineered conquest of aging and disease your life-long quest, as I have, until we finally figure it out. But, realistically, when could we do this, if ever?

According to futurist Ray Kurzweil based on the logic of Moore's Law (for computer chips), the "Singularity" is supposed to arrive around the year 2038 -- in which the rate of scientific progress will become hyper-exponential -- in which men will merge with machines and nothing past that date will be predictable by modern reckoning (like passing into a 'black hole').

But if we're all looking forward to the arrival of this Singularity, we have to make sure that we don't die first along the way. After all, that hypothetical date is still 30 years away. As my colleague Dr. Aubrey de Grey predicts, we need to create an "escape velocity" in which we gain at least one year of additional life expectancy for each year that passes. That may come in only 20 years.

So, there's still hope for those of you who are relatively young. I envy you who are in your 20's today, since you will almost certainly live to see this new world come to pass. I might not!

In conclusion, even if you don't agree with whatever I've said before, I would like to leave you with a major concern that I have for the future of our country...

As Prof. Richard Dawkins the biologist of Oxford University has said, "We can distinguish at least two ways of looking at the world: (1) Magical thinking or superstition on the one hand and (2) The rigor of logic, scientific observation, and gathering of evidence on the other - that is, the faculty of reason. Reason and a respect for evidence are precious commodities - the source of our human progress over the last 500 years. Through the use of reason, science and technology have given us advances in medicine and public health, including the eradication of many infectious diseases with vaccines and antibiotics, C-sections with general anesthesia, type-matched blood transfusions, CT/MRI scans, and a 50 percent increase in average life expectancy in the US over the last century (from 49 to 75). Think about that. As Sir Arthur Clarke, who died last March, has said, "Any sufficiently advanced technology is indistinguishable from magic." Reason has served to safeguard us against religious fundamentalists and charlatans who would profit from concealing the truth.

Yet today we have an epidemic of irrational thought that is running rampant in our society (new-age mystics, astrologers, Tarot-Card readers, palmists, people who will read your aura, your tea leaves, speak-in-tongues, [glossolalia in Pentacostalism], and what have you). I assert that irrational thought is not harmless: alchemy, phrenology, Ouija Boards, claims of UFO abductions by aliens in the night, crop circles, dowsing, the Loch Ness Monster, Big Foot, ghosts, witches, warlocks, goblins, the Easter Bunny, and Santa Clause, can be relatively harmless, but when we teach Creationism to school children as part of the academic curriculum, flock to witch doctors or spiritualists to heal our loved ones, Vodoo, Santeria (animal sacrifice), Macumba, use homeopathy or moxibustion, call on psychic surgeons, employ a professional medium in a seance to communicate with the dearly departed (our late relatives), perform ritual sacrifices of virginal maidens (the Aztecs in Mexico), burn heretics at the stake (such as during the Spanish Inquisition), interrogate military prisoners using "extreme rendition," and employ forms of torture like "water boarding," it can profoundly undermine the ethical basis for our Western Civilization. Beliefs in Telepathy (mind reading), Precognition (forecasting the future), Clairvoyance (Extra Sensory Perception [ESP]), Psychokinesis (bending spoons or stopping/starting clocks without touching them), or other forms of parapsychological intervention, such as remote group prayer for infertile women to get pregnant at a higher rate than normal, is a dangerous tendency.

When our citizens confuse simple cause-and-effect relationships -- like when our leaders take credit for successes that might be attributable to causes other than their own interventions, which, under objective scrutiny, might have even been counterproductive, like the troop surge in Iraq. Leaders who conceal the true costs of their decisions in treasure and blood, or employ 'denial' to make our citizens feel "warm and fuzzy," while they surreptitiously support their well-connected friends or abrogate nuclear weapon non-proliferation treaties to express displeasure with Russian intervention in Georgia (talk about shooting yourself in the foot!).

In ancient times, superstition was the best that humans could do; since before modern supercomputer modeling, we were largely helpless in the face of seemingly random phenomena like earthquakes/tsunamis, hurricanes/typhoons, cyclones/tornadoes, volcanoes, forest fires, solar-radiation flares, ice ages, global warming, floods, droughts, famine, pandemic plagues/pestilence, and other forms of seemingly-capricious adversity. We are still helpless whenever we lose our utilities -- from electricity and telephone to communication satellites or refineries - but at least we know why.

Also, our modern society has an insatiable fascination with the private lives of rock stars, movie stars, and TV-series personalities. Hollywood celebrities seek publicity, and the paparazzi and tabloid newspapers graciously indulge them with gossip at industrial strength. But to what logical end?

To this graduating class I say use reason, and if you have clients in your practice or students in your classrooms or children in your families, educate them to the use of critical thinking and to deny superstition in their own lives. The future success of our society depends on it.

Whether one believes our human condition derives from an intentional act of God or else appeared somehow as a subtle, emergent property of random Darwinian evolution, it is evident that the predicament of human mortality was thrust upon us without our consent. So how shall we deal with this tragic phenomenon we call "death" in the future, assuming that we will have real choices available for medical intervention? I hope we choose life.

You should be aware that for the first time in human history, stem-cell biology -- in conjunction with nanotechnology, still to be perfected -- will provide us with the option to choose which of the following models of the human condition is the best for us.

(1) In Leviathan, Thomas Hobbes (1651) described the life of man as
" solitary, poor, nasty, brutish, and short." while

(2) In Hamlet, William Shakespeare (1601) wrote,
"What a piece of work is man! How noble in reason! How infinite in faculty! In form-and-moving how express and admirable! In action how like an angel! In apprehension how like a god! The beauty of the world! The paragon of animals!"

Hopefully, we will choose this last model rather than the first.

-- L. Stephen Coles, September 26, 2008; Los Angeles, California; USA