"I was always treated as if I had insisted on being born in opposition to the dictates of reason, religion, and morality, and against the arguments of my best friends." ~ Pip's Lament "Great Expectations" (Chapter 4, pg. 25)
Sunday, July 16, 2017
How to live with death, the 6 psychological flaws that keep the talented from achieving greatness, a stunning piece on the total solar eclipse of 1869
On Sunday, July 16, 2017 6:08 AM, Brain Pickings by Maria Popova <firstname.lastname@example.org> wrote:
How to live with death, the 6 psychological flaws that keep the talented from achieving greatness, a stunning piece on the total solar eclipse of 1869
How to live with death, a remarkable account of the total solar eclipse of 1869, the founding father of neuroscience on the 6 psychological flaws that keep the talented from achieving greatness, and more.
NOTE: This message might be cut short by your email program. View it in full. If a friend forwarded it to you and you'd like your very own newsletter, subscribe here – it's free.
donating = loving
I pour tremendous time, thought, love, and resources into Brain Pickings, which remains free. If you find any joy and stimulation here, please consider supporting my labor of love with a recurring monthly donation of your choosing, between a cup of tea and a good dinner:
You can also become a one-time patron with a single donation in any amount:
And if you've already donated, from the bottom of my heart: THANK YOU.
If you wish to cancel your recurring donation, you can do so here.
Hello, Terry Travers! This is the weekly email digest of brainpickings.org by Maria Popova. If you missed last week's edition – Tolstoy on love, Ursula K. Le Guin on the magic of real human communication, Kafka on music and why we make art – you can catch up right here. And if you're enjoying this newsletter, please consider supporting my labor of love with a donation – each month, I spend hundreds of hours and tremendous resources on it, and every little bit of support helps enormously.
Our lifelong struggle to learn how to live is inseparable from two facts only: that of our mortality and that of our dread of it, dread with an edge of denial. Half a millennium ago — a swath of time strewn with the lives and deaths of everyone who came before us — Montaigne captured this paradox in his magnificent meditation on death and the art of living: "To lament that we shall not be alive a hundred years hence, is the same folly as to be sorry we were not alive a hundred years ago." Centuries later, John Updike — a mind closer to our own time but now swept by mortality to the same nonexistence as Montaigne — echoed the sentiment when he wrote: "Each day, we wake slightly altered, and the person we were yesterday is dead, so why… be afraid of death, when death comes all the time?"
How to live with what lies behind that perennial "why" is what British psychoanalyst Adam Phillips examines in Darwin's Worms: On Life Stories and Death Stories (public library) — a rather unusual and insightful reflection on mortality, suffering, and the redemptions of living through the dual lens of the lives of two cultural titans who have shaped the modern understanding of life from very different but, as Phillips demonstrates, powerfully complementary angles: Charles Darwin and Sigmund Freud.
For Freud, as for Darwin, there is not just the right amount of suffering in any conventionally moral sense of right: for who could ever condone suffering? But there is a necessary amount. Our instincts, at once the source of our suffering and of our satisfaction, ensure the survival of the species and the death of the individual.
The amount of suffering in the world is not something added on; it is integral to the world, of a piece with our life in nature. This is one of the things that Freud and Darwin take for granted. But it is one thing not to believe in redemption — in saving graces, or supernatural solutions — and quite another not to believe in justice. So the question that haunts their writing is: how does one take justice seriously if one takes nature seriously?
Darwin, to be sure, had his own profound confrontation with suffering in his beloved daughter Annie's death just as he was beginning to tell the story of life itself. After two generational revolutions of the cycle of life, Freud made our relationship to death a centerpiece of understanding our trials of living. With an eye to these parallel legacies, Phillips writes:
If death was at once final and unavoidable, it was also a kind of positive or negative ideal; it was either what we most desired, or what, for the time being, had to be avoided at all costs. For both Darwin and Freud, in other words, death was an organizing principle; as though people were the animals that were haunted by their own and other people's absences… Modern lives, unconsoled by religious belief, could be consumed by the experience of loss.
So what else could a life be now but a grief-stricken project, a desperate attempt to make grief itself somehow redemptive, a source of secular wisdom? Now that all modern therapies are forms of bereavement counselling, it is important that we don't lose our sense of the larger history of our grief. It was not life after death that Darwin and Freud speculated about, but life with death: its personal and trans-generational history.
Redemption — being saved from something or other — has been such an addictive idea because there must always be a question, somewhere in our minds, about what we might gain from descriptions and experiences of loss. And the fact of our own death, of course, is always going to be a paradoxical kind of loss (at once ours and not ours). But the enigma of loss — looked at from the individual's and, as it were, from nature's point of view — was what haunted Darwin and Freud. As though we can't stop speaking the language of regret; as though our lives are tailed by disappointment and grief, and this in itself is a mystery. After all, nothing else in nature seems quite so grief-stricken, or impressed by its own dismay.
Well before twentieth-century physics illuminated the impartiality of the universe, Darwin and Freud planted the seed for rendering the notion of suffering — that supreme species of disappointment at the collision between human desires and reality — irrelevant against the vast backdrop of nature, inherently indifferent to our hopes and fears. Phillips writes:
Darwin and Freud showed us the ways in which it was misleading to think of nature as being on our side. Not because nature was base or sinful, but because nature didn't take sides, only we did. Nature, in this new version, was neither for us nor against us, because nature (unlike God, or the gods) was not that kind of thing. Some of us may flourish, but there was nothing now that could promise, or underwrite, or predict, a successful life. Indeed, what it was that made a life good, what it was about our lives that we should value, had become bewildering. The traditional aims of survival and happiness, redescribed by Darwin and Freud, were now to be pursued in a natural setting. And nature seemed to have laws but not intentions, or a sense of responsibility; it seemed to go its own unruly, sometimes discernibly law-bound, way despite us (if nature was gendered as a mother, she was difficult to entrust ourselves to; and if we could love a mother like this, what kind of creatures were we?). And though we were evidently simply pails of nature — nature through and through — what nature seemed to be like could be quite at odds with what or who we thought we were like.
Nature, apparently organized but not designed, did not have what we could call a mind of its own, something akin to human intelligence. Nor does nature have a project for us; it cannot tell us what to do, only we can. It doesn't bear us in mind because it doesn't have a mind… And what we called our minds were natural products, of a piece with our bodies. So we couldn't try to be more or less natural — closer to nature, or keeping our distance from it — because we were of nature.
If, once, we could think of ourselves as (sinful) animals aspiring to be more God-like, now we can wonder what, as animals without sin (though more than capable of doing harm), we might aspire to.
Trumpeted by the press as "the great eclipse of the nineteenth century," the total solar eclipse of August 7, 1869 was the world's first astronomical event marketed as popular entertainment — not merely a pinnacle of excitement for the scientific community, but a celestial spectator sport for laypeople. Smoked, stained, or vanished glass sold faster than any other item in American stores that summer. Small towns on the eclipse path, which stretched from Alaska to the mid-Atlantic coast, experienced an unprecedented surge of tourism. Newspapers offered extensive coverage of the event in the weeks leading up to it, crowned by front-page headlines the morning after the eclipse.
But by far the most exquisite and original piece about the cosmic marvel came two months later from the inimitable Maria Mitchell (August 1, 1818–June 28, 1889) — America's first female astronomer, who paved the way for women in science and became the first woman hired by the United States federal government for a "specialized nondomestic skill" in her capacity as "computer of Venus" — a one-woman GPS guiding sailors around the world.
Total solar eclipse of 1869. Photograph by Benjamin Peirce, Harvard University.
Writing in the October issue of Hours at Home, Scribner's first magazine, Mitchell reported on the eclipse expedition she had led to Burlington, Vermont. (Nearly a decade later, she would draw on this experience and a subsequent one in her tips on how to watch a solar eclipse.) Fusing rigorous insight into the science of eclipses with a poetic account of this uncommonly transcendent encounter with the universe, Mitchell ends her anonymous essay with a clever rhetorical twist that throws a grenade into society's most foundational assumptions about science, gender, and human nature.
She begins by framing the enormity of the fanfare surrounding the event, summoning the throngs "ticketed to totality":
Every known astronomer received courteous invitations into the shadow. Every astronomer, professional or amateur, prepared to go. The observatories must have been left undirected; the mathematical chairs of the colleges must have been empty, and, judging from the crowded condition of hotels within the darkness, Saratoga and Newport must have felt the different set of the travelling current… In the halls of the hotels we saw meetings between friends long separated, and heard joyous exclamations as grayhaired men met and shook hands and laughed, that neither could recognize in the middle-aged other the youth whom he had left, and whom he had since known only through scientific journals.
Leading a team of six young scientists, Mitchell arrived at Burlington College on August 4, "too late to attempt any work that day," and was engulfed in inclement weather for the next two days, glooming from cloudy to "rainy, rainy all day." When the third day broke with "as beautiful as morning could be," they began setting up their instruments. Writing deliberately in the masculine, she describes the technical setup:
In preparing for an observation of time, the astronomer gives himself every possible facility. He ascertains to a tenth of a second the condition of his chronometer, not only how fast or how slow it is, but how much that fastness or that slowness varies from hour to hour. He notes exactly the second and part of a second when the expected event should arrive; and a short time before that he places himself at the telescope.
Diagram of a total solar eclipse (Atlas of Astronomy, 1869)
Mitchell then pivots smoothly from this matter-of-factly reportage to a lyrical account of the pinnacle of the experience — the sight of the corona and its attendant otherworldly light:
There were some seconds of breathless suspense, and then the inky blackness appeared on the burning limb of the sun. All honor to my assistant, whose uniform count on and on, with unwavering voice, steadied my nerves! That for which we had travelled fifteen hundred miles had really come. We watched the movement of the moon's black disk across the less black spots on the sun's disk, and we looked for the peculiarities which other observers of partial eclipses had known. The colored glasses of our telescope were several, arranged on a circular plate, so that w could slip a green one before the eye, change it for a red one or a yellow one, or, if we wished to look with the eye unprotected, a vacant space could be found in the circumference. In the course of the hour, from the beginning of the eclipse to total phase, this was readily done. I fancied that an orange hue suited my eye best, and kept that in place intending to slip it aside and receive the full light when the darkness came on. As the moon moved on, the crescent sun became a narrower and narrower golden curve of light, and as it seemed to break up into brilliant lines and points, we knew that the total phase was only a few seconds off.
Light clouds had for some time seemed to drift toward the sun; the Mississippi assumed a leaden hue; a sickly green spread over the landscape; Venus shone brightly on one side of the sun, Mercury on the other; Arcturus was gleaming overhead, Saturn was rising in the east; the neighboring cattle began to low; the birds uttered a painful cry; fireflies twinkled in the foliage, and when the last ray of light was extinguished, a wave of sound came up from the villages below, the mingling of the subdued voices of the multitude.
Instantly the corona burst forth, a glory indeed! It encircled the sun with a soft light, and it sent off streamers for millions of miles into space!
On looking through the glass, two rosy prominences were seen on the right of the sun's disk, perhaps one-twentieth of the diameter of the moon, having the shape of the half-blown morning-glory. I found myself continually likening almost all these appearances to flowers, possibly from the exquisite delicacy of the tints. They were not wholly rosy, but of an invariegated pink and white, with a mingling of violet.
Corona of the total solar eclipse of 1869, as seen through a four-inch telescope. Lithograph by J. Bien and J. F. Gedney.
Here, in a supreme testament to the subjectivity of individual human consciousness — the sole tool we have for probing the objective truths of the universe — Mithcell reminds us that no two perceptual experiences of the same phenomenon are alike. (My blue is never your blue.) A century and a half before modern neuroscience coined the notion of the "qualia" that shape our experience of the world, she writes:
Any correct observation of color is, however, impossible. Beside the different perception of the eye, in its normal state, the retina cannot instantly lose the effect of the colored glass. I had just left an orange glass, and was quite insensible to that color; while one of our party who had been using a green glass declares the protuberances to be orange-red.
In a parallel sentiment, with an eye to the representatives of various disciplines observing the eclipse — photography, spectrography, astronomy — she offers a reflection on the nature of knowledge itself:
No one person can give an account of this eclipse, but the speciality of each is the bit of mosaic which he contributes to the whole.
"Four Views of the Solar Eclipse, August 1869″ by John Adams Whipple
As I ran my glance along the limb of the moon I saw another protuberance much larger than the former ones, very nearly at the vertex, increasing rapidly. It seemed to be brought into light as the moon moved on; and yet, billowy in shape and mottled in color, it appeared to have, or possibly it had, a motion within itself. Next there leapt out on the left of the moon two more flower-shaped and flower-tinted creations. Twice, as I was looking at these, a flickering light caught my eye, as if from the moon's centre; another strangely shaped figure rushed out as if from behind the moon, and instantly the sun came forth. All nature rejoiced, and much as we needed more time, we rejoiced with Nature, and felt that we loved the light… The darkness was neither that of twilight nor of moonlight.
Now here is the ingenious twist: The attentive reader would notice that, throughout her account of the phenomenon, Mitchell is deliberately using the gender-neutral "we" to refer to her team of scientists and doesn't once use gender pronouns in her first-person narrative. A century and a half before Ursula K. Le Guin so brilliantly unsexed the universal pronoun, Mitchell's choice inclines her reader to the assumption, standard in her era and still lamentably common in ours, that "scientist" defaults to maleness (even though the word itself had been coined for woman thirty-five years earlier). Against that backdrop of implicit assumption, Mitchell draws on a Wordsworth verse and concludes her anonymous essay with a dramatic revelation surprising, perhaps even shocking, to her reader:
[The English astronomer Charles] Piazzi Smyth says: "The effect of a total eclipse on the minds of men is so overpowering, that if they have never seen it before they forget their appointed tasks, and will look around during the few seconds of obscuration to witness the scene." Other astronomers have said the same. My assistants, a party of young students, would not have turned from the narrow line of observation assigned to them if the earth had quaked beneath them. They would have said
— "by the storms of circumstance unshaken And subject neither to eclipse nor wane, Duty exists."
Was it because they were women?
In 1869, what a way to turn the magazine page into a drum onto which to play such a zinging rimshot sting.
"Principles are good and worth the effort only when they develop into deeds," Van Gogh wrote to his brother in a beautiful letter about talking vs. doing and the human pursuit of greatness. "The great doesn't happen through impulse alone, and is a succession of little things that are brought together." But what stands between the impulse for greatness and the doing of the "little things" out of which success is woven?
Although Cajal's counsel is aimed at young scientists, it is replete with wisdom that applies as much to science as it does to any other intellectually and creatively ambitious endeavor — nowhere more so than in one of the pieces in the volume, titled "Diseases of the Will," presenting a taxonomy of the "ethical weaknesses and intellectual poverty" that keep even the most gifted young people from ascending to greatness.
It should be noted that Cajal addresses his advice to young men, on the presumption that scientists are male — proof that even the most visionary geniuses are still products of their time and place, and can't fully escape the limitations and biases of their respective era, or as Virginia Woolf memorably put it in Orlando, "It is probable that the human spirit has its place in time assigned to it." (Lest we forget, although the word "scientist" had been coined for a woman half a century earlier, women were not yet able to vote and were decades away from being admitted into European universities, so scientists in the strict academic sense were indeed exclusively male in Cajal's culture.) Still, when stripped of its genderedness, his advice remains immensely psychologically insightful, offering a timeless corrective for the pitfalls that keep talent and drive from manifesting into greatness, not only in science but in any field.
Considering the all too pervasive paradox of creative people "who are wonderfully talented and full of energy and initiative [but] who never produce any original work and almost never write anything," Cajal divides them into six classes according to the "diseases of the will" afflicting them — contemplators, bibliophiles and polyglots, megalomaniacs, instrument addicts, misfits, and theorists.
He examines the superficiality driving the "particularly morbid variety" of the first type:
[Contemplators] love the study of nature but only for its aesthetic qualities — the sublime spectacles, the beautiful forms, the splendid colors, and the graceful structures.
With an eye to his own chosen field of histology, which he revolutionized by using beauty to illuminate the workings of the brain, Cajal notes that a contemplator will master the finest artistic techniques "without ever feeling the slightest temptation to apply them to a new problem, or to the solution of a hotly contested issue." He adds:
[Contemplators] are as likable for their juvenile enthusiasm and piquant and winning speech as they are ineffective in making any real scientific progress.
More than a century before Tom Wolfe's admonition against the rise of the pseudo-intellectual, Cajal treats with special disdain the bibliophiles and polyglots — those who use erudition not as a tool of furthering humanity's enlightenment but as a personal intellectual ornament of pretension and vanity. He diagnoses this particular "disease of the will":
The symptoms of this disease include encyclopedic tendencies; the mastery of numerous languages, some totally useless; exclusive subscription to highly specialized journals; the acquisition of all the latest books to appear in the bookseller's showcases; assiduous reading of everything that is important to know, especially when it interests very few; unconquerable laziness where writing is concerned; and an aversion to the seminar and laboratory.
In a passage that calls to mind Portlandia's irrepressibly hilarious "Did You Read It?" sketch, he writes:
Naturally, our bookworm lives in and for his library, which is monumental and overflowing. There he receives his following, charming them with pleasant, sparkling, and varied conversation — usually begun with a question something like: "Have you read So-and-so's book? (An American, German, Russian, or Scandinavian name is inserted here.) Are you acquainted with Such-and-such's surprising theory?" And without listening to the reply, the erudite one expounds with warm eloquence some wild and audacious proposal with no basis in reality and endurable only in the context of a chat about spiritual matters.
Cajal examines the central snag of these vain pseudo-scholars:
Discussing everything — squandering and misusing their keen intellects — these indolent men of science ignore a very simple and very human fact… They seem only vaguely aware at best of the well-known platitude that erudition has very little value when it does not reflect the preparation and results of sustained personal achievement. All of the bibliophile's fondest hopes are concentrated on projecting an image of genius infused with culture. He never stops to think that only the most inspired effort can liberate the scholar from oblivion and injustice.
Three decades before John Cowper Powys's incisive dichotomy between being educated and being cultured, Cajal is careful to affirm the indisputable value of learnedness put to fertile use — something categorically different from erudition as a personal conceit:
No one would deny the fact that he who knows and acts is the one who counts, not he who knows and falls asleep. We render a tribute of respect to those who add original work to a library, and withhold it from those who carry a library around in their head. If one is to become a mere phonograph, it is hardly worth the effort of complicating cerebral organization with study and reflection. Our neurons must be used for more substantial things. Not only to know but also to transform knowledge; not only to experience but also to construct.
The eloquent fount of erudition may undoubtedly receive enthusiastic plaudits throughout life in the warm intimacy of social gatherings, but he waits in vain for acclamation from the great theater of the world. The wise man's public lives far away, or does not yet exist; it reads instead of listens; it is so austere and correct that recognition with gratitude and respect is only extended to new facts that are placed in circulation on the cultural market.
Next come the megalomaniacs, who may be talented and motivated, but are bedeviled by a deadly overconfidence that ultimately renders them careless and unrigorous in their work. Cajal writes:
People with this type of failure are characterized by noble and winning traits. They study a great deal, but love personal activities as well. They worship action and have mastered the techniques needed for their research. They are filled with sincere patriotism and long for the personal and national fame that comes with admirable conquests.
Yet their eagerness is rendered sterile by a fatal flaw. While they are confirmed gradualists in theory, they turn out to rely on luck in practice. As if believing in miracles, they want to start their careers with an extraordinary achievement. Perhaps they recall that Hertz, Mayer, Schwann, Roentgen, and Curie began their scientific careers with a great discovery, and aspire to jump from foot soldier to general in their first battle. They end up spending their lives planning and plotting, constructing and correcting, always submerged in feverish activity, always revising, hatching the great embryonic work—the outstanding, sweeping contribution. And, as the years go, by expectation fades, rivals whisper, and friends stretch their imaginations to justify the great man's silence. Meanwhile, important monographs are raining down abroad on the subjects they have so painstakingly explored, fondled, and worn to a thread.
Cajal reflects on the only remedy for the megalomaniac's main stumbling block:
All of this happens because when they started out these men did not follow with humility and modesty a law of nature that is the essence of good sense: Tackle small problems first, so that if success smiles and strength increases one may then undertake the great feats of investigation.
He considers a special class of megalomaniac — the serial ideator who always fails to reach the stage of execution and whose rampant dreaming chronically falls short of doing. (This type, it occurs to me, has an analog in love — the serial besotter, who thrives on the thrill of infatuation, but crumbles as soon as the fantasy the beloved becomes a real relationship teeming with imperfection and the often toilsome work of love.) Cajal writes:
The dreamers who are reminiscent of the conversationalists of old might be seen as a variety of megalomaniac. They are easily distinguished by their effervescence and by a profusion of ideas and plans of attack. Their optimistic eyes see everything through rose-colored glasses. They are confident that, once accepted, fruits of their initiative will open broad horizons in science, and yield invaluable practical results as well. There is only one minor drawback, which is deplorable — none of their undertakings are ever completed. All come to an untimely end, sometimes through lack of resources, and sometimes through lack of a proper environment, but usually because there were not enough able assistants to carry out the great work, or because certain organizations or governments were not sufficiently civilized and enlightened to encourage and fund it.
The truth is that dreamers do not work hard enough; they lack perseverance.
He turns to the instrument addicts next — a class particularly prominent in our present culture of techno-fetishism. In a sentiment that applies with astonishing precision to today's legions of failed serial entrepreneurs — the foundering founders who have fetishized the glitzy sleekness of an invention, be it a gadget or an app, over its core conceptual value proposition — Cajal writes:
This rather unimportant variety of ineffectualist can be recognized immediately by a sort of fetishistic worship of research instruments. They are as fascinated by the gleam of metal as the lark is with its own reflection in a mirror.
Cold-hearted instrument addicts cannot make themselves useful. They suffer from an almost incurable disease, especially when it is associated (as it commonly is) with a distinctive moral condition that is rarely admitted — a selfish and disagreeable obsession with preventing others from working because they personally do not know how, or don't want, to work.
Next, Cajal turns to the misfit — though I suspect the word could have been translated better, for he doesn't mean the visionary nonconformist who propels society forward but the person who has ended up in a vocation or environment ill-fitted to their inherent talents, thwarting them from reaching their potential. He writes:
Instead of being abnormal, misfits are simply unfortunate individuals who have had work unsuited to their natural aptitudes imposed on them by adverse circumstances. When everything is said and done, however, these failures still fall in the category of abulics because they lack the energy to change their course, and in the end fail to reconcile calling and profession.
It appears to us that misfits are hopelessly ill. On the other hand, this certainly does not apply to the young men whose course has been swayed by family pressure or the tyrannies of their social environment, and who thus find themselves bound to a line of work by force. With their minds still flexible, they would do well to change course as soon as favorable winds blow. Even those toiling in a branch of science they do not enjoy — living as if banished from the beloved country of their ideals — can redeem themselves and work productively. They must generate the determination to reach for lofty goals, to seek an agreeable line of work — which suits their talents — that they can do well and to which they can devote a great deal of energy. Is there any branch of science that lacks at least one delightful oasis where one's intellect can find useful employment and complete satisfaction?
Next come the theorists. Marked by "a certain flaunting of intellectual superiority that is only pardoned in the savant renowned for a long series of true discoveries," the theorist becomes so besotted with her ideas and hypotheses that she shirks from testing them against reality and instead continually narrows her lens to only factor in what supports her theories. Cajal writes:
There are highly cultivated, wonderfully endowed minds whose wills suffer from a particular form of lethargy, which is all the more serious because it is not apparent to them and is usually not thought of as being particularly important. Its undeniable symptoms include a facility for exposition, a creative and restless imagination, an aversion to the laboratory, and an indomitable dislike for concrete science and seemingly unimportant data. They claim to view things on a grand scale; they live in the clouds. They prefer the book to the monograph, brilliant and audacious hypotheses to classic but sound concepts. When faced with a difficult problem, they feel an irresistible urge to formulate a theory rather than to question nature. As soon as they happen to notice a slight, half-hidden, analogy between two phenomena, or succeed in fitting some new data or other into the framework of a general theory –whether true or false — they dance for joy and genuinely believe that they are the most admirable of reformers. The method is legitimate in principle, but they abuse it by falling into the pit of viewing things from a single perspective. The essential thing for them is the beauty of the concept. It matters very little whether the concept itself is based on thin air, so long as it is beautiful and ingenious, well-thought-out and symmetrical.
Exclaiming that "so many apparently immutable doctrines have fallen," Cajal summarizes this particular pitfall rather bluntly:
Basically, the theorist is a lazy person masquerading as a diligent one. He unconsciously obeys the law of minimum effort because it is easier to fashion a theory than to discover a phenomenon.
Cajal takes care to note that while hypotheses have their use "as inspiration during the planning stage of an investigation, and for stimulating new fields of investigation," the theorist's mistake is a blind attachment to her theories not as a means to truth but as an end of intellectual labor:
One must distinguish between working hypotheses … and scientific theories. The hypothesis is an interpretative questioning of nature. It is an integral part of the investigation because it forms the initial phase, the virtually required antecedent. But to speculate continuously — to theorize just for its own sake, without arriving at an objective analysis of phenomena — is to lose oneself in a kind of philosophical idealism without a solid foundation, to turn one's back on reality.
Let us emphasize again this obvious conclusion: a scholar's positive contribution is measured by the sum of the original data that he contributes. Hypotheses come and go but data remain. Theories desert us, while data defend us. They are our true resources, our real estate, and our best pedigree. In the eternal shifting of things, only they will save us from the ravages of time and from the forgetfulness or injustice of men. To risk everything on the success of one idea is to forget that every fifteen or twenty years theories are replaced or revised. So many apparently conclusive theories in physics, chemistry, geology, and biology have collapsed in the last few decades! On the other hand, the well-established facts of anatomy and physiology and of chemistry and geology, and the laws and equations of astronomy and physics remain — immutable and defying criticism.