When Illness Goes Public: Celebrity Patients and How We Look at Medicine by Barron H. Lerner
Reviewed by Sherwin B. Nuland
The New Republic Online
If there was a single theme that united the various forms of social unrest characterizing the late 1960s, it surely was the demand for individual self-determination. If the placid decade preceding those turbulent years concealed a growing discontent with the long heritage of a paternalistic and accepting society, no one seems to have noticed. The sudden demands for change required the coming of age of a new generation, its members literally and figuratively shouting out their antagonism at established indifference to inequities no longer to be tolerated, even if it took violent struggle (or so some of them insisted) to bring about the necessary transformations. Like revolutionaries of every past era, the young radicals were intent on loosing themselves from norms that they believed to be inhibiting the free expression of their true selves.
Plenty of elders responded with alarm and even disdain, and supported attempts to quash the nascent movement, sometimes with tactics as aggressive as those used by the radicalized youths themselves. But others listened to their message and thought deeply about it, perhaps because some among them had long been dissatisfied with certain social shortcomings and were suddenly finding themselves emboldened to speak out and take action. Still others so motivated may simply have been energized by the process that was given the name "consciousness-raising." Being made consciously aware for the first time that certain of the ways in which society functioned were inherently unfair, plenty of well-meaning people found themselves puzzled over why they had never confronted or even considered the contentious issues that their children were rudely bringing to their attention.
It took plenty of loud urging, mass protests, and sometimes disorderly agitation by the radicalized activists to shift the center. But the center did shift. The civil rights movement, the women's movement, the gay liberation movement -- all owe a great deal to the social and intellectual explosiveness of the 1960s. Though the men and women most affected by the long-standing injustices were also the most active, others joined in. Slowly at first -- and in many ways too slowly, even today -- important changes began to take place; and, though much remains to be done, far more has been accomplished than might have been predicted at the time, by social conservatives or by outright opponents of the needed reforms.
There was one area in which the victims of the inequities consisted not only of certain large segments of society, but, in fact, of everyone. This area was medicine. Since its beginnings some two and a half millennia ago, Western medicine -- which would grow into the scientific medicine of the past few centuries -- had been characterized by a relationship between doctor and patient that might be called beneficent paternalism, in which the physician determined what was best for the patient, who was expected to comply with all recommendations. It was understood that such decisions were to be made with only the interests of the patient in mind, and were to follow principles of deontology, or the obligation to do the right and moral thing. These obligations were most famously spelled out in declarations such as the Oath of Hippocrates and the Prayer of Maimonides, but they appear as well in many codes of ethics, ancient and modern. Underlying all of this was the unstated compact by which all stakeholders were joined, namely that everyone tacitly assented to the unwritten proposition that doctors knew what they were doing.
None of this was ever overtly questioned, either by the medical profession or by the laity. Despite a low-grade level of chronic grumbling on both sides -- and occasionally having attention called to certain inadequacies, through the writings and comments of leaders of medicine -- all agreed to agree that the system worked well, to the benefit of patients, doctors, and the general welfare. Enter the 1960s and the self-determination movement. For the first time in their long history of authority, the leaders of medicine were being asked -- or more precisely, told -- to re-evaluate their basic premise of authority, their presumption of effectiveness, and even their benevolence. The same quality of discontent with long-established tradition that was being brought to bear on racial and sexual inequality exploded in the midst of the serene therapeutic companionship that formed the image of ideal medical care.
But unlike the other arenas in which change was being demanded, the field of medicine was in itself already changing rapidly, in ways that magnified the need for new models of functioning and added urgency to it. An abundance of remarkable discoveries were pouring out of American laboratories in particular, but also from certain other scientifically advanced nations, and these discoveries were revolutionizing diagnosis and therapy. The new knowledge and technologies brought not only new understanding, but also new problems. Unanticipated complexities were raining down upon the process of clinical decision-making. Choices were wider and much harder to make than they had been in the past, and not infrequently they involved the moral and spiritual values not only of the patient but of the doctor as well, and even of the entire fabric of society. This was especially true in issues arising around terminal illness, in which quality of life, definitions of death, and the very uses of certain of the innovations were increasingly coming under scrutiny.
The greatest of the advances bringing the discoveries of twentieth-century science to the bedside had been the introduction of antibiotics, after penicillin came into general use during the final phases of World War II. But matters really went into high gear in the 1960s, with the rapid development of university medical centers, the massive growth of the pharmaceutical industry, and the availability of huge governmental grants for biomedical research. It was this triumvirate of university laboratories and clinics, industry-sponsored research, and federal involvement that led to the present happy situation in which blizzards of new discoveries reach swift usefulness in the care of the sick, and can be evaluated by ever-improving modern statistical methodologies. Not surprisingly, the appearance of the notion of a patient's right to make informed decisions about the course of his or her own care served as a spur to achievement, as individuals, advocacy groups, and the knowledgeable public began to make demands not only on their doctors and the profession as a whole, but also on government to support research and clinical applications.
All of this, as I say, began concomitantly with the rise of the self-determination movement. Vast and rapid social change was taking place at the same time as vast and rapid scientific change, and their interplay created a complex mÃ©lange of reciprocal influences affecting both of them in ways still not completely understood.
Philosophers, sociologists, theologians, and lawyers came to realize that they had much to contribute to the discussion. As the 1960s merged into the 1970s, they weighed in on one issue after another, writing articles and books that revealed such a depth of comprehension of medical problems that the profession could no longer ignore their contributions to the discourse about appropriateness of care and the patterns by which decisions were made. And thus was born the modern field of "bioethics," a word that came into being in the late 1960s to encompass the full range of quandaries that arise from scientific discovery. The purpose of bioethics, as articulated by one of its founders, the philosopher and historian Albert Jonsen, is to "designate a vision of a world in which scientific advances were linked to human and environmental values in an evolutionary progress toward human community."
"Progress toward human community" is -- among other things -- progress toward decision-making that is less in the hands of scientists and physicians alone and progressively more in the hands of the affected communities and individuals. The most obvious casualty of such an approach to medical care was the old tradition of paternalism. Being responsive to the demand for self-determination, bioethics is a natural outgrowth of the events of the 1960s. Not unexpectedly, it brought with it the principle of paternalism's opposite: autonomy, and the notion that a man or woman seeking medical care is the man or woman best qualified to make decisions that affect the nature and extent of that care -- not one's doctor, but oneself. On the part of the patient, this involves becoming well informed by whatever means necessary; on the part of the doctor, it means sharing all available information, including the benefits as well as the risks and potential for harm inherent in every intervention. Not unexpectedly, autonomy soon became the central focus of the rapidly growing field of bioethics.
Barron H. Lerner's book is centered around the transformative events affecting medical practice that came into focus in and around the late 1960s, and the consequences of the new medical ethics. The story that Lerner tells is of the laity's increasing involvement in matters traditionally left to the profession, including the lessening of its certainty of benevolence and scientific infallibility. Ultimately, the book issues an appeal of sorts -- to both doctors and patients -- hoping to ensure that a well-informed public will continue to make reasonable demands on scientists and physicians that can only result in "progress toward human community."
Lerner is a medical historian whose previous book, The Breast Cancer Wars, won the Welch Medal, the highest literary honor bestowed by his colleagues in the American Association for the History of Medicine. But he is also a working physician, and his very readable new book is suffused with insights that come from years at the bedsides of the sick, dealing with the real problems of real people living in real time, where life-determining decisions must often be made under conditions of intense pressure and with incomplete information. In such circumstances, the deliberated and sometimes theoretical formulations of bioethical thinking must sometimes be temporarily abandoned in the interests of immediate action and the resolution of life-threatening problems. Lerner appreciates the individuality that every patient and every disease bring to the therapeutic encounter, and he sees such idealized benchmarks as evidence-based medicine and the randomized controlled trial for what they are: enormously useful tools as criteria of the highest standard of practice, and yet necessarily limited by the uniqueness of the disease process in each man, woman, or child who falls victim to illness, and limited also by the circumstances in which the illness takes place. "The larger point," he makes clear in a sentence that summarizes the underlying truth of clinical decision-making, "is that the care of all patients...is based on percentages and hunches, backed up, when possible, by the medical literature."
Lerner's engrossing book is, in fact, about uniqueness. The thirteen people described in the development of his thesis are unique not only because of the disease processes that brought them to medical attention, but also because of who they were. Each was a celebrity, either because he or she had been famous before getting sick or because sickness thrust him or her into the limelight, by focusing the public eye on the pathology of which he or she would become a symbol. Such luminaries as John Foster Dulles and Rita Hayworth share Lerner's pages with previous unknowns such as Barney Clark and Libby Zion. The historian-physician takes the reader chronologically through the series of celebrities, from Lou Gehrig's first symptoms in 1938 to the most recent report on the condition of Lorenzo Odone in 2006. The lessons to be found in their stories are of several sorts, but they all intertwine into the message that Lerner intends us to carry away after his last paragraph has been read. That ultimate message is about autonomy: its values, its limits, and its occasional dangers.
To read about these celebrities is to consider the emergence of a group of factors that nowadays are so commonplace that they have come to be expected, such as public knowledge and perhaps the deliberate publicization of the details of an eminent person's diagnosis and even symptoms; the personal involvement of the celebrity in the movement to call attention to the disease and to find ways to treat it; the rallying to the cause by others affected by the disease, and the formation of advocacy groups of which the celebrity is the focus; the consequent challenging of the medical profession, not only to find new methods of treatment, but to become involved in strategies being advanced by the personal efforts -- sometimes including his or her own ferreting out of non-standard approaches -- of the celebrity. None of these factors could have appeared did we not live in a culture that celebrates fame. We tend to see something of ourselves -- or of our fantasies -- reflected in the lives of prominent people who capture the public imagination. Sometimes the celebrity represents the best we dream of being; in some deep, ever-hopeful compartment of the mind, he or she epitomizes what we believe we can yet become, or what we might have become had circumstances been just a bit different. And sometimes a celebrity shows us the basest qualities of which we fear ourselves culpable, and becomes a focus of scorn that enables the fiction in which we need to believe: the fiction of our own moral superiority.
Of course, neither of these extremes exhausts the list of motives for our culture's fascination with the activities of public figures. But whatever its source, identification with another seems always to lie close beneath the surface of the scrutiny, the judgment, and even the imitation of the prominent that sometimes occurs. It is this sense of identification, unacknowledged or even unrecognized as it often is, that accounts for the lessons -- the good and the bad -- that we take from our images of such people. We can imitate them to our profit, or to our peril, or to both at the same time.
The cult of celebrity requires some degree of myth-making (or "framing," as scholars of journalism like to call it) in order to work its effects on us, even if the myth is as bogus as George Washington's constant virtue or Richard Nixon's constant vice. Though we search for the reality at its core, we nevertheless respond to the general framework of the fully developed myth, because it serves some need within us. Whether drawn to the lives of celebrities because they inspire us or because they are cautionary tales (or both), what we are likely to see in each instance is an aspect or a possibility within ourselves.
Celebrities stand out as exceptions to the general culture at the same time as they exemplify it. Cultural saints and cultural scoundrels alike, they reflect more than they determine the values of the times in which they live. In some respect they are us, and something in our innermost selves knows it. And it is in their being us that their influence lies. For every man-on-the-street who has benefited from the increased self-determination that some celebrities have exemplified, there is another who has been led astray by it. In the case of disease, for example, for every man or woman who has been helped to see perceptively into an illness and become active in its amelioration, there is another who has done himself or herself immeasurable harm by emulating a famous someone who has sought out unproven, unprovable, and ultimately dangerous recourses. For every Morris Abram, whose example, as described below, led others to informed searching for science, there has been a Steve McQueen, whose example led others -- many thousands of them, in his case -- to the quackery of dubious healers serving up dubious promises of certain cure. Lerner's book shows us both sides of the cult of celebrity, but its message is clearly optimistic. As in all other matters in which celebrities are used as models, the publicization of their illnesses has reflected the tenor of the times, and the tenor of our time -- at least where medicine is concerned -- favors informed and intelligent self-involvement. In other arenas, however, the crackpots have too often had their way.
These various strands of Lerner's meticulously documented narrative come together step by step to tell the story of the old paternalism, the evolving ethos of resistance to it, and the rise of that new phenomenon in medicine: the active involvement of patients in their own care and the decisions that determine how it will be carried out. The function of the celebrity in each case after the change occurred in the early 1970s was to be a kind of standard bearer who, by his or her public eminence, called attention not only to the disease but also to the necessity that those suffering from it take a decisive role in their individual therapy.
It is no coincidence that the first of Lerner's evolving examples of public acknowledgment of an illness being endured by a high-profile figure -- and of its being used to further the twin causes of self-determination and medical progress -- does not take full form until 1973, only a few years after the field of bioethics was born, and with it the notion of autonomy. In that year, the renowned civil rights lawyer Morris Abram was found to have acute myelocytic leukemia, with a prognosis of six months. Abram immediately set about to discover as much as he could about the pathology of the disease and then went on to exert a strong and soon well-publicized influence on the course of his own successful therapy. "The era of the activist patient was dawning," writes Lerner, "and sick celebrities like Abram were leading by example." If Abram and the later exemplars of Lerner's lesson could do it, then so could ordinary people.
The chance had been lost again and again -- in the 1930s, when Lou Gehrig was revealed to be suffering with amyotrophic lateral sclerosis, and as late as 1969, when the Chicago Bears football player Brian Piccolo was diagnosed with lung cancer. But with Abram, the time had arrived, scientifically and culturally. "Beginning in the 1970s," Lerner writes, "the stories evolved to incorporate some element of opposition to, or at least questioning of, the medical profession, the Food and Drug Administration, and other representatives of the establishment.' The implication was that even celebrity patients could not assume that everything possible was being done to cure them."
But the publicization of celebrity illness served yet another function, as significant as its ability to mobilize patients' involvement in the details of treatment. "More than providing good practical advice, ill celebrities have delivered something else: stories of tenacity, inspiration, and hope in a time of crisis," Lerner writes. In order to accomplish this, it was necessary that they tell their stories in such a way as to have unalloyed meaning for those who might be helped by them. In other words, it was necessary to "spin," or perhaps merely to "frame," the narrative to provide the desired effect. Real life often does not play itself out precisely in the way it must be made to seem if it is to be sufficiently inspirational; celebrities and those who told their stories allowed themselves leeway to alter details just enough to deliver the desired message, and it was this spun message that became part of the public consciousness. "Overwhelmingly, the stories that became fixed were those that had a particular cultural resonance, those that somehow provided meaning to the public. Just as celebrities could define popular tastes in clothes, food, and style, so, too, could they demonstrate how to be a successful patient and, increasingly over time, a successful disease activist."
The quality and the quantity of the various forms and degrees of spin has varied. In general, most have provided inspiring messages so close to the reality of the actual events that they continue to serve as beacons leading others toward meaningful improvements in their own care. After Abram had succeeded in hectoring his physicians to give him the novel combination of chemotherapy and immunotherapy that he believed had saved his life, he was interviewed by a staff writer for the New York Times, resulting in an article that read like "a cohesive narrative of what had been an extremely complicated and rocky illness," in order to make the efforts toward final resolution seem far smoother than they had actually been. When he later wrote The Day Is Short, a book about his ordeal, Abram "crafted an account of his experiences...that had a coherence and trajectory that had never existed. Although he was surely trying to be as truthful as possible, the outcome of the story necessarily affected how it was told." The article and the book accurately described the details of Abram's odyssey toward cure, but they placed so much emphasis on personal determination that a reader could hardly be blamed for coming away with the certainty of an underlying message that willpower and optimism were the deciding elements in reversing the course of organic pathology, rather than hard slogging on the part of his doctors and himself.
Lerner rightly warns that "this logic, is, of course, a fallacy. Since dead leukemia patients are not alive to describe how they fought their diseases, it seems, in retrospect, that only survivors had enough willpower to live." As every clinical physician can attest, for every man or woman who has "beaten" a disease seemingly by the sheer will to do it, there are plenty of others who have succumbed, despite an equal or even greater degree of optimism and determination. In the final analysis, the reason for Abram's survival cannot be known, though it was most likely the new chemotherapy regimen chosen by his doctors, even if some efficacy of the immunological treatments cannot be ruled out. But the basic truth of his story was not altered by the relatively small degree of spin that he and others applied to it, namely that his own initiatives made the difference: becoming informed about the latest research; perseverance in seeking the best possible physicians; taking an active role in making medical decisions. And so the experience of one well-known and well-connected patient had a positive impact on the decisions of many thousands of others not as privileged.
At the other end of the spectrum of spin -- and here the spin might be called extreme -- falls the story that has become known by the name of the Hollywood movie made about it, Lorenzo's Oil. Lorenzo Odone was born with a rare genetic neurological disorder -- adrenoleukodystrophy, or ALD -- thought to be caused by a buildup of compounds called saturated very-long-chain fatty acids (VLCFAs). The disease, which affects only males, is usually diagnosed between the ages of five and ten, and gradually results in paralysis, deafness, and the inability to speak or swallow. Lorenzo's highly educated parents, Michaela and Augusto, made themselves experts on the most minute biological details of the disease process and on the various laboratory and clinical efforts being made to influence its course, wherever on the globe they were to be found.
By a combination of diligent library study, the seeking out of scientists who were doing research on aspects of the sometimes obscure biochemistry, and their own methods of dietary experimentation on their son and his maternal aunt -- who was a healthy carrier of the disease, but with elevated blood levels of VLCFAs -- they determined that the ingestion of a combination of oleic and erucic acid, which came to be called "Lorenzo's oil," dramatically lowered the amount of the toxic material. They wrote a series of articles about the phenomenon and soon became public figures, covered by the international press and receiving various awards. During all this time, they were working with several of the relatively few experts on the treatment of the disease, sometimes compatibly and sometimes with a degree of contentiousness largely the result of the friction between the caution of a scientist and the sense of urgency of a parent.
In 1990, the Australian film director George Miller suggested to the Odones that a movie be made about their quest. From the beginning, Miller intended to construct Lorenzo's Oil in the form of the sort of myth as described by Joseph Campbell, in which the final triumph of the hero or heroes -- in this case the Odones, played by Susan Sarandon and Nick Nolte -- takes place after many setbacks. The movie contains crucial inaccuracies; it distorts the characters and motives of the physicians and certain of the parents of ALD children; and it appears to make claims for Lorenzo's treatment not completely justified by the reality of its degree of effectiveness. Moreover, it glorifies a level of education, personal research, and in-your-face activism of which few patients or families are capable, raising the bar of expectations beyond what is possible for any but a very few.
What some viewers, especially knowledgeable physicians and affected relatives, have found most egregious is the movie's implication that the oil, though it became available too late to reverse Lorenzo's deterioration, had allowed many boys to lead healthy lives. Among the multiple complaints about the production made by scientists and others was the soon-to-be well-known fact that between 40 percent and 50 percent of boys born with the ALD gene either develop a mild form of the disease as adults or never manifest any symptoms at all. Though the film's action concludes with a scene of dozens of healthy ALD boys running, swimming, and playing baseball, there is no way of knowing whether their treatment with Lorenzo's oil had an iota of effect on their conditions. There has never been a controlled clinical trial of the oil, nor is there likely to be one.
Still, the movie and the many journalistic stories of the Odones' long journey ultimately served the same purpose -- though admittedly in an extreme and distorted way -- as did the far more accurate publicizing of the illnesses of other celebrities in Lerner's book. No longer could patients and their families sit back, expecting competent medical care. They had to become active participants, even scouring the globe for experts and possible treatments. "The real value of the movie," Augusto Odone said, "has been to show people that in cases where you have a disease in the family or yourself, you have to be proactive -- don't wait for doctors to tell you what the remedies are."
Albert Jonsen, quoted earlier, published a book in 1990 called The New Medicine and the Old Ethics, a thoughtful and perceptive disquisition on a dilemma that has sought resolution since the beginnings of Western medicine: the never-ending attempt to bring harmony to the competing claims of altruism and self-interest so that both patient and doctor are benefited. A difficulty for anyone, the conflict is writ large in the moral life of doctors. The new medicine that appeared in the 1960s, and the new autonomy that followed soon after, changed the face of the relationship between physician and patient -- they "raised the consciousness" of both groups, of all the stakeholders. Working together, though in rather different frames of mind, scientific discovery and the self-determination movement brought about the assertiveness and the advocacy ably documented by Barron Lerner, and so contributed to medicine's ability to solve many more clinical and ethical problems than ever before.
That autonomy and assertiveness have limits beyond which they become interference and are ultimately counterproductive or at least frustrating, both to medical science and to patients, cannot be doubted, as is shown by the case of Lorenzo's oil. But that they can benefit individuals, large groups of patients, and the scientific enterprise in general has now been established beyond question. As always in the uncertain art that we call healing, patients and doctors must think less in terms of autonomy or paternalism and more in terms of understanding one another, as they make their way together through the thickets of disease and its therapies. It is a virtue of Lerner's fine book that it helps us to understand the multifarious complexities that enter into the healing of the sick, and the ways in which some of them have been addressed. Not the least of those ways, as the author puts it in his subtitle, are those influenced for the good by "celebrity patients and how we look at medicine." As in every other aspect of public life and the affairs of humankind, a more open society is a healthier society.