"Original and provocative.…Gullette …challenges the belief that decline is the truth of aging."—Marilyn Gardner, Christian Science Monitor "This is one of the most compelling, well-written, and reasonably priced social science books to come along in some time.…Gullette lays out the case for revisiting and transforming current perceptions of aging. She examines contemporary American culture's attitudes about age, observing that our culture has bought into the notion of aging as a woeful wait for death, characterized by unattractive physical decline (witness the popularity of Botox and cosmetic surgery) and social disdain. With a sophisticated blend of scholarship and examples drawn from popular culture, Gullette issues a resounding call for a new way of looking at the progression of life."—Library Journal "The word 'age' in contemporary parlance often means nothing more than the evaporation of youth and the onset of inevitable, ghastly decay. Gullette, author of an award-winning study of age defiance in popular culture, is disturbed not just by the reductiveness of this idea, but the 'anomalies in our celebratory age ideology' as well. Her ambitious examination of the forces behind various age norms calls for profound changes in the way we think about age, both socially and culturally…this complex book is an important resource for anyone who wants to think seriously about the way personal and cultural time lines can, or should, interact."—Publishers Weekly "We haven't done justice to age in the popular press. Margaret Gullette may change that. You can't read this ground-breaking book without realizing that 'age could be different.' It will be a more mature country that takes note of so important a voice, giving hope that our culture may yet value wrinkles—the face's road map of experience—accumulated from smiles, tears, and the hard-won wisdom of the body."—Bill Moyers |
An excerpt from Aged by Culture Margaret Morganroth Gullette Chapter One in the science museum At the Boston Museum of Science, one exhibit in particular attracted long lines of children: "Face Aging." Access to the open booth was forbidden to people over fifteen, so I watched from just outside. After standing for long periods with remarkable patience, the youngsters sat down inside under bright illumination, faced forward trustingly—"frontality also implies in the most vivid way the subject's cooperation"—and had their portrait taken by an automatic camera. After another wait, their digitized bust appeared on a TV monitor. Then, tapping a button like a VCR remote, each child could rapidly call up simulations of what she or he would look like at one-year intervals up to age sixty-nine. Flipped as fast as a Victorian zoetrope, the stills became a "movie." In seconds the computer added grotesque pouches, reddish skin, and blotches to their familiar features; the faces became elongated and then wider and then saggy; lines became more heavily rutted. Boys lost hair. Hair turned gray. The heads of both boys and girls grew and then shrank. The children were almost uniformly shaken. One eight-year-old girl in the hearing of a Boston Globe reporter moaned, "I don't want to get old!" While viewing the show, gerontologist Richard Griffin heard a boy "looking as if he had tasted something bad" say about another child's facial changes, "He's disgusting at forty-two." The teenagers, most of them white, were solitary or in small groups made up of age peers; they were on their own. But having a mother nearby didn't always cushion the shock. One woman with daughters told a son, "The girls say you're getting ugly." To another son who used the button to ride the years backward she said, "That's when you look the best—as a little boy." Nobody stayed in the booth long. Anyone could have stopped punching the button altogether at any age or lingered longer at a particular age (or gone backward): two girls I saw stopped at fifty-five and sixty-two. But most swept through the changes of their putative face-course to the bitter end. They came out preoccupied, distracted; some giggling recklessly, most edging away fast, not wanting to talk about the experience, not knowing what had happened to them in there. Afterward they fled. This was the booth in the "Secrets of Aging" show that enticed the kids during the spring, summer, and fall of 2000. Everything promised them scientific "truth"—their location in a "Museum of Science" and the prestigious array of complex and nonhuman technologies involved: the robot eye with no human behind it, the computer-driven graphics, the "interactive" button that produced the same aging effect forward or backward, invariably. And children are deeply curious about their life course, that mystery where your particularity scrunches up against unknown laws. As Virginia Woolf says, "If you are young, the future lies upon the present, like a piece of glass, making it tremble and quiver." Invariability was implied in the title of the exhibit too: "This is the way all faces age." When I interviewed the children exiting, I asked, "What did you learn?" The answer, in short, was, "I don't want to get old." They had nothing to add. "Do you think that's really the way you will look?" I continued. The question seemed to astonish them. Hadn't that mirror just shown them how they would really look? Its thick glass gave them what they believed was information. As children, they had no sense of how to discount visual "evidence." Unlike them, I was old enough to associate the paraphernalia of booths with bureaucracy, know that "passport photo" is a byword for ugliness, and possess a Department of Motor Vehicles' distortion of my physiognomy. Only one boy, already fairly tall and with some width of chest (he might have snuck in) said, smirking, "I got bigger." Of all those interviewed, only he seemed satisfied. Disgust was otherwise unisex. But gender played out as perceptive resistance. "Well, it made all of us really red," one skeptical white girl of maybe twelve observed to me. Only she expressed the tiniest suspicion about the crystal ball's veracity.
I am neither a journalist nor a gerontologist but a writer and cultural critic who studies age issues—call me an age critic. Reading about the exhibit had made me naively curious about my own "truth": I had really wanted to go in and see myself age. I was disappointed at being excluded. Then it slyly occurred to me that if I sent the programmers a photo of me at eight or ten, they could use their procedures to depict a "me" at my age, which was fifty-nine. I could compare their semblance with photos of the real me, taken recently, in available light, in different moods, by different people. Although I say "the real me," I scarcely look the same in any two. (As the theorist and photographer Jo Spence points out, "Two cameras standing side by side could take totally different pictures of the same moment.") Would the "I" of their simulation be recognizably the same as the person portrayed in any of my other photos? That was my motive for the first call I made, to Core Digital Pictures in Toronto. I was also skeptical about the predictive power of thesoftware they had used in the exhibit. Because I know beautiful people in their sixties, I had been surprised that no one on the monitor looked good, aging, to my willing eyes. No one looked better as they reached midlife, although youngsters can improve considerably with age, acquiring more harmoniously related features. The bogus faces had none of the qualities one might expect: drama, humor, intelligence, character. (These observations should have been a dead giveaway of what my sleuthing was about to discover.) Ron Estey is the project manager at Core Digital responsible for the program. Estey's answer to my request was that they could not use my childhood photograph as their starting point. A subject has to be well lit, without expression, posed frontally against a black background, and young. The software has to recognize the chin, brow, ears, and so on, in order to operate. The increasing facial redness as the children aged was just an accident: it had to do with the original colorization of the Kodak film they used. The blotches were also unintentional: they developed from marks the kids already had on their faces: freckles, moles, pimples. Core Digital had added the wrinkles, swags, and grayness. When I asked Estey how Core Digital had conceptualized "aging," he wouldn't say exactly. The software was proprietary. They had started with a photo of an eight-year-old girl; the algorithm was intended to create a "believable" image of her at sixty-something. When they constructed the oldest image, they asked their staff if it was credible. The responses made them add more "age effects." These required arbitrary decisions: at what age did they start sagging the line under the chin? By what age do ears droop? As someone who worked on designing the booth observed, "The age is an estimate. Their sixty-five looks like seventy-five or eighty to me." Only one component was scientific: they had worked with a cranial surgeon to identify the actual age-related growth in bones that produces longer noses and head shapes. For other potential clients—the FBI and forensic experts seeking people who have disappeared—they were adding some scientific components so that eventually they might be able to work from snapshots or group photos. But at present, Estey said, they could make no claim to be scientific or "rigorous." There were too many other circumstances the program did not account for, he added: gaining weight, having children, smoking, getting a disease. "It was only an entertainment," he explained. Core Digital does TV animation. This technology is typically deployed for cartoons and wizard ghoulishness in films, doctoring fashion photos so models look yet more emaciated, and falsifying historical documents. "We streamed together six or seven different ideas. We're a special effects studio."
The cyber-fi booth has been carted around the country, carried along in the "Secrets of Aging" show from one pedagogical site to another, like the smallpox germs secreted in blankets that colonists gave to Native Americans. After Boston, the exhibit went to L.A., and the cutoff age dropped to ten. But what exactly is going wrong in there, aside from the fact that Core Digital's secret is "morphing" and not "aging," or that the gizmo automatically uglifies and passes off its squint as truth? The software engineers hadn't asked, "What's the algorithm for making people look more beautiful, expressive, or individual as they grow up?" because they worked from our culture's preexisting notions of decline—skin, hair, outline. Caricature ruled. Decline overrode even the quirkinesses that never change: the shape of your eyebrows, the bow of your mouth. Do an experiment with two of your own photos, taken as far apart in time as possible: Try to look for likeness. Not just the genetic but the life-historical kinds: that uplifted look of adoring inquiry you acquired at four, the bravado adopted at fourteen, the resolute jaw of thirty. Our faces and bodies—as I argue in chapter 9—are historical repositories. If your mind balks at detecting resemblances thirty-five years apart, start with a modest ten-year spread. This is a cultural test. If you have difficulties passing it, you have been successfully trained—taught to notice only the bad differences between yourself younger and now, not the similarities, or the improvements. If we mean by ideology a system that socializes us into certain beliefs and ways of speaking about what it means to be "human," while suppressing alternatives, it is useful to call this training "age ideology." At the time I didn't know precisely what worries I felt for those children in the museum. It was also hard at first to specify what their experience had to do with the vast shadowy context of American age culture. Was the problem "ageism"? The term has many meanings. In the narrow sense of stimulating blind prejudice against old people (Dr. Robert Butler's original definition) among the impressionable, the effect was probably not that. The title of the exhibit invited kids to "face aging" not "face the elderly." The experience might in fact lead them to think, in contrast to what they'd just seen, "Grandpa doesn't look nearly that purplish" and "Aunt Flo is as handsome as everyone says." Nor, thankfully, did this exhibit reinforce the medieval memento mori, "Remember you must die," deployed to clients by the fitness and pharmaceuticals industries in every allusion to longevity. (While they were utilizing digital animation, Core Digital could have shown the kids what they'd allegedly look like at 140, the span telomerists promise us, and really have produced a frisson.) But there's another ageism, that "old age is a problem": this has become so intransigent a formulation even among alleged anti-ageists that children exposed to dominant cultures in the West probably overhear quite a lot of it. Depending upon their age, their exposure to the media, their family's subculture, and their parents' relation to their own parents, many children no doubt came into the museum holding some preexisting negative views about old age. These the exhibit reinforced. But the real trouble with the booth is not that it reinforces society's negative associations with old people. It is important to underscore that the grotesque big red face on the monitor is allegedly me myself. Forget others; in the United States aging is about me and me alone. "The future" is privatized. The first photo each child saw established the monitor as a mirror—and belief in cultural mirrors has devastating consequences in our hypervisual culture. Susan Sontag once observed, "The camera has ended by effecting a tremendous promotion of the value of appearances." Appearance and selfhood, increasingly, are stickily twined, so that your appearance (minus your expressions) is your self. The crude algorithm of the exhibit was modeled on a dominant cultural assumption: that the body declines as if with no cultural intervention. (Everyone forgot that Core Digital intervened.) This "fact" overwhelms the thousand other qualities or practices that are also you, even as young as fifteen—your SAT score, your concern for Afghani kids, your passion for Narnia or Jane Eyre—which together over time trace an individual life. Tapping a button, the kids pressed ahead through the counterfeits as fast as if they were playing Nintendo and the goal were to knock off the ages. Speed-up. (Sontag warned us in 1974 about "faster and faster seeing.") There was only one "special effect" to get, and they rushed to discover it. Aging equals decline, a devastating formula. One person, the man who built the booth, had been concerned about making aging artificially fearful for his audience. Gary Renaud, the manager of Exhibitors, Inc., had been cautioned by the maker of another face-focused technique about potential harm to young viewers. As a result, he said, he installed a monitor outside, where the children lined up. It showed photos of other kids as they were undergoing the special-effects distortions. His idea was that if a child found the preview too scary, she or he would drop out of the queue. Unlikely. Face-aging wasn't about them until they got inside, and then suddenly it was. Moreover, children as consumers of movies are taught to toughen themselves. Where the box office is concerned, they learn early on that avoidance of unpleasantness looks "wimpy" or "girlish." The entire R-17 system teaches that as you get older, you should be able to bear increasingly dreadful visual horrors. The outside monitor thus served less as a warning than a taunt. The very pun in "Face Aging!" dares you to go in the booth. There the computer screen tells you an authoritative story. In a few seconds postmodern techno-fragmentation click-clicks you through your years. If you paid a quarter for this in an arcade, the whole thing might be less troubling. But this is no game; it's "a copying machine" certified by the Science Museum Exhibit Collaborative. The spectacle offers a prophecy about your own appearance that makes human aging entirely bodily, predictable, and inescapably awful. For its concepts of decline, Core Digital drew on the "before" photos of cosmetic-surgery candidates familiar to all of us, as well as on other North American caricatures of the midlife. The children have no reason not to believe what they were shown. The exhibit performed an experiment on them, cognitively and emotionally. They were led to personalize—"internalize" is the term used by psychology and cultural studies—the wordless message "Aging is terrible." This will become for them one of those "group-defining stories that go underground as cognition where they serve as mental equipment for the interpretation of events." It is a powerful psy-fi operation. Most of us have undergone it, but few so young.
When I described the Museum of Science exhibit to photographer Vaughn Sills, she said she had seen something like it in the Musée d'Art Moderne de la Ville de Paris. Hans-Peter Feldmann's show, also in 2000, started with the photograph of an eight-week-old named Felina; next to it was a picture of a one-year-old, with its name and age, and then an image of a baby of two, and so on. The photos continued around the gallery; there were a full one hundred. The walk around "A Century" ("Un Siècle" was the name of the show) was slowed down because the photographs represented a hundred different faces, men as well as women, of various ethnicities, mostly but not all white, in distinctive settings. Individuals, they were differentially attractive or vibrant or strong or personable at various ages. As Sills came toward her own current age, she was aware of becoming more involved. She was able to pick and choose people who more closely matched her imagined future: "This is what I might be like." By the end of that meditative walk, Sills said, she had gained an impressive sense of change (and obviously the people represented at the end appeared older than the sixty-nine, or eighty, of "Face Aging"). Un Siècle wasn't depressing; it wasn't upsetting. Sills was utterly engrossed and imaginatively stimulated. Aging through a particular century, the twentieth. The figures inhabited backgrounds filled with furniture, photos, or other personal items, some of which carried period and historic associations, of events witnessed or memories accumulated over time. Such densely textured impressions might be complicated by other ideas—longevity, certainly; perhaps the striking and oddly reassuring impression that so much of life is now spent being what the cult of youth calls "old." To Sills the series was not even predictably chronological. From the artists' array, a woman at, say, eighty-three might look younger than another person at sixty-six. Indeed, Sills said, this happened quite often, at many of the ages shown. At any given age, "What next?" was not a foregone conclusion. "The dominant fiction of chronological aging ... plots our lives in continually increasing numbers," critic Mary Russo notes. On the calm surface, chronology is a bureaucratic convenience and a motive for annual potlatches of celebration. But the media increasingly exploit these automatic sequences for their associated story of decline. In Esquire in 2001 a man published head shots of his wife from age thirty-one to fifty-five for an issue on "Women and Aging," a spread that was unlikely to raise the median age of the trophy wife. The net effect of such sequences is to confuse the autobiography of anyone's unique mind-body with a universal arithmetic series supposedly etched on the body. In Paris, Hans-Peter Feldmann, putting his photos side by side in numerical order, nevertheless avoided the traps. Using chronology, he disrupted linearity. The show avoided telling a simple decline story about the external signs of aging, which—as age critic Mike Hepworth points out—do not usually cause physical discomfort nor seriously affect bodily functions. Feldmann probably did not know that the literary doctors of the nineteenth-century used to declare, in effect, "In no one thing do people differ more than in their aging." Yet wordlessly he had produced that idea. "The world is full of diversity and noncomparability and age is another fascinating difference" might be another lesson to carry away from these hundred images. Probably the same children who scattered in fright in Boston could have promenaded through the Paris show at their own self-directed and responsive pace, changed into curious agents by its impersonality reference to themselves, its dense reference to other worlds, its invitation to imaginative identification with others at any age. Photographs are thought to need text in order to channel meaning through the maelstrom of possible interpretations. The Boston and Paris exhibits suggest otherwise: that even without text, visual sequences always have life narratives secretively embedded in them. Such narratives declare the meaning of the passing of life time, not day by day but on a big scale. Since they help us tell our own stories, about the value of our own lives, the burning question is, what genre of story about aging gets wrapped in the narrative? Jo Spence, an English pioneer in "phototherapy" using family albums, offers yet another way of constructing a life narrative—by yourself, autobiographically. "Get together all the pictures of yourself which you can find," she instructed her college-age students. "Now sort out one single picture for each year of your life and then lay them out in a line on the floor, starting with the earliest year. Play about with them until you are satisfied that you have the ones which mean the most to you." Spence suggested writing about the meaningful ones rather than a silent inward telling. This is life review for people as young as eighteen, or even younger. It is cued by images but not limited to the physical self that others see, which some theorists call the "specular body." It's important to get beyond face and "figure"—which are so vulnerable to our culture's hypercritical age gaze. The story you tell is your artifact, private and memory based, your truth. Only you can decide which states of mind or age-selves have been significant to you, which events or people linger in your mind. (In chapter 7 I describe further how people do this saving kind of life storytelling, and in chapter 8 I propose an even more resistant telling that I call "age autobiography.") Your latest self might have changed its opinion of an earlier self: You might have hated the way you looked at twelve but now see how touching you were. Not all ages will matter equally. You might skip several years, not because you looked crummy then but because that period lacked narrative particularity. Although Spence's project too starts out as a linear effort, the addition of your inner storytelling breaks the strictly metronomic sequence. The nonvisual memories add rich text. The closer you get to your current age, the fuller your memories may be. Aging becomes maturation, change, history—more complex than a simple minus or a plus. At last.
What these three ways of representing age visually have in common is that they are all narratives of aging. But the stories they offer are quite different—in technique, in affect, in ethical and psychological effects. At their best, as Spence's phototherapy suggests, narratives of aging are personal and produced by the owner of the life in question, can be told at any age, minimize mere appearance, and encourage the teller to include ever thicker layers of what it has meant to be an embodied psyche, in culture, over time. At their worst, as in "Face Aging," the narratives are fatally flat: prospective and phony, solipsistic, body-obsessed, pseudo-universal and context denying, cognitively inhibiting, and anxiety producing. The meanings of age and aging are conveyed in large part through the moral and psychological implications of the narrative ideas we have been inserting into our heads, starting when we were very young indeed. Artistic and technological products, like the stories we ordinarily tell ourselves and one another, are permeated by the preexisting inventions of culture. It matters whether a given society—the United States, in this book—permits dense, interesting, encouraging narratives about aging, and for how long in the life course, and whether those ideas are dominant or merely subordinate or resistant. Our age narratives become our virtual realities. Certainly, whichever accounts you and I find ourselves living with and seeing the world through make a fundamental difference to the quality of our lives, starting with our willingness or reluctance, at any age, to grow older. Decline is a metaphor as hard to contain as dye. Once it has tinged our expectations of the future (sensations, rewards, status, power, voice) with peril, it tends to stain our experiences, our views of others, our explanatory systems, and then our retrospective judgments. Once I feel I am at risk, the collective future can shrink to the fantasized autobiography of the Aging Me. It is decline ideology above all, I would argue, that makes each life span "a distinctive and enclosed trajectory, picked out from other surrounding events," as sociologist Anthony Giddens brilliantly describes it. One of decline's saddest ego-centripetal effects is to obscure anything suffered by those adjacent to us, in the polity and across the globe. The only history that matters is that of our times. Decline then squeezes the life span further, into an inflexible, biological, individual arc. The three versions of aging outlined above provide alternatives but barely begin to explore all the imaginative possibilities there could be if we in the audience explicitly critiqued age ideology and demanded more care—and a bigger share—in representation. Despite the impression given by the freaky time machine, "aging" even at the merely visual level cannot have a single, invariable, universal, and ahistorical meaning. "We understand meaning not as a natural but as an arbitrary act—the intervention of ideology into language," Stuart Hall succinctly suggests. If you have been waiting—perhaps impatiently—for me to argue against the belief that decline is the truth of aging, that is just what, by starting with my three narratives of aging, I am doing. on socializing children into aging I don't like to trade in "secrets," a blatant marketing technique, but the basic secret of age was revealed decades ago by the very first works that could be called "age studies": Human beings are aged by culture. It's a simple anthropological idea, but it ramifies marvelously. The process takes various forms in different subcultures, as those know who have envied, say, Chinese-born Americans or Jews because "they revere old age." Now the life-enhancing age ideologies of our subcultures are themselves under siege. What matters to those exposed to dominant American age ideology—whether in the United States or abroad, among the cosmopolitan elites that receive it "along with Cable News Network, Coca Cola and Visa credit cards"—is that underneath its boastful surface it is surreptitiously telling much more dire and muddled stories. American children live unavoidably in relation to our culture's constructs of "age" or "aging," which rocket around them even if they never enter a science museum. The audiences for "Secrets of Aging," young as they were, had had earlier encounters with items of age lore and the aging narratives implicit in them. The first time might have been when they heard, "Happy birthday to you!" or "She must be almost three now!" or "You're not old enough for that yet!" (Interesting how emphatically age lore gets presented: does it typically have exclamation marks attached? Only when children are its objects?) How the young get socialized into a particular age culture is a fascinating, necessary subject for the field I call age studies. Age socialization must be bewildering. As with other social learning, children have to put together "uncoordinated pieces of information," in the words of two faculty members of the intriguingly named Department of Psychological Development and Socialization at the University of Padua. Neuropsychologist Merlin Donald warns, "to a human child, adult culture must be revealed only gradually, layer upon layer, with extensive mentoring." We know more about how children learn gender. The gender they acquire is usually like that of one parent, and so a parental "we" includes them. "Aging" too could be seen as a continuum along which they glide, to eventually join the adult "we." Having an "age," when separated off by itself, is more puzzling, because subjectively children feel stuck so long at one lowly state quite distinct from the adult: their age changes but their stage of life seems static. Children collect contradictory age-tinged language and revelations about older ages and about getting older in general, often without guidance, from peers, from overhearing adults, from ill-informed educators, haphazard reading, or, more and more, via the mass media. "You go from tree to tree: there is as yet no forest," as a John Berger character says. In all cultures, literate or illiterate, linguists tell us, children are exposed, early and unconsciously, to "the accepted story-structural forms," and learning them—being taught them—is an important activity. These stories, whatever else they do, tell the meaning of time passing. Decline narrative—whether found in a science museum or Esquire—is one of these explanations: a road through the forest. This book might inspire readers to go hunting for other story forms of aging—other time machines. They are everywhere. We need to think soberly through all the implications of these new facts about being aged by culture. At the millennium, "Face Aging" signals that decline's flawed and injurious ideology is no longer surreptitious, that decline is an authoritative narrative, and that the already early age of the target audience is dropping: the age considered appropriate for acquiring information about decline and internalizing it is now under ten.
"Since the audience is so young," I inquired of Estey, "why not have stopped the computer program at twenty or twenty-five or thirty?" Indeed, one mother I watched, who had a child of about six or seven on her lap, intuitively stopped the button at age fifteen. "You look so much like your father," she said approvingly. That was a span long enough to give her son an idea of bodily change, she must have thought, and one that he might regard as positive. (In fact, he squirmed away.) But no one organizing the exhibit had thought of stopping the projection of "aging" that young. The client—the Boston Museum of Science—wanted the simulation for an aging show, in a world where "aging" means old age and a bodily fate, not a choice of narratives about time that children must make sense of. Other sections of "Secrets of Aging" represented the ages neutrally (the "universal design" and Tai Chi booths) and positively, or suggested multiple (bodily) narratives. But presenting decline (in the most alluring exhibit) as the truth had the effect of overriding any counterpossibility that age/aging is constructed by culture and could therefore be critiqued and reconstructed. (Even a science museum can inhibit age consciousness.) Although the project manager at the museum, Jan Crocker, had gathered a panel of gerontologists to assist with the concepts and a panel of evaluators, none of them stopped to consider whether this youngster-attracting gimmick might convey a shocking and unwanted message. I'm not saying that they themselves believe decline should be the acceptable life-course narrative for the young. My analysis even so far suggests, if anything, that the ways of conveying meaning about age are not easy to understand or to deal with, given our existing tools. But age ideology has forced on us a question that would formerly have been unthinkable: "Should decline now be acceptable as a life-course narrative taught to the young?" Instead of simply saying no, I want to turn to the question of how children might be harmed by certain kinds of age narrative. For children younger than fifteen, a decade might be about as much personal physical change as they can comfortably assimilate. Studies on their psychological responses to broader kinds of changes suggest that confronting even a decade might be too much. One inference I draw from this research is that the experiment in the booth undermined the children's precious sense of "self-continuity." According to cross-cultural developmentalists, the conservation of self despite change is a "condition of any coherent conception of selfhood and consequently, of any collective moral order." Adolescents who cannot find a warrant for believing they are "connected to their own prospective futures" may become suicidal, according to Michael Chandler and his colleagues, who studied First Nations Canadian teens exposed to dramatic cultural loss and apparently certain personal decline. Children are conservatively present-minded in this imaginative realm of aging, or only moderately future-oriented. They're not postmodernists, playfully changing identities. One rare cross-cultural study in four advanced societies found that most thirteen- and fifteen-year-olds prefer their own age as "the best time to be alive." (With therapy her goal, Jo Spence intuited that it was best even for college-age students to focus on their own life courses no farther than the age they had attained. For imagining other/future identities, she counseled dress-up.) Children exposed merely to stories of change often stand pat for continuity and dig in or regress when subjected to too much projected difference. The youngest argue for self-sameness on the basis of an essential durable quality: for example, "My eyes are blue," "I always play with ponies." As they age, they develop more sophisticated "guarantees" that "different installments of their identity" are related. Finally, Chandler and his associates found, these explanations are narrative. In the Boston museum, the boy with some chest who liked seeing a "strong" self presumably viewed his future body not as different but as by and large the same, with change characterized more by gain than loss. (He may have stopped watching when the shrinking began.) The other children—except perhaps the girl with the presence of mind to notice redness, my first heroine of resistance—were stricken by more negative change over more future time than they could imaginatively handle. Although their presence in a science museum hints that they are educationally privileged youngsters, they are being prepared, like the socially excluded and suicidal Native Americans in Canada, to relate decline to aging-into-adulthood and to expect it. fantasizing the future: decline versus progress The children's inarticulate experience of decline in the museum somehow had to be fit into an evolving mental construct—although "construct" is too static a term—that I want to call an "age identity." Even children under fifteen probably have a couple. Age identity comprehends each person's collection of "information" about age and aging in general and stories about their own age and aging in particular, made less random but not necessarily less perplexing by the aging narratives they have come across, since some narratives serve as evidence of their implicit theories and desired outcomes and match their experiences while others do not. Age identity as I conceptualize it keeps a moving balance sheet, evaluating what aging—or, more typically where children are concerned, "growth"—has so far brought the self or its subidentities, as well as guessing what it is likely to bring in relation to what the dominant culture and the child's family and subculture say "the life course" is supposed to bring. The term suggests there is one and only one life course, as universal in its process as the biologized body. One of the most important vehicles for organizing this "knowledge" may be prospective age narrative. If looking backward is unreliable, as the memoir critics never tire of telling us, looking forward might be considered preposterous. H. G. Wells's classic text of time travel carried his reader ahead into the year 802,701. Our culture's narrative prophecies take us no less fantastically along our putative trajectory. Prospective narrative is now available for all ages. (Novelists writing about ages they have no experience of yet are among the unseen producers of this psy-fi genre.) But children have even more limited experiences of temporality than the rest of us. I started this book with the monitor in the booth above all because it is such a startling example of forecasting: a wreck foretold about each and every tender body. Prospective age narrative in a normal American childhood is mostly about "progress," not decline. Spence told her students the basic story-structural form: they "had gone through various phases, changing all the time, and were in a state of dynamic progression." Progress narrative has weaknesses, but its intent is not menacing or hostile. It doesn't come to the young in a single dreadful flash of affect (as decline can be learned) but in a surprisingly coordinated humdrum and reassuring way, through myriad comments, stories, and nonverbal practices, in family, social, religious, and institutional contexts. Many people will recognize something like the following reminiscence. I remember a dinnertime when it was Christmas.…Sometimes there would be other kids over and we would have a little table in the corner where we would eat, the kids, while the adults had the big table to themselves. And I remember that being a specially nice time. A lot of laughing, and cutting, and scraping, and eating and talking and all those kinds of good things. My family too practiced this holiday division between adults and children. Other people may have hated it or learned something else from it than I did. It taught me that I would grow up eventually—all in good time—to find myself at the big table, with my cousins also grown up, sharing it with my parents and my aunts and uncles (in my mental picture they were unchanged: robust, dazzling talkers, leaning forward in one another's faces, intensely gossipy and political). Was this my first experience of age hierarchy? In any case, age hierarchy seemed appealing. My husband's first age-memory, still fresh from when he was six, is of stroking his chin like a thoughtful grownup and being mocked by an aunt. His introduction to age hierarchy had a stick in it (ridicule at precociously presuming to act older) as well as a carrot (an image of impressive masculine maturity). Both of us were learning that life could be a progress. The kind of prospective age narrative each of us internalizes in childhood can be foundational. It may be no more than one sentence deep and yet as salient as Pike's Peak. When I was about five, I cut my knee on some metal sticking out of our new used car. The wounds bled profusely and my father held the cuts closed with his two hands until they stopped; I didn't get stitches. My mother used to say, about the two scars, "They'll be gone before you're married." Today's readers—trained to think of auto/biography sociologically, up to a point—are likely to note first off the way compulsory heterosexuality is built into my mother's prophecy, because sexuality is a main narrative we are currently being trained to read for. But in this book we are in training to read differently, in an unaccustomed way, for the age and aging that are ubiquitous. In this context, the basic notion that my mother conveyed was about healing through time. The body (married or not) would always heal. By itself, without constant scrutiny and tinkering: health was its default. The message she read off her crystal ball was, "Be calm; don't worry; all in good time." This optimism comes, she thinks, from her father, a first-generation Jewish immigrant ironworker who put his daughters through college in the Depression. The postwar boom assisted: That battered car was our first. In this and many other powerful normal sentences and anecdotes, my mother invented for me a trustful progress story about "the life course." The message went bone deep; it has functioned in all my recoveries, lifelong. It gave me a bias toward aging that had no sentimentality in it. The two scars in fact never disappeared; I can see them now. But it didn't matter that she was wrong. Her soothsaying—coming from a parent, it was true aging knowledge—strengthened me. It may have enabled me to become an age critic. It may be part of what made me bold enough to identify myself with the term "age studies." "Progress" is defined here as beginning in a personal relationship to time and aging, a willingness to get on the life course as on a train, for a lifelong journey, and an anticipation of staying on the Patagonian Local because the future seems worth it. Progress is latent in all developmental metaphors of psychological, physical, and moral "growth," as critical psychologist John M. Broughton has pointed out. Since my first book, Safe at Last in the Middle Years (1988), "progress narrative" is also the capacious term I have been using for stories in which the implicit meanings of aging run from survival, resilience, recovery, and development, all the way up to collective resistance to decline forces. The genre has been an influential engine of soul-making since the French Revolution, from Goethe's Wilhelm Meister's Wanderjahre to Toni Morrison's Beloved. Progress novels can be found in children's literature, the so-called coming-of-age story (of aging-into-adulthood), or in stories about aging-into-the-midlife or aging-into-old-age. Their implicit effect is to prove, through the readers' experience of fictional time passing, that faith in aging is justifiable. In Declining to Decline (1997), I tried to say what made the form culturally valuable and how writers could make it less vulnerable to criticism on the grounds of what it ignores. Progress is implicit in the age grading that children learn by entering school and by continuing through the education system, in the merit badges of scouting, and in other institutions in which status is conferred as if it depended primarily on aging. The Massachusetts Comprehensive Assessment System exam in 2001 offered this essay topic to seventh graders: "Age has a funny way of making changes. It is probably easy for you to look back and see that you and your friends have made some major changes [friends, teachers, interests, are suggested] since you left the elementary grades." The implicitly benevolent age-system that children are taught to rely on could be called "seniority." In the twentieth century, if not before, "growth" and "progress" became America's semiofficial life narrative for children. In normally protected circumstances, where these exist, children absorb into their age identity an increasing sense of control over things, authority over juniors and over themselves, and trust in the future. In adolescence, many convert the trend of predictable improvements in selfhood—"Next year you'll be riding a bike"—into positive expectations of the next stage, adulthood. Even dropping out of school to get a low-wage job and prematurely rushing into maternity may display a naive faith in the restorative power of aging-a-little-bit-further-along. Children under fifteen nevertheless have weak equipment with which to counter the feeling that horrified helpless regret is the only response to "aging," especially when decline narrative is imbued with medico-scientific certainty. Their age identity possesses too few records, and they know it. The "Face Aging" exhibit abused them spiritually, in the most delicate part of our age identity, the concept of the personal meaning of time. We're used to thinking of child abuse in other terms: malnourishment; inferior schooling; coercing poor children into labor or sexual slavery; inducing rich children to become premature consumers, mother naggers, and dysfunctional pleasure seekers. "The discourse of hope is replaced with the rhetoric of cynicism and disdain," Henry Giroux declares, warning about institutions that fail youth. Prospective decline narrative brings fatalism home: "You'll grow up to be a bum like your Dad"; "We can't get ahead." Some adolescents are exposed to domestic middle-ageism. When they are sunk in miserable adolescence, someone says glibly, "These are the best years of your life." Or, "You'll be old/have gray hair/be wrinkled too someday." But not all such messages are internalized. Even if such sentences are said out of bitter "realism" by an authority figure who feels he or she is "fastened to a dying animal," in Yeats's malignant phrase, destined for an age-adjusted pink slip, or immiserated by prejudice, some children find a way to mishear the words or fail to identify or reserve a hope that their fate will be different. They may continue to fantasize richly about their own future and about social change. But not in front of the science museum monitor. When I saw that exhibit I felt the temblor of a seismic change in American age socialization.
What of adults? In invoking the figure of a "threatened child" with "weak equipment" and new "needs," I have no intention of isolating children. On the contrary, I have been preparing to make a move fundamental to the emerging field of age studies, which believes that no age class exists in a capsule, insulated from whatever is impinging on the other age classes—younger and older. The move I am making at the end of this chapter, from the vulnerability of children to the vulnerability of adults, consolidates us all imaginatively as stakeholders in age ideology and the politics of aging, with a powerful interest in secret life-course narratives. What happens to the sense of rising through an institutionalized and secure and progressive age hierarchy, as those brought up in its expectations age beyond childhood and adolescence and, for some, their college years? Adults continue to build ever more complex age identities, and progress is supposed to remain a pursuable goal. Progress means that over time some people have acquired by some means a degree of what Anthony Giddens calls "ontological security." (Bracket the specifics. Some acquire this security by agency, some by luck, some by inheritance. Some conceive progress in masculinist metaphors, of overcoming less fit adversaries or cruising directly on time's arrow. Some envision it through a version (relevant to their class, race, gender, ableness) of the life-course narrative called "the American Dream." Some satisfy this narrative desire in base ways: progress can mean exploiting sweatshop workers more invisibly.) People need to feel, Giddens says, that they can "colonise the future with some degree of success." Historian Tamara Hareven calls this "life-planning": for people who parent, it involves children and even grandchildren, not the individual life but family timereckoned in generations. Even people who don't know literary forms or who face serious obstacles may do this planning and want that progress. But aging-beyond-youth makes any sense of security more elusive, starting earlier for the disadvantaged. Life storytelling becomes more edgily poised within the binary of progress versus decline. To deal internally with the threat of deprivations, to keep their particular narrative going, adults respond on many levels, including a recurrent "drive toward identity stabilization" or "self-continuity." In our culture, adults too want to maintain a subjective sense "of having reached a higher level of self-knowledge, of having become more self-confident, of having gained more control over one's impulses," in the words of cultural psychologist Amos Handel, studying Israeli immigrants, nurses, army recruits. He concludes, "Stability and progressive aspects of the self-narrative are not inconsistent and may actually coexist." So if you ask, with Zygmunt Bauman, "What possible purpose could the strategy of pilgrim-style 'progress' serve in this world of ours?" one answer is that stability and progress are felt as interior needs, essential to the survival of the self. Another answer lies in how the life-course opposition of progress and decline constrains narrative options in our culture. Age studies might help explode that binary. But until it does so, progress narrative—as the only apparent alternative to decline—is almost obligatory. The paradox is that in many ways and for many people, aging-past-youth is increasingly becoming a decline. I argue in the first half of this book that the structures that support progress and progress narratives are slowly being withdrawn early or late in middle life from all but the most privileged. My research uncovers bad news about the economics of the life course—detailed in chapter 5, "The High Costs of Middle Ageism": the age/wage curve peaks low at midlife for white and black women, for black men, for those with low educational attainment. Even for well-educated men now at midlife, the most privileged class, there are disquieting signs of loss of employment, stagnant or declining income, erosion of respect. Our age ideology presents us with an overarching contradiction. It constructs a tension—the subject of the chapter 2—between two pressures: "Change, dammit, in subjection to ineluctable laws, while simultaneously never getting older" (because being young is the single best promise of being able to succeed in the future). This can be an irresolvable and frightening conundrum for anyone who is no longer young. The tension increases over the life course. But it can be a buried terror even for those still young (even ideally young: male, single, highly educated, and child free), reading their own prospective decline in the ads and through their nightmares at 4 A.M. The losers are little better equipped than children to explain why progress is merited in their case. The unconscious belief produced by these circumstances—that decline is as inevitable as disease and has an early onset—is now so widespread that it seeps willy-nilly into artifacts like the booth in the Boston Science Museum. In light of this, the booth is what the joke calls a second opinion: Patient: "Doctor, doctor, you tell me I have a terminal illness. I want a second opinion!" Everyone should be aware of the economics of the life course, the cult of youth, and other forces of middle-ageism. But should people who parent use this knowledge to "predict" decline to their children? Or say how early it began in their own lives? (Plan their surgeries in front of the kids?) Many, including me, would argue for defensive optimism: "You'll have a good life whatever happens." It would require a heart of ice, a false realism, a poor understanding of my data, or a failure of will to be more menacing. Children also need a foundation of heightened age consciousness on which to build their future age identity and confront future risks. Adults need to become good enough age critics to explain the joke.
The rest of this book disentangles—as only a reconceptualized age studies can do—other central, toxic, unrecognized stories about age and aging from the welter of age culture. It will demystify their sources. As we identify the harms, we can figure out how to protect ourselves better—the children who sat in the booth, others waiting their turn to be exposed, the rest of us. We can resist more effectively only by changing age culture. Chapter 2 discusses why change should be possible, and the whole book suggests how. Activists can frame better arguments—building on a wiser, progressive form of age consciousness—to keep the concept of decline from becoming a reality for more people. There are socioeconomic contexts and political choices that might enable more people to attain more security over longer periods of life time, and there are narratives that can tell their full story more honestly. The overarching question is, How might more people of all ages develop a collective identification with the whole life world—especially the ages of life ahead? Only through such imaginative solidarity can we maintain our precious sense of self-continuity and possibility within the dangerous age ideology we confront in the twenty-first century. (Shades of another prospective narrative.) Is it too blissful to imagine, as our goal, being able to feel at home in the life course at every age? | |||
|