“The follow–up to the much talked about, responded to, and reflected upon Academically Adrift… . Highly recommended for faculty, staff, administrators, students, and parents. Of special interest is the chapter titled ‘A Way Forward,’ which provides the authors’ recommendations for improving undergraduate education based on their research.”–Library Journal
Buy this book: Aspiring Adults Adrift
Twenty–First Century Higher Education
The students in our study went through and were potentially shaped by four–year colleges and universities that existed in a particular historic moment—one in which the importance of rigorous academic study had been largely abandoned throughout many parts of contemporary higher education institutions. There are complex cultural, sociological, and historical explanations for the character of twenty–first century higher education that are worth touching upon to provide context for the empirical analysis that follows. While we have discussed some of this historical context in our prior work, we aim to extend those insights and focus more explicitly on the question of how higher education came to abandon academic rigor and promote social engagement for undergraduate students.
One widely shared explanation for the contemporary state of higher education, popularized by many institutional critics, depicts an organizational sector whose failings are a function of the creep of corporatization into academia’s once–hallowed halls. This deepening corporatization is argued to have led to a decline in the role of faculty relative to school administrators, and a corresponding marginalization of academic pursuits and student learning. “Now, the Zeitgeist is the market,” public policy scholar David Kirp has commented. “Still, embedded in the very idea of the university—not the storybook idea, but the university at its truest and best— are values that the market does not honor: the belief in a community of scholars and not a confederacy of self–seekers; in the idea of openness and not ownership; in the professor as a pursuer of truth and not an entrepreneur; in the student as an acolyte whose preferences are to be formed, not a consumer whose preferences are to be satisfied.”
Sheila Slaughter and Gary Rhoades refer to this as the emergence of “academic capitalism,” where actors within higher education in recent decades have brought “the corporate sector inside the university.” They argue that this corporatization manifests itself in multiple ways, including expanded ties with private–sector firms, growing attention to the economic utility of research endeavors, and increased market–based interactions with students, who have been redefined as consumers. The quality of instruction is compromised, according to Slaughter and Rhoades, because “expanded managerial capacity is also directed toward restructuring faculty work to lower instructional costs (although not costs generally).”
Scholars looking for evidence of a corporate turn in higher education can easily find ample material to support this claim by turning to various provocations provided by contemporary higher education administrators. Consider, for example, Rick Matasar, former law school dean and current New York University vice president for university enterprise initiatives, who wrote in his “Commercialist Manifesto”: “Commercialism is here, now, and it is not going away… . We are a business, deal with it.” Or perhaps consider the comments of the former president of George Washington University, Stephen Trachtenberg, who asserted that marketing colleges was similar to selling vodka—raising prices and improving packaging are generally sufficient to lure customers, as people mistakenly “equate price with the value of their education.”
Higher education, however, is a particular type of business. It is heavily subsidized by public provision and, with the exception of for–profit entities, untaxed. Higher education is also a business that has increasingly become characterized by the growth of administration and a marginalization of faculty. “Every year, hosts of administrators and staffers are added to college and university payrolls, even as schools claim to be battling budget crises,” political scientist Benjamin Ginsberg notes in The Fall of the Faculty: The Rise of the All–Administrative University and Why It Matters. “As a result universities are filled with armies of functionaries—the vice presidents, associate vice presidents, provosts, associate provosts, vice provosts, assistant provosts, deans, deanlets, deanlings, each commanding staffers and assistants—who, more and more, direct the operation of every school.” Again, empirical support for these claims is not hard to find. Ginsberg presents an analysis from the National Center for Education Statistics showing relative changes in the number of full–time equivalent (FTE) students to higher–education personnel from 1975 to 2005. According to this analysis, student–faculty ratios have remained roughly constant—16:1 or 15:1— while student–administrative ratios have changed from 84:1 to 68:1 and student–professional staff ratios from 50:1 to 21:1.10
There is a good deal of truth in these popular critiques of higher education, but we believe that they often fail to adequately describe the transformation of colleges and universities in terms that are immediately relevant to our focus on students’ collegiate experiences. For example, the increase in noninstructional staff and administration is not simply a case of bureaucratic bloat. Rather, the expansion is required by an organizational logic inherent to a particular higher–education model, which developed over the past century and has considerably strengthened in recent decades in the United States. One could imagine, after all, a corporate business that was focused on student learning—one that, for example, not only catered to satisfying students’ nonacademic needs as consumers, but also carefully measured and reported student learning outcomes to demonstrate value to clients. As a whole, however, the higher–education sector has been at best ambivalent about such assessment efforts.
It is not just that the relationship between faculty and administrators has changed as higher education increasingly has become managed along organizational principles found in corporations. More importantly, educators have increasingly ceded their authority to students, and administrators have shifted institutional emphasis from students’ academic and moral development to their personal growth and well–being. Empowering and catering to students as consumers has only exacerbated these broader and deeper changes that have come to characterize US higher education.
Historian Christopher Loss has astutely highlighted the extent to which the current character of higher education developed from a convergence of several forces, including the development of a “personnel perspective” that, following World War I, was embraced and adopted by higher–education administrators; the legal and cultural undermining of school authority in the 1960s; and the willingness of the federal government to provide financial support to subsidize the enterprise without seeking administrative oversight or demonstrated student achievement. As higher–education institutions began to grow and become more diverse in the twentieth century, administrators moved away from their earlier commitments, often religiously informed, to ensure moral development and character formation for students. Instead, they turned to the field of organizational psychology and embraced models of personnel management. According to Loss, beginning in the 1920s, “personality and the belief in pliable selfhood eclipsed character as the ‘chief purpose of college education.’ ” Specifically, the institutional model of student services was developed and promoted organizationally by the American College Personnel Association (ACPA). According to proponents of the personnel perspective, colleges needed to be brought “into closer organizational touch with [their] students” to solve their problems of maladjustment and attrition. College admission offices began to ask applicants for detailed information about their personal lives so that they could choose students who would fit the specific organization and, theoretically, so that they could provide individualized programs of counseling. During this time period, colleges and universities also began to pay increasing attention to promoting extracurricular activities and university–affiliated housing. Beginning in the 1920s, students’ involvement in extracurricular activities such as student clubs, athletics, and fraternities and sororities was argued to enhance their success.
While the “personnel perspective” or the student service model was ostensibly promoted to address the high attrition rates colleges and universities were experiencing in the 1920s—when 35 percent of students who entered four–year institutions did not finish their first year and more than half of students did not graduate—the approach flourished, according to Loss, because it was aligned with institutional interests. Specifically, administrators had a rationale for expanding their institutions’ scale and scope, as well as their own role within those organizations. Although the student service model never worked as well as administrators hoped it would (college attrition rates have remained high to this day), “this fact did not weaken administrators’ faith in the personnel perspective; it strengthened it.” The model’s success reflected its fit with organizational interests, not its demonstrated efficiency in improving student outcomes per se.
If colleges began to adopt a personnel approach in the 1920s, it was not until the last fifty years, with the growth of student rights, changes in college financing, and broader cultural adoption of a therapeutic ethic, that the implications of this model for collegiate life were fully felt. Prior to the 1960s, US courts granted colleges and universities broad in loco parentis rights to regulate student behavior both on and off campus. Following a period of student rights contestation from 1969 to 1975, the ability of US schools, including colleges and universities, to regulate student behaviors was significantly diminished and constrained. A landmark Supreme Court decision in 1975, Goss v. Lopez, guaranteed rudimentary due process rights to elementary and secondary students in public schools who faced even minor day–to–day school discipline. While Goss v. Lopez was about public high school students, lawyers quickly came to apply this precedent broadly and effectively to private schools and higher–education institutions, which were argued to have an implicit contract that afforded rights to students in these settings. As is often the case when organizations face uncertainty in their legal environments, schools began to develop internal structures that mimicked those found externally. Colleges and universities detailed rights and procedures in student handbooks, administrative committees were established to handle disciplinary matters bureaucratically, the right to appeal administrative decisions through grievance procedures was articulated, and adversarial legal challenges became considerably more prevalent.
In 1974, colleges and universities were also faced with new federal regulations enacted into law through the Family Educational Rights and Privacy Act (FERPA), which prevented them from reporting student grades or disciplinary matters to employers, graduate schools, and parents without prior student consent. Students applying to graduate school had the right to view recommendation letters written by their professors. “Ours is the age of judges and legislators who routinely second–guess decisions with which they disagree,” one education lawyer recently commented, “even if it means substituting their own views for the considered judgment of educational professionals.”
It was during this period that colleges and universities also began to require the use of course evaluations, in which students were further empowered to evaluate their instructors. Given these changes, it is not surprising that in recent decades researchers have found that cheating and plagiarizing have increased dramatically and the hours students spend studying have plummeted. One way to get a sense of the extent of these changes is to contrast the current situation with Howard Becker and colleagues’ depiction of faculty–student academic relationships in the 1960s. In Making the Grade, they argued that faculty and administrator “subjection” of students was nearly complete: “They decide what students are to do, when they are to do it, something of how it is to be done, and what rewards or punishments will be given to those who do or do not meet the standards.” Today’s students, by contrast, can rely on social networks to identify course sequences with few academic demands, tame professors whose standards are considered too demanding, and earn high marks by studying little more than an hour per day.
Colleges and universities were reluctant to challenge student interests as they increasingly became financially dependent on satisfying the demands of students acting as consumers. “Students could not have won such hegemony if parents had not frequently abdicated their authority,” sociologist David Riesman has noted, “often siding with their children against secondary school teachers over matters of student discipline, and at the postsecondary level unwilling to support institutional demands where these conflicted with student preferences.” Studying the revealed preferences of students, as demonstrated by their choice of which college to attend, economist Brian Jacob and his colleagues have shown that the majority of students manifest consumer preferences for institutional investments in college amenities, such as student activities, sports, and dormitories. Of course it was not just permissive parenting, but increased public subsidies—including poorly understood guaranteed student loans—that facilitated this particular brand of student consumerism. Absent these forms of financing, it is possible that students would articulate a different set of consumer preferences more closely aligned with a conception of college as a long–term investment in one’s future. As Riesman noted, “If they decided to attend, they might not feel that they were having a subsidized lunch whose nutritional quality need not be examined as carefully as if one were paying out of one’s own eventual pocket.”
Increased higher–education institutional commitment to promoting a personnel perspective that celebrated self–exploration and social wellbeing was also well aligned with broader cultural trends occurring in society. Specifically, in recent decades a therapeutic ethic increasingly came to underlie modern conceptions of the self and society. Sociologist Christian Smith, for example, has argued that “the de facto dominant religion among contemporary US teenagers is what we might well call ‘Moralistic Therapeutic Deism.’ ” According to Smith: “Being moral in this faith means being the kind of person that other people will like, fulfilling one’s personal potential, and not being socially disruptive or interpersonally obnoxious.” Sociologist Jennifer Silva has documented the broad diffusion of this therapeutic model in the lives of contemporary emerging adults. Silva argues that it has become ubiquitous in contemporary US culture, “propagated through school psychologists, family services, the service economy, self–help literature, online support groups, addiction recovery groups, medical trials, or even talk shows such as Oprah.”
Given these changes, colleges and universities became spaces where both students and the institutions they inhabited were increasingly focused on personal development and social engagement. As sociologist Steven Brint documented in his examination of undergraduate time use in the University of California system, while students spent thirteen hours a week studying, they spent more than three times that amount on recreation (twelve hours socializing with friends, eleven hours using computers for fun, six hours watching television, six hours exercising, five hours on hobbies, and three hours on other forms of entertainment). Higher education institutions today are academically adrift but socially alive, active, and attentive. This emphasis on the social sphere at least partly reflects the role of schools in socializing students for adult roles in society.
Colleges and universities, like all schools, typically function to instill students with behaviors and attitudes aligned with the contemporary values of the society of which they are a part. “Educational transformations are always the result and symptom” of a set of larger social transformations that have produced “new ideas and needs,” sociologist Émile Durkheim noted. From this perspective, one would argue that if colleges, universities, and the students in their midst are increasingly focused on personal growth, individual well–being, and social engagement, it is likely the case that this new pedagogical orientation is seen by many as closely aligned with the personalities thought necessary for successful transitions to adulthood.
Following World War II, sociologist David Riesman, in The Lonely Crowd, described the emergence of young adults in the upper middle class of urban cosmopolitan areas who were increasingly “other–directed”—focused on getting along with others, rather than being grounded by their own deeply held “inner–directed” values and motivations. Sociability and sensitivity to social groups was understood increasingly as a requirement “for success and for marital and personal adaptation.” According to Riesman, schools that were focused on socializing students for middle–class adult roles would increasingly embrace approaches aligned with the development of these “other–directed” dispositions.
Riesman highlighted how extracurricular activities and group projects were increasingly promoted to socialize students for this new model of individual development. “Play, which in the earlier epoch is often an extracurricular and private hobby, shared at most with a small group, now becomes part of the school enterprise itself, serving a realistic purpose,” Riesman observed. In promoting group activities, the teacher redefines his or her role in a manner similar to that of an “industrial relationships department in a modern factory … increasingly concerned with cooperation between men and men and between men and management, as technical skill becomes less and less of a major concern” (our emphasis). The teacher emphasizes that “what matters is not [students’] industry or learning as such, but their adjustment in their group, their cooperation, their (carefully stylized and limited) initiative and leadership.” According to Riesman, “The other–directed child is taught at school to take his place in a society where the concern of the group is less with what it produces than with its internal group relations, its morale.”
While Riesman’s dichotomy of “inner–directed” and “other–directed” ideal types can easily be critiqued on both theoretical and empirical grounds—for example, we are highly skeptical of drawing such sharp distinctions between intrinsic and extrinsic motivation—his work successfully articulated an emerging feature of cultural and institutional life that is consistent with social science research and general observation. Media scholar Todd Gitlin wrote in a 2000 foreword to Riesman’s book that contemporary students “born into a world of rock music and TV … [have] lived their entire lives as other–directed.” Gitlin maintained that “by the 1980s, the ‘exceptional sensitivity to the actions and wishes of others’ … had long since been institutionalized into the norms of talk shows and ‘sensitivity training.’ ” More importantly for our purposes, the promotion of this other–directed orientation had been institutionalized into the structure of US higher education.
The “personnel perspective” that historian Christopher Loss shows came to dominate higher education’s approach to student development was oriented to a general therapeutic ethos infused with cultural assumptions that social sensitivity, sociability, and interpersonal competencies were at the core of psychological adjustment and well–being. Higher–education institutions, largely through the promotion of social engagement, extracurricular activities, and group learning, focused on individual growth and self–realization, the development of personality and identity, tolerance for difference, and the ability to get along with others. Colleges and universities not only promoted these values, commitments, and activities on their campuses but also actively selected students on these factors when given the opportunity, as sociologist Mitchell Stevens has documented in his ethnography of selective college admissions:
Admission to places like the College typically requires considerably more than academic accomplishment, and mothers and fathers with their eyes on top schools begin investing in the development of their children’s extracurricular abilities many years before college begins… . At least part of their incentive for doing all of this is the hope that their incremental investments will sum to the “talented” athletes and musicians favored by admissions offices at the nation’s most selective schools.
Middle–class parenting that promotes “concerted cultivation” of youth through extracurricular activities is reinforced and legitimated by colleges that select on and emphasize these orientations for individual development.
After students enter college, institutions continue to support nonacademic pursuits, as is carefully documented in sociologists Elizabeth Arm–strong and Laura Hamilton’s recent in–depth qualitative study of a cohort of female students from college entry to exit at a flagship public research university. This research provides a detailed and illuminating window into the extent to which social, not academic, engagement dominates campus life for most students. Armstrong and Hamilton documented that while some students were focused on learning, mobility, and professional training, the largest pathway through college was a “party pathway” implicitly promoted and supported by the way in which the school was organized. “The party pathway is a main artery through the university, much like a well–paved eight–lane highway directing traffic into a major city,” Arm–strong and Hamilton noted. “On–ramps are numerous and well marked, and avoiding it completely requires intent, effort, and intricate knowledge of alternative routes.”
While college students’ focus on social activities is not new, the extent to which those activities are now perceived as being closely aligned with adult development and the purpose of college arguably is. The cultivation of character, grit, perseverance, social obligation, and duty are vanishing features of campus life at the beginning of the twenty–first century. Instead, college increasingly is focused on personal exploration and the development of young adults who are socially acclimated for middle–class societal roles. Many of the college students in our study have come to believe that “it’s not what you know, it’s who you know.” The colleges where they enrolled did not fundamentally challenge that logic. Instead, they implicitly offered a friendly amendment: “It’s not what you know, it’s who you know and who you are” in terms of interpersonal competencies, psychological well–being, and capacity for social adjustment.
Emerging Adulthood and an Unforgiving Economic Environment
How these college graduates truly are and how their life course conditions and outcomes should be assessed is, however, subject to empirical investigation and general debate. One way to assess graduates’ life outcomes is to ask whether colleges support their movement towards adulthood. In general, there is agreement from scholars and the larger public on the definition of what constitutes traditional markers of adult status. For example, sociologist Mary Waters and her colleagues have asserted, “In the United States, becoming an adult is achieved when a person takes on a set of socially valued roles associated with finishing schooling, leaving home, starting work, entering into serious relationships, and having children.” Transitions to adulthood do not require the accomplishment of all of these conditions, although it is generally assumed (given prevailing social norms) that the majority of them should be satisfied.
Social, economic, and cultural changes in society have led in recent decades to increasing numbers of individuals, including college graduates, not making traditional adult transitions either in their twenties or beyond. Indeed, researchers studying contemporary transitions to adulthood have noted that “much of the pertinent action occurs in the early thirties.” Social scientists have documented that “the transition from adolescence to adulthood has in recent years become more complicated, uncertain, and extended than ever before.” The likelihood of young adults living at home has increased significantly from the 1970s to today. In addition, the age of marriage has been delayed six years, from an average of age twenty in 1960 to older than twenty–six today.
The reasons for these changes in life course development are complex. For example, some scholars have emphasized the importance of structural factors, such as changes in the economy. In a recent book, sociologist Katherine Newman has argued that “globalization has ensured that the economic conditions that underwrote the earlier, more traditional, road to adulthood no longer hold,” and that “new entrants fall back into the family home because—unless they are willing to take a significant cut in their standard of living, the last resort these days—they have no other way to manage the life to which they are accustomed.” Other sociologists have emphasized cultural factors that underlie these changes. Christian Smith writes, “The adult world is teaching its youth all too well. But what it has to teach too often fails to convey what any good society needs to pass onto its children.” Pointing to the rise of moral individualism, relativism, and consumerism, Smith asserts that “American culture itself seems to be depleted of some important cultural resources that it would pass onto youth if it had them.”
While a range of structural and cultural factors have been responsible for changes in the timing of adulthood in our society, one interesting institutional feature that has received less attention is the extent to which individuals, particularly from middle–class social backgrounds, have been spending more and more time in colleges and universities. The extent to which higher education is therefore implicated, not at the periphery but at the core of these changing patterns, is worth emphasizing. As a set of interdisciplinary scholars organized by the MacArthur Foundation and focused on transitions to adulthood succinctly noted, “The hope for and necessity, if not always the reality, of obtaining post–secondary education (or additional training through the military or an apprenticeship) has created the growing gap between the end of adolescence and achievement of adult statuses.”
What is accomplished and what fails to be accomplished at college is thus central to the transitions of many emerging adults. This is true for a number of reasons. As an increasing percentage of individuals are going to college and taking longer to complete undergraduate degrees, large numbers of individuals are living for longer periods of time in relatively unsupervised residential halls or independently, as opposed to living under the auspices of parental authority and commuting from home. National studies of college freshmen show a significant decline in the percentage of students living with parents or relatives, from 21.3 percent in 1973 to 14.3 percent in 2006. With the exception of a handful of Scandinavian countries (Sweden, Norway, Denmark, and Finland), undergraduate students in European countries are much more likely to live at home with their parents and commute to college. For example, 75 percent of Italian college students, 55 percent of Spanish, 48 percent of French, and 22 percent of English and Welsh students live at home with parents during their college years. In the United States, students are not only going to college but are increasingly going away to college, and are spending longer and longer periods of time there.
Consequently, large numbers of students—for increasing amounts of time—are deeply immersed in collegiate social life; they are embedded in peer climates that sociologists have characterized as “adolescent societies”—or perhaps, given the age of the students in these settings, what we might call “emerging adult societies.” As we have described above, the power of these young adult peer subcultures is enhanced by school authorities, as social engagement with peers is not discouraged but, rather, is institutionally advocated, endorsed, and subsidized. For many undergraduates, college is understood and experienced primarily in terms of social interaction with their peers. Sociologist Michael Rosenfeld observed that “by the 1970s, coeducational college dorms were common, curfews were a thing of the past, and the college campus had become an important site of social and sexual experimentation.” For many individuals from middle– and upper–class social origins, the college years begin a period of independence that allows for the exploration of a wide range of life–course pathways. This period of individual exploration and experimentation can last well beyond college as young adults attempt to “find themselves.” Researchers studying transitions to adulthood have observed that “growing numbers of young people give themselves an early sabbatical to travel and experience life or engage in a community service project before deciding what they are going to do with their lives.”
While overall trends provide useful contours for understanding the transition to adulthood, as well as how colleges are implicated in this process, individual trajectories vary with respect to different components of undergraduate education—including development of general collegiate skills, college majors, and selectivity of the institutions attended—that are potentially associated with students’ post–college trajectories. Individual lives also unfold and are profoundly shaped by social background and historic circumstances. In his classic work Children of the Great Depression, Glen Elder demonstrated that the cohort of people who had been children during the Great Depression experienced long–term consequences with respect to their subsequent adult careers, marital formation, health, and world–views. Interestingly, Elder found that while working–class youth suffered long–term negative outcomes from the economic hardships in their lives, middle–class children who experienced economic deprivation but had familial resources as a buffer often assumed greater responsibility for taking care of others, and had largely positive adult outcomes. The consequences of delayed transition to adulthood in general, as well as the difficulties that recent college graduates experienced specifically as a result of the economic downturn, likely vary on the basis of social origins. As Frank Furstenberg and his colleagues have noted about emerging adulthood, “The ability of families to manage this long and complex period clearly varies greatly by the resources they possess or those they can access through formal or informal ties.” Whether delayed transitions to adulthood are a cause for concern is thus partially dependent on the extent to which emerging adults have resources that allow them to be adrift for a while, before they “find themselves” to lead more directed and purposeful lives.
The college students we followed in our study, who largely graduated in 2009, faced particularly difficult economic conditions associated with the 2007 recession. While they faced dismal economic circumstances that contributed to the labor market difficulties observed in our study, their conditions were perhaps not as dire as some social commentators have argued. Claims in the popular media, for example, featured unemployed and indebted college students joining Occupy Wall Street and suing their colleges for malpractice. Increasingly, commentators were asking whether college was worth it, and often explicitly invoked our prior work to question the value of undergraduate education. In spite of this sometimes shrill commentary, social scientific research on how college graduates fared during the recent economic downturn demonstrated that college–educated young adults continued to experience significant advantages in finding desirable employment, relative to those young adults without a degree. Sociologist David Grusky and colleagues write, “The deteriorating market situation of recent college graduates, while real and troubling, is nonetheless less extreme than that experienced by less–educated groups.”
Copyright notice: Excerpted from Aspiring Adults Adrift: Tentative Transitions of College Graduates by Richard Arum and Josipa Roksa, published by the University of Chicago Press. ©2015 by University of Chicago Press. All rights reserved. This text may be used and shared in accordance with the fair-use provisions of U.S. copyright law, and it may be archived and redistributed in electronic form, provided that this entire notice, including copyright information, is carried and provided that the University of Chicago Press is notified and no fee is charged for access. Archiving, redistribution, or republication of this text on other terms, in any medium, requires the consent of the University of Chicago Press. (Footnotes and other references included in the book may have been removed from this online version of the text.)