Politics Foreign Affairs Culture Fellows Program

Where Higher Education Went Wrong

Today’s colleges and universities are little more than credentialing factories.

(Michael Vi/Shutterstock)

“Whether education can foster and improve culture or not, it can surely adulterate and degrade it. For there is no doubt that in our headlong rush to educate everybody, we are lowering our standards, and more and more abandoning the study of those subjects by which the essentials of our culture — of that part of it which is transmissible by education — are transmitted; destroying our ancient edifices to make ready the ground upon which the barbarian nomads of the future will encamp in their mechanised caravans.”

T.S. Eliot, Notes Toward the Definition of Culture

Plumbers and proctologists both clean our pipes. Plumbers merely attend to a later stage of waste-disposal. Proctologists earn more, but plumbers can still make a decently good living. And it is there where the similarities between these two end. In today’s America, they occupy utterly distinct social spheres. Proctologists attend house parties in the Hamptons, while plumbers only set foot inside in the event of a real emergency. Proctologists are the “educated” elite. Plumbers are… well, plumbers. They’re the guys who wear denim overalls, as illustrated by Super Mario, the most famous plumber of all time. Then we have working-class hero Joe the Plumber, who made his name speaking up for small businesses at an Obama campaign stop during the 2008 presidential campaign. The question, of course, is why? What it is that elevates proctologists and relegates plumbers to the lower rungs of our social totem pole? Is the distinction justified? And the even more important question is what it says about us that we make the distinction at all.


A Debate in Which We Were the Losers

Nearly a century ago, the writer and journalist Walter Lippmann and the philosopher John Dewey engaged in an extended debate in print on the subject of the general public’s fitness for democracy. Much had changed, both agreed, since the days when the Founding Fathers had envisioned gentlemen-farmers contemplating and voting on all the significant issues of the day, issues with which their experience of life had made them largely familiar. The nation’s growth and industrialization had made it, Lippmann argued, “altogether too big, too complex, and too fleeting for direct acquaintance.” Gone were the days of the kind of “omnicompetence” that democracy required to be effective, i.e., informed citizens making wise decisions about their own futures. Now we were stabbing in the dark of a “pseudo-environment” in which “everything is on the plane of assertion and propaganda,” and we “believe whatever fits most comfortably with [our] prepossessions.” Lippmann’s solution to this problem was to have policymakers take their marching orders not from the general public but from apolitical experts who would make “unseen facts” intelligible to the politicos. 

Dewey’s response to Lippmann’s proposal was that “[n]o government by experts in which the masses do not have the chance to inform the experts as to their needs can be anything but an oligarchy managed in the interests of the few.” Democracy had to rest on an “assumption of responsibility” on by the demos. Lippmann’s “omnicompetence” was unnecessary:

There is no limit to the intellectual endowment which may proceed from the flow of social intelligence when that circulates by word of mouth from one to another in the communications of the local community…. We lie, as Emerson, said, in the lap of an immense intelligence. But that intelligence is dormant and its communications are broken, inarticulate and faint until it possesses the local community as its medium.

Due to the progressive impoverishment of the entire category of “the local,” Dewey’s vision of an America in which thriving local communities serve as natural networks within which local color—the reality on the ground—can filter up and out while expert knowledge filters down and through is receding further and further from view. Centralized government and multinational corporations dominate the field, and we are far more likely to feed our news into and get it from globe-spanning social-media conglomerates than friends and neighbors we trust. At the same time, we have self-administered a mega-dose of Lippmann’s prescription of government-by-experts, and Dewey’s prediction that keeping the experts non-partisan would be impossible and that it was a recipe for oligarchy has come to pass.


To avoid the distraction of Covid politics, I will not delve here into what, to many, is the most glaring recent example of the problem, nor will I discuss the fraught question of whether it was a good idea to allow ourselves to spend a few years being governed by Big Pharma shill Anthony “La Science, c’est Moi” Fauci. The larger point, as Alasdair MacIntyre wrote in After Virtue, is that “in the social world of corporations and governments[,] private preferences are advanced under cover of… the findings of experts.” We have been disempowered on every level. Big Pharma, the AMA’s medical-licensing scheme, and the insurance industry have taken decision-making about the most critical matters of life and death out of our hands. “First, do no harm” gone by the wayside; most doctors long ago gave up the healing arts for the far more lucrative practice of pill-pushing, including pill-pushing us directly into the jaws of the opiate epidemic. Always advanced under cover of protecting us from ourselves, other licensing and accreditation schemes in the legal industry, accounting, education, and other métiers create artificial monopolies, enriching certain educational institutions and professional organizations.

In nearly every walk of life, the same dividing line is drawn in the sand and then fortified with ivory towers, separating those who have education, expertise, and credentials from those who—it would seem—do not. When in stark contrast to Hillary Clinton’s “basket of deplorables,” Mitt Romney’s “47 percent” or Obama’s folks who “get bitter and cling to guns or religion,” Trump proclaimed that he “love[d] the poorly educated,” it was a greater slap in the face of all those establishment interests committed to this dividing line than any of Bernie Sanders’ “billionaire class” rhetoric could ever have been. The left-populist Christopher Lasch made essentially this point in the mid-1990s in Revolt of the Elites:

According to [New Republic editor Mickey] Kaus [in The End of Equality], the most serious threat to democracy, in our time, comes not so much from the maldistribution of wealth as from the decay or abandonment of public institutions in which citizens meet as equals. … The ‘routine acceptance of professionals as a class apart’ strikes Kaus as an ominous development. So does their own ‘smug contempt for the demographically inferior.’ Part of the trouble, I would add, is that we have lost our respect for honest manual labor. We think of ‘creative’ work as a series of abstract mental operations performed in an office, preferably with the aid of computers, not as the production of food, shelter, and other necessities.

The Rise of Big Ed

It was not always thus. Up through much of the nineteenth century, college, while conferring some degree of prestige and some tangible economic benefit, was not seen by most families as either necessary or ultimately worth the comparative bargain. After the Civil War and especially with the coming of the Gilded Age, university enrollment began to grow, and another surge came with the burgeoning popularity of collegiate athletics and all the now-familiar hallmarks of college social debauchery in the 1920s. During this same era, medical schools and law schools began requiring some college or a college degree before students could begin their professional training. Between 1920 and the end of World War II, college enrollment increased from 5 percent to 15 percent of high-school graduates.

The real spike in the graph came with 1944’s G.I. Bill, making college education a great bargain for returning veterans. A university education was now no longer just for the children of the wealthy elite. Before World War II, most major state universities had enrollments between 3,000 and 6,000, while in the decades after the war, many grew to 10 times that size. The Cold War, and especially the shock to the system that came after the Soviets beat the U.S. to space with the launch of the Sputnik satellite in 1957, prompted President Eisenhower to initiate a renewed education push and the passage of education-funding legislation such as the National Defense Education Act (NDEA) of 1958. Americans needed to get the message that education—and especially higher education—was the key to competing in a more technologically complex world.

That college enrollment, in the ensuing years, offered a means of avoiding the Vietnam draft made that message go down a whole lot easier. But when the draft ended in 1973, with the reputation of universities tarnished in the minds of many Americans as a result of their role in the ’60s counterculture, the federal government had to step in to tip the scale once again. Thus, in the 1970s, generous financial aid appeared on the scene, initially taking the primary form of Pell Grants and later moving increasingly to low-interest loans. Such programs contributed to university enrollment, while simultaneously enabling the era of skyrocketing tuition costs (research strongly suggests that the availability of more generous loans has driven tuition hikes), with the ultimate cost being shifted to a combination of taxpayers and enrollees’ future selves. 

Large enrollments and high tuitions meant that universities had become lucrative business opportunities. The era of Big Ed, with its administrative bloat and prominent system of standardized testing, was upon us. As a 2018 article in the Atlantic on the skyrocketing cost of education in America explained, the upshot, based on OECD data, is that “U.S. colleges spend more on non-teaching staff than on teachers, which is upside down compared with every other country.”

A Wage Premium Driven by Credential Inflation

College was being pushed upon us by government and Big Ed alike, with the two often working in close collaboration. And there is no question that listening to the advice we were given and getting a college degree, as they all said we should, was a wise financial strategy, though that’s less true for the 36 percent of college enrollees who do not graduate or for a full quarter of college graduates, especially those who major in subjects such as psychology, visual arts, music, philosophy, religion, education, or ethnic or gender studies (as opposed to engineering or computer science). Nevertheless, on average, while college graduation as recently as 1980 offered a 23 percent wage premium over high-school graduation, by 2019, that number had fallen to 77 percent. 

The more important and interesting question, however, is whether this massive wage premium is actually being driven by anything like a 77 percent advantage in skills portable to the workplace. Many independent factors suggest the answer is no. Only 46 percent of college graduates even work in their field of study, with 29 percent working in a different field and 16 percent being of working age but unemployed. A 2011 study by sociologists Richard Arum and Josipa Roksa concluded that 45 percent of students show no significant improvement in learning in their first two years of college, and 36 percent of students show no significant improvement the whole way through, while even those who improve generally do so very modestly. To allay any suspicion that the actual benefit of higher education is clustered at the top, i.e., in elite universities, a 2020 study of 28,339 graduates of 294 universities found that despite the substantially higher salaries graduates of elite institutions command, for every 1,000-place drop on the university rankings, the concomitant drop in job performance is a meager 1.9 percent. Moreover, even this meager advantage in overall job performance by elite graduates comes at a substantial cost—and not one easily measured in dollars and cents. Telling us what we already know, the authors explain that “graduates from top universities tend to be less friendly, are more prone to conflict, and are less likely to identify with their team.” While not constituting definitive proof of anything, what such data strongly suggests that college education is not directly arming us with marketable skills—or, really, with much of anything, and certainly not anything that should result in a 77 percent wage hike. 

The wage premium, then, is likely being paid to college graduates for some other reason than the skills they gain in college. A bit of independent confirmation for this hypothesis comes from merely thinking back, if you are a college graduate, on your own experiences. Most of us, for better or for worse—and I will return later to the question of whether it is for better or for worse—didn’t spend our college years learning anything employment-focused. Unless you were something like a computer-science major who went on to a job in I.T., you probably spent college pursuing your passions, dabbing in this and that or taking classes that aimed toward a “practical” major but still did not directly arm you with marketable skills. The truth is that, unless you went to technical school, professional school, or graduate school for professional training (and even then I can testify that most attorneys learn no more than 5 percent of what they need to know in law school), more than likely you picked up the skills you need during your first years on the job.

So, again, why the college premium? The most obvious explanation is that college has come to serve a gatekeeper function in our economic life. If you are hiring for any profession that requires intellect and responsibility, the fact that someone got into college and then made it the whole way through will give you evidence of both qualities. If your applicant graduated from a top university, so much the better. A 2017 Harvard survey of 600 employers found that 61 percent routinely tossed out resumes without four-year degrees even if the applicants were qualified for the position and even though 63 percent of the respondents had trouble filling middle-skill jobs, i.e., jobs such as food-service management, childcare, and supervisors of production, construction, or retail-sales workers. They did so, moreover, despite the fact that they had to pay a premium of between 11 percent and 30 percent for college-educated workers, while getting, in exchange, no “material improvements in productivity.” Because of employers’ increasing degree fixation, as the supply of college graduates entering the job market grew while the supply of well-paying jobs became increasingly targeted toward those college graduates, we saw, as we would expect to see, the formation of an ever-starker dividing line between college graduates and the lower-caste Morlocks toiling away in the trenches.  

Creating Castes, Serving Two Masters and Pleasing Neither

Does creating that stark dividing line between the roughly 40 percent of us who graduated from college and the majority who do not—especially when it is not tied to any meaningful difference in skills or erudition—serve any positive social good? The question is rhetorical, but it is also empirical. In the 1950s, the prominent Berkeley sociologist Robert Nisbet had already written of “the growing popular conception of university education as a means less of illumination than as an avenue to social status and intellectual certainty.” There is substantial evidence that university education today makes us more certain of ourselves and more intolerant of others, while attributing to them extreme positions they do not hold. As the historian Adam Garfinkle has written, these superficially educated minions “contribute scantily supported opinions about things they don’t really understand, validating the old saw that a little bit of knowledge can be a dangerous thing.” A 2019 study by the polling and analytics firm PredictWise, retained by the Atlantic to analyze partisan prejudice, found that a high level of education was strongly correlated with political intolerance of those who held different views. 

A 2019 study by the “More in Common” project similarly found that “the more educated a person is, the worse their Perception Gap”—their distorted view of and tendency to attribute extreme positions to those on the “other side.” Interestingly, however, the research found that this error was being committed only by educated Democrats: “Democrats’ understanding of Republicans actually gets worse with every additional degree they earn. This effect is so strong that Democrats without a high school diploma are three times more accurate than those with a postgraduate degree.” The reason? Monocultural groupthink: “Highly educated Democrats are the most likely to say that ‘most of [their] friends’ share their political beliefs.” Even being more plugged in to the media, as more educated people tend to be (especially when it comes to consuming more “in depth” coverage), may actually only deepen existing bias in our contemporary polarized media landscape. More in Common: “We found that the more news people consumed, the larger their Perception Gap.”

Why is American higher education such an abject failure? Our system of higher education simultaneously and confusingly aims at two entirely distinct goals—and, as such, is discharging neither of them especially well. It is, on the one hand, supposed to prepare us for the labor market by arming us with practical skills, and on the other hand, supposed to turn us into Renaissance men and women, who have a deeper understanding of our world. But the most popular major in college today is “business”—a nonsense major that neither educates anyone in the broad sense nor prepares them for anything in particular in the narrow sense of the workaday world. Christopher Lasch observed that most college students “get little training in writing (unless ‘Commercial English’ is an acceptable substitute), seldom read a book, and graduate without exposure to history, philosophy, or literature.” 

The reality is that the goal of pure workplace preparation, as I have already suggested above, can be discharged far more directly and efficiently by technical and professional schools of the sort more common for aspiring professionals in Europe. The brain trust at Google, one of the nation’s elite employers, has recently figured this out and taken it to a whole other level, as the company has opened up a no-college-degree-required free Coursera course to teach the exact skills necessary in its workplace, which then doubles as a means for Google to assess its aspiring engineers based on their coursework. 

As for the other goal of American universities—their traditional function of turning us into more deeply learned individuals—the ideal kind of program to achieve that end is something that is sadly all-but-entirely absent in our universities today. It is a program like the one at St. John’s College that offers no choose-your-own-adventure-style majors and courses to wide-eyed kids who typically have not the faintest clue of what is or is not worth learning. St. John’s offers a single “great books” curriculum that has students marching systematically through four years of education in the finest, most important and canonical texts the West has on offer and constituting, in the process, a tight-knit academic community capable of participating in intensive discussions of the texts all are reading. Students do not emerge from such a program as ready-made accountants, nurses, or engineers, but they emerge as far more aware human beings, understanding on a deeper level than most of us who we are and where we came from. 

Thus, while our universities neither make us more productive nor more enlightened, we continue to talk as if and tout college education as though it were doing both. Because education in the broad sense is an unquestioned social good, we persist in trafficking in the prejudice that the college-educated set is, in some important respect, better than the rest, even if the actual current complexion of our universities offers us no sound reason to believe that. Indeed, as we have seen above, we have reason to think higher education is actually creating a class of wealthy, intolerant, bigoted, and ignorant snobs. Most proctologists, like most “educated” white-collar professionals, are as a rule profoundly ignorant on every matter beyond the ambit of their narrow field. The proctologist knows stuff about the human body—and, especially, about the human butt—that the plumber doesn’t. That is the extent of his enlightenment. 

Aside from making graduates substantially wealthier—often unjustifiably so—the principal feat that higher education today accomplishes is the creation of castes. It creates, standing above all, its Brahmins, the high priests of academia and the country parsons staffing schools—who are, ironically, some of the least capable and qualified and some of the most ideologically extreme professionals in America. One rung below them are the Kshatriyas, our aristocrats, professional-school graduates, doctors and lawyers, commanding respect and substantial salaries. Next come the Vaishyas, our merchants and tradesmen, ordinary college graduates. Finally, we have the Shudras, the non-college educated servants and manual laborers, who become untouchable Dalits, deplorables, if they dare to let their political leanings take a right-populist turn. 

The Education of Partisans

The plumber and the proctologist, in a sane world, would be merely different kinds of technicians, one of whom simply needs to know more facts to do his job and, thus, commands a somewhat higher salary than the other. These individuals live in increasingly distinct worlds not because they inherently belong to utterly distinct sociocultural strata, but because our society has left them artificially stratified. With the college-educated and non-college-educated occupying more and more separate worlds, the party that used to stand with the labor movement and the working class has developed a profound contempt for the working class, which in turn overwhelmingly votes Republican nowadays. 

When universities became a monoculture, their graduates came to represent that same monoculture. Between 2000 and 2020, college-educated voters went from favoring Republicans by an 11-point margin to favoring Democrats by a 13-point margin. And, just as we would expect, as our college-educated elites have adopted an ever-more sneering attitude toward Republicans, Republicans have responded in kind, taking an ever-dimmer view of higher education, a phenomenon certain to exacerbate the partisan education gap. If President Biden’s “loan forgiveness”—another instance of Democratic elites’ unconcealed contempt for the working class, entailing a massive regressive tax transferring wealth from the disproportionately poorer non-college educated to the college-educated, including those earning up to $125,000—should survive legal challenges, the “educated” and less “educated” will become yet more polarized.

Elites are inevitable. From ancient times to the modern age, from absolute monarchies to communist states and everything in between, there has not yet been devised, in all humanity’s history, a large, complex society in which a class of sociocultural elites did not arise. A society is healthy—and I contend a society with virtually any form of government can be healthy or sick—when, above all else, the relationship between its regular folks and elites is organic and symbiotic. This occurs when, on the one hand, the elites are an "image" of the “commoners” writ large, representing the commoners and their needs and desires before kings and gods alike while serving as paragons of virtue, transmitting to the regular folks “in their character and behavior,” in the words of sociologist Philip Rieff, the culture’s system of “moral demands.” The commoners, on the other hand, should ideally admire and respect elites, even as they hold them to account and infuse elite culture with everyday vitality and dynamism. 

But a big problem manifests when we send so many to college, who then graduate thinking of themselves as “elites.” This is what complexity scientist Peter Turchin has dubbed “elite overproduction,” a state of affairs in which more of us have claims on elite status and its privileges than society can reasonably accommodate, which, in turn, creates social instability when those aspirants left out in the cold inevitably become resentful and begin to question the very foundations of the culture that made them what they are. “Elite overproduction” is not the only way a healthy society falls ill, but it is one sure way to get there. In a sick society, the elites are a Walter Lippmannesque class apart, differing markedly in their beliefs and desires from the commoners, and looking upon regular folks with contempt. Rather than modeling manners, virtue, and veneration for exalted cultural traditions, the elites become narcissistic, lazy, and soft, demanding little of themselves, leading and promoting depraved lifestyles, enacting their unconscious but well-deserved self-disgust, modeling ignorance of and disdain for the monumental cultural achievements and traditions from which they and their entire society draw their lifeblood. 

The College Fix: Toward Real Education

How can we nurse our sick society back to health? The process will be long and laborious, but I believe that the first critical step is to recognize that our confused and faulty system of education has created this toxic divide. We must begin by decoupling true education from mere career preparation. As surveys report, “Almost all students say access to a well-paying job is a primary reason for attending college.” These people are not interested in an education for its own sake and do not strive to attain one in college. Let’s make their true interests and their actual attainments transparent, both for them and for everyone else. 

Just as plumbers routinely find a training program to get qualified for their license, proctologists should be able to skip the pretense of “college education,” which deludes them and us into thinking they represent a higher species of being, and attend a medical-training program, albeit somewhat longer than the plumbing equivalent, to be licensed for their job. And the same is true for all those other professionals—lawyers, investment bankers, accountants, management consultants, marketing folks, programmers, and so on—who presently walk around thinking of themselves as “elite.” 

Education is not mere job training. It is, rather, as T.S. Eliot wrote, “the process by which the community seeks to open its life to all the individuals within it and enable them to take their part in it. It attempts to pass on to them its culture, including the standards by which it would have them live.” That process—while ideally lifelong and enacted by society as a whole—begins, formally, in schools. But for those who are truly interested in furthering their formal education after graduation day comes and goes, a few universities should exist to serve that end. 

The universities of the future must be more akin to seminaries—rigorous and demanding, far more modest than our current universities in their means, but aiming far higher in their ultimate ends. There must be no majors, no menus of course options, no mascots, no organized athletics, no swanky student centers. The appeal of universities must be to those who relish a monastic existence and a monastic community dedicated wholly and exclusively to the study of “the best that has been thought and said in the world.” Some may choose to stay for many years, or for life. Others may get their fill and venture back out to devote themselves to the dissemination of this sacred knowledge to the wider world. In the end, such monkish scholars, like saints in our midst, might restore the vital place of education, so that it is no longer an expensive commodity that separates the haves from the have-nots but rather is a polestar that guides us all, to whatever extent we are each illuminated by its light.