Much has been made of recent news that about 60 percent of college students are female, “higher education’s dirty little secret,” as reported by the Wall Street Journal. Though obviously a disconcerting trend for men—who in increasing numbers are out of the workforce, addicted to drugs and porn, and killing themselves—there’s another, not so secret, not so little, bit of dirt: the declining quality of education America’s college students are receiving. Indeed, even advanced degrees serve less as a marker of intellectual acumen than they do an (irrelevant) measure of technocratic success for entry into our digital economy.
Consider a short, couple-hours course I recently took on maintaining safe and secure practices in the digital age. The instructor, who had multiple advanced degrees, had written a dissertation on the social and sexual practices of lesbians in Namibia. She was also, apparently, a Wiccan priest. And her self-professed pronouns were “she/them.” Towards the end of the class, a student expressed incredulity at the foolish things people do on the internet and their mobile devices, and what those persons allow third parties to know about them. With a self-assured smile, the instructor shrugged and said: “Well, some people just aren’t well informed.”
“Informed” was an interesting and even illuminating word choice, particularly coming from someone adhering to a religion popularized by a retired British civil servant in the 1950s. People, apparently, simply don’t have the proper knowledge to make good decisions because of poor education, poor access to trustworthy information, and perhaps even a little bit of “user error.” But what “knowledge” is actually valuable and integral for human flourishing?
For most folks in my class, I’d wager the answer is predominantly utilitarian, if not even clinical: Good knowledge is that which enables people to make decisions that protect and promote their personal welfare, be it economic, social, physical, or psychological. Technocratic expertise that helps a person acquire a job with a good salary is good knowledge. So is developing emotional intelligence to have amicable relationships at work and in one’s free time. So too is maintaining a solid exercise regimen and practicing “mindfulness,” perhaps through yoga or meditation.
A predominantly therapeutic, technocratic understanding of knowledge would be largely foreign to most humans throughout history, and even our grandparents. The dramatic scientific developments that have transformed our world in the last couple generations explain why: Forty years ago, nobody needed lessons in exercising prudential judgment with a computer or cell phone; people in more agriculture-based societies didn’t need Orange Theory or Peloton to teach them how to stay fit. There has been a paradigm shift in what modern humans value as knowledge.
Take the simple fact that someone can now spend their years in a state-sponsored public educational institution studying the sexual practices of people in southwest Africa. The social sciences have become risibly niche and arcane, focused on the bizarre musings and peccadilloes of activist academics that prize originality and provocativeness above all else. Justifiably, they have become an endless source of mockery, from an annual column in the New Criterion to the purposeful publication of fabricated research in leading academic journals.
Cardinal John Henry Newman identified this trend a century and a half ago, observing in his The Idea of a University that “there is a demand for a reckless originality of thought, and a sparkling plausibility of argument.” The academic elite, Newman notes, tend to flaunt their intellect “daily before the public,” wearing clothes “like the silkworm’s, out of themselves!” The intelligentsia of his day were increasingly beholden to “random theories and imposing sophistries and dashing paradoxes, which carry away half-formed and superficial intellects.”
Newman contrasts this intellectual superficiality with a more ancient understanding of the university, one which properly appreciates the meaning of the very word itself: a unification of diverse studies and people into a common, cohesive group with an ultimate, underlying focus. In an ancient and medieval context, this ultimate telos was God Himself, from whom all knowledge, both theoretical and technical, derives. Thus “religious doctrine is knowledge, in as full a sense as Newton’s doctrine is knowledge.”
The modern secular university has not actually abandoned the principle of uniting all knowledge under a single, overarching framework. If anything, academia is becoming more unified around a new final end—it is just that this telos isn’t transcendent, but immanent. It does not drive the student outside himself, but inwards, towards the narcissistic, celebratory self. This is manifested most palpably in racial, sexual, and gender identitarianism, which has seeped into every academic discipline, from literature and history to mathematics and biology, as well as theology. One need only attend a professional conference or read a journal to see how identitarianism has become the unifying feature of both the academy and its research arms.
The unintended consequence of this has been to undermine the veracity of all academic disciplines by making them subject to activist politics, something Newman predicted too. “To withdraw Theology from the public schools is to impair the completeness and to invalidate the trustworthiness of all that is actually taught in them,” he declares. In abandoning theology, we are left with an anthropology that is impoverished by ever more radical progressive politics, “degenerating into error and quackery, because they are carried to excess.” Newman adds: “It is not only the loss of theology, it is the perversion of other sciences.”
This is how we come to have people with advanced degrees with a focus on the sexual behavior of Namibian lesbians. Obviously, studying and understanding cultures has tremendous value: In learning about other peoples, we are able to better contemplate the human condition, both as a universal phenomenon, and what makes each culture unique. Yet separated from some unifying, transcendent principle—such as man’s common divine origin and end—our research becomes increasingly bizarre, self-absorbed, and, frankly, worthless. College students now take courses not on Shakespeare, Renaissance Italy, or introductory Latin, but Miley Cyrus, Harry Potter, beauty pageants, and, no surprise, the phallus. We possess credentials, but what these credentials actually communicate about our knowledge of reality is tenuous at best.
Amusingly, this is perceptible even in the case of my Wiccan, alternative pronoun-identifying instructor. She was not actually teaching a course in anything related to her advanced social science degrees, but rather a topic related to the use of technology. I’m not sure anyone in the course would have paid money to attend a lecture on her actual academic specialty. She spent tens of thousands, perhaps hundreds of thousands of dollars on her degrees, and here she was teaching an introductory class about applying common-sense to one’s social media accounts and iPhone (it was, admittedly, a half-decent course).
This is the impoverished reality of academia, particularly the soft sciences: not actually teaching anything particularly valuable or useful, apart from enough basic reading, writing, and speaking skills to find work in the digital economy that is entirely unrelated to one’s university education. The object of a university, argues Newman, is supposed to be to help its students “fill their respective posts in life better, and [make] them more intelligent, capable, active members of society.” Now we simply shoot for credentialism for the sake of entering the technocratic workforce, attaining a decent standard of living, and earning the respect of our peers.
If one’s goals in life are regular access to single-source chocolate, personal trainers and yoga classes, and annual vacations to trendy locations, I suppose one has achieved success, at least as far as today’s young, credentialed generation understand it. And, as long as one embraces or at least makes obeisance to the ever-shifting norms of racial, sexual, and gender identity, one can assume the mantle of being “informed.” It seems like pretty weak tea to me, and not just because I think people changing their pronouns, identifying as other than their biological sex, or practicing Wiccanism have taken leave of their senses.
Why would American men want to attend universities where they will be shamed for their toxic masculinity, cisgender and patriarchal norms, or, depending on their race, “white privilege”? You cannot blame men for their increased disinterestedness in playing along with this expensive, self-flagellating game. Nor is there little hope for heroism in a worldview that, claiming to be radical and provocative, descends into digital age conformity.
To be properly informed, argues Newman, is to perceive all reality in light of God and the divine order. It is to orient all our knowledge towards the betterment of ourselves and our fellow man, and to contemplate, however inchoately, how our finitude can illuminate transcendent truths. Instead, modern man perceives reality in light of our fickle, self-congratulatory selves for the sake of nothing more than careerism and fleeting sensual pleasure. The sad irony is that in dispensing with the divine, we also dispensed with man and woman, the former descending into despair, the latter into an overhyped professionalism that fails to satisfy our deepest needs.
Casey Chalk writes about religion and culture issues for The American Conservative and is a contributing editor for the New Oxford Review. He has degrees in history and teaching from the University of Virginia, and a masters in theology from Christendom College. He is the author of The Persecuted: True Stories of Courageous Christians Living Their Faith in Muslim Lands (Sophia Institute Press).