fbpx
Politics Foreign Affairs Culture Fellows Program

Americans Used to be Proud of their Universities

Global prestige slips as schools prioritize faddish scholarship and political agitation.
university

Americans used to be proud of their universities. The reasons were simple. Universities taught students how to think, and showed them how to communicate complex ideas. They gave many young people an expertise, a vocation, and a useful purpose in life. They nurtured an understanding of the finest aspects of human cultural flourishing. In their pursuit of athletic excellence, they promoted pride across local communities. And on top of all this, through research and discovery, they added unceasingly to our understanding of the world and our place within it.

Today, however, radicalized faculties and belligerent students are debasing the old model, causing some Americans to have second thoughts. As we start a new academic year, for example, the University of Missouri, a locus of racial protest in 2015, is experiencing a sustained decline in enrollments. It turns out that a banner year for political fundamentalism on any campus, even amongst a minority of students, is a marketing fiasco for an institution whose core business is supposed to be the discovery and dissemination of truth.

But what’s happening in American higher education is much worse for American universities—and much worse for America as well—than a mere PR disaster.

Last year, Tsinghua University in Beijing published more computer science papers than MIT and Stanford combined, according to at least one respected source: the Essential Science Indicators. More broadly, China now accounts for more than 20 percent of the world’s academic literature in computer science, up from just five percent at the turn of the century. Meanwhile, the U.S. share has declined to less than 15 percent—down from 33 percent in 2000 and over 40 percent in the early 1980s. (See the National Science Foundation’s Science and Engineering Indicators for recent international data on computer science outputs through to 2013.)

Software may well be revolutionising the global economy, a trend that will only accelerate as machine learning and artificial intelligence become embedded into our daily lives. But American higher education doesn’t care: It is preoccupied with an alternative vision of the future.

Unfazed by the machine age, American institutions have been squandering capital pursuing hysterical but unverifiable scholarship in a host of faddish and politically contentious disciplines in the humanities and social sciences. At the same time, judging by the bracing culture of outrage on U.S. campuses, they have ramped up production of a new version of an old product: The graduate who is energized not by knowledge but by ideology.

Thus, while their counterparts in Asia have been discovering how to control the world through mathematics and software, a good portion of the present generation of American students have wasted their energies on arguments about gendered restrooms, male privilege, and whether controversial conservatives should be allowed to speak or not. Meanwhile, even the politically indifferent on U.S. campuses collude in an imprudent fantasy. Heroically, in this era of failing newspapers and unemployed journalists, the country’s universities and colleges still graduate more communication and journalism specialists than computer scientists—despite a recent uptick in computer science enrollments.

Americans should be worried. History shows that changes in the intellectual wealth of nations matter, and that a divergence in the priorities of higher education systems of different countries can have serious consequences. Take the example of Britain and Germany, as brilliantly articulated by Otto Keck in National Innovation Systems: A Comparative Analysis, edited by the great evolutionary economist Richard R. Nelson and published in the early 1990s.

Throughout most of the 19th century, Germany lagged behind Britain economically and technologically. Then, in the 1880s, inspired by the ideas of Wilhelm von Humboldt, and implemented through the efforts of the Prussian bureaucrat, Friedrich Althoff, the Germans pioneered a new model of the university that was research-intensive and focused on scientific and technological inquiry.

German universities experienced rapid growth in enrollment numbers and research funding in technical fields, and they quickly gained global leadership in a range of areas, including chemistry, physics, and engineering. By the outbreak of the First World War, the number of engineering students in German tertiary education was 10 times greater than the number in Britain.

This had far-reaching consequences. Over much of the preceding century, patents registered in the U.S. Patent Office were twice as likely to be British as German. Yet with deeper knowledge, and a more systematic approach to discovery, Germany’s university-trained inventors eventually out-stripped their British counterparts. By 1913, Germans were named on roughly 50 percent more U.S. patents than the British—a reversal from which the British never recovered.

The impact was especially marked in chemicals and pharmaceuticals—the emerging, knowledge-intensive industries of the era. In the early 19th century, German graduates routinely found work in Britain, much as Chinese engineers and scientists do now in the U.S. But over time, Germany’s university-trained scientists began creating more opportunities at home. By the early 20th century, Germany was the world’s top pharmaceutical exporter, and it was producing 90 percent of the world’s synthetic dye exports.

Could something similar be happening with the U.S. and China today? American universities still sit at the pinnacle of all major higher education league tables. Yet, their position is more fragile than is widely appreciated.

Looking beyond elite universities such as Harvard and Stanford, the number of U.S. institutions in the top 500 globally has steadily declined for at least a decade. (For some interesting data, compare the numbers in 2017 with those in 2004.) At the same time, in fields like physics, chemistry, mathematics, engineering, and computer science, U.S. universities do not have a dramatic lead over the rest of the world as they did in the late 20th century.

Lazy administrators justify such changes as the natural result of shifts in the relative prosperity of nations. But they are also a by-product of differences in investment patterns, student attitudes, and community perceptions about the purpose of higher education. Compound this with a Chinese Government announcement, issued just last month, that it is doubling down on its capability in computer science, with the goal of becoming the global leader in artificial intelligence by 2030, and one could argue that a transition is indeed underway.

The optimistic counterargument is that government sponsored initiatives don’t always work out as planned, as the Japanese proved with their ill-fated Fifth Generation Computing strategy in the 1980s. One might also discount the need for concern in the U.S. since significant advances in artificial intelligence and machine learning are actually developing in American corporations, not universities. Firms like Alphabet, Amazon, Apple, and Facebook—not to mention a host of smaller players—already lead the world in software and are currently developing incredible capabilities in artificial intelligence.

Yet, in the long run, corporations only succeed based upon the caliber and values of their people. And there is a profound distinction between a society that trains technologists and one that educates complainants.

A campus culture that prioritizes political agitation over the disciplined search for truth will tend to produce citizens who rank feelings over reason, and whose lives are increasingly burdened with perceptions of past injustice. By contrast, a university system focused upon technological achievement is more likely to teach its citizens to prize rationality, objectivity, and value creation. After all, even a mediocre programmer must understand both the laws of logic and the constraints of reality, neither of which seems to be a requirement for participation in some modern liberal arts degrees.

America leads the world in innovation because American know-how and optimism has consistently sought to create a better future through ingenuity and discovery. In their pursuit of rigorous knowledge and technical excellence, American universities have long helped to supercharge their own country and the world’s imagination. But a surge of Chinese capability in computing and artificial intelligence hints at a very different fate—one in which the U.S. may eventually regret its giddy campus radicals and the slow fading of its universities’ standing in technological fields.

Thomas Barlow advises universities globally on research strategy and is the author, most recently, of A Theory of Nothing.

Advertisement