I began my first class as a teacher of undergraduates—still only a graduate student myself—forty-four years ago this month. It was only a handful of years since I had been a college freshman, but I already held the opinion that higher education—“the academy”—was in crisis. Over the years since, as student, teacher, and scholar, in institutions secular and religious, foreign and American, humble and elite, I have not wavered in that opinion. The crisis appears to be perpetual.
The academy frequently experiences pseudo-crises and takes them for real ones. Budget crises, enrollment crises, employment crises, adjunct-reliance crises, preferential-admission crises, tuition-inflation crises, doctorate-overproduction crises, research-integrity crises, grade-inflation crises, student-radicalism crises, academic-freedom crises: you name them, they appear with numbing regularity, sometimes several at once. For the most part these are, in the long view, not crises but evanescent problems, often self-inflicted; students, faculty, and staff may be ill-treated, whole institutions may go under, but the academy writ large chugs along, puffing great clouds of smoke as always.
The real crisis of the academy is an identity crisis. With noble, notable exceptions, colleges and universities have no very clear idea of what they are for—what they exist to do. Yes, in the research laboratories of the “STEM” fields, progress on scientific problems is confidently made, and the question “What are we doing here?” is probably not so pervasive. But scientific progress—the “relief of man’s estate,” in Francis Bacon’s phrase—is only intelligible in light of some coherent idea of what a human being is, and what the excellence of such a being consists of. On such questions, the modern academy is not so much in a state of disagreement among differing answers (which might be a sign of health) as in a state of incomprehension about the questions themselves.
The nature and excellence of human beings would seem to be the subject of the humanities, and the understanding of their relations with one another the subject of the social sciences. But these two divisions of the academy are where the real identity crisis resides. This can be seen in curricula and pedagogy—in what is studied and how it is taught. In a 1959 address titled “What Is Liberal Education?” Leo Strauss observed, “liberal education consists in reminding oneself of human excellence, of human greatness.” It “supplies us,” he said, “with experience in things beautiful.” But today the academy is—and has been for a long time—radically uncertain that the human mind can distinguish the beautiful from the ugly, or can say what human nature is, let alone human excellence or greatness.
I recently reread parts of Allan Bloom’s The Closing of the American Mind, which I read when it was new in 1987 and have revisited more than once, and I am struck by how well its argument holds up:
Our present educational problems cannot seriously be attributed to bad administrators, weakness of will, lack of discipline, lack of money, insufficient attention to the three R’s, or any of the other common explanations that indicate things will be set aright if we professors would just pull up our socks. All these things are the result of a deeper lack of belief in the university’s vocation.
Bloom is unduly nostalgic for the 1950s, which he calls “one of the great periods of the American university,” perhaps because those were the salad days of his own education. But he is right that “the university now offers no distinctive visage to the young person,” who finds a “democracy of the disciplines” that is “really an anarchy, because there are no recognized rules for citizenship and no legitimate titles to rule.” Bloom discerns in the humanities in particular that “there is no semblance of order, no serious account of what should and should not belong, or of what its disciplines are trying to accomplish or how.” When we look in the college course catalogue or examine the intellectual output of the professoriate, it is hard to disagree.
But if the 1950s were not so great, then when was the academy not in an identity crisis? Perhaps about 200 years ago—or so might be one reading of The Battle of the Classics (2022), by University of Maryland classics professor Eric Adler. The book’s subtitle, How a Nineteenth-Century Debate Can Save the Humanities Today, gives a somewhat misleading impression, considering that in Adler’s view the defenders of the traditional humanities curriculum, with its heavy emphasis on required mastery of Greek and Latin classics, “were trounced” in the debate with those who wanted to open up the curriculum of the emerging modern university to elective courses of study.
But this was chiefly because, as Adler shows, the traditionalists had already lost their grasp of what a humanist curriculum is for—“the creation of better human beings”—and had fallen into platitudes about “skills” like “critical thinking.” Thus they unwittingly made their program subservient to the social sciences, which could claim to measure the attainment of such skills. In the twentieth century Irving Babbitt and others would champion a “New Humanism,” emphasizing substantive content over skills, but their movement failed to gain much traction outside a few notable core curricula that barely checked the open-menu momentum of the academy’s courses of study. Adler notes that Babbitt himself never proposed any definite humanities curriculum, and he closes with only the barest sketch of one of his own, for that matter. But as a history of where the American academy has been and how it got where it is, The Battle of the Classics is instructive reading.
One reason to doubt the prospects for reintegrating the academy’s identity is that, if reliance for such an instauration is to be placed on Adler’s field of classics, our hopes are pretty dim. It was something of a bellwether when Princeton decided no longer to require the mastery of any ancient language in order to graduate with a degree in classics. But the problem goes deeper, as Wellesley classicist Mary Lefkowitz observed in her 2008 book History Lesson: A Race Odyssey. In the 1990s, upon discovering that a colleague in Wellesley’s Africana Studies department was teaching students that the ancient Greeks had “stolen” all their apparent intellectual innovations from Africa—and that there were pseudo-scholarly works in print making the same bogus claims—Lefkowitz not only wrote a book exploding such pretensions (Not Out of Africa) and co-edited another (Black Athena Revisited), but attempted to prevent her colleague from continuing such pedagogical malpractice at their college. For this she found herself largely hung out to dry by Wellesley, and sued by her colleague. History Lesson is her memoir of her ultimate survival—one can hardly call it a triumph—in a struggle that she should have won in five minutes, not the five years it took.
Lefkowitz is refreshingly direct about how deep the rot goes in the academy. The “postmodern” attitude that pervades some disciplines holds that there are “no such things as fact or objectivity,” while the principle of academic freedom is absurdly invoked to shield the teaching of “what is demonstrably false.” As Lefkowitz laments, “If our best students are not taught to distinguish between history and propaganda, and are encouraged blindly to adopt partisan or racial or tribal loyalties, then they are not being taught well at all.” She survived the attack on her career because the courts of law still take some cognizance of truth claims, not because her college or her colleagues rallied around her.
Lefkowitz’s brief memoir does not consider how it is that the displacement of facts and objectivity by propaganda and partisan loyalties came about. But Carl Trueman’s latest book, To Change All Worlds, explains it very well. Subtitled Critical Theory from Marx to Marcuse, Trueman’s book is by no means a screed against the critical theorists he considers, chiefly from the Frankfurt School. Each is given his due, and his say—Korsch, Horkheimer, Lukács, Adorno, Benjamin, Marcuse—and the reader is permitted to draw his own conclusions. But the judgment is inescapable. These thinkers’ systems should be rejected not just by Christians (Trueman’s intended readership) but by anyone who holds that truth is independent of our wishes.
Karl Korsch is a representative example. Trueman writes:
For him, what makes a philosophy, idea, or theory true is not its simple correspondence to reality. Reality, after all, cannot [for Korsch] be accessed in a straightforward, nonideological manner. Rather truth values are determined by whether a particular idea or claim furthers the revolutionary cause.
Hence “the articulation of critical theory is itself a part of revolutionary praxis.” The trouble, however, is that critical theory’s “historicism and deep suspicion of essentialism prohibit it from articulating a clear anthropology that then prevents it from offering a cogent view of the future in anything more than hopeful pieties.”
Critical theory of this kind has infected, and in some places fully dominated, various academic disciplines and programs, and the result is the displacement of real inquiry with ideology. When changing the world supplants understanding the world, teachers aren’t teaching and students aren’t learning. Instead they are agitating, and authentic academic content is replaced by agitprop.
A special case of critical theory, which has come to the fore in the last fifteen months since the Hamas massacres in Israel, is the ideological movement against “settler colonialism,” which Adam Kirsch ably critiques in his 2024 book On Settler Colonialism. In most of the “settler colonialist” countries—Canada, Australia, the United States—the discourse of critique “is primarily a conversation among ‘settlers’ about their own identity, and what it offers is less a program for action than a political theology.” These countries are not simply going to go away. But when this otherwise largely academic discourse turns to Israel, the accusation being that Jews are the “settler colonialists” unjustly “invading” the land of “indigenous” Palestinians, an otherwise ineffectual ideology justifies murderous rage and terrorism, perversely accusing Israel of “genocide” while justifying attacks on the Jewish nation that would actually be genocidal.
Thus do value-relativism (the view that all moral claims are purely subjective) and fact-relativism (the view that there are not even “facts,” only assertions advanced for reasons of self-interest) result not in an easygoing tolerance in which everyone gets along, but in furious ideological conflicts in which power, boldness, and sheer unabashed nerve are the currency of discourse. It is not surprising that the modern academy has hatched such a combination of solipsism and thuggishness. On the Right today one hears a good deal of half-baked complaining about “cultural Marxism,” but Trueman and Kirsch show us that critical theory and “settler colonialism” discourse really do have a lineal descent, in part, from Marx. Odd as it may seem, there is a line running from Marx’s notion of alienation to today’s preening “land acknowledgments” and the student chants to “globalize the intifada.”
How can the academy recover its vocation, its true identity as a center of humane inquiry? That will be the subject of next month’s Bookshelf.
Image by Nate Hovee and licensed via Adobe Stock.