Rage—Goddess, sing the rage of Achilles,
Murderous and doomed.
—The Iliad. Book I.
The Bob Hope got itself played in Palm Springs last week—despite all the efforts of Aeolus, god of wind—and watching it always reminds me of the third hole at Silver Rock, a shortish par three, when Justin Leonard’s caddie corrected me on the yardage I was giving Derek Anderson, who was then still Cleveland’s hope for the future. Silver Rock is one of those modern courses with many, many tee boxes installed by architects fighting a rear-guard action against the equipment-makers—a war that has all of the vitality Rome’s legions on the Rhine must have had in the century or two after Marcus Aurelius—and looking in the yardage book, I’d mistaken the tee box we actually were on for another because I’d missed seeing one of the tee boxes. The yardage I’d given Derek was something like 12 yards off: enough to put him on the wrong club. Justin’s caddie corrected me, which might have been the end of it but for the tenor of the man’s voice. He was angry.
Now, golf and anger are no strangers to each other: “Some emotions,” even the great Bobby Jones once said, “cannot be endured with a golf club in your hands.” “Terrible” Tommy Bolt, a U.S. Open-winning subscriber to Jones’ theory, advised not only to throw your clubs in front of you (it saves a walk), but also never to break both your putter and driver in the same round: canny pieces of advice from a man not unfamiliar with helicoptering drivers or putters.
Nowadays, of course, such displays of temper are hugely frowned upon, perhaps in keeping with the general vibe of today’s world: my great-uncle, who was city editor of the Chicago Daily News far back in the last century, was renowned for his temper—he “ruled the staff … in fiery justice” his obituary said— as were a lot of city editors at the time. Twenty years ago, though, even a leading candidate for the Oldest Living City Editor, Julius Parker of the Chattanooga Free Press, then 79, admitted to the American Journalism Review that he tried “not to shout as much as I used to.”
Even so rarified an air as academia, which one might suppose has as little to do with a the clatterings of a newsroom as a milkmaid has to a milking machine, isn’t immune to a change in the culture as a whole. For instance, John Milton’s foremost living scholar, Stanley Fish (of Berkeley, Johns Hopkins, Columbia, and, most notoriously, Duke), recently wrote in one of his columns (“The Digital Humanities and the Transcending of Morality”) for the New York Times’ digital edition—which he had, until the very column I am citing, refused to call a blog—that “the new forms of communication—blogs, links, hypertext, re-mixes, mash-ups, multi-modalities and much more—that have emerged with the development of digital technology” challenge the old model of scholarship entirely. It’s a claim that might appear quite unrelated to the one in the previous paragraph—it doesn’t follow that angry city editors have anything to do with scholarship, exactly—but a closer examination of Fish’s argument might reveal that even if the two worlds of newspapering and scholarship aren’t in harmony, they’re singing a similar song.
The reason Fish gives for refusing to call his blog a blog is, it seems, exactly the reasons many defenders of what’s being called the “digital humanities” are proclaiming are the virtues of the practice of blogging and other, newer, forms of scholarly communication. Blogs, and other forms of writing on the Internet, are “provisional, ephemeral, interactive, communal, available to challenge, interruption and interpolation, and not meant to last,” whereas for the past 50 years or so Fish has been “building arguments that are intended to be decisive, comprehensive, monumental, definitive and, most important, all mine.” But for those practicing the new forms of scholarship, such ends are mistaken.
What the “digital humanities” promises, according to Fish (as their enemy, perhaps it is wise to take his point with a grain of salt), is a mode of scholarship in which “knowledge is available in a full and immediate presence” to everyone everywhere: which is to say, the usual kind of left-wing millenarianism. (Indeed, The Digital Humanities Manifesto 2.0 explicitly describes itself as having a “utopian core shaped by its genealogical descent from the counterculture/cyberculture of the 60s and 70s.”) The promise is, as Fish notes Milton described while facing an earlier version of the same sort of thing, as being that we should be “all in all.” In other words, even if Fish and, say, my great-uncle, might have had serious disagreements about … well, virtually everything, the digital humanities people might describe them as being roughly similar in their views about what, for instance, might constitute a proper piece of writing.
It’s true, to be sure, that Clem’s standing orders to his reporters (“Short words … short sentences … short leads … short paragraphs”) isn’t quite the style of Fish, the Ivy League professor—nor, equally surely, that of Milton, who virtually defines a “difficult” style of writing—but I suspect he’d have agreed with Fish’s point about the relation between death and writing. “To be mortal,” Fish says, is not only to be “capable of dying” but also to have a “beginning, middle and end,” which is what “sentences, narratives, and arguments have”—and from which the “digital humanities,” it seems, promises to liberate us. As Fish, the old scholar of Milton, knows, that’s what’s always promised, and as Milton knew (it’s what Paradise Lost is about, after all), it’s what we never get.
Still, it’s true that both newspapering and academia are getting rather a larger reminder of the significance of mortality these days than either might like. Both occupations have been sounding the death knell for decades: Clem’s newspaper, the Daily News, went under in 1978, and the transformation of image of humanities professors as august persons protected by tenure and remote in their wood-paneled offices to be-spectacled, goatee-wearing adjuncts who are probably working more than one job (a job that, if they are lucky, is not at a McDonald’s) is not only well underway, but nearly over in many places. In that sense, the vision of the “digital humanities” looks rather more like just trying to make the inevitable a cheering, rather than awful, vision of the future.
That vision of the future, however, despite what it might say about being “inclusive” and the like (“all in all”), necessarily doesn’t include everything in it: presumably, it doesn’t include beginnings and endings, or arguments, or anger. Or—here one assumes—golf: which is, after all, a sport devoted to beginnings (like, say, tee boxes) and endings (holes), arguments (which tee box was it?), and very often involves anger. That’s all right: it’s in the nature of radicalism to deny the present. What isn’t clear, at least to Fish I suppose, is just how to make all of that disappear: without, at the same time, effectively making much else disappear as well.
“You can’t make an omelette without breaking eggs,” goes one of the oldest leftist remarks—skewered by George Orwell, who asked “Yes, but where is the omelette?” If, for instance, the claim of the “digital humanities” is that by, say, breaking down “the more traditional structures of academic publishing,” as Fish cites one Matthew Kirschenbaum as arguing for, will somehow lead to—well, something, anyway—it certainly can’t be told by the economic data: all the indicators have been flashing red for some decades. For most of the American population, many many observers have noted, wages have remained more or less the same since about 1972.
I don’t want to spend a lot of time rehearsing the whole, which litters today’s landscape—it is the reason for the Occupy Wall Street movement—but let me select a few pieces of evidence. The Nobel Prize-winning economist Paul Krugman, reviewing Edward N. Wolff’s Top Heavy: A Study of the Increasing Inequality of Wealth in America, observes that the evidence for increasing economic inequality is “overwhelming, and it comes from many sources—from government agencies like the Bureau of the Census, from Fortune’s annual survey of executive compensation, and so on.” And that inequality has itself been unequal: “the top 5 percent have gotten richer compared with the next 15, the top 1 percent compared with the next 4, the top 0.25 percent compared with the next 0.75, and onwards all the way to Bill Gates.” Each level, in other words, has seen their income levels soar at an exponential rate—Bill Gates’ wealth has expanded not arithmetically, but according to a multiple: a multiple that is, for the Bill Gates category (the top .01 percent), at 497 percent.
Despite that, the Official American Left—ensconced in its ivory tower—has little to say about income inequality, even if it has a lot to say about protecting the rights of minorities. As even the notorious Marxist professor of literature Terry Eagleton has written, the very “idea of a creative majority movement” has “come to seem like a contradiction in terms” to many academics. In that sense, maybe golf, and anger, might have something to teach—and maybe that lesson isn’t necessarily that remote from the dusty halls of academe. The Iliad, after all—widely regarded as the beginning, along with the Pentateuch, of Western literature—begins with Homer invoking the Muse’s help to tell his tale: the story of the anger of Achilles. As for golf, anyone who says he’s played without feeling one emotion on the first tee and another on the final green is lying: if the game is about nothing else, it is about beginnings, middles, and endings.
And, also, keeping score.