His Dark Materials

But all these in their pregnant causes mixed
Confusedly, and which thus must ever fight.
Unless the Almighty Maker them ordain
His dark materials to create more worlds
—Paradise Lost II, 913-16

One of the theses of what’s known as the “academic Left” in America is that “nothing is natural,” or, as the literary critic (and “tenured radical”) Stanley Fish more properly puts it, “the thesis that the things we see and the categories we place them in … have their source in culture rather than nature.” It’s a thesis however, that seems to be obviously wrong in the case of professional golf. Without taking the time to do a full study of the PGA Tour’s website, which does list place of birth, it seems undoubtable that most of today’s American tour players originate south of the Mason-Dixon line: either in the former Confederacy or in other Sun Belt states. Thus it seems difficult to argue that there’s something about “Southern culture” that gives Southerners a leg up toward the professional ranks, rather than just the opportunity to play golf more times a year.

Let’s just look, in order to keep things manageable, at the current top ten: Jordan Speith, this year’s Masters winner, is from Texas, while Jimmy Walker, in second place, is just from up the road in Oklahoma. Rory McIlroy doesn’t count (though he is from Northern Ireland, for what that’s worth), while J.B. Holmes is from Kentucky. Patrick Reed is also from Texas, and Bubba Watson is from Florida. Dustin Johnson is from South Carolina, while Charlie Hoffman is from southern California. Hideki Matsuyama is from Ehime, Japan, which is located on the southern island of Shikoku in the archipelago, while Robert Streb rounds out the top ten and keeps the score even between Texas and Oklahoma.

Not until we reach Ryan Moore, at the fifteenth spot, do we find a golfer from an indisputably Northern state: Moore is from Tacoma, Washington. Washington however was not admitted to the Union until 1889; not until the seventeenth spot do we find a golfer from a Civil War-era Union state beside California. Gary Woodland, as it happens one of the longest drivers on tour, is from Kansas.

This geographic division has largely been stable in the history of American golf. It’s true of course that many great American golfers were Northerners, particularly at the beginnings of the game (like Francis Ouimet, “Chick” Evans, or Walter Hagan—from Massachusetts, Illinois, and Michigan respectively), and arguably the greatest of all time was from Ohio: Jack Nicklaus. But Byron Nelson and Ben Hogan were Texans, and of course Bobby Jones, one of the top three golfers ever, was a Georgian.

Yet while it might be true that nearly all of the great players are Southern, the division of labor in American golf is that nearly all of the great courses are Northern. In the latest Golf Digest ranking for instance, out of the top twenty courses only three—Augusta National, which is #1, Seminole in Florida, and Kiaweh in South Carolina—are in the South. New York (home to Winged Foot and Shinnecock, among others) and Pennsylvania (home to Merion and Oakmont) had the most courses in the top twenty; other Northern states included Michigan, Illinois, and Ohio. If it were access to great courses that made great golfers, in other words—a thesis that would appear to have a greater affinity with the notion that “culture,” rather than “nature,” was what produced great golfers, then we’d expect the PGA Tour to be dominated by Northerners.

That of course is not so, which perhaps makes it all the stranger that, if looked at by region, it is usually “the South” that champions “culture” and “the North” that champions “nature”—at least if you consider, as a proxy, how evolutionary biology is taught. Consider for instance a 2002 map generated by Lawrence S. Lerner of California State University at Long Beach:

v6i8g11

(Link here: http://bigthink.com/strange-maps/97-nil-where-and-how-evolution-is-taught-in-the-us). I realize that the map may be dated now, but still—although with some exceptions—the map generally shows that evolutionary biology is at least a controversial idea in the states of the former Confederacy, while Union states like Connecticut, New Jersey, and Pennsylvania are ranked by Professor Lerner as “Very good/excellent” in the matter of teaching Darwinian biology. In other words, it might be said that the states that are producing the best golfers are both the ones with the best weather and a belief that nature has little to do with anything.

Yet, as Professor Fish’s remarks above demonstrate, it’s the “radical” humanities professors of the nation’s top universities that are the foremost proponents of the notion that “culture” trumps “nature”—a fact that the cleverest creationists have not led slide. An article entitled “The Postmodern Sin of Intelligent Design Creationism” in a 2010 issue of Science and Education, for instance, lays out how “Intelligent Design Creationists” “try to advance their premodern view by adopting (if only tactically) a radical postmodern perspective.” In Darwinism and the Divine: Evolutionary Thought and Natural Theology, Alister McGrath argues not only “that it cannot be maintained that Darwin’s theory caused the ‘abandonment of natural theology,’” and also approvingly cites Fish: “Stanley Fish has rightly argued that the notion of ‘evidence’ is often tautologically determined by … interpretive assumptions.” So there really is a sense in which the the deepest part of the Bible Belt fully agrees with the most radical scholars at Berkeley and other top schools.

In Surprised By Sin: The Reader in Paradise Lost, Stanley Fish’s most famous work of scholarship, Fish argues that Satan is evil because he is “the poem’s true materialist”—and while Fish might say that he is merely reporting John Milton’s view, not revealing his own, still it’s difficult not to take away the conclusion that there’s something inherently wrong with the philosophical doctrine of materialism. (Not to be confused with the vulgar notion that life consists merely in piling up stuff, the philosophic version says that all existence is composed only of matter.) Or with the related doctrine of empiricism: “always an experimental scientist,” Fish has said more recently in the Preface to Surprised By Sin’s Second Edition, Satan busies himself “by mining the trails and entrails of empirical evidence.” Fish of course would be careful to distance himself from more vulgar thinkers regarding these matters—a distance that is there, sure—but it’s difficult not to see why creationists shouldn’t mine him for their own views.

Now, one way to explain that might be that both Fish and his creationist “frenemies” are drinking from the Pure Light of the Well of Truth. But there’s a possible materialistic candidate to explain just why humanities professors might end up with views similar to those of the most fundamentalist Christians: a similar mode of production. The political scientist Anne Norton remarks, in a book about the conservative scholar Leo Strauss, that the pedagogical technique pursued by Strauss—reading “a passage in a text” and asking questions about it—is also one pursued in “the shul and the madrasa, in seminaries and in Bible study groups.” At the time of Strauss’ arrival in the United States as a refugee from a 1930s Europe about to be engulfed in war, “this way of reading had fallen out of favor in the universities,” but as a result of Strauss’ career at the University of Chicago, along with that of philosophers Mortimer Adler (who founded the Great Books Program) and Robert Hutchins, it’s become at least a not-untypical pedagogical method in the humanities since.

At the least, that mode of humanistic study would explain what the philosopher Richard Rorty meant when he repeated Irving Howe’s “much-quoted jibe—‘These people don’t want to take over the government; they just want to take over the English Department.’” It explains, in other words, just how the American left might have “become an object of contempt,” as Rorty says—because it is a left that no longer believes that “the vast inequalities within American society could be corrected by using the institutions of a constitutional democracy.” How could it, after all, given a commitment against empiricism or materialism? Taking a practical perspective on the American political machinery would require taking on just the beliefs that are suicidal if your goal is to achieve tenure in the humanities at Stanford or Yale.

If you happen to think that most things aren’t due to the meddling of supernatural creatures, and you’ve given up on thoughts of tenure because you dislike both creationist nut-jobs and that “largely academic crowd cynical about America, disengaged from practice, and producing ever-more-abstract, jargon-ridden interpretations of cultural phenomena,” while at the same time you think that putting something in the place of God called “the free market”—which is what, exactly?—isn’t the answer either, why, then the answer is perfectly natural.

You are writing about golf.

Advertisements

Thought Crimes

 

How often have I said to you that when you have eliminated the impossible, whatever remains, however improbable, must be the truth?
Sherlock Holmes
    The Sign of Four (1890).

 

Whence heavy persecution shall arise
On all, who in the worship persevere
Of spirit and truth; the rest, far greater part,
Will deem in outward rites and specious forms
Religion satisfied; Truth shall retire
Bestuck with slanderous darts, and works of faith
Rarely be found: So shall the world go on …
John Milton
   Paradise Lost
   Book XII 531-37

 

When Tiger Woods, just after four o’clock Eastern time, hit a horrific duck-hook tee shot on Augusta National’s 13th hole during the third round of the Masters tournament Saturday, the golfer sent one of George Carlin’s “seven dirty words” after it, live on air. About an hour later, around a quarter after five, the announcer Ian Baker-Finch caught himself from uttering a taboo phrase: although he began by saying “back,” the Australian quickly corrected himself by saying “second nine.” To the novice Masters viewer the two misuses of language might appear quite different (Finch-Baker’s slip, that is, being far less offensive), but longtime viewers are aware that, had Baker-Finch not saved himself, his error would have been the more serious incident—to the extent, in fact, that he might have lost his job. Just why that is so is difficult to explain to outsiders unfamiliar with Augusta National’s particular vision of decorum; it may, however, perhaps be explained by one of the broadcast’s few commercials; an advert whose tagline connects a golf commentator’s innocent near-mistake to an argument about censorship conducted at the beginning of this year—in Paris, at the business end of a Kalashnikov.

France is a long way from Georgia, however, so let’s begin with how what Ian Baker-Finch almost said would have been far worse than Tiger’s f-bombs. In the first place that is because, as veterans of watching the Masters know, the announcing team is held to very strict standards largely unique to this sporting event. Golf is, in general, far more concerned with “decorum” and etiquette than other sports—it is, as its enthusiasts often remark, the only one where competitors regularly call penalties on themselves—but the Masters tournament examines the language of its broadcasters to an extent unknown even at other golf tournaments.

In 1966, for example, broadcaster Jack Whittaker—as described in the textbook, Sports Media: Planning, Production, and Reporting— “was canned for referring to Masters patrons as a ‘mob,’” while in 1994 Gary McCord joked (as told by Alex Myers in Golf Digest) “that ‘bikini wax’ is used to make Augusta National’s greens so slick”—and was unceremoniously dumped. Announcers at the Masters, in short, are well-aware they walk a fine line.

Hence, while Baker-Finch’s near-miss was by no means comparable to McCord’s attempts at humor, it was serious because it would have broken a known one of the “Augusta Rules,” as John Feinstein called them in Moment of Glory: The Year Underdogs Ruled Golf. “There are no front nine and back nine at Augusta but, rather, a first nine and a second nine,” Feinstein wrote; a rule that, it’s said, developed because the tournament’s founders, the golfer Bobby Jones and the club chairman Clifford Roberts, felt “back nine” sounded too close to “back side.” The Lords of Augusta, as the club’s members are sometimes referred to, will not stand for “vulgarity” from their announcing team—even if the golfers they are watching are sometimes much worse.

Woods, for example (as the Washington Post reported), “followed up a bad miss left off the 13th tee with a curse word that was picked up by an on-course microphone, prompting the CBS announcers to intone, ‘If you heard something offensive at 13, we apologize.’” Yet while even had Baker-Finch uttered the unutterable, he would only have suggested what Woods baldly verbalized, it’s unimaginable that Woods could suffer the same fate as a CBS announcer would, or be penalized in any way. The uproar that would follow if, for instance, the Lords decided to ban Tiger from further tournaments would make all previous golf scandals appear tame.

Undoubtedly, the difference in treatment conceivably could be justified by the fact that Woods is a competitor (and four-time winner) in the tournament while announcers are ancillary to it. In philosophic terms, players are essential while announcers are contingent: players just are the tournament because without them, no golf. That isn’t as possible to say about any particular broadcaster (though, when it comes to Jim Nantz, lead broadcaster since 1986, it might be close). From that perspective then it might make sense that Tiger’s “heat-of-the-moment” f-bombs are not as significant as a slip of the tongue by an announcer trained to speak in public could be.

Such, at least, might be a rationale for the differing treatment accorded golfers and announcers: so far as I am aware, neither the golf club nor CBS has come forward with an explanation regarding the difference. It was while I was turning this over in my mind that one of the tournament broadcast’s few commercials came on—and I realized just why the difference between Tiger’s words and, say, Gary McCord’s in 1994 caught in my brain.

The ad in question consisted of different people reciting, over and over again, a line once spoken by IBM pioneer Thomas Watson in 1915: “All of the problems of the world could be settled easily if men were only willing to think.” Something about this phrase—repeated so often it became quite literally like a mantra, defined as a “sacred utterance, numinous sound” by Wikipedia—rattled something in my head, which ignited a slight Internet investigation: it seems that, for IBM, that last word—think—became a catchword after 1915; the word was plastered on company ephemera like the name of the company magazine and even, in recent times, becoming the basis for the name of such products as the Thinkpad. The sentence, it could be said, is the official philosophy of the company.

As philosophies go it seems inarguable that this is rather a better one than, for instance, one that might demand “silence your enemies wherever possible.” It is, one might say, a hopeful sentence—if only people were willing to use their rationality, the difficult and the intractable could be vanquished. “Think,” in that sense, is a sentiment that seems quite at odds with the notion of censorship: without airing what someone is thinking, it appears impossible to believe that anything could be settled. In order to get people to think, it seems inarguable that they must be allowed to talk.

Such, at least, is one of the strongest pillars of the concept of “free speech,” as the English and law professor Stanley Fish has pointed out. Fish quotes, as an example of the argument, the Chairman of the National Endowment for the Humanities, James A. Leach, who gave a speech in 2009 claiming that “the cornerstone of democracy is access to knowledge.” In other words, in order to achieve the goal outlined by Watson (solving the world’s problems), it’s necessary to put everyone’s views in the open in order that they might be debated—a notion usually conceptualized, in relation to American law, as the “marketplace of ideas.”

That metaphor traces back to American Supreme Court justice Oliver Wendell Holmes, Jr.’s famous dissent in a case called Abrams v. United States, decided in 1919. “The ultimate good desired,” as Holmes wrote in that case (interestingly, in the light of his theory, against the majority opinion), “is better reached by free trade in ideas—that the best test of truth is the power of the thought to get itself accepted in the competition of the market.” That notion, in turn, can (as Fish observes) be followed back to English philosopher John Stuart Mill, and even beyond

“We can never be sure that the opinion we are endeavoring to stifle is a false opinion,” Mill wrote in his On Liberty, “and if we were sure, stifling it would be an evil still.” Yet further back,  the thought connects to John Milton’s Areopagitica, where the poet wrote “Let [Truth] and Falsehood grapple; who ever knew Truth put to the worse in a free and open encounter?” That is, so long as opinions can be freely shared, any problem could in principle be solved—more or less Thomas Watson’s point in 1915.

Let’s be clear, however, what is and what is not being said. That is, the words “in principle” above are important because I do not think that Watson or Mills or Milton or Holmes would deny that there are many practical reasons why it might be impossible to solve problems with a meeting or a series of meetings. No one believes, for instance, that the threat of ISIS could be contained by a summit meeting between ISIS and other parties—the claim that Holmes & Watson (smirk) et al. would make is just that the said threat could be solved if only that organization’s leaders would agree to a meeting. Merely objecting that many times such conceivable meetings are not practical isn’t, in that sense, an strong objection to the idea of the “idea market”—which asserts that in conditions of what could be called “perfect communication” disagreement is (eventually) impossible.

That however is precisely why Fish’s argument against the “market” metaphor is such a strong one: it is Fish’s opinion that the “marketplace” metaphor is just that—a metaphor, not a bedrock description of reality. In an essay entitled “Don’t Blame Relativism,” in fact, Fish apparently denies “the possibility of describing, and thereby evaluating” everything “in a language that all reasonable observers would accept.” That is, he denies the possibility that is imagined by Thomas Watson’s assertion regarding “[a]ll of the problems of the world”: the idea that, were only everyone reasonable, all problems could be solved.

To make the point clearer, while in Watson’s metaphor (which is also Milton’s and Mills’ and Holmes’), in theory everything can be sorted out if only everyone came to the bargaining table, to Fish such a possibility is not only practically impossible, but also theoretically impossible. Fish’s objection to the “market” idea isn’t just that it is difficult, for instance, to find the right translators to speak to different sides of a debate in their own language, but that even were all conditions for perfect communication met, that would not guarantee the end of disagreement.

It’s important to note at this point that this is a claim Fish needs to make in order to stick his argument, because if all he does is advance historically-based arguments to the effect that at no point in human history has the situation described by Watson et al. ever existed, their partisans can counterclaim that just because no one has yet seen perfect communication, that’s no reason to think it might not someday be possible. Such partisans might, for example, quote Alice Calaprice’s The Quotable Einstein, which asserts that Einstein once remarked that “No amount of experimentation can prove me right; a single experiment can prove me wrong.” Or, as the writer Nassem Nicholas Taleb has put the same point while asserting that it ultimately traces back through John Stuart Mill to David Hume: “No amount of observations of white swans can allow the inference that all swans are white, but the observation of a single black swan is sufficient to refute that conclusion.” In other words, Fish could be right that no such perfect communication has ever existed, but it would be logically inconsistent to try to claim that such evidence implies that it could never be possible.

To engage his opponents, then, Fish must take to the field of “theory,” not just adduce historical examples. That is why Fish cannot just claim that, historically, even regimes that claim to follow the creed of Watson and Holmes and so on in theory do not actually follow that creed in reality, though he does make that argument. He points out, for instance, that even in the Areopagitica, otherwise a passionate defense of “free speech,” Milton allowed that while “free speech” is all well and good for most people most of the time, he does not mean to imply “tolerated popery” (i.e., Catholics), because as that religion (according to Milton) “extirpates all religious and civil supremacies, so itself should be extirpate.”

In other words, Milton explains that anything that threatens the idea of “free speech” itself—as Catholicism, in Milton’s day arguably in the throes of the Inquisition, did so threaten—should not be included in the realm of protected speech, since that “which is impious or evil absolutely against faith or manners no law can possibly permit that intends not to unlaw itself.” And while it might be counterclaimed that in Milton’s time “free speech” was imperfectly realized, Fish also demonstrates that while Catholicism no longer constitutes a threat to modern “free speech” regimes, there are still exceptions to what can be said publicly.

As another American Supreme Court justice, Robert Jackson, would put the point centuries later, “the constitutional Bill of Rights”—including, one presumes, the free-speech-protecting First Amendment—is not “a suicide pact.” Or, as Fish himself put the same point, even today the most tolerant governments still ask themselves, regarding speech, “would this form of speech or advocacy, if permitted to flourish, tend to undermine the very purposes for which our society is constituted?” No government, in other words, can allow the kind of speech that threatens to end the practice of free speech itself.

Still, that is not enough to disrupt the “free speech” argument, because even if it has not been exemplified yet on this earth, that does not mean that it could not someday. To make his point, Fish has to go further; which he does in an essay called “There’s No Such Thing As Free Speech, And It’s A Good Thing Too.”

There, Fish says that he is not merely claiming that “saying something … is a realm whose integrity is sometimes compromised by certain restrictions”—that would be the above argument, where historical evidence is advanced—but rather “that restriction, in the form of an underlying articulation of the world that necessarily (if silently) negates alternatively possible articulations, is constitutive of expression.” The claim Fish wants to make in short—and it is important to see that it is the only argument that can confront the claims of the “marketplace of ideas” thesis—is that restrictions, such as Milton’s against Catholicism, aren’t the sad concessions we must make to an imperfect world, but are in fact what makes communication possible at all.

To those who take what’s known as a “free speech absolutism” position, such a notion might sound deeply subversive, if not heretical: the answer to pernicious opinions, in the view of the free speech absolutist, is not to outlaw them, but to produce more opinions—as Oliver Wendell Holmes, Mill, and Milton all advise. The headline of an editorial in Toronto’s Globe and Mail puts the point elegantly: “The lesson of Charlie Hebdo? We need more free speech, not less.” But what Fish is saying could be viewed in the light of the narrative described by the writer Nassim Nicholas Taleb about how he derived his saying regarding “black swans” under the influence of John Stuart Mill and David Hume.

Taleb says that while “Hume had been irked by the fact that science in his day … had experience a swing from scholasticism, entirely based on deductive reasoning” to “an overreaction into naive and unstructured empiricism.” The difficulty, as Hume recognized, “is that, without a proper method”—or, as Fish might say, a proper set of constraints—“empirical observations can lead you astray.” It’s possible, in other words, that amping up production of truths will not—indeed, perhaps can not—produce Truth.

In fact, Taleb argues (in a piece entitled “The Roots of Unfairness: the Black Swan in Arts and Literature”) that in reality, rather than the fantasies of free speech absolutists, the production of very many “truths” may tend to reward a very few examples at the expense of the majority—and that thusly “a large share of the success” of those examples may simply be due to “luck.” The specific market Taleb is examining in this essay is the artistic and literary world, but like many other spheres—such as “economics, sociology, linguistics, networks, the stock market”—that world is subject to “the Winner-Take-All effect.” (Taleb reports Robert H. Frank defined that effect in his article, “Talent and the Winner-Take-All Society,” as “markets in which a handful of top performers walk away with the lion’s share of total rewards.”) The “free speech absolutist” position would define the few survivors of the “truth market” as being, ipso facto, “the Truth”—but Taleb is suggesting that such a position takes a more sanguine view of the market than may be warranted.

The results of Taleb’s investigations imply that such may be the case. “Consider,” he observes, “that, in publishing, less than 1 in 800 books represent half of the total unit sales”—a phenomenon similar to that found by Art De Vany at the cinema in his Hollywood Economics. And while those results might be dismissed as subject to crass reasons, in fact the “academic citation system, itself supposedly free of commercialism, represents an even greater concentration” than that found in commercial publishing, and—perhaps even yet more alarmingly—there is “no meaningful difference between physics and comparative literature”: both display an equal amount of concentration. In all these fields, a very few objects are hugely successful, while the great mass sink like stones into the sea of anonymity.

The replication of these results do not confine themselves simply to artistic or scientific production; they are, in fact, applicable to subjects as diverse as the measurement of the coast of England to the error rates in telephone calls. George Zipf, for example, found that the rule applied to the “distribution of words in the vocabulary,” while Vilfredo Pareto found it applied to the distribution of income in any give society.

“Now,” asks Taleb, “think of waves of one meter tall in relation to waves of 2 meters tall”—there will inevitably be many more one meter waves than two meter waves, and by some magic the ratio between the two will be invariant, just as, according to what linguists call “Zipf’s Law,” “the most frequent word [in a given language] will occur approximately twice as often as the second most frequent word, three times as often as the third most frequent word,” and so on. As the Wikipedia entry for Zipf’s Law (from which the foregoing definition is taken) observes, the “same relationship occurs in many other rankings unrelated to language, such as the population ranks of cities in various countries, corporation sizes, income rankings, ranks of number of people watching the same TV channel, and so on.” All of these subjects are determined by what have come to be known as power laws—and according to some researchers, they even apply to subjects as seemingly immune to them as music.

Zipf himself, in order to explain the distribution he discovered among words, proposed that it could be explained by a kind of physical process, rather than discernment on the part of language-users: “people aim at minimizing effort in retrieving words; they are lazy and remember words that they have used in the past, so that the more a word is used, the more likely it is going to be used in the future, causing a snowball effect.” The explanation has an intuitive appeal: it appears difficult to argue that “the” (the most common English word) communicates twice as much information as “be” (the second-most common English word). Still less does such an argument explain why those word distributions should mirror the distributions of American cities, say, or the height of the waves on Hawaii’s North Shore, or the metabolic rates of various mammals. The widespread appearance of such distributions, in fact, suggests that rather than being determined by forces “intrinsic” to each case, the distributions are driven by a natural law that cares nothing for specifics.

So far, it seems, “we have no clue about the underlying process,” as Taleb says. “Nothing can explain why the success of a novelist … bears similarity to the bubbles and informational cascades seen in the financial markets,” much less why both should “resemble the behavior of electricity power grids.” What we can know is that, while according to the “free speech absolutist” position “one would think that a larger size of the population of producers would cause a democratization,” in fact “it does not.” “If anything,” Taleb notes, “it causes even more clustering.” The prediction of the “free speech absolutist” position suggests that the production of more speech results in a closer approximation of the Truth; experiential results, however, suggest that more production results merely in a smaller number of products becoming more successful for reasons that may have nothing to do with their intrinsic merits.

These results suggest that perhaps Stanley Fish has it right about “free speech,” and thus that the Lords of Augusta—like their spiritual brethren who shot up the offices of Charlie Hebdo in early January this year—have it completely right in the tight rein they hold over the announcers that work their golf tournament: Truth could be the result of, not the enemy of, regulation. The irony, of course, is that such also suggests the necessity of regulation in areas aside from commentary about golf and golfers—a result that, one suspects, is not only one not favored by the Lords of the Masters, but puts them in uncomfortable company. Allahu akbar, presumably, sounds peculiar with a Southern accent.

The End of Golf?

And found no end, in wandering mazes lost.
Paradise Lost, Book II, 561

What are sports, anyway, at their best, but stories played out in real time?
Grantland “Home Fields” Charles P. Pierce

We were approaching our tee shots down the first fairway at Chechessee Creek Golf Club, where I am wintering this year, when I got asked the question that, I suppose, will only be asked more and more often. As I got closer to the first ball I readied my laser rangefinder—the one that Butler National Golf Club, outside of Chicago, finally required me to get. The question was this: “Why doesn’t the PGA Tour allow rangefinders in competition?” My response was this, and it was nearly immediate: “Because that’s not golf.” That’s an answer that, perhaps, appeared clearer a few weeks ago, before the United States Golf Association announced a change to the Rules of Golf in conjunction with the Royal and Ancient of St. Andrews. It’s still clear, I think—as long as you’ll tolerate a side-trip through both baseball and, for hilarity’s sake, John Milton.

Throughout the rest of this year, any player in a tournament conducted under the Rules of Golf would be subjected to disqualification should she or he take out their cell phone during a round to consult a radar map of incoming weather. But on the coming of the New Year, that will be permitted: as the Irish Times wonders, “Will the sight of a player bending down to pull out a tuft of grass and throwing skywards to find out the direction of the wind be a thing of the past?” Perhaps not, but the new decision certainly says where the wind is blowing in Far Hills. Technology is coming to golf, as, it seems, to everything.

At some point, and it isn’t likely that far away, all relevant information will likely be available to a player in real time: wind direction, elevation, humidity, and, you know, yardage. The question will be, is that still golf? When the technology becomes robust enough, will the game be simply a matter of executing shots, as if all the great courses of the world were simply your local driving range? If so, it’s hard to imagine the game in the same way: to me, at least, part of the satisfaction of playing isn’t just hitting a shot well, it’s hitting the correct shot—not just flushing the ball on the sweet spot, but seeing it fly (or run) up toward the pin. If everyone is hitting the correct club every time, does the game become simply a repetitive exercise to see whose tempo is particularly “on” that day?

Amateur golfers think golf is about hitting shots, professionals know that golf is selecting what shots to hit. One of the great battles of golf, to my mind, is the contest of the excellent ball-striker vs. the canny veteran. Bobby Jones vs. Walter Hagen, to those of you who know your golf history: since Jones was perhaps known for the purity of his hits while Hagen, like Seve Ballesteros, for his ability to recover from his impure ones. Or we can generalize the point and say golf is a contest between ballstriking and craftiness. If that contest goes, does the game go with it?

That thought would go like this: golf is a contest because Bobby Jones’ ability to hit every shot purely is balanced by Walter Hagen’s ability to hit every shot correctly. That is, Jones might hit every shot flush, but he might not hit the right club; while Hagen might not hit every shot flush, but he will hit the correct club, or to the correct side of the green or fairway, or the like. But if Jones can get the perfection of information that will allow him to hit the correct club more often, that might be a fatal advantage—paradoxically ending the game entirely because golf becomes simply an exercise in who has the better reflexes. The idea is similar to the way in which a larger pitching mound became, in the late 1960s, such an advantage for pitchers that hitting went into a tailspin; in 1968 Bob Gibson became close to unhittable, issuing 268 strikeouts and possessing a 1.12 ERA.

As it happens, baseball is (once again) wrestling with questions very like these at the moment. It’s fairly well-known at this point that the major leagues have developed a system called PITCH/fx, which is capable of tracking every pitch thrown in every game throughout the season—yet still, that system can’t replace human umpires. “Even an automated strike zone,” wrote Ben Lindbergh in the online sports magazine Grantland recently, “would have to have a human element.” That’s for two reasons. One is the more-or-less obvious one that, while an automated system has no trouble judging whether a pitch is over the plate or not (“inside” or “outside”) it has no end of trouble judging whether a pitch is “high” or “low.” That’s because the strike zone is judged not only by each batter’s height, but also by batting stance: two players who are the same height can still have different strike zones because one might crouch more than another, for instance.

There is, however, a perhaps-more rooted reason why umpires will likely never be replaced: while it’s true that major league baseball’s PITCH/fx can judge nearly every pitch in every game, every once in (a very great) while the system just flat out doesn’t “see” a pitch. It doesn’t even register that a ball was thrown. So all the people calling for “robot umpires” (it’s a hashtag on Twitter now) are, in the words of Dan Brooks of Brooks Baseball (as reported by Lindbergh), “willing to accept a much smaller amount of inexplicable error in exchange for a larger amount of explicable error.” In other words, while the great majority of pitches would likely be called more accurately, it’s also so that the mistakes made by such a system would be a lot more catastrophic than mistakes made by human umpires. Imagine, say, Zack Greinke was pitching a perfect game—and the umpire just didn’t see a pitch.

These are, however, technical issues regarding mechanical aids, not quite the existential issues of the existence of what we might term a perfectly transparent market. Yet they demonstrate just how difficult such a state would, in practical terms, be to achieve: like arguing whether communism or capitalism are better in their pure state, maybe this is an argument that will never become anything more than a hypothetical for a classroom. The exercise however, like seminar exercises are meant to, illuminates something about the object in question: since a computer doesn’t know the difference between the first pitch of April and the last pitch of the World Series’ last game—and we do—that I think tells us something about what we value about both baseball and golf.

Which is what brings up Milton, since the obvious (ha!) lesson here could be the one that Stanley Fish, the great explicator of John Milton, says is the lesson of Milton’s Paradise Lost: “I know that you rely upon your senses for your apprehension of reality, but they are unreliable and hopelessly limited.” Fish’s point refers to a moment in Book III, when Milton is describing how Satan lands upon the sun:

There lands the Fiend, a spot like which perhaps
Astronomer in the Sun’s lucent Orb
Through his glaz’d optic Tube yet never saw.

Milton compares Satan’s arrival on the sun to the sunspots that Galileo (whom Milton had met) witnessed through his telescope—at least, that is what the first part of the thought appears to imply. The last three words, however—yet never saw—rip away that certainty: the comparison that Milton carefully sets up between Satan’s landing and sunspots he then tells the reader is, actually, nothing like what happened.

The pro-robot crowd might see this as a point in favor of robots, to be sure—why trust the senses of an umpire? But what Fish, and Milton, would say is quite the contrary: Galileo’s telescope “represents the furthest extension of human perception, and that is not enough.” In other words, no matter how far you pursue a technological fix (i.e., robots), you will still end up with more or less the problems you had before, only they might be more troublesome than the ones you have now. And pretty obviously, a system that was entirely flawless for every pitch of the regular season—which encompasses, remember, thousands of games just at the major league level, not even to mention the number of individual pitches thrown—and then just didn’t see a strike three that (would have) ended a Game 7 is not acceptable. That’s not really what I meant by “not golf” though.

What I meant might best be explained by reference to (surprise, heh) Fish’s first major book, the one that made his reputation: Surprised by Sin: The Reader in Paradise Lost. That book set out to hurdle what had seemed to be an unbridgeable divide, one that had existed for nearly two centuries at least: a divide between those who read the poem (Paradise Lost, that is) as being, as Milton asked them, intended to “justify the ways of God to men,” and those who claimed, with William Blake, that Milton was “of the Devil’s party without knowing it.” Fish’s argument was quite ingenious, which was in essence was that Milton’s technique was true to his intention, but that, misunderstood, could easily explain how some could mis-read him so badly. Which is rather broad, to be sure—as in most things, the Devil is in the details.

What Fish argued was that Paradise Lost could be read as one (very) long instance of what are now called “garden path” sentences, which are grammatical sentences that begin in a way that appear to direct the reader toward one interpretation, only to reveal their true meaning at the end. Very often, they require the reader to go back and reread the sentence, such as in the sentence, “Time flies like an arrow; fruit flies like a banana.” Another example is Emo Philips’ line “I like going to the park and watching the children run around because they don’t know I’m using blanks.” They’re sentences, in other words, where the structure implies one interpretation at the beginning, only to have that interpretation snatched away by the sentence’s end.

Fish argued that Paradise Lost was, in fact, full of these moments—and, more significantly, that they were there because Milton put them there. One example Fish uses is just that bit from Book III, where Satan gets compared, in detail, with the latest developments in solar astronomy—until Milton jerks the rug out with the words “yet never saw.” Satan’s landing is just like a sunspot, in other words … except it isn’t. As Fish says,

in the first line two focal points (spot and fiend) are offered the reader who sets them side by side in his mind … [and] a scene is formed, strengthened by the implied equality of spot and fiend; indeed the physicality of the impression is so persuasive that the reader is led to join the astronomer and looks with him through a reassuringly specific telescope (‘glaz’d optic Tube) to see—nothing at all (‘yet never saw’).

The effect is a more-elaborate version of that of sentences like “The old man the boats” or “We painted the wall with cracks”—typical examples of garden-path sentences. Yet why would Milton go to the trouble of constructing the simile if, in reality, the things being compared are nothing alike? It’s Fish’s answer to that question that made his mark on criticism.

Throughout Paradise Lost, Fish argues, Milton again and again constructs his language “in such a way that [an] error must be made before it can be acknowledged by the surprised reader.” That isn’t an accident: in a sense, it takes the writerly distinction between “showing” and “telling” to its end-point. After all, the poem is about the Fall of Man, and what better way to illustrate that Fall than by demonstrating it—the fallen state of humanity—within the reader’s own mind? As Fish says, “the reader’s difficulty”—that is, the continual state of thinking one thing, only to find out something else—“is the result of the act that is the poem’s subject.” What, that is, were Adam and Eve doing in the garden, other than believing things were one way (as related by one slippery serpent) when actually they were another? And Milton’s point is that trusting readers to absorb the lesson by merely being told it is just what got the primordial pair in trouble in the first place: why Paradise Lost needs writing at all is because our First Parents didn’t listen to what God told them (You know: don’t eat that apple).

If Fish is right, then Milton concluded that just to tell readers, whether of his time or ours, isn’t enough. Instead, he concocted a fantastic kind of riddle: an artifact where, just by reading it, the reader literally enacts the Fall of Man within his own mind. As the lines of the poem pass before the reader’s eyes, she continually credits the apparent sense of what she is reading, only to be brought up short by a sudden change in sense. Which is all very well, it might be objected, but even if that were true about Paradise Lost (and not everyone agrees that it is), it’s something else to say that it has anything to do with baseball umpiring—or golf.

Yet it does, and for just the same reason that Paradise Lost applies to wrangling over the strike zone. One reason why we couldn’t institute a system that could possibly just not see one pitch over another is because, while certainly we could take or leave most pitches—nobody cares about the first pitch of a game, for instance, or the middle out of the seventh inning during a Cubs-Rockies game in April—there are some pitches that we must absolutely know about. And if we consider what gives those pitches more value than other pitches—and surely everyone agrees that some pitches have more worth than others—then what we have to arrive at is that baseball doesn’t just take place on a diamond, but also takes place in time. Baseball is a narrative, not a pictorial, art.

To put it another way, what Milton does in his poem is just what a good golf architect does for the golf course: it isn’t enough to be told you should take a five-iron off this tee, while on another a three wood. The golfer has to be shown it: what you thought was one state of affairs was in fact another. And not merely that—because that, in itself, would only be another kind of telling—but that the golfer—or, at least, the reflective golfer—must come to see the point as he traverses the course. If a golf hole, in short, is a kind of sentence, then the assumptions with which he began the hole must be dashed by the time he reaches the green.

As it happens, this is just what the Golf Club Atlas says about the fourth at Chechessee Creek, where a “classic misdirection play comes.” At the fourth tee, “the golfer sees a big, long bunker that begins at the start of the fairway and hooks around the left side.” But the green is to the right, which causes the golfer to think “‘I’ll go that way and stay away from the big bunker.’” Yet, because there is a line of four small bunkers somewhat hidden down the right side, and bunkers to the right near the green, “the ideal tee ball is actually left center.” “Standing behind the hole”—that is, once play is over—“the left to right angle of the green is obvious and clearly shows that left center of the fairway is ideal,” which makes the fourth “the cleverest hole on the course.” And it is, so I’d argue, because it uses precisely the same technique as Milton.

That, in turn, might be the basis for an argument for why getting yardages by hand (or rather, foot) so necessary to the process of professional golf at the highest level. As I mentioned, amateur golfers think golf is about hitting shots while professionals know that golf is selecting what shots to hit. Amateurs look at a golf hole and think, “What a pretty picture,” while a professional looks at one and thinks of the sequence of shots it would take to reach the goal. That’s why it is so that, even though so much of golf design is mostly conjured by way of pretty pictures, whether in oils or photographic, and it might be thought that pictures, since they are “artistic,” are antithetical to the mechanistic forces of computers, it might be thought that it is the beauty of golf courses that make the game irreducible to analysis—an idea that, in fact, gets things precisely wrong.

Machines, that is, can paint a picture of a hole that can’t be beat: just look at the innumerable golf apps available for smart phones. But computers can’t parse a sentence like “Time flies like an arrow; fruit flies like a banana.” While computers can call (nearly) every pitch over the course of a season, they don’t know why a pitch in the seventh inning of a World Series game is more important than a spring training game. If everything is right there in front of you, then computers or some other mechanical aids are quite useful; it’s only when the end of a process causes you to re-evaluate everything that came before that you are in the presence of the human. Working out yardages without the aid of a machine forces the kind of calculations that can see a hole in time, not in space—to see a hole as a sequence of events, not (as it were) a whole.

Golf isn’t just the ability to hit shots—it’s also, and arguably more significantly, the ability to decide what the best path to the hole is. One argument for why further automation wouldn’t harm the game in the slightest is the tale told by baseball umpiring: no matter how far technological answers are sought, it’s still the case that human beings must be involved in calling balls and strikes, even if not in quite the same way as now. Some people, that is, might read Milton’s warning about astronomy as saying that pursuing that avenue of knowledge is a blind alley, when what Milton might instead be saying is just that the mistake is to think that there could be an end to the pursuit: that is, that perfect information could yield perfect decision-making. We extend “human perception” all we like—it will not make a whit of difference.

Milton thought that was because of our status as Original Sinners, but it isn’t necessary to take that line to acknowledge limitations, whether they are of the human animal in general or just endemic to living in a material universe. Some people appear to take this truth as a bit of a downer: if we cannot be Gods, what then is the point? Others, and this seems to be the point of Paradise Lost, take this as the condition of possibility: if we were Gods, then golf (for example) would be kind of boring, as merely the attempt to mechanically re-enact the same (perfect) swing, over and over. But Paradise Lost, at least in one reading, seems to assure us that that state is unachievable. As technology advances, so too will human cleverness: Bobby Jones can never defeat Walter Hagen once and for all.

Yet, as the example of Bob Gibson demonstrates, trusting to the idea that, somehow, everything will balance out in the end is just as dewy-eyed as anything else. Sports can ebb and flow in popularity: look at horse racing or boxing. Baseball reacted to Gibson’s 13 shutouts and Denny McLaine’s 31 victories in 1968, as well as Carl Yastrzemski’s heroic charge to a .301 batting average, the lowest average ever to win the batting crown. Throughout the 1960s, says Bill James in The New Bill James Historical Abstract, Gibson and his colleagues competed in a pitcher’s paradise: “the rules all stacked in their favor.” In 1969, the pitcher’s mound was lowered from 15 to 10 inches high and the strike zone was squeezed too, from the shoulders to the armpits, and from the calves to the top of the knee. The tide of the rules began to swing the other way, until the offensive explosion of the 1990s.

Nothing, in other words, happens in a vacuum. Allowing perfect yardages, so I would suspect, advantages the ballstrikers at the expense of the crafty shotmakers. To preserve the game then—a game which, contrary to some views, isn’t always the same, and changes in response to events—would require some compensating rule change in response. Just what that might be is hard, for me at least, to say at the moment. But it’s important, if we are to still have the game at all, to know what it is and is not, what’s worth preserving and why we’d like to preserve it. We can sum it up, I think, in one sentence. Golf is a story, not a picture. We ought to keep that which allows golf to continue to tell us the stories we want—and, perhaps, need—to hear.

The Anger of Achilles

Rage—Goddess, sing the rage of Achilles,
Murderous and doomed.
The Iliad. Book I.

The Bob Hope got itself played in Palm Springs last week—despite all the efforts of Aeolus, god of wind—and watching it always reminds me of the third hole at Silver Rock, a shortish par three, when Justin Leonard’s caddie corrected me on the yardage I was giving Derek Anderson, who was then still Cleveland’s hope for the future. Silver Rock is one of those modern courses with many, many tee boxes installed by architects fighting a rear-guard action against the equipment-makers—a war that has all of the vitality Rome’s legions on the Rhine must have had in the century or two after Marcus Aurelius—and looking in the yardage book, I’d mistaken the tee box we actually were on for another because I’d missed seeing one of the tee boxes. The yardage I’d given Derek was something like 12 yards off: enough to put him on the wrong club. Justin’s caddie corrected me, which might have been the end of it but for the tenor of the man’s voice. He was angry.

Now, golf and anger are no strangers to each other: “Some emotions,” even the great Bobby Jones once said, “cannot be endured with a golf club in your hands.” “Terrible” Tommy Bolt, a U.S. Open-winning subscriber to Jones’ theory, advised not only to throw your clubs in front of you (it saves a walk), but also never to break both your putter and driver in the same round: canny pieces of advice from a man not unfamiliar with helicoptering drivers or putters.

Nowadays, of course, such displays of temper are hugely frowned upon, perhaps in keeping with the general vibe of today’s world: my great-uncle, who was city editor of the Chicago Daily News far back in the last century, was renowned for his temper—he “ruled the staff … in fiery justice” his obituary said— as were a lot of city editors at the time. Twenty years ago, though, even a leading candidate for the Oldest Living City Editor, Julius Parker of the Chattanooga Free Press, then 79, admitted to the American Journalism Review that he tried “not to shout as much as I used to.”

Even so rarified an air as academia, which one might suppose has as little to do with a the clatterings of a newsroom as a milkmaid has to a milking machine, isn’t immune to a change in the culture as a whole. For instance, John Milton’s foremost living scholar, Stanley Fish (of Berkeley, Johns Hopkins, Columbia, and, most notoriously, Duke), recently wrote in one of his columns (“The Digital Humanities and the Transcending of Morality”) for the New York Times’ digital edition—which he had, until the very column I am citing, refused to call a blog—that “the new forms of communication—blogs, links, hypertext, re-mixes, mash-ups, multi-modalities and much more—that have emerged with the development of digital technology” challenge the old model of scholarship entirely. It’s a claim that might appear quite unrelated to the one in the previous paragraph—it doesn’t follow that angry city editors have anything to do with scholarship, exactly—but a closer examination of Fish’s argument might reveal that even if the two worlds of newspapering and scholarship aren’t in harmony, they’re singing a similar song.

The reason Fish gives for refusing to call his blog a blog is, it seems, exactly the reasons many defenders of what’s being called the “digital humanities” are proclaiming are the virtues of the practice of blogging and other, newer, forms of scholarly communication. Blogs, and other forms of writing on the Internet, are “provisional, ephemeral, interactive, communal, available to challenge, interruption and interpolation, and not meant to last,” whereas for the past 50 years or so Fish has been “building arguments that are intended to be decisive, comprehensive, monumental, definitive and, most important, all mine.” But for those practicing the new forms of scholarship, such ends are mistaken.

What the “digital humanities” promises, according to Fish (as their enemy, perhaps it is wise to take his point with a grain of salt), is a mode of scholarship in which “knowledge is available in a full and immediate presence” to everyone everywhere: which is to say, the usual kind of left-wing millenarianism. (Indeed, The Digital Humanities Manifesto 2.0 explicitly describes itself as having a “utopian core shaped by its genealogical descent from the counterculture/cyberculture of the 60s and 70s.”) The promise is, as Fish notes Milton described while facing an earlier version of the same sort of thing, as being that we should be “all in all.” In other words, even if Fish and, say, my great-uncle, might have had serious disagreements about … well, virtually everything, the digital humanities people might describe them as being roughly similar in their views about what, for instance, might constitute a proper piece of writing.

It’s true, to be sure, that Clem’s standing orders to his reporters (“Short words … short sentences … short leads … short paragraphs”) isn’t quite the style of Fish, the Ivy League professor—nor, equally surely, that of Milton, who virtually defines a “difficult” style of writing—but I suspect he’d have agreed with Fish’s point about the relation between death and writing. “To be mortal,” Fish says, is not only to be “capable of dying” but also to have a “beginning, middle and end,” which is what “sentences, narratives, and arguments have”—and from which the “digital humanities,” it seems, promises to liberate us. As Fish, the old scholar of Milton, knows, that’s what’s always promised, and as Milton knew (it’s what Paradise Lost is about, after all), it’s what we never get.

Still, it’s true that both newspapering and academia are getting rather a larger reminder of the significance of mortality these days than either might like. Both occupations have been sounding the death knell for decades: Clem’s newspaper, the Daily News, went under in 1978, and the transformation of image of humanities professors as august persons protected by tenure and remote in their wood-paneled offices to be-spectacled, goatee-wearing adjuncts who are probably working more than one job (a job that, if they are lucky, is not at a McDonald’s) is not only well underway, but nearly over in many places. In that sense, the vision of the “digital humanities” looks rather more like just trying to make the inevitable a cheering, rather than awful, vision of the future.

That vision of the future, however, despite what it might say about being “inclusive” and the like (“all in all”), necessarily doesn’t include everything in it: presumably, it doesn’t include beginnings and endings, or arguments, or anger. Or—here one assumes—golf: which is, after all, a sport devoted to beginnings (like, say, tee boxes) and endings (holes), arguments (which tee box was it?), and very often involves anger. That’s all right: it’s in the nature of radicalism to deny the present. What isn’t clear, at least to Fish I suppose, is just how to make all of that disappear: without, at the same time, effectively making much else disappear as well.

“You can’t make an omelette without breaking eggs,” goes one of the oldest leftist remarks—skewered by George Orwell, who asked “Yes, but where is the omelette?” If, for instance, the claim of the “digital humanities” is that by, say, breaking down “the more traditional structures of academic publishing,” as Fish cites one Matthew Kirschenbaum as arguing for, will somehow lead to—well, something, anyway—it certainly can’t be told by the economic data: all the indicators have been flashing red for some decades. For most of the American population, many many observers have noted, wages have remained more or less the same since about 1972.

I don’t want to spend a lot of time rehearsing the whole, which litters today’s landscape—it is the reason for the Occupy Wall Street movement—but let me select a few pieces of evidence. The Nobel Prize-winning economist Paul Krugman, reviewing Edward N. Wolff’s Top Heavy: A Study of the Increasing Inequality of Wealth in America, observes that the evidence for increasing economic inequality is “overwhelming, and it comes from many sources—from government agencies like the Bureau of the Census, from Fortune’s annual survey of executive compensation, and so on.” And that inequality has itself been unequal: “the top 5 percent have gotten richer compared with the next 15, the top 1 percent compared with the next 4, the top 0.25 percent compared with the next 0.75, and onwards all the way to Bill Gates.” Each level, in other words, has seen their income levels soar at an exponential rate—Bill Gates’ wealth has expanded not arithmetically, but according to a multiple: a multiple that is, for the Bill Gates category (the top .01 percent), at 497 percent.

Despite that, the Official American Left—ensconced in its ivory tower—has little to say about income inequality, even if it has a lot to say about protecting the rights of minorities. As even the notorious Marxist professor of literature Terry Eagleton has written, the very “idea of a creative majority movement” has “come to seem like a contradiction in terms” to many academics. In that sense, maybe golf, and anger, might have something to teach—and maybe that lesson isn’t necessarily that remote from the dusty halls of academe. The Iliad, after all—widely regarded as the beginning, along with the Pentateuch, of Western literature—begins with Homer invoking the Muse’s help to tell his tale: the story of the anger of Achilles. As for golf, anyone who says he’s played without feeling one emotion on the first tee and another on the final green is lying: if the game is about nothing else, it is about beginnings, middles, and endings.

And, also, keeping score.

Golf Architecture as Narrative Art

You think you can leave the past behind,
You must be out of your mind.
If you think you can simply press rewind,
You must be out of your mind, son
You must be out of your mind.
—Magnetic Fields “You Must Be Out Of Your Mind.” Realism.

I sometimes get asked just what the biggest difference is between the amateur and the professional games are, and about as often I want to say, “Amateurs always start on the first tee.” This is a smart-alecky remark, but it isn’t just smart-alecky. For over a century the United States Open sent every player off from the first tee the first two days of the tournament, a tradition that ended in 2002 at Bethpage in New York. Now, only the Masters and the Open Championship in the U.K. still start everyone on the first tee every day. Mostly nobody notices, in part because televised golf encourages a kind of schizoid viewing habit: we skip from hole to hole, shot to shot, seemingly at random, without order.

“Here’s Ernie at 11,” the announcer will say, never mind that the last thing we saw was the leaders hitting their approach shots into 7, and right before that we saw Player X finishing up at 18. All of this approaches the golf course like a deck of cards to be dealt at random: which is precisely the opposite of how the amateur player always sees a golf course, one hole at a time.

Pro golf, both on television and the way the players themselves experience it, is different. A golf course, like a book, is designed to be played in a certain order, which makes golf architecture different from other kinds of architecture or other kinds of art like painting or sculpture, as much as the brochures and the television announcers like to make mention of this week’s “breathtaking beauty.” Golf architecture though has just as much in common with temporal arts like music or narrative: what’s important isn’t just what’s happening now but what’s happened before.

Did the architect create the illusion that those bunkers weren’t a problem on the last hole, causing you to play safe on this one—or vice versa? Maybe two greens with similar-looking slopes will play differently because the grain runs differently on each. There’s a lot of games architects can play that take advantage of what we’ve learned—or thought we learned—on previous holes.

Mostly though the obvious tricks are easily discovered, or only work once. Courses like that are like murder mysteries spoiled after somebody tells you just how Mr. Green bought it from a rutabega poisoned by the maid, who turns out to be employed by and who the hell cares. What makes a course worth playing is one that continues to bewilder, even after you know the secret of it. Nobody gives a damn if you know “Rosebud” was the sled—Citizen Kane is still good. Good architecture, I would submit, tells a story.

Maybe the best example of what I mean is Riviera Country Club in Los Angeles, where the tour plays the L.A. Open every year. Widely acclaimed as an architect’s dream course, Riviera is also remarkably fun to play while still being one of the toughest tracks the professionals play every year. The first tee begins a few steps, quite literally, from the clubhouse, on a patch of grass high above the rest of the course. The tee shot drops out of the sky just as you do from the heights—Icarus (or Lucifer) plummeting, as Milton says, “toward the coast of Earth beneath,/Down from th’ Ecliptic, sped with hop’d success.” The first is the easiest hole on the course, a par five with the tee not only elevated, but a wide landing zone to receive the shot. The green is wide, and in general it’s a lullaby of a hole.

The second, however, turns the tables quickly. It’s a long dog-legged par four with out-of-bounds (the driving range) left and trees right: the tee shot is either to a narrow piece of fairway or the riskier shot over the neck of the dogleg on the right over the trees. Either way, the approach is to a very narrow green with deep bunkers left and a hillside with very tall rough on the right. The professionals regard a four here as dearly as a five is cheap on the first hole. Usually the second is the toughest hole on the course every year.

Whereas the first hole rewards the bomber, the second favors the straight-shooter. In other words, what worked on the first hole is exactly what’s penalized on the next, and vice versa. Riviera continues on like this all the way around the course, giving and taking away options throughout and always mixing it up: what worked on the last hole won’t necessarily work on the next; in fact, following the same strategy or style of play is exactly what leads to big numbers.

What’s really astonishing about Riviera is that it doesn’t matter whether you know what’s coming: just because you know the first hole is easy, and why, and the second is hard, and why, doesn’t change things. There isn’t any short-cut—such as is often found on the videogame Golden Tee for instance—that, once discovered, ends the problem the next hole presents. That ability to confound is something rare in a golf course. Most courses reward a particular style—Jack Nicklaus’ courses are notorious for rewarding high fades, the shot Nicklaus liked to hit in his prime.

The great courses, though, not only mix up styles, but also tell a story. As Rob says in High Fidelity, “You gotta kick it off with a killer to grab attention. Then you gotta take it up a notch. But you don’t want to blow your wad. So then you gotta cool it off a notch. There are a lot of rules.” Rob’s point owes something perhaps to Stanley Fish, the Miltonist, who argued in Surprised By Sin that the way Paradise Lost works is to ensnare the reader constantly, setting up one expectation after another, dashing each in turn.

At Riviera, for instance, the first two holes raise hopes and then dash them—or conceivably raise them to a higher pitch, should you somehow make a miraculous birdie on the second. The rest of the course continues to toy with a player’s mind. Two years ago Geoff Ogilvy, the Australian pro I’ve written about before, talked with Geoff Shackelford of Golf Digest about the short 10th hole and how important that hole’s place in the routing is:

“The eighth and ninth holes are very hard, but you know that the 10th and 11th [a reachable par 5] offer a couple of birdie or even eagle chances. So [the 10th hole] sits in the round at the perfect time,” says Ogilvy. “It’s definitely a much better hole than it [would be] if you teed off there to start your round when the dynamics just aren’t nearly the same.”

Sequence matters in other words even if, as at Riviera, players are guaranteed to have to start at least one round on the 10th hole because the first two days use split tees.

Medinah, where I usually work, often takes a lot of crap from the big-name golf writers on just that point: Bradley Klein, for instance, who’s not only the architecture critic for Golfweek but was also PGA Tour caddie and a professor of political science, doesn’t think much of the course. In 1999, he said it was “stunningly mediocre.” Klein doesn’t convince me. Maybe it’s because I am—maybe more so than anyone on the planet—familiar with the course, but it might also be that Klein either isn’t aware of the role of narrative in architecture, or isn’t familiar enough with Medinah to understand its narrative.

There’s a stretch of holes, for instance, that I think illustrate what I’ll call the High Fidelity or Paradise Lost principle pretty well: the ninth through the eleventh. The first and the last hole of this stretch are both dogleg-left four pars, sandwiching a long five par that goes directly into the prevailing wind. The ninth and the eleventh are both similar-looking holes to the unwary: both require you to choose either to try to carry the dogleg with a driver off the tee or lay-up with some other club. But the tee shot on nine is into the prevailing wind and uphill, while the eleventh is with the prevailing wind and downhill. What worked on the first one won’t work on the other. In addition, the tenth is so long, and into the wind, that the player usually thinks more club is necessary on the eleventh tee—but that’s usually exactly the wrong choice.

Medinah just underwent a renovation last year—again—so I will see how the changes went and report back on them here. What I wanted to do here first though was to describe a bit about how I’m going to understand that change, which is to evaluate the golf course through the story it tells. Playing the course as the architect meant it to be played is one advantage the amateur has over the professional. The PGA Tour isn’t far removed from the shotgun starts that are a feature of your typical pro-am event, where it doesn’t matter what hole you start on. But enjoying the structure, the internal logic, of course design is not only one of the game’s pleasures, but also I think a means of improving your own golf: understanding what the architect wants is a big step towards lowering your score. “But to convince the proud what signs avail?” Milton says in Paradise Lost, “Or wonders move the obdurate to relent?” Reading the signs in order, I think, is the amateur’s one advantage over the professional—it is a pleasure not unlike the bite of a noted apple.