“‘The frog is almost five hundred million years old. Could you really say with much certainty that America, with all its strength and prosperity, with its fighting man that is second to none, and with its standard of living that is the highest in the world, will last as long as … the frog?’”
—Joseph Heller. Catch-22. (1961).
 … the fall of empires which aspired to universal dominion could be predicted with very high probability by one versed in the calculus of chance.
—Laplace. Theórie Analytique des Probabilities. (1814).


If sexism exists, how could it be proved? A recent lawsuit—Chen-Oster v. Goldman Sachs, Inc., filed in New York City on 19 May, 2014—aims to do just that. The suit makes four claims: that Goldman’s women employees make less than men at the same positions; that a “disproportionate” number of men have been promoted “over equally or more qualified women”; that women employees’ performance was “systematic[ally] underval[ued]”; and that “managers most often assign the most lucrative and promising opportunities to male employees.” The suit, then, echoes many of the themes developed by feminists over the past two generations, and in a general sense may perhaps be accepted, or even cheered, by those Americans sensitive to feminism. But those Americans may not be aware of the potential dangers of the second claim: dangers that threaten not merely the economic well-being of the majority of Americans, including women, but also America’s global leadership. Despite its seeming innocuousness, the second claim is potentially an existential threat to the future of the United States.

That, to be sure, is a broad assertion, and seems disproportionate, you might say, to the magnitude of the lawsuit: it hardly seems likely that a lawsuit over employment law, even one involving a firm so important to the global financial machinery as Goldman Sachs, could be so important as to threaten the future of the United States. Yet few today would deny the importance of nuclear weapons—nor that they pose an existential threat to humanity itself. And if nuclear weapons are such a threat, then the reasoning that led to those weapons must be at least as, if not more so, as important than the weapons themselves. As I will show, the second claim poses a threat to exactly that chain of reasoning.

That, again, may appear a preposterous assertion: how can a seemingly-minor allegation in a lawsuit about sexism have anything to do with nuclear weapons, much less the chain of logic that led to them? One means of understanding how requires a visit to what the late Harvard biologist Stephen Jay Gould called “the second best site on the standard tourist itinerary of [New Zealand’s] North Island—the glowworm grotto of Waitomo Cave.” Upon the ceiling of this cave, it seems, live fly larvas whose “illuminated rear end[s],” Gould tells us, turn the cave into “a spectacular underground amphitheater”—an effect that, it appears, mirrors the night sky. But what’s interesting about the Waitomo Cave is that it does this mirroring with a difference: upon observing the cave, Gould “found it … unlike the heavens” because whereas stars “are arrayed in the sky at random,” the glowworms “are spaced more evenly.” The reason why is that the “larvae compete with, and even eat, each other—and each constructs an exclusive territory”: since each larva has more or less the same power as every other larva, each territory is more or less the same size. Hence, as Gould says, the heaven of the glowworms is an “ordered heaven,” as opposed to the disorderly one visible on clear nights around the the world—a difference that not only illuminates just what’s wrong with the plaintiff’s second claim in Chen-Oster v. Goldman Sachs, Inc, but also how that claim concerns nuclear weapons.

Again, that might appear absurd: how can understanding a Southern Hemispheric cavern help illuminate—as it were—a lawsuit against the biggest of Wall Street players? To understand how requires another journey—though this one is in time, not space.

In 1767, an English clergyman named John Michell published a paper with the unwieldy title of “An Inquiry into the Probable Parallax, and Magnitude of the Fixed Stars, from the Quantity of Light Which They Afford us, and the Particular Circumstances of Their Situation.” Michell’s purpose in the paper, he wrote, was to inquire whether the stars “had been scattered by mere chance”—or, instead, by “their mutual gravitation, or to some other law or appointment of the Creator.” Since (according to Michell’s biographer, Russell McCommach), Michell assumed “that a random distribution of stars is a uniform distribution,” he concluded that—since the night sky does not resemble the roof of the Waitomo Cave—the distribution of stars must be the result of some natural law. Or even, he hinted, the will of the Creator himself.

So things might have stayed had Michell’s argument “‘remained buried in the heavy quartos of the Philosophical Transactions”—as James Forbes, the Professor of Natural Philosophy at Edinburgh University, would write nearly a century later. But Michell’s argument hadn’t; several writers, it seems, took his argument as evidence for the existence of the supernatural. Hence, Forbes felt obliged to refute an argument that, he thought, is “‘too absurd to require refutation.’” To think—as Michell did—that “a perfectly uniform and symmetrical disposition of the stars over the sky,” as Forbes wrote, “could alone afford no evidence of causation” would be “palpably absurd.” The reason Forbes thought that way, in turn, is the connection both to the Goldman lawsuit—and nuclear weapons.

Forbes made his point by an analogy to flipping a coin: to think that the stars had been distributed randomly because they were evenly spaced across the sky, he wrote, would be as ridiculous as the chances that “on 1000 throws [of a fair coin] there should be exactly 500 heads and 500 tails.” In fact, the Scotsman pointed out, mathematics demonstrates that in such a case of 1000 throws “there are almost forty chances to one [i.e., nearly 98%], that some one of the other possible events shall happen instead of the required one.” In 1000 throws of a fair coin, there’s less than a three percent chance that the flipper will get exactly 500 heads: it’s simply a lot more likely that there will be some other number of heads. In Gould’s essay about the Waitomo Cave, he put the same point like this: “Random arrays always include some clumping … just as we will flip several heads in a row quite often so long as we can make enough tosses.” Because the stars clump together, Forbes argued, that is evidence that they are randomly distributed—not of a benevolent Creator, like Michell thought. Forbes’ insight, in turn, about how to detect randomness, or chance, in astronomical data had implications far beyond the stars: in a story that would take much more space than this essay to tell, it eventually led a certain Swiss patent clerk to take up the phenomena called “Brownian motion.”

The clerk, of course, was Albert Einstein; the subject of his 1905 paper, “On the Movement of Small Particles Suspended In a Stationary Liquid Demanded by the Molecular-Kinetic Theory of Heat,” was the tendency—“easily observed in a microscope,” Einstein remarks—for tiny particles to move in an apparently-spontaneous manner. What Einstein realized (as physicist Leonard Mlodinow put it in his 2008 book, The Drunkard’s Walk: How Randomness Rules Our Lives) was that the “jiggly” motion of dust particles and so on results from collisions between them and even smaller particles, and so “there was a predictable relationship between factors such as the size, number, and speed of the molecules and the observable frequency and magnitude of the jiggling.” In other words, “though the collisions [between the molecules and the larger particles] occur very frequently, because the molecules are so light, those frequent isolated collisions have no visible effects” for the most part—but once in a while, “when pure luck occasionally leads to a lopsided preponderance of hits from some particular direction,” there are enough hits to send the particle moving. Or, to put it another way, when the flip of a 1000 coins all come up heads, the particle will move. Put in that fashion, to be sure, Einstein’s point might appear obscure at best—but as Mlodinow goes on to say, it is no accident that this seemingly-minor paper became the great physicist’s “most cited work.” That’s because the ultimate import of the paper was to demonstrate the existence … of the atom. Which is somewhat of a necessity for building an atom bomb.

The existence of the atomic bomb, then, can be said to depend on the insight developed by Forbes: just how significant the impact of chance can be in the formation of both the very large (the universe itself, according to Forbes), and the very small (the atom, according to Einstein). The point both men attempted to make, in turn, is that the existence of order is something very rare in this universe, at any rate (whatever may be the case in others). Far more common, then, is the existence of disorder—which brings us back to Goldman Sachs and the existence of sexism.

It is the contention of the second point in the plaintiffs’ brief in Chen-Oster v. Goldman Sachs, Inc., remember, that there exists (as University of Illinois English professor Walter Benn Michaels has noted) a “‘“stark” underrepresentation’ [of women] in management” because “‘just 29 percent of vice presidents, 17 percent of managing directors, and 14 percent of partners’” are women. Goldman Sachs, as it happens, has roughly 35,000 employees—which, it turns out, is about 0.001% of the total population of the United States, which is 323 million. Of those 323 million, as of the 2010 Census women number about 157 million, compared to around 151 million men. Hence, the question to be asked about the Goldman Sachs lawsuit (and I write this as someone with little sympathy for Goldman Sachs) is—if the reasoning Einstein followed to demonstrate the existence of the atom is correct—then if the chances of landing exactly 500 heads, when tossing a coin 1000 times, is less than three percent, how much less likely is it that a sample of 35,000 people will exactly mirror the proportions of 323 million? The answer, it would seem, is rather low: it’s simply a lot more likely that Goldman Sachs would have something other than a proportionate ratio of men to women than the reverse, just as it it’s a lot more likely that stars should clump together than be equally spaced like the worms in the New Zealand cave. And that is to say that the disproportionate number of men in leadership in positions at Goldman Sachs is merely evidence of the absence of a pro-woman bias at Goldman Sachs, not evidence of the existence of a bias against women.

To which it might be replied, of course, that the point isn’t the exact ratio, but rather that it is so skewed toward one sex: what are the odds, it might be said, that all three categories of employee should all be similarly bent in one direction? Admittedly, that is an excellent point. But it’s also a point that’s missing from the plaintiffs’ brief: there is no mention of a calculation respecting the particular odds in the case, despite the fact that the mathematical techniques necessary to do those calculations have been known since long before the atomic bomb, or even Einstein’s paper on the existence of the atom. And it’s that point, in turn, that concerns not merely the place of women in society—but ultimately the survival of the United States.

After all, the reason that the plaintiffs in the Goldman Sachs suit do not feel the need to include calculations of the probability of the disproportion they mention—despite the fact that it is the basis of their second claim—is that the American legal system is precisely structured to keep such arguments at bay. As Oliver Roeder observed in FiveThirtyEight last year, for example, the justices of the U.S. Supreme Court “seem to have a reluctance—even an allergy—to taking math and statistics seriously.” And that reluctance is not limited to the justices alone: according to Sanford Levinson, a University of Texas professor of law and government interviewed by Roeder in the course of reporting his story, “top-level law schools like Harvard … emphasize … traditional, classical legal skills” at the expense of what Levinson called “‘genuine familiarity with the empirical world’”—i.e., the world revealed by techniques pioneered by investigators like James Forbes. Since, as Roeder observes, all nine current Supreme Court justices attended either Harvard or Yale, that suggests that the curriculum followed at those schools has a connection to the decisions reached by their judicial graduates.

Still, that exclusion might not be so troublesome were it limited merely to the legal machinery. But as Nick Robinson reported last year in the Buffalo Law Review, attorneys have “dominated the political leadership of the United States” throughout its history: “Since independence,” Robinson pointed out there, “more than half of all presidents, vice presidents, and members of Congress have come from a law background.” That then implies that if the leadership class of the United States is derived from American law schools, and American law schools train students to disdain mathematics and the empirical world, then it seems plausible to conclude that much of the American leadership class is specifically trained to ignore both the techniques revealed by Forbes and the underlying reality they reveal: the role played by chance. Hence, while such a divergence may allow plaintiffs like those in the Goldman case to make allegations of sexism without performing the hard work of actually demonstrating how it might be possible mathematically, it might also have consequences for actual women who are living, say, in a nation increasingly characterized by a vast difference between the quantifiable wealth of those at the top (like people who work for Goldman Sachs) and those who aren’t.

And not merely that. For decades if not centuries, Americans have bemoaned the woeful lack of performance of American students in mathematics: “Even in Massachusetts, one of the country’s highest-performing states,” Elizabeth Green observed in the latest of one of these reports in the New York Times in 2014, “math students are more than two years behind their counterparts in Shanghai.” And results like that, as the journalist Michael Lewis put the point several years ago in Vanity Fair, risk “ceding … technical and scientific leadership to China”—and since, as demonstrated, it’s knowledge of mathematics (and specifically knowledge of the mathematics of probability) that made the atomic bomb possible, that implies conversely that ignorance of the subject is a serious threat to national existence. Yet, few Americans have, it seems, considered whether the fact that students do not take mathematics (and specifically probability) seriously may have anything to do with the fact that the American leadership class explicitly rules such topics, quite literally, out of court.

Of course, as Lewis also pointed out in his recent book, The Undoing Project: A Friendship that Changed Our Minds, American leaders may not be particularly alone in ignoring the impact of probabilistic reasoning: when, after the Yom Kippur War—which had caught Israel’s leaders wholly by surprise—future Nobel Prize winner Daniel Kahneman and intelligence officer Zvi Lanir attempted to “introduce a new rigor in dealing with questions of national security” by replacing intelligence reports written “‘in the form of essays’” with “probabilities, in numerical form,” they found that “the Israeli Foreign Ministry was ‘indifferent to the specific probabilities.’” Kahneman suspected that the ministry’s indifference, Lewis reports, was due to the fact that Israel’s leaders’ “‘understanding of numbers [was] so weak that [the probabilities did not] communicate’”—but betting that the leadership of other countries continues to match the ignorance of our own does not particularly appear wise. Still, as Oliver Roeder noted for FiveThirtyEight, not every American is willing to continue to roll those dice: University of Texas law professor Sanford Levinson, Roeder reported, thinks that the “lack of rigorous empirical training at most elite law schools” requires the “long-term solution” of “a change in curriculum.” And that, in turn, suggests that Chen-Oster v. Goldman Sachs, Inc. might be more than a flip of a coin over the existence of sexism on Wall Street.


Green Jackets ’n’ Blackfaces

But if you close your eyes,
Does it almost feel like
Nothing’s changed all?
    Bastille (2013)



Some bore will undoubtedly claim, this April week, that the Masters is unique among golf’s major tournaments because it is the only one held at the same course every year—a claim not only about as fresh as a pimento cheese sandwich but refuted by the architectural website Golf Club Atlas. “Augusta National,” the entry for the course goes on their website, “has gone through more changes since its inception than any of the world’s twenty or so greatest courses.” But the club’s jive by no means stops there; just as the club—and the journalists who cover the tournament—likes to pretend its course is timeless, so too does the club—what with the sepia photos of Bobby Jones, the talk of mint juleps, the bright azaleas, the “limited commercial interruptions” and the old-timey piano music of the tournament broadcast—like to pretend it is an island of “the South” in a Yankee sea. The performance is worthy of one of the club’s former members: Freeman Gosden, who became a member of Augusta National as a result of the riches and fame thrown off by the radio show he created in 1928 Chicago—Amos ’n’ Andy.

Gosden played Amos; his partner, Charles Correll, played Andy. The two actors had met in Durham, North Carolina in 1920, and began performing together in Chicago soon afterwards. According to Wikipedia, both were “familiar with minstrel traditions”: the uniquely American art form  in which white performers would sing and tell jokes and stories while pretending to be black, usually while wearing “blackpaint”—that is, covering their faces with black makeup. The show they created, about two black cab drivers, translated those minstrel traditions to radioand became the most successful minstrel show in American history. Amos ’n’ Andy lasted 32 years on the radio—the last performance came in 1960—and while it only lasted a few years on television in the early 1950s, the last rerun played on American air as late as 1966.

The successful show made Gosden and Correll made so rich, in fact, that by the early 1950s Gosden had joined the Augusta National Golf Club, and sometime thereafter the actor had become so accepted that he joined the group known as “the Gang.” This was a troop of seven golfers that formed around General Dwight Eisenhower—who had led the amphibious Allied invasion of France on the beaches of Normandy in 1944—after the former war hero was invited to join the club in 1948. Gosden had, in other words, arrived: there was, it seems, something inherently entertaining about a white men pretending to be something he wasn’t.

Gosden was however arguably not the only minstrel performer associated with Augusta National: the golf architecture website Golf Club Atlas claims that the course itself performs a kind of minstrelry. Originally, Augusta’s golf course was designed by famed golf architect Alister MacKenzie, who also designed such courses as Cypress Point in California and Crystal Downs in Michigan, in consultation with Bobby Jones, the great player who won 13 major championships. As a headline from The Augusta Chronicle, the town’s local newspaper, once proclaimed, “MacKenzie Made Jones’ Dream Of Strategic Course Into Reality.” But in the years since, the course has been far from timeless: as Golf Club Atlas points out, in fact it has gone through “a slew of changes from at least 15 different ‘architects.’” As it now stands, the course is merely pretending to be a MacKenzie.

Nearly every year since the Masters began in 1934, the course has undergone some tweak or another: whereas, once “Augusta National could have been considered amongst the two or three most innovative designs ever,” it has now been so altered—according to the Golf Club Atlas article—that to “call it a MacKenzie course is false advertising as his features are essentially long gone.” To say that course Tiger Woods won on is the same as the one that Jack Nicklaus or Ben Hogan won on, thus, is to make a mockery of history.

The primary reason the Atlas can make that claim stick is because the golf club has flouted Jones’ and MacKenzie’s original intent, which was to build a course like one they both revered: the Old Course at St. Andrews. Jones loved the Old Course so much that, famously, he was later made an honorary citizen of the town, while for his part MacKenzie wrote a book—not published until decades after his death in 1995—called The Spirit of St. Andrews. And as anyone familiar with golf architecture knows, the Old Course is distinguished by the “ground game”: where the golfer does better to keep his ball rolling along the ground, following its contours, rather than flying it through the air.

As Golf Club Atlas observes, “Jones and MacKenzie both shared a passion for the Old Course at St. Andrews, and its influence is readily apparent in the initial design” because “the ground game was meant to be the key at Augusta National.” That intent, however, has been lost; in a mordant twist of history, the reason for that loss is arguably due to the success of the Masters tournament itself.

“Ironically, hosting the Masters has ruined one of MacKenzie’s most significant designs,” says the Atlas, because “much of the money that the club receives from the Invitational is plowed back into making changes to the course in a misguided effort to protect par.” Largely, “protecting par” has been interpreted by the leadership of the golf club to mean “to minimize the opportunity for the ground game.” As Rex Hoggard—repeating a line heard about the course for decades—wrote in an article for the Golf Channel’s website in 2011, it’s “important to hit the ball high at Augusta National”—a notion that would be nonsensical if Jones and MacKenzie’s purpose had been kept in view.

In short, the Atlas understands—perhaps shockingly—that “an invitation to play Augusta National remains golf’s most sought-after experience,” it thus also believes that “fans of Alister MacKenzie would be better served to look elsewhere for a game.” Though the golf club, and the television coverage, might work to present the course as a static beauty, in fact that effect is achieved through endless surgeries that have effectively made the course other than it was. The Augusta National golf course, thus, is a kind of minstrel.

Similarly, the presentation of the golf club as a specifically Southern institution—perhaps above all, by ensuring that the chairman of the club, the only member who regularly speaks to the media, possesses a Georgia drawl (as recent chairmen Hootie Johnson and Billy Payne have)—is belied by the club’s history. Consider, in that light, a story from the beginnings of the club itself, a story ably told in Curt Sampson’s The Masters: Golf, Money, and Power in Augusta, Georgia.

In January of 1933—the depths of the Great Depression—a New York investment banker named Clifford Roberts approached the Southern Railroad System with a proposal: “comfortable conveyance for one hundred New Yorkers to and from Augusta, Georgia”—at a discount. “Business was so bad,” Roberts himself would later write in his history of the golf club, “that the railroad promised not only a special low rate, but all new Pullman equipment with two club cars for card players and two dining cars.” In this way, Sampson writes, “the grand opening of the Augusta National Golf Club began in a railroad station in New York City.”

Most golf fans, if they are aware of the club that holds the tournament at all, only know that it was founded by Bobby Jones when he retired from competitive golf following the annus mirabilis of 1930, when Jones won the Grand Slam of all four major tournaments in the same year. But, as Sampson’s story demonstrates, it was Clifford Roberts that made Jones’ vision a reality by raising the money to build it—and that money came largely from New York, not the South.

Sixty of the 100 men Roberts recruited to join the club before it opened were from New York City: the Augusta National Golf Club would be, as Sampson puts it, “a private enclave for rich Yankees in the heart of the South, just sixty-eight years after the Civil War.” Sampson calls the idea “bizarre”—but in fact, it only is if one has a particularly narrow idea of “the South.” Augusta National’s status as a club designed to allow Yankees to masquerade as Southerners only seems ridiculous if it’s assumed that the very idea of “the South” itself is not a kind of minstrelry—as, in fact, it arguably is.

Links between New York finance and the South, that is, long predated the first golf shot at the new course. It’s often forgotten, for instance, that—as historians Charles and John Lockwood pointed out in the New York Times in 2011—after South Carolina declared it would secede in December of 1860, “the next call for secession would not come from a Southern state, but from a Northern city—New York.”

On 7 January of the bleak “Secession Winter” of ’61, the two historians note, New York’s mayor, Fernando Wood, spoke to the city council to urge that it follow the Southern state and secede. The mayor was merely articulating the “pro-Southern and pro-independence sentiment” of the city’s financiers and traders—a class buoyed up by the fact that “the city’s merchants took 40 cents of every dollar that Europeans paid for Southern cotton.” The Southern staple (and the slaves whose labor grew that crop), had in other words “helped build the new marble-fronted mercantile buildings in lower Manhattan, fill Broadway hotels and stores with customers, and build block after block of fashionable brownstones north of 14th Street.” Secession of the South put all those millions of dollars at risk: to protect its investments, thus Mayor Wood was proposing, New York might have to follow the South out of the Union.

Such a move would have had disastrous consequences. The city was the site of the vast Brooklyn Navy Yard, which in the months after the fall of Fort Sumter in Charleston Harbor would assemble the fleet that not only would blockade the Southern coast, but would, in November of ’61, land an army at Hilton Head, South Carolina, the heart of secessionism—a fleet only exceeded by the armada General Eisenhower would gather against Normandy in the late winter and spring of 1944. But even more importantly, in that time the taxes collected by the New York Customs House virtually paid the entire federal government’s budget each year.

“In 1860,” as the Lockwoods write, “tariffs on imported goods collected at ports … provided $56 million of the $64.6 million of federal revenue, and more than two-thirds of imports by value passed through New York.” If New York seceded, in other words, the administration of president-elect Abraham Lincoln would be bankrupt before it took office: the city, as it were, held the nation’s government by a golden leash.

But New York City did not follow the South out of the Union: when the cannons fired at Fort Sumter that April, New York joined the rest of the nation in confirming the sentiments of Daniel Webster’s Second Reply to Hayne: “Liberty and Union, Now and Forever, One and Inseparable!” Over a hundred thousand would turn out to the “Great Sumter Rally” at (the appropriately-named) Union Square in the city on 20 April, after the fall of the federal fort in Charleston Harbor. It was, perhaps, the largest expression of New York’s patriotism before the fall of the towers overlooking the city at the dawn of the twenty-first century.

Mayor Wood himself spoke at that rally to affirm his support for “the Union, the government, the laws and the flag”—reversing his course from mere months before, a turn that perhaps has served to obscure how close the city’s ties were to a region, and economic system, that had turned away from all of those institutions. But just because it was politically expedient to deny them did not conjure them away. Indeed, the very existence of the Augusta National Golf Club is testament to just how enduring those ties between New York and the Deep South may be.

Still, of course, none of these acts of minstrelry—the golf course’s masquerade as the work of a designer whose work barely survives, the golf club’s disguise as a Southern institution when in fact it has been largely the work of Yankee financiers, or even the South’s own pretense—could be said to matter, really, now. Except for one detail: those links, some might say, extend into the present: perhaps the biggest story in American political history over the past century is how the party that would win the Civil War, the party of Lincoln, has become the defender, instead of the antagonist, of that vision of the South portrayed every year by the Masters tournament. It’s an act of minstrelry that lies at the heart of American political life today.

In 1962, wrote Ian Haney-Lopez (John H. Boalt Professor of Law at the University of California, Berkeley) for Salon in 2013, “when asked which party ‘is more likely to see that Negroes get fair treatment in jobs and housing,’ 22.7 percent of the public said Democrats and 21.3 percent said Republicans, while over half could perceive no difference between the two.” The masks of the two parties were, on this issue, interchangeable.

Yet, by the summer of 1963, conservative journalist Robert Novak could report from the Republican National Committee’s meeting in Denver that a “good many, perhaps a majority of the party’s leadership, envision political gold to be mined in the racial crisis by becoming in fact, though not in name, the White Man’s Party.” It was a harvest that would first be reaped the following year: running against Lyndon Johnson, who had—against long odds—passed the 1964 Civil Rights Act, the Republican nominee, Barry Goldwater, would outright win five states of the Deep South: Louisiana, Alabama, Georgia, Mississippi, and South Carolina. It was the first time a Republican nominee for president had won in those states, at least since the end of Reconstruction and the beginning of Jim Crow.

Still, those states—and electoral votes—were not enough to carry Goldwater to the White House. But they formed the prelude to the election that did make those votes count: 1968, won by Richard Nixon. According to one of Nixon’s political strategists that year, Kevin Phillips, that election demonstrated the truth of the thesis Phillips would lay out in his 1969 book, The Emerging Republican Majority: “The Negro problem, having become a national rather than a local one, is the principal cause of the breakup of the New Deal coalition”—the coalition that had delivered landslides for Franklin Roosevelt and, in 1964, for Johnson. Phillips predicted that a counter-coalition would emerge that would be “white and middle class,” would be “concentrated in the South, the West, and suburbia,” and would be driven by reaction to “the immense midcentury impact of Negro enfranchisement and integration.” That realignment would become called Nixon’s “Southern Strategy.”

The “Southern Strategy,” as Nixon’s opponent in 1972, George McGovern, would later remark, “says to the South:”

Let the poor stay poor, let your economy trail the nation, forget about decent homes and medical care for all your people, choose officials who will oppose every effort to benefit the many at the expense of the few—and in return, we will try to overlook the rights of the black man, appoint a few southerners to high office, and lift your spirits by attacking the “eastern establishment” whose bank accounts we are filling with your labor and your industry.

Haney-Lopez argues, in the book from which this excerpt is taken—entitled Dog Whistle Politics: How Coded Racial Appeals Have Reinvented Racism and Wrecked the Middle Class, published by Oxford University Press—that it is the wreckage from Nixon’s course that surrounds us today: economic attacks on the majority enabled by nearly transparent racial coding. He may or may not be right—but what might be of interest to future historians is the role, large or small, that the Augusta National Golf Club may have played in that drama.

Certainly, after all, the golf club played an outsize role in the Eisenhower administration: according to the Augusta Chronicle, Eisenhower made 45 trips to the golf club during his life: “five before he became president, 29 while president and 11 after his last term.” And just as certainly the club provided more than recreation for the general and president.

One Augusta member (Pete Jones) would, according to Sampson and other sources, “offer Ike $1 million for his 1952 campaign for president.” (“When Pete Jones died in a plane crash in 1962,” Sampson reports, “he had $60,000 in his wallet.”) Even before that, Clifford Roberts had arranged for one Augusta member, a publisher, to buy the general’s memoirs; the money made Eisenhower financially secure for the first time in his life.

It was members of the golf club in short who provided the former Supreme Commander of the West with both the advice and the financial muscle to reach for the Republican nomination for president in 1952. His friends while in Augusta, as Sampson notes, included such figures as Robert Woodruff of Coca-Cola, “Bud (washing machines) Maytag, Albert (General Motors) Bradley, Alfred S. (Singer Sewing Machines) Bourne” and other captains of industry. Another member of the golf club was Ralph Reed, president of American Express, who would later find a job for the general’s driver during the war, Kay Summersby.

All of which is, to be sure, a long way from connecting the club directly to Nixon and the “Southern Strategy.” There’s a great deal of testimony, in fact, that would appear to demonstrate the contrary. According to Golf Digest, for example, Nixon “once told Clifford Roberts”—the storied golf club’s sometimes-malevolent dictator—“that he wouldn’t mind being a member of Augusta National, and Roberts, who didn’t like him any better than Eisenhower did, said “I didn’t know you were that interested in golf.” “And that,” goes the story, “was the end of that.” Sampson’s work tends to confirm the point: a few of Ike’s cronies at the club, Sampson reports, “even urged Ike to dump Dick in 1956,” the year the general ran for re-election.

Still, the provable is not the same as the unimaginable. Take, for instance, the testimony of Charlie Sifford, the man Lee Trevino called the “Jackie Robinson” of golf—he broke the game’s color barrier in 1961, after the attorney general of California threatened to sue the PGA of America for its “whites only” clause. Sifford fought for years to be invited to play in the Masters tournament, only to be denied despite winning two tournaments on the PGA Tour. (The 1967 Greater Hartford Open and the 1969 Los Angeles Open.) In his autobiography, Just Let Me Play, Sifford quoted Clifford Roberts as saying, “As long as I live, there will be nothing at the Masters besides black caddies and white players.”

Sampson for one discounts this as implausible—for what it’s worth, he thinks it unlikely that Roberts would have actually said such a thing, not that Roberts was incapable of thinking it. Nevertheless, golfers in the Masters tournament were required to take “local” (i.e., black) caddies until 1983, six years after Roberts shot himself in the head beside Ike’s Pond on the grounds of the club, in late September, 1977. (The chairman, it’s said, took a drop.) Of course, the facts of the golf club’s caddie policy means nothing, nor even would Clifford Roberts’ private thoughts regarding race. But the links between the club, the South, and the world of money and power remain, and whatever the future course of the club, or the nation, those forged in the past—no matter the acts of minstrelry designed to obscure them—remain.

Now, and forever.