Just as ancient Greek and Roman propagandists insisted, the Carthaginians did kill their own infant children, burying them with sacrificed animals and ritual inscriptions in special cemeteries to give thanks for favours from the gods, according to a new study.
The Guardian, 21 January 2014.


Just after the last body fell, at three seconds after 9:40 on the morning of 14 December, the debate began: it was about, as it always is, whether Americans ought to follow sensible rules about guns—or whether they ought to be easier to obtain than, say, the right to pull fish out of the nearby Housatonic River. There’s been a lot of words written about the Sandy Hook killings since the day that Adam Lanza—the last body to fall—killed 20 children and six adults at the elementary school he once attended, but few of them have examined the culpability of some of the very last people one might expect with regard to the killings: the denizens of the nation’s universities. After all, it’s difficult to accuse people who themselves are largely in favor of gun control of aiding and abetting the National Rifle Association—Pew Research reported, in 2011, that more than half of people with more than a college degree favored gun control. And yet, over the past several generations a doctrine has gained ground that, I think, has not only allowed academics to absolve themselves of engaging in debate on the subject of gun control, but has actively harmed the possibility of accomplishing it.

Having said that, of course, it is important to acknowledge that virtually all academics—even those who consider themselves “conservative” politically—are in favor of gun control: when for example Texas passed a law legalizing carrying guns on college campus recently Daniel S. Hamermesh, a University of Texas emeritus professor of economics (not exactly a discipline known for its radicalism), resigned his position, citing a fear for his own and his students’ safety. That’s not likely accidental, because not only do many academics oppose guns in their capacities as citizens, but academics have a special concern when it comes to guns: as Firmin DeBrabander, a professor of philosophy at the Maryland Institute College of Art argued in the pages of Inside Higher Ed last year, against laws similar to Texas’, “guns stand opposed” to the “pedagogical goals of the classroom” because while in the classroom “individuals learn to talk to people of different backgrounds and perspectives,” guns “announce, and transmit, suspicion and hostility.” If anyone has a particular interest in controlling arms, in other words, it’s academics, being as their work is particularly designed to foster what DeBrabander calls “open and transformative exchange” that may air “ideas [that] are offensive.” So to think that academics may in fact be an obstacle towards achieving sensible policies regarding guns might appear ridiculous on the surface.

Yet there’s actually good reason to think that academic liberals bear some responsibility for the United States’ inability to regulate guns like every other industrialized—I nearly said, “civilized”—nation on earth. That’s because changing gun laws would require a specific demands for action, and as political science professor Adolph Reed, Jr. of the University of Pennsylvania put the point not long ago in Harper’s, these days the “left has no particular place it wants to go.” That is, to many on campus and off, making specific demands of the political sphere is itself a kind of concession—or in other words, as journalist Thomas Frank remarked a few years ago about the Occupy Wall Street movement, today’s academic left teaches that “demands [are] a fetish object of literal-minded media types who stupidly crave hierarchy and chains of command.” Demanding changes to gun laws is, after all, a specific demand, and to make specific demands is, from this sophisticated perspective, a kind of “sell out.”

Still, how did the idea of making specific demands become a derided form of politics? After all, the labor movement (the eight-hour day), the suffragette movement (women’s right to vote) or the civil rights movement (an end to Jim Crow) all made specific demands. How then has American politics arrived at the diffuse and essentially inarticulable argument of the Occupy movement—a movement within which, Elizabeth Jacobs claimed in a report for the Brookings Institute while the camp in Zuccotti Park still existed, “the lack of demands is a point of pride?” I’d suggest that one possible way the trick was turned was through a 1967 article written by one Robert Bellah, of Harvard: an article that described American politics, and its political system, as a “civil religion.” By describing American politics in religious rather than secular terms, Bellah opened the way towards what some have termed the “non-politics” of Occupy and other social movements—and incidentally, allow children like Adam Lanza’s victims to die.

In “Civil Religion in America,” Bellah—who received his bachelor’s from Harvard in 1950, and then taught at Harvard until moving to the University of California at Berkeley in 1967, where he continued until the end of his illustrious career—argued that “few have realized that there actually exists alongside of and rather clearly differentiated from the churches an elaborate and well-institutionalized civil religion in America.” This “national cult,” as Bellah terms it, has its own holidays: Thanksgiving Day, Bellah says, “serves to integrate the family into the civil religion,” while “Memorial Day has acted to integrate the local community into the national cult.” Bellah also remarks that the “public school system serves as a particularly important context for the cultic celebration of the civil rituals” (a remark that, incidentally, perhaps has played no little role in the attacks on public education over the past several decades). Bellah also argues that various speeches by American presidents like Abraham Lincoln and John F. Kennedy are also examples of this “civil religion” in action: Bellah spends particular time with Lincoln’s Gettysburg Address, which he notes that poet Robert Lowell observed is filled with Christian imagery, and constitutes “a symbolic and sacramental act.” In saying so, Bellah is merely following a longstanding tradition regarding both Lincoln and the Gettysburg Address—a tradition that, however, that does not have the political valence that Bellah, or his literal spiritual followers, might think it does.

“Some think, to this day,” wrote Garry Wills of Northwestern University in his magisterial Lincoln at Gettysburg: The Words that Remade America, “that Lincoln did not really have arguments for union, just a kind of mystical attachment to it.” It’s a tradition that Wills says “was the charge of Southerners” against Lincoln at the time: after the war, Wills notes, Alexander Stephens—the only vice president the Confederate States ever had—argued that the “Union, with him [Lincoln], in sentiment rose to the sublimity of a religious mysticism.” Still, it’s also true that others felt similarly: Wills points out that the poet Walt Whitman wrote that “the only thing like passion or infatuation” in Lincoln “was the passion for the Union of these states.” Nevertheless, it’s a dispute that might have fallen by the historical wayside if it weren’t for the work of literary critic Edmund Wilson, who called his essay on Lincoln (collected in a relatively famous book Patriotic Gore: Studies in the Literature of the American Civil War) “The Union as Religious Mysticism.” That book, published in 1962, seems to have at least influenced Lowell—the two were, if not friends, at least part of the same New York City literary scene—and through Lowell Bellah, seems plausible.

Even if there was no direct route from Wilson to Bellah, however, it seems indisputable that the notion—taken from Southerners—concerning the religious nature of Lincoln’s arguments for the American Union became widely transmitted through American culture. Richard Nixon’s speechwriter, William Safire—since a longtime columnist for the New York Times—was familiar with Wilson’s ideas: as Mark Neely observed in his The Fate of Liberty: Abraham Lincoln and the Civil Liberties, on two occasions in Safire’s novel Freedom, “characters comment on the curiously ‘mystical’ nature of Lincoln’s attachment to the Union.” In 1964, the theologian Reinhold Niebuhr published an essay entitled “The Religion of Abraham Lincoln,” while in 1963 William J. Wolfe of the Episcopal Theological School of Cambridge, Massachusetts claimed that “Lincoln is one of the greatest theologians in America,” in the sense “of seeing the hand of God intimately in the affairs of nations.” Sometime in the early 1960s and afterwards, in other words, the idea took root among some literary intellectuals that the United States was a religious society—not one based on an entirely secular philosophy.

At least when it comes to Lincoln, at any rate, there’s good reason to doubt this story: far from being a religious person, Lincoln has often been described as non-religious or even an atheist. His longtime friend Jesse Fell—so close to Lincoln that it was he who first suggested what became the famous Lincoln-Douglas debates—for instance once remarked that Lincoln “held opinions utterly at variance with what are usually taught in the church,” and Lincoln’s law partner William Herndon—who was an early fan of Charles Darwin’s—said that the president also was “a warm advocate of the new doctrine.” Being committed to the theory of evolution—if Lincoln was—doesn’t mean, of course, that the president was therefore anti-religious, but it does mean that the notion of Lincoln as religious mystic has some accounting to do: if he was, it apparently was in no very simple way.

Still, as mentioned the view of Lincoln as a kind of prophet did achieve at least some success within American letters—but, as Wills argues in Lincoln at Gettysburg, that success has in turn obscured what Lincoln really argued concerning the structure of American politics. As Wills remarks for instance, “Lincoln drew much of his defense of the Union from the speeches of [Daniel] Webster, and few if any have considered Webster a mystic.” Webster’s views, in turn, descend from a line of American thought that goes back to the Revolution itself—though its most significant moment was at the Constitutional Convention of 1787.

Most especially, to one James Wilson, a Scottish emigrant, delegate to the Constitutional Convention of 1787, and later one of the first justices of the Supreme Court of the United States. If Lincoln got his notions of the Union from Webster, then Webster got his from Supreme Court Justice Joseph Story: as Wills notes, Theodore Parker, the Boston abolitionist minister, once remarked that “Mr. Justice Story was the Jupiter Pluvius [Raingod] from whom Mr. Webster often sought to elicit peculiar thunder for his speeches and private rain for his own public tanks of law.” Story, for his part, got his notion from Wilson: as Linda Przybyscewski notes in passing in her book, The Republic According to John Marshall Harlan (a later justice), Wilson was “a source for Joseph Story’s constitutional nationalism.” And Wilson’s arguments concerning the constitution—which he had a strong hand in making—were hardly religious.

At the constitutional convention, one of the most difficult topics to confront the delegates was the issue of representation: one of the motivations for the convention itself, after all, was the fact that under the previous terms of government, the Articles of Confederation, each state, rather than each member of the Continental Congress, possessed a vote. Wilson had already, in 1768, attacked the problem of representation as being one of the foremost reasons for the Revolution itself—the American colonies were supposed, by British law, to be fully as much British subjects as a Londoner or Mancunian, but yet had no representation in Parliament: “Is British freedom,” Wilson therefore asked in his Considerations on the Nature and Extent of the Legislative Authority of the British Parliament, “denominated from the soil, or from the people, of Britain?” That question was very much the predecessor of the question Wilson would ask at the convention: “For whom do we make a constitution? Is it for men, or is it for imaginary beings called states?” To Wilson, the answer was clear: constitutions are for people, not for tracts of land.

Wilson also made an argument that would later be echoed by Lincoln: he drew attention to the disparities of population between the several states. At the time of the convention, Pennsylvania—just as it is today—was a much more populous state than New Jersey was, a difference that made no difference under the Articles of Confederation, under which all states had the same number of votes: one. “Are not the citizens of Pennsylvania,” Wilson therefore asked the Convention, “equal to those of New Jersey? Does it require 150 of the former to balance 50 of the latter?” This argument would later be echoed by Lincoln, who, in order to illustrate the differences between free states and slave states, would—in October of 1854, at Peoria, in the speech that would mark his political comeback—note that

South Carolina has six representatives, and so has Maine; South Carolina has eight presidential electors, and so has Maine. This is precise equality so far; and, of course they are equal in Senators, each having two. Thus in the control of the government, the two States are equals precisely. But how are they in the number of their white people? Maine has 581,813—while South Carolina has 274,567. Maine has twice as many as South Carolina, and 32,679 over. Thus each white man in South Carolina is more than the double of any man in Maine.

The point of attack for both men, in other words, was precisely the same: the matter of representation in terms of what would later be called a “one man, one vote” standard. It’s an argument that hardly appears “mystical” in nature: since the matter turns, if anything, upon ratios of numbers to each other, it seems more aposit to describe the point of view adopted here as, if anything, “scientific”—if it weren’t for the fact that even the word “scientific” seems too dramatic a word for a matter that appears to be far more elemental.

Were Lincoln or Wilson alive today, then, it seems that the first point they might make about the gun control debate is that it is a matter about which the Congress is greatly at variance with public opinion: as Carl Bialik reported for FiveThirtyEight this past January, whenever Americans are polled “at least 70 percent of Americans [say] they favor background checks,” and furthermore that an October 2015 poll by CBS News and the New York Times “found that 92 percent of Americans—including 87 percent of Republicans—favor background checks for all gun buyers.” Yet, as virtually all Americans are aware, it has become essentially impossible to pass any sort of sensible legislation through Congress: a fact dramatized this spring by a “sit-down strike” in Congress by congressmen and congresswomen. What Lincoln and Wilson might further say about the point is that the trouble can’t be solved by such a “religious” approach: instead, what they presumably would recommend is that what needs to change is a system that inadequately represents the people. That isn’t the answer that’s on offer from academics and others on the American left, however. Which is to say that, soon enough, there will be another Adam Lanza to bewail—another of the sacrifices, one presumes, that the American left demands Americans must make to what one can only call their god.


High Anxiety

Now for our mountain sport …

Act III, Scene 3

High Hampton

Wade Hampton Golf Club Sign

Entrances to Wade Hampton Golf Club and High Hampton Inn and Country Club, North Carolina

Walt Whitman once said, as anyone who saw Bull Durham knows, that baseball would function to draw America together after the Civil War: the game, the poet said, would “repair our losses and be a blessing to us.” Many Americans have not lost this belief in the redemptive power of sports: as recently as 2011 John Boehner, then-Speaker of the House of Representatives, played a much-ballyhooed round of golf with President Barack Obama—along with many other outlets, Golf Digest presented the event as presaging a new era of American unity: the “pair can’t possibly spend four hours keeping score, conceding putts, complimenting drives, filling divots, retrieving pond balls, foraging for Pro V1s and springing for Kit Kats off the snack cart,” argued the magazine, “without finding greater common ground.” Golf would thusly be the antidote to what the late Columbia University history professor Richard Hofstadter, in 1964, called the “paranoid style”: the “heated exaggeration, suspiciousness, and conspiratorial fantasy” that Hofstadter found to be a common theme in American politics then and whose significance has seemingly only grown since. Yet, while the surface approval of the “golf summit” seemed warranted because golf is, after all, a game that cannot really be played without trust in your opponents—it’s only on the assumption that everyone is honest that the game can even work—as everyone knows by now the summit failed: Boehner was, more or less, forced out of office this summer by those members of his party who, Boehner said, got “bent out of shape” over his golf with the president. While golf might, in other words, furnish a kind of theoretical model for harmonious bipartisanship, in practice it has proved largely useless for preventing political polarization—a result that anyone who has traveled Highway 107 in western North Carolina might have realized. Up there, among the Great Smoky Mountains, there sits a counterexample to the dream of political consensus: the Wade Hampton Golf Club.

Admittedly, that a single golf club could be strong enough evidence as to smack down the flights of fancy of a Columbia University professor like Hofstadter—and a Columbia University alumni like Barack Obama—might appear a bit much: there’s a seeming disconnect between the weightiness of the subject matter and the evidential value of an individual golf club. What could the existence of the Wade Hampton Golf Club add (or detract) from Hofstadter’s assertions about the dominance of this “paranoid style,” examples of which range from the anti-Communist speeches of Senator Joseph McCarthy in the 1950s to the anti-Catholic, “nativist” movements of the 1830s and 1840s to the Populist denunciations of Wall Street during the 1890s? Yet, the existence of the Wade Hampton Golf Club does constitute strong evidence against one of the pieces of evidence Hofstadter adduces for his argument—and in doing so unravels not only the rest of Hofstadter’s spell like a kitten does a ball of string, but also the fantasy of “bipartisanship.”

One of the examples of “paranoia” Hofstadter cited, in other words, was the belief held by “certain spokesmen of abolitionism who regarded the United States as being in the grip of a slaveholders’ conspiracy”—a view that, Hofstadter implied, was not much different than the contemporary belief that fluoridation was a Soviet plot. But a growing number of historians now believe that Hofstadter was wrong about those abolitionists: according to historian Leonard Richards of the University of Massachusetts, for instance, there’s a great deal of evidence for “the notion that a slaveholding oligarchy ran the country—and ran it for their own advantage” in the years prior to the Civil War. The point is more than an academic one: if it’s all just a matter of belief, then the idea of bipartisanship makes a certain kind of sense; all that matters is whether those we elect can “get along.” But if not, then that would suggest that what matters is building the correct institutions, rather than electing the right people.

Again, that seems like rather more question than the existence of a golf club in North Carolina seems capable of answering. The existence of the Wade Hampton Golf Club however tends to reinforce Richards’ view if, for nothing else, on its name alone: the very biography of the man the golf club was named for, Wade Hampton III, lends credence to Richards’ notion about the real existence of a slave-owning, oligarchical conspiracy because Hampton was after all not only a Confederate general during the Civil War, but also the possessor (according to the website for the Civil War Trust, which attempts to preserve Civil War battlefields) of “one of the largest collections of slaves in the South.” Hampton’s career, in other words, demonstrates just how entwined slaveowners were with the “cause” of the South—and if secession was largely the result of a slave-owning conspiracy during the winter of 1860, it becomes a great deal easier to think that said conspiracy did not spring fully grown only then.

Descended from an obscenely wealthy family whose properties stretched from near Charleston in South Carolina’s Lowcountry to Millwood Plantation near the state capital of Columbia and all the way to the family’s summer resort of “High Hampton” in the Smokies—upon the site of which the golf club is now built—Wade Hampton was intimately involved with the Southern cause: not only was he one of the richest men in the South, but at the beginning of the war he organized and financed a military unit (“Hampton’s Legion”) that would, among other exploits, help win the first big battle of the war, near the stream of Bull Run. By the end of the war Hampton became, along with Nathan Bedford Forrest, the only man without prior military experience to achieve the rank of lieutenant general. In that sense, Hampton was exceptional—only eighteen other Confederate officers achieved that rank—but in another he was representative: as recent historical work shows, much of the Confederate army had direct links to slavery.

As historian Joseph T. Glatthaar has put the point in his General Lee’s Army: From Victory to Collapse, “more than one in every four volunteers” for the Confederate army in the first year of the war “lived with parents who were slaveholders”—as compared with the general population of the South, in which merely one in every twenty white persons owned slaves. If non-family members are included, or if economic connections like those to whom soldiers rented land or sold crops prior to the war are allowed, then “the vast majority of the volunteers of 1861 had a direct connection to slavery.” And if the slaveowners could create an army that could hold off the power of the United States for four years, it seems plausible they might have joined together prior to outright hostilities—which is to say that Hofstadter’s insinuations about the relative sanity of “certain” abolitionists (among them, Abraham Lincoln) don’t have the same value as they may once have.

After all, historians have determined that the abolitionists were certainly right when they suspected the motives of the slaveowners. “By itself,” wrote Roger Ransom of the University of California not long ago, “the South’s economic investment in slavery could easily explain the willingness of Southerners to risk war … [in] the fall of 1860.” “On the eve of the war,” as another historian noted in the New York Times, “cotton comprised almost 60 percent of America’s exports,” and the slaves themselves, as yet another historian—quoted by Ta-Nehisi Coates in The Atlantic—has observed, were “the largest single financial asset in the entire U.S. economy, worth more than all manufacturing and railroads combined.” Collectively, American slaves were worth 3.5 billion dollars—at a time when the entire budget for the federal government was less than eighty million dollars. Quite literally, in other words, American slaveowners could buy the entire U.S. government roughly forty three times over.

Slaveowners thusly had, in the words of a prosecutor, both means and motive to revolt against the American government; what’s really odd about the matter, however, is that Americans have ever questioned it. The slaveowners themselves fully admitted the point at the time: in South Carolina’s “Declaration of the Immediate Causes which Adduce and Justify the Secession of South Carolina from the Federal Union,” for instance, the state openly lamented the election of a president “whose opinions and purposes are hostile to slavery.” And not just South Carolina: “Seven Southern states had seceded in 1861,” as the dean of American Civil War historians James McPherson has put observed, “because they feared the incoming Lincoln administration’s designs on slavery.” When those states first met together at Montgomery, Alabama, in February of 1861 it took them only four days to promulgate what the New York Times called “a provisional constitution that explicitly recognized racial slavery”; in a March 1861 speech Alexander Stephens, who would become the vice president of the Confederate States of America, argued that slavery was the “cornerstone” of the new government. Slavery was, as virtually anyone who has seriously studied the matter has concluded, the cause motivating the Southern armies.

If so—if, that is, the slaveowners created an army so powerful that it could hold off the power of the United States for four years, simply in order to protect their financial interests in slave-owning—it then seems plausible they might have joined together prior to the beginning of outright hostilities. Further, if there was a “conspiracy” to begin the Civil War, then the claim that there was one in the years and decades before the war becomes just that much more believable. And if that possibility is tenable, then so is the claim by Richards and other historians—themselves merely following a notion that Abraham Lincoln himself endorsed in the 1850s—that the American constitution formed “a structural impediment to the full expression of Northern voting power” (as one reviewer has put it)—and that thusly the answer to political problems is not “bipartisanship,” or in other words, the election of friendlier politicians, but rather structural reform.

Such, at least, might be the lesson anyone might draw from the career of Wade Hampton III, Confederate general—in light of which it’s suggestive that the Wade Hampton Golf Club is not some relic of the nineteenth century. Planning for the club began, according to the club’s website, in 1982; the golf course was not completed until 1987, when it was named “Best New Private Course” by Golf Digest. More suggestive still, however, is the fact that under the original bylaws, “in order to be a member of the club, you [had] to own property or a house bordering the club”—rules that resulted, as one golfer has noted, in a club of “120 charter and founding members, all from below the Mason-Dixon Line: seven from Augusta, Georgia and the remainder from Florida, Alabama, and North Carolina.” “Such folks,” as Bradley Klein once wrote in Golfweek, “would have learned in elementary school that Wade Hampton III, 1818-1902, who owned the land on which the club now sits, was a prominent Confederate general.” That is, in order to become a member of Wade Hampton Golf Club you probably knew a great deal about the history of Wade Hampton III—and you were pretty ok with that.

The existence of the Wade Hampton Golf Club does not, to be sure, demonstrate a continuity between the slaveowners of the Old South and the present membership of the club that bears Hampton’s name. It is, however, suggestive to think that if it is true, as many Civil War historians now say, that prior to 1860 there was a conspiracy to maintain an oligarchic form of government, then what are we to make of a present in which—as former Secretary of Labor Robert Reich recently observed—“the richest one-hundreth of one percent of Americans now hold over 11 percent of the nation’s total wealth,” a proportion greater than at any time since before 1929 and the start of the Great Depression? Surely, one can only surmise, the answer is easier to find than a mountain hideaway far above the Appalachian clouds, and requires no poetic vision to see.

Our Game

truck with battle flag and bumper stickers
Pick-up truck with Confederate battle flag.


[Baseball] is our game: the American game … [it] belongs as much to our institutions, fits into them as significantly, as our constitutions, laws: is just as important in the sum total of our historic life.
—Walt Whitman. April, 1889.

The 2015 Chicago Cubs are now a memory, yet while they lived nearly all of Chicago was enthralled—not least because of the supposed prophesy of a movie starring a noted Canadian. For this White Sox fan, the enterprise reeked of the phony nostalgia baseball has become enveloped by, of the sort sportswriters like to invoke whenever they, for instance, quote Walt Whitman’s remark that baseball “is our game: the American game.” Yet even while, to their fans, this year’s Cubs were a time machine to what many envisioned as a simpler, and perhaps better, America—much as the truck pictured may be such a kind of DeLorean to its driver—in point of fact the team’s success was built upon precisely the kind of hatred of tradition that was the reason why Whitman thought baseball was “America’s game”: baseball, Whitman said, had “the snap, go, fling of the American character.” It’s for that reason, perhaps, that the 2015 Chicago Cubs may yet prove a watershed edition of the Lovable Losers: they might prove not only the return of the Cubs to the elite of the National League, but also the resurgence of a type of thinking that was of the vanguard in Whitman’s time and—like World Series appearances for the North Siders—of rare vintage since. It’s a resurgence that may, in a year of Donald Trump, prove far more important than the victories of baseball teams, no matter how lovable.

That, to say the least, is an ambitious thesis: the rise of the Cubs signifies little but that their new owners possess a lot of money, some might reply. But the Cubs’ return to importance was undoubtedly caused by the team’s adherence, led by former Boston general manager Theo Epstein, to the principles of what’s been called the “analytical revolution.” It’s a distinction that was made clear during the divisional series against the hated St. Louis Cardinals: whereas, for example, St. Louis manager Matt Matheny asserted, regarding how baseball managers ought to handle their pitching staff,  that managers “first and foremost have to trust our gut,” the Cubs’ Joe Maddon (as I wrote about in a previous post) spent his entire season doing such things as batting his pitcher eighth, on the grounds that statistical analysis showed that by doing so his team gained a nearly-infinitesimal edge. (Cf. “Why Joe Maddon bats the pitcher eighth”

Since the Cubs hired former Boston Red Sox general manager Theo Epstein, few franchises in baseball have been as devoted to what is known as the “sabermetric” approach. When the Cubs hired him, Epstein was well-known for “using statistical evidence”—as the New Yorker’s Ben McGrath put it a year before Epstein’s previous team, the Boston Red Sox, overcame their own near-century of futility in 2004—rather than relying upon what Epstein’s hero, the storied Bill James, has called “baseball’s Kilimanjaro of repeated legend and legerdemain”—the sort embodied by the Cardinals’ Matheny apparent reliance on seat-of-the-pants judgement.

Yet, while Bill James’ sort of thinking may be astonishingly new to baseball’s old guard, it would have been old hat to Whitman, who had the example of another Bill James directly in front of him. To follow the sabermetric approach after all requires believing (as the American philosopher William James did according to the Internet Encyclopedia of Philosophy), “that every event is caused and that the world as a whole is rationally intelligible”—an approach that not only would Whitman have understood, but applauded.

Such at least was the argument of the late American philosopher Richard Rorty, whose lifework was devoted to preserving the legacy of late nineteenth and early twentieth century writers like Whitman and James. To Rorty, both of those earlier men subscribed to a kind of belief in America rarely seen today: both implicitly believed in what James’ follower John Dewey would call “the philosophy of democracy,” in which “both pragmatism and America are expressions of a hopeful, melioristic, experimental frame of mind.” It’s in that sense, Rorty argued, that William James’ famous assertion that “the true is only the expedient in our way of thinking” ought to be understood: what James meant by lines like this was that what we call “truth” ought to be tested against reality in the same way that scientists test their ideas about the world via experiments instead of relying upon “guts.”

Such a frame of mind however has been out of fashion in academia since at least the 1940s, Rorty often noted: for example, as early as the 1940s Robert Hutchins and Mortimer Adler of the University of Chicago were reviling the philosophy of Dewey and James as “vulgar, ‘relativistic,’ and self-refuting.” To say, as James did say, “that truth is what works” was—according to thinkers like Hutchins and Adler—“to reduce the quest for truth to the quest for power.” To put it another way, Hutchins and Adler provided the Ur Example of what’s become known as Godwin’s Law: the idea that, sooner or later, every debater will eventually claim that the opponent’s position logically ends up at Nazism.

Such thinking is by no means extinct in academia: indeed, in many ways Rorty’s work at the end of his life was involved in demonstrating how the sorts of arguments Hutchins and Adler enlisted for their conservative politics had become the very lifeblood of those supposedly opposed to the conservative position. That’s why, to those whom Rorty called the “Unpatriotic Academy,” the above picture—taken at a gas station just over the Ohio River in southern Indiana—will be confirmation of the view of the United States held by those who “find pride in American citizenship impossible,” and “associate American patriotism with an endorsement of atrocities”: to such people, America and science are more or less the same thing as the kind of nearly-explicit racism demonstrated in the photograph of the truck.

The problem with those sorts of arguments, Rorty wanted to claim in return, was that it is all-too willing to take the views of some conservative Americans at face value: the view that, for instance, “America is a Christian country.” That sentence is remarkable precisely because it is not taken from the rantings of some Southern fundamentalist preacher or Republican candidate, but rather is the opening sentence of an article by the novelist and essayist Marilynne Robinson in, of all places, the New York Review of Books. That it could appear so, I think Rorty would have said, shows just how much today’s academia really shares the views of its supposed opponents.

Yet, as Rorty was always arguing, the ideas held by the pragmatists are not so easily characterized as mere American jingoism as the many critics of Dewey and James and the rest would like to portray them as—nor is “America” so easily conflated with simple racism. That is because the arguments of the American pragmatists were (arguably) simply a restatement of a set of ideas held by a man who lived long before North America was even added to the world’s geography: a man known to history as Ibn Khaldun, who was born in Tunis on Africa’s Mediterranean coastline in the year 1332 of the Western calendar.

Khaldun’s views of history, as set out by his book Muqaddimah (“Introduction,” often known by its Greek title, Prolegemena), can be seen as the forerunners of the ideas of John Dewey and William James, as well as the ideas of Bill James and the front office of the Chicago Cubs. According to a short one-page biography of the Arab thinker by one “Dr. A. Zahoor,” for example, Khaldun believed that writing history required such things as “relating events to each other through cause and effect”—much as both men named William James believe[d] that baseball events are not inexplicable. As Khaldun himself wrote:

The rule for distinguishing what is true from what is false in history is based on its possibility or impossibility: That is to say, we must examine human society and discriminate between the characteristics which are essential and inherent in its nature and those which are accidental and need not be taken into account, recognizing further those which cannot possibly belong to it. If we do this, we have a rule for separating historical truth from error by means of demonstrative methods that admits of no doubt.

This statement is, I think, hardly distinguishable from what the pragmatists or the sabermetricians are after: the discovery of what Khaldun calls “those phenomena [that] were not the outcome of chance, but were controlled by laws of their own.” In just the same way that Bill James and his followers wish to discover things like when, if ever, it is permissible or even advisable to attempt to steal a base, or lay down a bunt (both, he says, are more often inadvisable strategies, precisely on the grounds that employing them leaves too much to chance), Khaldun wishes to discover ways to identify ideal strategies in a wider realm.

Assuming then that we could say that Dewey and James were right to claim that such ideas ought to be one and the same as the idea of “America,” then we could say that Ibn Khaldun, if not the first, was certainly one of the first Americans—that is, one of the first to believe in those ideas we would later come to call “America.” That Khaldun was entirely ignorant of such places as southern Indiana should, by these lights, no more count against his Americanness than Donald Trump’s ignorance of more than geography ought to count against his. Indeed, conducted according to this scale, it should be no contest as to which—between Donald Trump, Marilynn Robinson, and Ibn Khaldun—is the the more likely to be a baseball fan. Nor, need it be added, which the better American.

July Days

Other lands have their vitality in a few, a class, but we have it in the bulk of our people.

—Walt Whitman

And so it is July. The grass, so lush and green in April and May, has begun to brown over in spots, and everyone is, just now, realizing that the early season is over and they are, just now, about as good as they are going to get this season. And it’s dawned on some—not you, I hope—that this is probably about as good at this game as they ever will. For the professionals it has become make-or-break time, the time of year to put some serious money in the bank, or at least enough to keep their tour cards for another year, or at least get into the finals of Q-School, or second stage, or some kind of status on the Nationwide Tour, or something, just something to keep from having to go home again—home to that insurance job the brother-in-law’s been talking about, or that club pro job somebody promised once, “if it didn’t work out.” And so July is, for golf, not a lazy, happy time at all: it is a time of cruelty, and of victims piling up like the cracked shells of turtles beside a Florida highway.

July is also, by design or happenstance, the month of the Open Championship, or as we colonials like to call it, the British Open—which is, often, a championship of misfortune and sorrow, of too-proud Frenchmen, horrible bounces, and the heartbreak of old men allowed a brief glimpse of the glorious past … before that door is closed on them, wickedly and forever. The Masters is, of course, the tournament of hope, like the spring it heralds, and the U.S. Open, usually, is the tournament of the expected: it is a hard tournament, but the winner is nearly always the man who’s played the most consistently, so that it (mostly) feels like justice has been done by the end of it. But the Open is a tournament of darkness and mystery, and there’s hardly a year that goes by without someone wondering what might have been, if only …

At least some of that mystery has, in the past, come from the ignorance of we Americans—both the players themselves and we, the audience at home. An American watching the Open has always the uneasy sense that the spectacle on display is some different game that, coincidentally, has many of the same trappings and the same spelling as the familiar old game but is in fact something entirely other, something strange and uncanny. Why is that man using his putter—the flag stick isn’t even in the picture! Or, why hasn’t Tiger hit his driver in two days? And so on.

This year, however, some have the odd sense that we have already seen this tournament: the shot of the year, for instance, is probably Charl Schwartzel’s 120-foot chip-in on the first hole of the final round of the Masters—with a six-iron. What American player would even have thought of that? (Ask yourself: would you?) It was the kind of shot that Americans only see once a year, at the Open, but there it was at the course most Americans might think of as epitomizing the high-flying aerial American game: Augusta. (They’d be wrong about that, in one sense—because Augusta is actually receptive to a ground-game, but it’s true that the players who’ve dominated the Masters have been high-ball players.) And, to be sure, the U.S. Open was the coronation of a new king of European, and British, golf: Rory McIlroy.

So this year’s Open begins with, perhaps, a new sense of itself: the winner of the tournament is always introduced with the title, “the champion golfer of the year,” and if, in past decades, the words have always been imbued with some sense of irony (who ever thought Bobby Locke, as great as he was, was the match of Nelson or Hogan or Palmer?), there’s a notion on the march, now, that maybe those words are not just another relic of the nineteenth century, a token of past imperial splendor. More than a decade ago, Britain tried to re-invent, “re-brand” as the advertisers say, itself with the “Cool Britannia” label, acclaiming the election of Tony Blair’s New Labour Party as the final entombment of the old, class-bound, traditional England. Maybe it did and maybe it didn’t, but perhaps it’s true that the children of the ‘90s, including Rory McIlroy, really did grow up with a different sense of themselves and their possibilities, and that maybe—it’s impossible to know—that’s made a difference.

Almost certainly it’s made a difference in the game of golf: where once it was the Americans who came to Europe and sneered (Sam Snead, famously, first saw St. Andrews and thought he was looking at a pasture), now it’s the Europeans who seem self-confident, who look at the great American cathedrals of the game—Augusta, Pebble Beach—and view them as just another route to a paycheck. And possibly—in golf, at least—that’s what’s necessary to produce: that sense that all the world has just been born, and that you are the equal of anyone in it.

What’s astonishing, though maybe not as astonishing as some might like, is that traditionally that sort of sensibility has been the special province of Americans, not Europeans. It’s what George Orwell, that canny Englishman, meant when he said that what he admired about Walt Whitman, poet of America, was that Whitman really conveyed how, in what now might be a long-ago America, “Everyone had inside him, like a kind of core, the, knowledge that he could earn a decent living, and earn it without bootlicking.” Whitman himself defined freedom as the ability “to walk free and own no superior,” which is just the sort of sensibility that, it now seems, is more readily to hand on the far side of the Atlantic than on this.

Some time ago, the neoconservative David Brooks asserted that the difference between young African-Americans and young people of African descent in France (who were then rioting) was that African-Americans always had the option to go to college, whereas “in France the barriers to ascent are higher”—but the reality is, as the newspaper that published Brooks (The New York Times) was forced to admit, in fact social mobility “is not higher in the United States than in Britain or France.” The reality today, according to the social scientists that study such things, actually is that a young person with aspirations today is probably better off going to Berlin than to Los Angeles or New York or Chicago. And maybe that’s hard for Americans to hear, given that entire libraries are filled with stacks of books telling us that what makes us who we are is just that sense that anybody can be anything, the entire line of thought that is condensed in the old line that, in America, anybody can be president.

Yet while our present executive does, in some kind of 21st-century manner, exemplify the cliche, it’s also true that Rory McIlroy has probably seen more real political change in his lifetime than many Americans twice his age. It’s well-known, for instance, that to be an incumbent congressman in America is as near as it is possible to get to guaranteed employment outside the law or academia, while Rory witnessed, at the ripe age of 10, one of the most historic constitutional changes ever seen in the world: the “House of Lords Act of 1999,” which abolished the British aristocracy’s hereditary right to representation in Parliament. In other words, Rory saw what Washington and Jefferson and Adams and company put their lives and fortunes at risk to have a chance to see: the end of the nobility as a real political force in Britain. Not since the 1960s has anybody put forward an idea as monumental as that, but Britain in the 1990s not only talked about it—they acted on it. Young Americans, on the other hand, have simply watched as a mostly-moribund clique of liberals has tried to hang on to victories that were won by 1968, as the siege engines of the ravenously greedy have drawn in ever-tighter.

To say that the one has anything to do with the other (politics, golf) is, to be sure, just the sort of thing that isn’t done in America today—though just where the idea came from that there are things that are and aren’t done is a bit of a question—and anyway amounts to nothing when deciding who to bet on for the Open, which as I’ve mentioned is probably the hardest of the major championships to handicap because the rolls and folds of a links course—the only kind the Open is played on—can be so capricious. It’s unlikely that Rory McIlroy can follow up his victory in America with another in his “home” major—he hasn’t, for instance, played against serious competition since winning at Congressional. But if he can, in the seriousness and cruelty of July, he might say to the world that it is Europe—that “ancient bone-yard,” as Orwell called it—that is America now.