Double Down

There is a large difference between our view of the US as a net creditor with assets of about 600 billion US dollars and BEA’s view of the US as a net debtor with total net debt of 2.5 trillion. We call the difference between these two equally arbitrary estimates dark matter, because it corresponds to assets that we know exist, since they generate revenue but cannot be seen (or, better said, cannot be properly measured). The name is taken from a term used in physics to account for the fact that the world is more stable than you would think if it were held together only by the gravity emanating from visible matter. In our measure the US owns about 3.1 trillion of unaccounted net foreign assets. [Emp. added]
—Ricardo Hausmann and Frederico Sturzenegger.
“U.S. and Global Imbalances: Can Dark Matter Prevent a Big Bang?”
13 November 2005.

 

Last month Wikileaks, the journalistic-like platform, released a series of emails that included (according to the editorial board of The Washington Post) “purloined emailed excerpts” of Hillary Clinton’s “paid speeches to corporate audiences” from 2013 to 2015—the years in which Clinton withdrew from public life while building a war-chest for her presidential campaign. In one of those speeches, she expressed what the board of the Post calls “her much-maligned view that ‘you need both a public and a private position’”—a position that, the Post harumphs, “is playing as a confession of two-facedness but is actually a clumsy formulation of obvious truth”: namely, that politics cannot operate “unless legislators can deliberate and negotiate candidly, outside the glare of publicity.” To the Post, in other words, thinking that people ought to believe the same things privately as they loudly assert publicly is the sure sign of a näivete verging on imbecility; almost certainly, the Post’s comments draw a dividing line in American life between those who “get” that distinction and those who don’t. Yet, while the Post sees fit to present Clinton’s comments as a sign of her status as “a knowledgeable, balanced political veteran with sound policy instincts and a mature sense of how to sustain a decent, stable democracy,” in point of fact it demonstrates—far more than Donald Trump’s ridiculous campaign—just how far from a “decent, stable democracy” the United States has become: because as those who, nearly a thousand years ago, first set in motion the conceptual revolution that resulted in democracy understood, there is no thought or doctrine more destructive of democracy than the idea that there is a “public” and a “private” truth.

That’s a notion that, likely, is difficult for the Post’s audience to encompass. Presumably educated at the nation’s finest schools, the Post’s audience can see no issue with Clinton’s position because the way towards it has been prepared for decades: it is, in fact, one of the foundational doctrines of current American higher education. Anyone who has attended an American institution of higher learning over the past several decades, in other words, is going to learn a version of Clinton’s belief that truth can come in two (or more) varieties, because that is what intellectuals of both the political left and the political right have asserted for more than half a century.

The African-American novelist James Baldwin asserted, for example, in 1949 that “literature and sociology are not the same,” while in 1958 the conservative political scientist Leo Strauss dismissed “the ‘scientific’ approach to society” as ignoring “the moral distinctions by which we take our bearings as citizens and”—in a now-regrettable choice of words—“as men.” It’s become so unconscious a belief among the educated, in fact, that even some scientists themselves have adopted this view: the biologist Stephen Jay Gould, for instance, towards the end of his life argued that science and religion constituted what he called “non-overlapping magisteria,” while John Carmody, a physician turned writer for The Australian, more prosaically—and seemingly modestly—asserted not long ago that “science and religion, as we understand them, are different.” The motives of those arguing for such a separation are usually thought to be inherently positive: agreeing to such a distinction, in fact, is nearly a requirement for admittance to polite society these days—which is probably why the Post can assert that Clinton’s admissions are a sign of her fitness for the presidency, instead of being disqualifying.

To the Post’s readers, in short, Hillary Clinton’s doubleness is a sign of her “sophistication” and “responsibility.” It’s a sign that she’s “one of us”—she, presumably unlike the trailer trash interested in Donald Trump’s candidacy, understands the point Rashomon! (Though, Kurosawa’s film does not—because logically it cannot—necessarily imply the view of ambiguity it’s often suggested it does: if Rashomon makes the claim that reality is ultimately unknowable, how can we know that?) But those who think thusly betray their own lack of sophistication—because, in the long history of humanity, this isn’t the first time that someone has tried to sell a similar doctrine.

Toward the height of the Middle Ages the works of Aristotle became re-discovered in Europe, in part through contacts with Muslim thinkers like the twelfth-century Andalusian Ibn-Rushd—better known in Europe as “Averroes.” Aristotle’s works were extremely exciting to students used to a steady diet of Plato and the Church Fathers—precisely because at points they contradicted, or at least appeared to contradict, those same Church Fathers. (Which was also, as it happened, what interested Ibn-Rushd about Aristotle—though in his case, the Greek philosopher appeared to contradict Muslim, instead of Christian, sources.) That however left Aristotle enthusiasts with a problem: if they continued to read the Philosopher (Aristotle) and his Commentator (Averroes), they would embark on a collision course with the religious authorities.

In The Harmony of Religion and Philosophy, it seems, Averroes taught that “philosophy and revelation do not contradict each other, and are essentially different means of reaching the same truth”—a doctrine that his later Christian followers turned into what became known as the doctrine of “double truth.” According to a lecturer at the University of Paris in the thirteenth century named Siger of Brabant, for instance, “there existed a ‘double truth’: a factual or ‘hard’ truth that is reached through science and philosophy, and a ‘religious’ truth that is reached through religion.” To Brabant and his crowd, according to Encyclopedia Britannica, “religion and philosophy, as separate sources of knowledge, might arrive at contradictory truths without detriment to either.” (Which was not the same as Averroes’ point, however: the Andalusian scholar “taught that there is only one truth, but reached in two different ways, not two truths.”) Siger of Brabant, in other words, would have been quite familiar with Hillary Clinton’s distinction between the “public” and the “private.”

To some today, of course, that would merely point to how contemporary Siger of Brabant was, and how fuddy-duddy were his opponents—like Stephen Tempier, the bishop of Paris. As if he were some 1950s backwoods Baptist preacher denouncing Elvis or the Beatles, in 1277 Tempier denounced those who “hold that something is true according to philosophy but not according to the Catholic faith, as if there are two contrary truths.” Yet, while some might want to portray Brabant, thusly, as a forerunner to today’s tolerant societies, in reality it was Tempier’s insistence that truth comes in mono, not stereo, that (seemingly paradoxically) led to the relatively open society we at present enjoy.

People who today would make that identification, that is, might be uneasy if they knew that part of the reason Brabant believed his doctrine was his belief in “the superiority of philosophers to the common people,” or that Averroes himself warned “against teaching philosophical methods to the general populace.” Two truths, in other words, easily translated into two different kinds of people—and make no mistake, these doctrines did not imply that these two differing types were “separate but equal.” Instead, they were a means of asserting the superiority of the one type over the other. The doctrine of “double truth,” in other words, was not a forerunner to today’s easygoing societies.

To George Orwell, in fact, it was prerequisite for totalitarianism: Brabant’s theory of “double truth,” in other words, may be the origin of the concept of “doublethink” as used in Orwell’s 1984. In that 1948 novel, “doublethink” is defined as

To know and not to know, to be conscious of complete truthfulness while telling carefully constructed lies, to hold simultaneously two opinions which cancelled out, knowing them to be contradictory and believing in both of them, to use logic against logic, to repudiate morality while laying claim to it, to believe that democracy was impossible and that the Party was the guardian of democracy, to forget whatever it was necessary to forget, then to draw it back into memory again at the moment when it was needed, and then promptly to forget it again, and above all, to apply the same process to the process itself – that was the ultimate subtlety: consciously to induce unconsciousness, and then, once again, to become unconscious of the act of hypnosis you had just performed. Even to understand the word ‘doublethink’ involved the use of doublethink.

It was a point Orwell had been thinking about for some time: in a 1946 essay entitled “Politics and the English Language,” he had denounced “unscrupulous politicians, advertisers, religionists, and other doublespeakers of whatever stripe [who] continue to abuse language for manipulative purposes.” To Orwell, the doctrine of the “double truth” was just a means of sloughing off feelings of guilt or shame naturally produced by human beings engaged in such manipulations—a technique vital to totalitarian regimes.

Many in today’s universities, to be sure, have a deep distrust for Orwell: Louis Menand—who not only teaches at Harvard and writes for The New Yorker, but grew up in a Hudson Valley town named for his own great-grandfather—perhaps summed up the currently fashionable opinion of the English writer when he noted, in a drive-by slur, that Orwell was “a man who believed that to write honestly he needed to publish under a false name.” The British novelist Will Self, in turn, has attacked Orwell as the “Supreme Mediocrity”—and in particular takes issue with Orwell’s stand, in “Politics and the English Language,” in favor of the idea “that anything worth saying in English can be set down with perfect clarity such that it’s comprehensible to all averagely intelligent English readers.” It’s exactly that part of Orwell’s position that most threatens those of Self’s view.

Orwell’s assertion, Self says flatly, is simply “not true”—an assertion that Self explicitly ties to issues of minority representation. “Only homogeneous groups of people all speak and write identically,” Self writes against Orwell; in reality, Self says, “[p]eople from different heritages, ethnicities, classes and regions speak the same language differently, duh!” Orwell’s big argument against “doublethink”—and thusly, totalitarianism—is in other words just “talented dog-whistling calling [us] to chow down on a big bowl of conformity.” Thusly, “underlying” Orwell’s argument “are good old-fashioned prejudices against difference itself.” Orwell, in short, is a racist.

Maybe that’s true—but it may also be worth noting that the sort of “tolerance” advocated by people like Self can also be interpreted, and has been for centuries, as in the first place a direct assault on the principle of rationality, and in the second place an abandonment of millions of people. Such, at least, is how Thomas Aquinas would have received Self’s point. The Angelic Doctor, as the Church calls him, asserted that Averroeists like Brabant could be refuted on their own terms: the Averroeists said they believed, Aquinas remarked, that philosophy taught them that truth must be one, but faith taught them the opposite—a position that would lead those who held it to think “that faith avows what is false and impossible.” According to Aquinas, the doctrine of the “double truth” would imply that belief in religion was as much as admitting that religion was foolish—at which point you have admitted that there is only a single truth, and it isn’t a religious one. Hence, Aquinas’ point was that, despite what Orwell feared in 1984, it simply is not psychologically possible to hold two opposed beliefs in one’s head simultaneously: whenever someone is faced with a choice like that, that person will inevitably choose one side or the other.

In this, Aquinas was merely following his predecessors. To the ancients, this was known as the “law of non-contradiction”—one of the ancient world’s three fundamental laws of thought. “No one can believe that the same thing can (at the same time) be and not be,” as Aristotle himself put that law in the Metaphysics; nobody can (sincerely) believe one thing and its opposite at the same time. As the Persian, Avicenna—demonstrating that this law was hardly limited to Europeans—put it centuries later: “Anyone who denies the law of non-contradiction should be beaten and burned until he admits that to be beaten is not the same as not to be beaten, and to be burned is not the same as not to be burned.” Or finally, as Arthur Schopenhauer wrote centuries after that in The World as Will and Representation (using the heavy-handed vocabulary of German philosophers), “every two concept-spheres must be thought of as either united or as separated, but never as both at once; and therefore, although words are joined together which express the latter, these words assert a process of thought which cannot be carried out” (emp. added). If anyone says the contrary, these philosophers implied,  somebody’s selling something.

The point that Aristotle, Aquinas, Avicenna, and Orwell were making, in other words, is that the law of non-contradiction is essentially identical to rationality itself: a nearly foolproof method of performing the most basic of intellectual tasks—above all, telling honest and rational people from dishonest and duplicitous ones. And that, in turn, would lead to their second refutation of Self’s argument: by abandoning the law of non-contradiction, people like Brabant (or Self) were also effectively setting themselves above ordinary people. As one commenter on Aquinas writes, the Good Doctor’s insisted that if something is true, then “it must make sense and it must make sense in terms which are related to the ordinary, untheological ways in which human beings try to make sense of things”—as Orwell saw, that position is related to the law of noncontradiction, and both are related to the notion of democratic government, because telling which candidate is the better one is exactly the very foundation of that form of government. When Will Self attacks George Orwell for being in favor of comprehensibility, in other words, he isn’t attacking Orwell alone: he’s actually attacking Thomas Aquinas—and ultimately the very possibility of self-governance.

While the supporters of Hillary Clinton like to describe her opponent as a threat to democratic government, in other words, Donald Trump’s minor campaign arguably poses far less threat to American freedoms than hers does: from one point of view, Clinton’s accession to power actually threatens the basic conceptual apparatus without which there can be no democracy. Of course, given that during this presidential campaign virtually no attention has been paid, say, to the findings of social scientists (like Ricardo Hausmann and Federico Sturzenegger) and journalists (like those who reported on The Panama Papers) that while many conservatives bemoan such deficits as the U.S. budget or trade imbalances, in fact there is good reason to suspect that such gaps are actually the result of billions (or trillions) of dollars being hidden by wealthy Americans and corporations beyond the reach of the Internal Revenue Service (an agency whose budget has been gutted in recent decades by conservatives)—well, let’s just say that there’s good reason to suspect that Hillary Clinton’s campaign may not be what it appears to be.

After all—she said so.

Advertisements

The Commanding Heights

The enemy increaseth every day; 
We, at the height, are ready to decline.
Julius Caesar. Act IV, Scene 3.

 

“It’s Toasted”: the two words that began the television series Mad Men. The television show’s protagonist, Don Draper, comes up with them in a flash of inspiration during a meeting with the head of Draper’s advertising firm’s chief client, cigarette brand Lucky Strikes: like all cigarette companies, Luckies have to come up with a new campaign in the wake of a warning from the Surgeon General regarding the health risks of smoking. Don’s solution is elegant: by simply describing the manufacturing process of making Luckies—a process that is essentially the same as all other cigarettes—the brand does not have to make any kind of claim about smokers’ health at all, and thusly can bypass any consideration of scientific evidence. It’s a great way to introduce a show about the advertising business, as well as one of the great conflicts of that business: the opposition between reality, as represented by the Surgeon General’s report, and rhetoric, as represented by Draper’s inspirational flash. It’s also what makes Mad Men a work of historical fiction: in the first place, as documented by Thomas Frank’s The Conquest of Cool: Business Culture, Counterculture, and the Rise of Hip Consumerism, there really was, during the 1950s and 60s, a conflict in the advertising industry between those who trusted in a “scientific” approach to advertising and those who, in Frank’s words, “deplored conformity, distrusted routine, and encouraged resistance to established power.” But that conflict also enveloped more than the advertising field: in those years many rebelled against a “scientism” that was thought confining—a rebellion that in many ways is with us still. Yet, though that rebellion may have been liberating in some senses, it may also have had certain measurable costs to the United States. Among those costs, it seems, might be height.

Height, or a person’s stature, of course is a thing that most people regard as something that is akin to the color of the sky or the fact of gravity: a baseline foundation to the world incapable of change. In the past, such results that lead one person to tower over others—or look up to them in turn—might have been ascribed to God; today some might view height as the inescapable result of genetics. In one sense, this is true: as Burkhard Bilger says in the New Yorker story that inspired my writing here, the work of historians, demographers and dietitians have shown that with regard to height, “variations within a population are largely genetic.” But while height differences within a population are, in effect, a matter of genetic chance, that is not so when it comes to comparing different populations to each other.

“Height,” says Bilger, “is a kind of biological shorthand: a composite code for all the factors that make up a society’s well-being.” In other words, while you might be a certain height, and your neighbor down the street might be taller or shorter, both of you will tend to be taller or shorter than people from a different country—and the degree of shortness or tallness can be predicted by what sort of country you live in. That doesn’t mean that height is independent of genetics, to be sure: all human bodies are genetically fixed to grow at only three different stages in our lives—infancy, between the ages of six and eight, and as adolescents. But as Bilger notes, “take away any one of forty-five or fifty essential nutrients”—at any of these stages—“and the body stops growing.” (Like iodine, which can also have an effect on mental development.) What that means is that when large enough populations are examined, it can be seen whether a population as a whole is getting access to those nutrients—which in turn means it’s possible to get a sense of whether a given society is distributing resources widely … or not.

One story Bilger tells, about Guatemala’s two main ethnic groups, illustrates the point: one of them, the Ladinos, who claim descent from the Spanish colonizers of Central America, were averagely tall. But the other group, the Maya, who are descended from indigenous people, “were so short that some scholars called them the pygmies of Central America: the men averaged only five feet two, the women four feet eight.” Since the two groups shared the same (small) country, with essentially the same climate and natural resources, researchers initially assumed that the difference between them was genetic. But that assumption turned out to be false: when anthropologist Barry Bogin measured Mayans who had emigrated to the United States, he found that they were “about as tall as Guatemalan Ladinos.” The difference between the two ethnicities was not genetic: “The Ladinos,” Bilger writes, “who controlled the government, had systematically forced the Maya into poverty”—and poverty, because it can limit access to the nutrients essential during growth spurts, is systemically related to height.

It’s in that sense that height can literally be a measurement of the degree of freedom a given society enjoys: historically, Guatemala has been a hugely stratified country, with a small number of landowners presiding over a great number of peasants. (Throughout the twentieth century, in fact, the political class was engaged in a symbiotic relationship with the United Fruit Company, an American company that possessed large-scale banana plantations in the country—hence the term “banana republic.”) Short people are, for the most part, oppressed people; tall people, conversely, are mostly free people: it’s not an accident that as citizens of one of the freest countries in the world, the Netherlands, Dutch people are also the tallest.

Americans, at one time, were the tallest people in the world: in the eighteenth century, Bilger reports, Americans were “a full three inches taller than the average European.” Even so late as the First World War, he also says, “the average American soldier was still two inches taller than the average German.” Yet, a little more than a generation later, that relation began to change: “sometime around 1955 the situation began to reverse.” Since then all Europeans have been growing, as have Asians: today “even the Japanese—once the shortest industrialized people on earth—have nearly caught up with us, and Northern Europeans are three inches taller and rising.” Meanwhile, American men are “less than an inch taller than the average soldier during the Revolutionary War.” And that difference, it seems, is not due to the obvious source: immigration.

The people that work in this area are obviously aware that, because the United States is a nation of immigrants, that might skew the height data: clearly, if someone grows up in, say, Guatemala and then moves to the United States, that could conceivably warp the results. But the researchers Bilger consulted have considered the point: one only includes native-born, English-speaking Americans in his studies, for example, while another says that, because of the changes to immigration law during the twentieth century, the United States now takes in far too few immigrants to bias the figures. But if not immigration, then what?

For my own part, I find the coincidence of 1955 too much to ignore: it’s around the mid-1950s that Americans began to question a previous view of the sciences that had grown up a few generations previously. In 1898, for example, the American philosopher John Dewey could reject “the idea of a dualism between the cosmic and the ethical,” and suggested that “the spiritual life … [gets] its surest and most ample guarantees when it is learned that the laws and conditions of righteousness are implicated in the working processes of the universe.” Even so late as 1941, intellectual magazine The New Republic could publish an obituary of the famed novelist James Joyce—author of what many people feel is the finest novel in the history of the English language, Ulysses—that proclaimed Joyce “the great research scientist of letters, handling words with the same freedom and originality that Einstein handles mathematical symbols.” “Literature as pure art,” the magazine then said, “approaches the nature of pure science”—suggesting, as Dewey said, that reality and its study did not need to be opposed to some other force, whether that be considered to be religion and morality or art and beauty. But just a few years later, elite opinion began to change.

In 1949, for instance, the novelist James Baldwin would insist, against the idea of The New Republic’s obituary, that “literature and sociology are not the same,” while a few years later, in 1958, the philosopher and political scientist Leo Strauss would urge that the “indispensable condition of ‘scientific’ analysis is then moral obtuseness”—an obtuseness that, Strauss would go on to say, “is not identical with depravity, but […] is bound to strengthen the forces of depravity.” “By the middle of the 1950s,” as Thomas Frank says, “talk of conformity, of consumerism, and of the banality of mass-produced culture were routine elements of middle-class American life”—so that “the failings of capitalism were not so much exploitation and deprivation as they were materialism, wastefulness, and soul-deadening conformity”: a sense that Frank argues provided fuel for the cultural fires of the 1960s that were to come, and that the television show Mad Men documents. In other words, during the 1950s and afterwards, Americans abandoned a scientific outlook, and meanwhile, Americans also have grown shorter—at least relative to the rest of the world. Correlation, as any scientist will tell you, does not imply causation, but it does imply that Lucky Strikes might not be unique any more—though as any ad man would tell you, “America: It’s Toast!” is not a winning slogan.

Hot Shots

 

… when the sea was calm all boats alike
Show’d mastership in floating …
—William Shakespeare.
     Coriolanus Act IV, Scene 3 (1608).

 

 

“Indeed,” wrote the Canadian scholar Marshall McLuhan in 1964, “it is only too typical that the ‘content’ of any medium blinds us to the character of the medium.” Once, it was a well-known line among literate people, though much less now. It occurred to me recently however as I read an essay by Walter Benn Michaels of the University of Illinois at Chicago, in the course of which Michaels took issue with Matthew Yglesias of Vox. Yglesias, Michaels tells us, tried to make the argument that

although “straight white intellectuals” might tend to think of the increasing economic inequality of the last thirty years “as a period of relentless defeat for left-wing politics,” we ought to remember that the same period has also seen “enormous advances in the practical opportunities available to women, a major decline in the level of racism … and wildly more public and legal acceptance of gays and lesbians.”

Michaels replies to Yglesias’ argument that “10 percent of the U.S. population now earns just under 50 percent of total U.S. income”—a figure that is, unfortunately, just the tip of the economic iceberg when it comes to inequality in America. But the real problem—the problem that Michaels’ reply does not do justice to—is that there just is a logical flaw in the kind of “left” that we have now: one that advocates for the rights of minorities rather than labors for the benefit of the majority. That is, a “cultural” left rather than a scientific one: the kind we had when, in 1910, American philosopher John Dewey could write (without being laughed at), that Darwin’s Origin of Species “introduced a mode of thinking that in the end was bound to transform the logic of knowledge, and hence the treatment of morals, politics, and religion.” When he was just twenty years old the physicist Freeman Dyson discovered why, when Winston Churchill’s government paid him to think about what was really happening in the flak-filled skies over Berlin.

The British had a desperate need to know, because they were engaged in bombing Nazi Germany at least back to the Renaissance. Hence they employed Dyson as a statistician, to analyze the operations of Britain’s Bomber Command. Specifically, Dyson was to investigate whether bomber crews “learned by experience”: if whether the more missions each crew flew, the better each crew became at blowing up Germany—and the Germans in it. Obviously, if they did, then Bomber Command could try to isolate what those crews were doing and teach what it was to the others so that Germany and the Germans might be blown up better.

The bomb crews themselves believed, Dyson tells us, that as “they became more skillful and more closely bonded, their chances of survival would improve”—a belief that, for obvious reasons, was “essential to their morale.” But as Dyson went over the statistics of lost bombers, examining the relation between experience and loss rates while controlling for the effects of weather and geography, he discovered the terrible truth:

“There was no effect of experience on loss rate.”

The lives of each bomber crew, in other words, were dependent on chance, not skill, and the belief in their own expertise was just an illusion in the face of horror—an illusion that becomes the more awful when you know that, out of the 125,000 air crews who served in Bomber Command, 55,573 were killed in action.

“Statistics and simple arithmetic,” Dyson therefore concluded, “tell us more about ourselves than expert intuition”: a cold lesson to learn, particularly at the age of twenty—though that can be tempered by the thought that at least it wasn’t Dyson’s job to go to Berlin. Still, the lesson is so appalling that perhaps it is little wonder that, after the war, it was largely forgotten, and has only been taken up again by a subject nearly as joyful as the business of killing people on an industrial scale is horrifying: sport.

In one of the most cited papers in the history of psychology, “The Hot Hand in Basketball: On the Misperception of Random Sequences,” Thomas Gilovich, Robert Vallone and Amos Tversky studied how “players and fans alike tend to believe that a player’s chance of hitting a shot are greater following a hit than following a miss on the previous shot”—but “detailed analysis … provided no evidence for a positive correlation between the outcomes of successive shots.” Just as, in other words, the British airmen believed some crews had “skill” that kept them in the air, when in fact all that kept them aloft was, say, the poor aim of a German anti-aircraft gunner or a happily-timed cloud, so too did the three co-authors find that, in basketball, people believed some shooters could get “hot.” That is, reel off seemingly impossible numbers of shots in a row, like when Ben Gordon, then with the Chicago Bulls, knocked down 9 consecutive three-pointers against Washington in 2006. But in fact hits and misses are reliant on a player’s skill, not his “luck”: toss a coin enough times and the coin will produce “runs” of heads and tails too.

The “hot hand” concept in fact applies to more than simply the players: it extends to coaches also. “In sports,” says Leonard Mlodinow in his book The Drunkard’s Walk: How Randomness Rules Our Lives, “we have developed a culture in which, based on intuitive feelings of correlation, a team’s success or failure is often attributed largely to the ability of the coach”—a reality that perhaps explains just why, as Florida’s Lakeland Ledger reported in in 2014, the average tenure of NFL coaches over the past decade has been 38 months. Yet as Mlodinow also says, “[m]athematical analysis of firings in all major sports … has shown that those firings had, on average, no effect on team performance”: fans (and perhaps more importantly, owners) tend to think of teams rising and falling based on their coach, while in reality a team’s success has more to do with the talent the team has.

Yet while sports are a fairly trivial part of most peoples’ lives, that is not true when it comes to our “coaches”: the managers that run large corporations. As Diane Stafford found out for the Kansas City Star a few years back, it turns out that American corporations have as little sense of the real value of CEOs as NFL owners have of their coaches: the “pay gap between large-company CEOs and average American employees,” Stafford said, “vaulted from 195 to 1 in 1993 to 354 to 1 in 2012.” Meanwhile, more than a third “of the men who appeared on lists ranking America’s 25 highest-paid corporate leaders between 1993 and 2012 have led companies bailed out by U.S. taxpayers, been fired for poor performance or led companies charged with fraud.” Just like the Lancasters flown by Dyson’s aircrews, American workers (and their companies’ stockholders) have been taken for a ride by men flying on the basis of luck, not skill.

Again, of course, many in what’s termed the “cultural” left would insist that they too, stand with American workers against the bosses, that they too, wish things were better, and they too, think paying twenty bucks for a hot dog and a beer is an outrage. What matters however isn’t what professors or artists or actors or musicians or the like say—just as it didn’t matter what Britain’s bomber pilots thought about their own skills during the war. What matters is what their jobs say. And the fact of the matter is that cultural production, whether it be in academia or in New York or in Hollywood, simply is the same as thinking you’re a hell of a pilot, or you must be “hot,” or Phil Jackson is a genius. That might sound counterintuitive, of course—I thought writers and artists and, especially, George Clooney were all on the side of the little guy!—but, like McLuhan says, what matters is the medium, not the message.

The point is likely easiest to explain in terms of the academic study of the humanities, because at least there people are forced to explain themselves in order to keep their jobs. What one finds, across the political spectrum, is some version of the same dogma: students in literary studies can, for instance, refer to American novelist James Baldwin’s insistence, in the 1949 essay “Everybody’s Protest Novel,” that “literature and sociology are not the same,” while, at the other end of the political spectrum, political science students can refer to Leo Strauss’ attack on “the ‘scientific’ approach to society” in his 1958 Thoughts on Machiavelli. Every discipline in the humanities has some version of the point, because without such a doctrine they couldn’t exist: without them, there’s just a bunch of people sitting in a room reading old books.

The effect of these dogmas can perhaps be best seen by reference to the philosophical version of it, which has the benefit of at least being clear. David Hume called it the “is-ought problem”; as the Scotsman claimed in  A Treatise of Human Nature, “the distinction of vice and virtue is not founded merely on the relations of objects.” Later, in 1903’s Principe Ethica, British philosopher G.E. Moore called the same point the “naturalistic fallacy”: the idea that, as J.B. Schneewind of Johns Hopkins has put it, “claims about morality cannot be derived from statements of facts.” The advantage for philosophers is clear enough: if it’s impossible to talk about morality or ethics strictly by the light of science, that certainly justifies talking about philosophy to the exclusion of anything else. But in light of the facts about shooting hoops or being killed by delusional Germans, I would hope that the absurdity of Moore’s “idea” ought to be self-evident: if it can be demonstrated that something is a matter of luck, and not skill, that changes the moral calculation drastically.

That then is the problem with running a “left” based around the study of novels or rituals or films or whatever: at the end of the day, the study of the humanities, just like the practice of the arts, discourages the thought that, as Mlodinow puts it, “chance events are often conspicuously misinterpreted as accomplishments or failures.” And without such a consideration, I would suggest, any talk of “values” or “morality” or whatever you would like to call it, is empty. It matters if your leader is lucky or skillful, it matters if success is the result of hard work or who your parents are—and a “left” built on the opposite premises is not, to my mind, a “left” at all. Although many people in the “cultural left,” then, might have the idea that their overt exhortations to virtue might outweigh the covert message being told by their institutional positions, reality tells a different tale: by telling people they can fly, you should not be shocked when they crash.

His Dark Materials

But all these in their pregnant causes mixed
Confusedly, and which thus must ever fight.
Unless the Almighty Maker them ordain
His dark materials to create more worlds
—Paradise Lost II, 913-16

One of the theses of what’s known as the “academic Left” in America is that “nothing is natural,” or, as the literary critic (and “tenured radical”) Stanley Fish more properly puts it, “the thesis that the things we see and the categories we place them in … have their source in culture rather than nature.” It’s a thesis however, that seems to be obviously wrong in the case of professional golf. Without taking the time to do a full study of the PGA Tour’s website, which does list place of birth, it seems undoubtable that most of today’s American tour players originate south of the Mason-Dixon line: either in the former Confederacy or in other Sun Belt states. Thus it seems difficult to argue that there’s something about “Southern culture” that gives Southerners a leg up toward the professional ranks, rather than just the opportunity to play golf more times a year.

Let’s just look, in order to keep things manageable, at the current top ten: Jordan Speith, this year’s Masters winner, is from Texas, while Jimmy Walker, in second place, is just from up the road in Oklahoma. Rory McIlroy doesn’t count (though he is from Northern Ireland, for what that’s worth), while J.B. Holmes is from Kentucky. Patrick Reed is also from Texas, and Bubba Watson is from Florida. Dustin Johnson is from South Carolina, while Charlie Hoffman is from southern California. Hideki Matsuyama is from Ehime, Japan, which is located on the southern island of Shikoku in the archipelago, while Robert Streb rounds out the top ten and keeps the score even between Texas and Oklahoma.

Not until we reach Ryan Moore, at the fifteenth spot, do we find a golfer from an indisputably Northern state: Moore is from Tacoma, Washington. Washington however was not admitted to the Union until 1889; not until the seventeenth spot do we find a golfer from a Civil War-era Union state beside California. Gary Woodland, as it happens one of the longest drivers on tour, is from Kansas.

This geographic division has largely been stable in the history of American golf. It’s true of course that many great American golfers were Northerners, particularly at the beginnings of the game (like Francis Ouimet, “Chick” Evans, or Walter Hagan—from Massachusetts, Illinois, and Michigan respectively), and arguably the greatest of all time was from Ohio: Jack Nicklaus. But Byron Nelson and Ben Hogan were Texans, and of course Bobby Jones, one of the top three golfers ever, was a Georgian.

Yet while it might be true that nearly all of the great players are Southern, the division of labor in American golf is that nearly all of the great courses are Northern. In the latest Golf Digest ranking for instance, out of the top twenty courses only three—Augusta National, which is #1, Seminole in Florida, and Kiaweh in South Carolina—are in the South. New York (home to Winged Foot and Shinnecock, among others) and Pennsylvania (home to Merion and Oakmont) had the most courses in the top twenty; other Northern states included Michigan, Illinois, and Ohio. If it were access to great courses that made great golfers, in other words—a thesis that would appear to have a greater affinity with the notion that “culture,” rather than “nature,” was what produced great golfers, then we’d expect the PGA Tour to be dominated by Northerners.

That of course is not so, which perhaps makes it all the stranger that, if looked at by region, it is usually “the South” that champions “culture” and “the North” that champions “nature”—at least if you consider, as a proxy, how evolutionary biology is taught. Consider for instance a 2002 map generated by Lawrence S. Lerner of California State University at Long Beach:

v6i8g11

(Link here: http://bigthink.com/strange-maps/97-nil-where-and-how-evolution-is-taught-in-the-us). I realize that the map may be dated now, but still—although with some exceptions—the map generally shows that evolutionary biology is at least a controversial idea in the states of the former Confederacy, while Union states like Connecticut, New Jersey, and Pennsylvania are ranked by Professor Lerner as “Very good/excellent” in the matter of teaching Darwinian biology. In other words, it might be said that the states that are producing the best golfers are both the ones with the best weather and a belief that nature has little to do with anything.

Yet, as Professor Fish’s remarks above demonstrate, it’s the “radical” humanities professors of the nation’s top universities that are the foremost proponents of the notion that “culture” trumps “nature”—a fact that the cleverest creationists have not led slide. An article entitled “The Postmodern Sin of Intelligent Design Creationism” in a 2010 issue of Science and Education, for instance, lays out how “Intelligent Design Creationists” “try to advance their premodern view by adopting (if only tactically) a radical postmodern perspective.” In Darwinism and the Divine: Evolutionary Thought and Natural Theology, Alister McGrath argues not only “that it cannot be maintained that Darwin’s theory caused the ‘abandonment of natural theology,’” and also approvingly cites Fish: “Stanley Fish has rightly argued that the notion of ‘evidence’ is often tautologically determined by … interpretive assumptions.” So there really is a sense in which the the deepest part of the Bible Belt fully agrees with the most radical scholars at Berkeley and other top schools.

In Surprised By Sin: The Reader in Paradise Lost, Stanley Fish’s most famous work of scholarship, Fish argues that Satan is evil because he is “the poem’s true materialist”—and while Fish might say that he is merely reporting John Milton’s view, not revealing his own, still it’s difficult not to take away the conclusion that there’s something inherently wrong with the philosophical doctrine of materialism. (Not to be confused with the vulgar notion that life consists merely in piling up stuff, the philosophic version says that all existence is composed only of matter.) Or with the related doctrine of empiricism: “always an experimental scientist,” Fish has said more recently in the Preface to Surprised By Sin’s Second Edition, Satan busies himself “by mining the trails and entrails of empirical evidence.” Fish of course would be careful to distance himself from more vulgar thinkers regarding these matters—a distance that is there, sure—but it’s difficult not to see why creationists shouldn’t mine him for their own views.

Now, one way to explain that might be that both Fish and his creationist “frenemies” are drinking from the Pure Light of the Well of Truth. But there’s a possible materialistic candidate to explain just why humanities professors might end up with views similar to those of the most fundamentalist Christians: a similar mode of production. The political scientist Anne Norton remarks, in a book about the conservative scholar Leo Strauss, that the pedagogical technique pursued by Strauss—reading “a passage in a text” and asking questions about it—is also one pursued in “the shul and the madrasa, in seminaries and in Bible study groups.” At the time of Strauss’ arrival in the United States as a refugee from a 1930s Europe about to be engulfed in war, “this way of reading had fallen out of favor in the universities,” but as a result of Strauss’ career at the University of Chicago, along with that of philosophers Mortimer Adler (who founded the Great Books Program) and Robert Hutchins, it’s become at least a not-untypical pedagogical method in the humanities since.

At the least, that mode of humanistic study would explain what the philosopher Richard Rorty meant when he repeated Irving Howe’s “much-quoted jibe—‘These people don’t want to take over the government; they just want to take over the English Department.’” It explains, in other words, just how the American left might have “become an object of contempt,” as Rorty says—because it is a left that no longer believes that “the vast inequalities within American society could be corrected by using the institutions of a constitutional democracy.” How could it, after all, given a commitment against empiricism or materialism? Taking a practical perspective on the American political machinery would require taking on just the beliefs that are suicidal if your goal is to achieve tenure in the humanities at Stanford or Yale.

If you happen to think that most things aren’t due to the meddling of supernatural creatures, and you’ve given up on thoughts of tenure because you dislike both creationist nut-jobs and that “largely academic crowd cynical about America, disengaged from practice, and producing ever-more-abstract, jargon-ridden interpretations of cultural phenomena,” while at the same time you think that putting something in the place of God called “the free market”—which is what, exactly?—isn’t the answer either, why, then the answer is perfectly natural.

You are writing about golf.