Buck Dancer’s Choice

Buck Dancer’s Choice: “a tune that goes back to Saturday-night dances, when the Buck, or male partner, got to choose who his partner would be.”
—Taj Mahal. Oooh So Good ‘n’ Blues. (1973).

 

“Goddamn it,” Scott said, as I was driving down the Kennedy Expressway towards Medinah Country Club. Scott is another caddie I sometimes give rides to; he’s living in the suburbs now and has to take the train into the city every morning to get his methadone pill, where I pick him up and take him to work. On this morning, Scott was distracting himself, as he often does, from the traffic outside by playing, on his phone, the card game known as spades—a game in which, somewhat like contract bridge, two players team up against an opposing partnership. On this morning, he was matched with a bad partner—a player who had, it came to light later, not trumped a ten of spades with the king the other player had in possession, and instead had played a three of spades. (In so doing, Scott’s incompetent partner thereby negated the value of the latter while receiving nothing in return.) Since, as I agree, that sounds relentlessly boring, I wouldn’t have paid much attention to the whole complaint—until I realized that not only did Scott’s grumble about his partner essentially describe the chief event of the previous night’s baseball game, but also why so many potential Democratic voters will likely sit out this election. After all, arguably the best Democratic candidate for the presidency this year will not be on the ballot in November.

What had happened the previous night was described on ESPN’s website as “one of the worst managerial decisions in postseason history”: in a one-game, extra-innings, playoff between the Baltimore Orioles and and the Toronto Blue Jays, Orioles manager Buck Showalter used six relief pitchers after starter Chris Tillman got pulled in the fifth inning. But he did not order his best reliever, Zach Britton, into the game at all. During the regular season, Britton had been one of the best relief pitchers in baseball; as ESPN observed, Britton had allowed precisely one earned run since April, and as Jonah Keri wrote for CBS Sports, over the course of the year Britton posted an Earned Run Average (.53) that was “the lowest by any pitcher in major league history with that many innings [67] pitched.” (And as Deadspin’s Barry Petchesky remarked the next day, Britton had “the best ground ball rate in baseball”—which, given that Orioles ultimately lost on a huge, moon-shot walk-off home run by Edwin Encarnacion, seems especially pertinent.) Despite the fact that the game went 11 innings, Showalter did not put Britton on the mound even once—which is to say that the Orioles ended their season with one of their best weapons sitting on the bench.

Showalter had the king of spades in his hand—but neglected to play him when it mattered. He defended himself later by saying, essentially, that he is the manager of the Baltimore Orioles, and that everyone else was lost in hypotheticals. “That’s the way it went,” the veteran manager said in the post-game press conference—as if the “way it went” had nothing to do with Showalter’s own choices. Some journalists speculated, in turn, that Showalter’s choices were motivated by what Deadspin called “the long-held, slightly-less-long-derided philosophy that teams shouldn’t use their closers in tied road games, because if they’re going to win, they’re going to need to protect a lead anyway.” In this possible view, Showalter could not have known how long the game would last, and could only know that, until his team scored some runs, the game would continue. If so, then it might be possible to lose by using your ace of spades too early.

Yet, not only did Showalter deny that such was a factor in his thinking—“It [had] nothing to do with ‘philosophical,’” he said afterwards—but such a view takes things precisely backward: it’s the position that imagines the Orioles scoring some runs first that’s lost in hypothetical thinking. Indisputably, the Orioles needed to shut down the Jays in order to continue the game; the non-hypothetical problem presented to the Orioles manager was that the O’s needed outs. Showalter had the best instrument available to him to make those outs … but didn’t use him. And that is to say that it was Showalter who got lost in his imagination, not the critics. By not using his best pitcher Showalter was effectively reacting to an imaginative hypothetical scenario, instead of responding to the actual facts playing out before him.

What Showalter was flouting, in other words, was a manner of thinking that is arguably the reason for what successes there are in the present world: probability, the first principle of which is known as the Law of Large Numbers. First conceived by a couple of Italians—Gerolamo Cardano, the first man known to have devised the idea, during the sixteenth century, and Jacob Bernoulli, who publicized it during the eighteenth—the Law of Large Numbers holds that, as Bernoulli put it in his Ars Conjectandi from 1713, “the more observations … are taken into account, the less is the danger of straying.” Or, that the more observations, the less the danger of reaching wrong conclusions. What Bernoulli is saying, in other words, is that in order to demonstrate the truth of something, the investigator should look at as many instances as possible: a rule that is, largely, the basis for science itself.

What the Law of Large Numbers says then is that, in order to determine a course of action, it should first be asked, “what is more likely to happen, over the long run?” In the case of the one-game playoff, for instance, it’s arguable that Britton, who has one of the best statistical records in baseball, would have been less likely to give up the Encarnacion home run than the pitcher who did (Ubaldo Jimenez, 2016 ERA 5.44) was. Although Jimenez, for example, was not a bad ground ball pitcher in 2015—he had a 1.85 ground ball to fly ball ratio that season, putting him 27th out of 78 pitchers, according to SportingCharts.com—his ratio was dwarfed by Britton’s: as J.J. Cooper observed just this past month for Baseball America, Britton is “quite simply the greatest ground ball pitcher we’ve seen in the modern, stat-heavy era.” (Britton faced 254 batters in 2016; only nine of them got an extra-base hit.) Who would you rather have on the mound in a situation where a home run (which is obviously a fly ball) can end not only the game, but the season?

What Bernoulli (and Cardano’s) Law of Large Numbers does is define what we mean by the concept, “the odds”: that is, the outcome that is most likely to happen. Bucking the odds is, in short, precisely the crime Buck Showalter committed during the game with the Blue Jays: as Deadspin’s Petchesky wrote, “the concept that you maximize value and win expectancy by using your best pitcher in the highest-leverage situations is not ‘wisdom’—it is fact.” As Petchesky goes on to say “the odds are the odds”—and Showalter, by putting all those other pitchers on the mound instead of Britton, ignored those odds.

As it happens, “bucking the odds” is just what the Democratic Party may be doing by adopting Hillary Clinton as their nominee instead of Bernie Sanders. As a number of articles this past spring noted, at that time many polls were saying that Sanders had better odds of beating Donald Trump than Clinton did. In May, Linda Qiu and Louis Jacobson noted in The Daily Beast Sanders was making the argument that “he’s a better nominee for November because he polls better than Clinton in head-to-head matches against” Trump. (“Right now,” Sanders said then on the television show, Meet the Press, “in every major poll … we are defeating Trump, often by big numbers, and always at a larger margin than Secretary Clinton is.”) Then, the evidence suggested Sanders was right: “Out of eight polls,” Qiu and Jacobson wrote, “Sanders beat Trump eight times, and Clinton beat Trump seven out of eight times,” and “in each case, Sanders’s lead against Trump was larger.” (In fact, usually by double digits.) But, as everyone now knows, that argument did not help to secure the nomination for Sanders: in August, Clinton became the Democratic nominee.

To some, that ought to be the end of the story: Sanders tried, and (as Showalter said after his game), “it didn’t work out.” Many—including Sanders himself—have urged fellow Democrats to put the past behind them and work towards Clinton’s election. Yet, that’s an odd position to take regarding a campaign that, above everything, was about the importance of principle over personality. Sanders’ campaign was, if anything, about the same point enunciated by William Jennings Bryan at the 1896 Democratic National Convention, in the famous “Cross of Gold” speech: the notion that the “Democratic idea … has been that if you legislate to make the masses prosperous, their prosperity will find its way up through every class which rests upon them.” Bryan’s idea, as ought to clear, has certain links to Bernoulli’s Law of Large Numbers—among them, the notion that it’s what happens most often (or to the most people) that matters.

That’s why, after all, Bryan insisted that the Democratic Party “cannot serve plutocracy and at the same time defend the rights of the masses.” Similarly—as Michael Kazin of Georgetown University described the point in May for The Daily Beast—Sanders’ campaign fought for a party “that would benefit working families.” (A point that suggests, it might be noted, that the election of Sanders’ opponent, Clinton, would benefit others.) Over the course of the twentieth century, in other words, the Democratic Party stood for the majority against the depredations of the minority—or, to put it another way, for the principle that you play the odds, not hunches.

“No past candidate comes close to Clinton,” wrote FiveThirtyEight’s Harry Enten last May, “in terms of engendering strong dislike a little more than six months before the election.” It’s a reality that suggests, in the first place, that the Democratic Party is hardly attempting to maximize their win expectancy. But more than simply those pragmatic concerns regarding her electability, however, Clinton’s candidacy represents—from the particulars of her policy positions, her statements to Wall Street financial types, and the existence of electoral irregularities in Iowa and elsewhere—a repudiation, not simply of Bernie Sanders the person, but of the very idea about the importance of the majority the Democratic Party once proposed and defended. What that means is that, even were Hillary Clinton to be elected in November, the Democratic Party—and those it supposedly represents—will have lost the election.

But then, you probably don’t need any statistics to know that.

Advertisements

Eat The Elephant

Well, gentlemen. Let’s go home.
Sink the Bismarck! (1960).

Someday someone will die and the public will not understand why we were not more effective and throwing every resource we had at certain problems.
—FBI Field Office, New York City, to FBI Headquarters, Washington, D.C.
29 August, 2001.

 

Simon Pegg, author of the latest entry in the Star Trek franchise, Star Trek Beyond, explained the new film’s title in an interview over a year ago: the studio in charge of the franchise, Pegg said, thought that Star Trek was getting “a little too Star Trek-y.” One scene in particular seems to designed to illustrate graphically just how “beyond” Beyond is willing to go: early on, the fabled starship Enterprise is torn apart by (as Michael O’Sullivan of the Washington Post describes it) “a swarm of mini-kamikaze ships called ‘bees.’” The scene is a pretty obvious signal of the new film’s attitude toward the past—but while the destruction of the Enterprise very well might then be read as a kind of meta-reference to the process of filmmaking (say, how movies, which are constructed by teams of people over years of work, can be torn apart by critics in a virtual instant), another way to view the end of the signature starship is in the light of how Star Trek’s original creator, Gene Rodenberry originally pitched Star Trek: as “space-age Captain Horatio Hornblower.” The demise of the Enterprise is, in other words, a perfect illustration of a truth about navies these days: navies today are examples of the punchline of the old joke about how to eat an elephant. (“One bite at a time.”) The payoff for thinking about Beyond in this second way, I would argue, is that it leads to much clearer thinking about things other than stories about aliens, or even stories themselves—like, say, American politics, where the elephant theory has held sway for some time.

“Starfleet,” the fictional organization employing James T. Kirk, Spock, and company has always been framed as a kind of space-going navy—and as Pando Daily’s “War Nerd,” Gary Brecher, pointed out so long ago as 2002, navies are anachronistic in reality. Professionals know, as Brecher wrote fourteen years ago, that “every one of those big fancy aircraft carriers we love”—massive ships much like the fictional Enterprise—“won’t last one single day in combat against a serious enemy.” The reason we know this is not merely because of the attack on the USS Cole in 2000, which showed how two Al Qaeda guys in a thousand-dollar speedboat could blow a 250 million dollar-sized hole in a 2 billion dollar warship, but also because—as Brecher points out in his piece—of research conducted by the U.S. military itself: a war game entitled “Millennium Challenge 2002.”

“Millennium Challenge 2002,” which conveniently took place in 2002, pitted an American “Blue” side against a fictional “Red” force (believed to be a representation of Iran). The commander of “Red” was Marine Corps Lieutenant General Paul K. Riper, who was hired because, in the words of his superiors, he was a “devious sort of guy”—though in the event, he proved to live up to his billing a little too well for the Pentagon’s taste. Taking note of the tactics used against the Cole, Riper attacked Blue’s warships with cruise missiles and a few dozen speedboats loaded with enormous cans of gasoline and driven by gentlemen with an unreasonable belief in the afterlife—a (fictional) attack that sent 19 U.S. Navy vessels to the bottom in perhaps 10 minutes. In doing so, Riper effectively demonstrated the truth also illustrated by the end of the Enterprise in Beyond: that large naval vessels are redundant.

Even warships like the U.S. Navy’s latest supercarrier, the Gerald R. Ford—a floating city capable of completely leveling other cities of the non-floating variety—are nevertheless, as Brecher writes elsewhere, “history’s most expensive floating targets.” That’s because they’re vulnerable to exactly the sort of assault that takes down Enterprise: “a saturation attack by huge numbers of low-value attackers, whether they’re Persians in Cessnas or mass-produced Chinese cruise missiles.” They’re as vulnerable, in other words, as elephants are according to the old joke. Yet, whereas that might be a revolutionary insight in the military, the notion that with enough mice, even an elephant falls is old hat within American political circles.

After all, American politics has, at least since the 1980s, proceeded only by way of “saturation attacks by huge numbers of low-value attackers.” That was the whole point of what are now sometimes called “the culture wars.” During the 1980s and 1990s, as the late American philosopher Richard Rorty put the point before his death, liberals and conservatives conspired together to allow “cultural politics to supplant real politics,” and for “cultural issues” to become “central to public debate.” In those years, it was possible to gain a name for oneself within departments of the humanities by attacking the “intrinsic value” of literature (while ignoring the fact that those arguments were isomorphic with similar ideas being cooked up in economics departments), while conversely, many on the religious right did much the same by attacking (sometimes literally) abortion providers or the teaching of evolution in the schools. To use a phrase of British literary critic Terry Eagleton, in those years “micropolitics seem[ed] the order of the day”—somewhere during that time politics “shift[ed] from the transformative to the subversive.” What allowed that shift to happen, I’d say, was the notion that by addressing seemingly minor-scale points instead of major-scale ones, each might eventually achieve a major-scale victory—or to put it more succinctly, that by taking enough small bites they could eat the elephant.

Just as the Americans and the Soviets refused to send clouds of ICBMs at each other during the Cold War, and instead fought “proxy wars” from the jungles of Vietnam to the mountains of Afghanistan, during the 1980s and 1990s both American liberals and conservatives declined to put their chief warships to sea, and instead held them in port. But right at this point the two storylines—the story of the navy, the story of American politics—begin to converge. That’s because the story of why warships are obsolete is also a story about why that story has no application to politics whatever.

“What does that tell you,” Brecher rhetorically asks, “about the distinguished gentlemen with all the ribbons on their chests who’ve been standing up on … bridges looking like they know what they’re doing for the past 50 years?” Since all naval vessels are simply holes in the water once the shooting really starts, those gentlemen must be, he says, “either stupid or so sleazy they’re willing to make a career commanding ships they goddamn well know are floating coffins for thousands.” Similarly, what does that tell you about an American liberal left that supposedly stands up for the majority of Americans, yet has stood by while, for instance, wages have remained—as innumerable reports confirm—essentially the same for forty years? For while it is all well and good for conservatives to agree to keep their Bismarcks and Nimitzs in port, that sort of agreement does not have the same payout for those on the liberal left—as ought to be obvious to anyone with an ounce of sense.

To see why requires seeing what the two major vessels of American politics are. Named most succinctly by William Jennings Bryan at Chicago in 1896, they concern what Bryan said were the only “two ideas of government”: the first being the idea that, “if you just legislate to make the well-to-do prosperous, that their prosperity will leak through on those below,” and the “Democratic idea,” the idea “that if you legislate to make the masses prosperous their prosperity will find its way up and through every class that rests upon it.” These are the two arguments that are effectively akin to the Enterprise: arguments at the very largest of scales, capable of surviving voyages to strange new worlds—because they apply as well to the twenty-third century of the Federation as they did to Bryan’s nineteenth. But that’s also what makes them different from any real battleship: unlike the Enterprise, they can’t be taken down no matter how many attack them.

There is, however, another way in which ideas can resemble warships: both molder in port. That’s one reason why, to speak of naval battles, the French lost the Battle of Trafalgar in 1805: as Wikipedia reports, because the “main French ships-of-the-line had been kept in harbour for years by the British blockade,” therefore the “French crews included few experienced sailors and, as most of the crew had to be taught the elements of seamanship on the few occasions when they got to sea, gunnery was neglected.” It’s perfectly alright to stay in port, in other words, if you are merely protecting the status quo—the virtues of wasting time with minor issues is sure enough if keeping things as they are is the goal. But that’s just the danger from the other point of view: the more time in port, the less able in battle—and certainly the history of the past several generations shows that supposed liberal or left-types have been increasingly unwilling to take what Bryan called the “Democratic idea” out for a spin.

Undoubtedly, in other words, American conservatives have relished observing left-wing graduate students in the humanities debate—to use some topics Eagleton suggests—“the insatiability of desire, the inescapability of the metaphysical … [and] the indeterminate effects of political action.” But what might actually affect political change in the United States, assuming anyone is still interested in the outcome and not what it means in terms of career, is a plain, easily-readable description of how that might be accomplished. It’s too bad that the mandarin admirals in charge of liberal politics these days appear to think that such a notion is a place where no one has gone before.

To Hell Or Connacht

And I looked, and behold a pale horse, and his name that sat on him was Death,
and Hell followed with him.
Revelations 6:8. 

In republics, it is a fundamental principle, that the majority govern, and that the minority comply with the general voice.
—Oliver Ellsworth.

In all Republics the voice of a majority must prevail.
—Andrew Jackson.

 

“They are at the present eating, or have already eaten, their seed potatoes and seed corn, to preserve life,” goes the sentence from the Proceedings of the Mansion House Committee for the Relief of Distress in Ireland During the Months of January and February, 1880. Not many are aware, but the Great Hunger of 1845-52 (or, in Gaelic, an Gorta Mór) was not the last Irish potato famine; by the autumn of 1879, the crop had failed and starvation loomed for thousands—especially in the west of the country, in Connacht. (Where, Oliver Cromwell had said two centuries before, was one choice for Irish Catholics to go if they did not wish to be murdered by Cromwell’s New Model Army—the other being Hell.) But this sentence records the worst fear: it was because the Irish had been driven to eat their seed potatoes in the winter of 1846 that the famine that had been brewing since 1845 became the Great Hunger in the year known as “Black ’47”: although what was planted in the spring of 1847 largely survived to harvest, there hadn’t been enough seeds to plant in the first place. Hence, everyone who heard that sentence from the Mansion House Committee in 1880 knew what it meant: the coming of that rider on a pale horse spoken of in Revelations. It’s a history lesson I bring up to suggest that “eating your seed corn” also explains the coming of another specter that many American intellectuals may have assumed lay in the past: Donald Trump.

There are two hypotheses about the rise of Donald Trump to the presumptive candidacy of the Republican Party. The first—that of many Hillary Clinton Democrats—is that Trump is tapping into a reservoir of racism that is simply endemic to the United States: in this view, “’murika” is simply a giant cesspool of hate waiting to break out at any time. But that theory is an ahistorical one: why should a Trump-like candidate—that is, one sustained by racism—only become the presumptive nominee of a major party now? “Since the 1970s support for public and political forms of discrimination has shrunk significantly” says one voice on the subject (Anna Maria Barry-Jester’s, surveying many sociological studies for FiveThirtyEight). If the studies Barry-Jester highlights are correct, and yet levels of racism remain precisely the same as in the past, then that must mean that the American public is not getting less racist—but instead merely getting better at hiding it. That then raises the question: if the level of racism still remains as high as in the past, why wasn’t it enough to propel, say, former Alabama governor George Wallace to a major party nomination in 1968 or 1972? In other words, why Trump now, rather than George Wallace then? Explaining Trump’s rise as due to racism has a timing problem: it’s difficult to think that, somehow, racism has become more acceptable today than it was forty or more years ago.

Yet, if not racism, then what is fueling Trump? Journalist and gadfly Thomas Frank suggests an answer: the rise of Donald Trump is not the result of racism, but of efforts to fight racism—or rather, the American Left’s focus on racism at the expense of economics. To wildly overgeneralize: Trump is not former Republican political operative Karl Rove’s fault, but rather Fannie Lou Hamer’s.

Although little known today, Fannie Lou Hamer was once famous as a leader of the Mississippi Freedom Democratic Party’s delegation to the 1964 Democratic Party Convention. On arrival Hamer addressed the convention’s Credentials Committee to protest the seating of Mississippi’s “regular” Democratic delegation on the grounds that Mississippi’s official delegation, an all-white slate of delegates, had only become the “official” delegation by suppressing the votes of the state’s 400,000 black people—which had the disadvantageous quality, from the national party’s perspective, of being true. What’s worse, when the “practical men” sent to negotiate with her—especially Senator Hubert Humphrey of Minnesota—asked her to step down her challenge on the pragmatic grounds that her protest risked losing the entire South for President Lyndon Johnson in the upcoming general election, Hamer refused: “Senator Humphrey,” Hamer rebuked him; “I’m going to pray to Jesus for you.” With that, Hamer rejected the hardheaded, practical calculus that informed Humphrey’s logic; in doing so, she set a example that many on the American Left have followed since—an example that, to follow Frank’s argument, has provoked the rise of Trump.

Trump’s success, Frank explains, is not the result of cynical Republican electoral exploitation, but instead because of policy choices made by Democrats: choices that not only suggest that cynical Republican choices can be matched by cynical Democratic ones, but that Democrats have abandoned the key philosophical tenet of their party’s very existence. First, though, the specific policy choices: one of them is the “austerity diet” Jimmy Carter (and Carter’s “hand-picked” Federal Reserve chairman, Paul Volcker), chose for the nation’s economic policy at the end of the 1970s. In his latest book, Listen, Liberal: or, Whatever Happened to the Party of the People?, Frank says that policy “was spectacularly punishing to the ordinary working people who had once made up the Democratic base”—an assertion Frank is hardly alone in repeating, because as noted not-radical Fortune magazine has observed, “Volcker’s policies … helped push the country into recession in 1980, and the unemployment rate jumped from 6% in August 1979, the month of Volcker’s appointment, to 7.8% in 1980 (and peaked at 10.8 % in 1982).” And Carter was hardly the last Democratic president who made economic choices contrary to the interests of what might appear to be the Democratic Party’s constituency.

The next Democratic president, Bill Clinton, after all put the North American Free Trade Agreement through Congress: an agreement that had the effect (as the Economic Policy Institute has observed) of “undercut[ing] the bargaining power of American workers” because it established “the principle that U.S. corporations could relocate production elsewhere and sell back into the United States.” Hence, “[a]s soon as NAFTA became law,” the EPI’s Jeff Faux wrote in 2013, “corporate managers began telling their workers that their companies intended to move to Mexico unless the workers lowered the cost of their labor.” (The agreement also allowed companies to extort tax breaks from state and municipal coffers by threatening to move, with the attendant long-term costs—including an inability to fight for workers.) In this way, Frank says, NAFTA “ensure[d] that labor would be too weak to organize workers from that point forward”—and NAFTA has also become the basis for other trade agreements, such as the Trans-Pacific Partnership backed by another Democratic administration: Barack Obama’s.

That these economic policies have had the effects described is, perhaps, debatable; what is not debatable, however, is that economic inequality has grown in the United States. As the Pew Research Center reports, “in real terms the average wage peaked more than 40 years ago,” and as Christopher Ingraham of the Washington Post reported last year, “the fact that the top 20 percent of earners rake in over 50 percent of the total earnings in any given year” has become something of a cliché in policy circles. Ingraham also reports that “the wealthiest 10 percent of U.S. households have captured a whopping 76 percent of all the wealth in America”—a “number [that] is considerably higher than in other rich nations.” These figures could be multiplied; they represent a reality that even Republican candidates other than Trump—who for the most part was the only candidate other than Bernie Sanders to address these issues—began to respond to during the primary season over the past year.

“Today,” said Senator and then-presidential candidate Ted Cruz in January—repeating the findings of University of California, Berkeley economist Emmanuel Saez—“the top 1 percent earn a higher share of our national income than any year since 1928.” While the cause of these realities are still argued over—Cruz for instance sought to blame, absurdly, Obamacare—it’s nevertheless inarguable that the country has become radically remade economically over recent decades.

That reformation has troubling potential consequences, if they have not already themselves become real. One of them has been adequately described by Nobel Prize-winning economist Joseph Stiglitz: “as more money becomes concentrated at the top, aggregate demand goes into a decline.” What Stiglitz means is this: say you’re Mitt Romney, who had a 2010 income of $21.7 million. “Even if Romney chose to live a much more indulgent lifestyle” than he actually does, Stiglitz says, “he would only spend a fraction of that sum in a typical year to support himself and his wife in their several homes.” “But take the same amount of money and divide it among 500 people,” Stiglitz continues, “say, in the form of jobs paying $43,400 apiece—and you’ll find that almost all of the money gets spent.” That expenditure represents economic activity: as should surely, but apparently isn’t to many people, be self-evident, a lot more will happen economically if 500 people split twenty million dollars than if one person has all of it.

Stiglitz, of course, did not invent this argument: it used to be bedrock for Democrats. As Frank points out, the same theory was advanced by the Democratic Party’s presidential nominee—in 1896. As expressed by William Jennings Bryan at the 1896 Democratic Convention, the Democratic idea is, or used to be, this one:

There are two ideas of government. There are those who believe that, if you will only legislate to make the well-to-do prosperous, their prosperity will leak through on those below. The Democratic idea, however, has been that if you legislate to make the masses prosperous, their prosperity will find its way up through every class which rests upon them.

To many, if not most, members of the Democratic Party today, this argument is simply assumed to fit squarely with Fannie Lou Hamer’s claim for representation at the 1964 Democratic Convention: on the one hand, economic justice for working people; on the other, political justice for those oppressed on account of their race. But there are good reasons to think that Hamer’s claim for political representation at the 1964 convention puts Bryan’s (and Stiglitz’) argument in favor of a broadly-based economic policy in grave doubt—which might explain just why so many of today’s campus activists against racism, sexism, or homophobia look askance at any suggestion that they demonstrate, as well, against neoliberal economic policies, and hence perhaps why the United States has become more and more unequal in recent decades.

After all, the focus of much of the Democratic Party has been on Fannie Lou Hamer’s question about minority representation, rather than majority representation. A story told recently by Elizabeth Kolbert of The New Yorker in a review of a book entitled Ratf**ked: The True Story Behind the Secret Plan to Steal America’s Democracy, by David Daley, demonstrates the point. In 1990, it seems, Lee Atwater—famous as the mastermind behind George H.W. Bush’s presidential victory in 1988 and then-chairman of the Republican National Committee—made an offer to the Congressional Black Caucus, as a result of which the “R.N.C. [Republican National Committee] and the Congressional Black Caucus joined forces for the creation of more majority-black districts”—that is, districts “drawn so as to concentrate, or ‘pack,’ African-American voters.” The bargain had an effect: Kolbert mentions the state of Georgia, which in 1990 had nine Democratic congressmen—eight of whom were white. “In 1994,” however, Kolbert notes, “the state sent three African-Americans to Congress”—while “only one white Democrat got elected.” 1994 was, of course, also the year of Newt Gingrich’s “Contract With America” and the great wave of Republican congressmen—the year Democrats lost control of the House for the first time since 1952.

The deal made by the Congressional Black Caucus in other words, implicitly allowed by the Democratic Party’s leadership, enacted what Fannie Lou Hamer demanded in 1964: a demand that was also a rejection of a political principle known as “majoritarianism”—the right of majorities to rule. It’s a point that’s been noticed by those who follow such things: recently, some academics have begun to argue against the very idea of “majority rule.” Stephen Macedo—perhaps significantly, the Laurance S. Rockefeller Professor of Politics and the University Center for Human Values at Princeton University—recently wrote, for instance, that majoritarianism “lacks legitimacy if majorities oppress minorities and flaunt their rights.” Hence, Macedo argues, “we should stop talking about ‘majoritarianism’ as a plausible characterization of a political system that we would recommend” on the grounds that “the basic principle of democracy” is not that it protects the interests of the majority but instead something he calls “political equality.” In other words, Macedo asks: “why should we regard majority rule as morally special?” Why should it matter, in other words, if one candidate should get more votes than another? Some academics, in short, have begun to wonder publicly about why we should even bother holding elections.

What is so odd about Macedo’s arguments to a student of American history, of course, is that he is merely echoing certain older arguments—like this one, from the nineteenth century: “It is not an uncommon impression, that the government of the United States is a government based simply on population; that numbers are its only element, and a numerical majority its only controlling power,” this authority says. But that idea is false, the writer goes on to say: “No opinion can be more erroneous.” The United States is, instead, “a government of the concurrent majority,” and “population, mere numbers,” are, “strictly speaking, excluded.” It’s an argument that, as it is spieled out, might sound plausible; after all, the structure of the government of the United States does have a number of features that are, “strictly speaking,” not determined solely by population: the Senate and the Supreme Court, for example, are pieces of the federal government that are, in conception and execution, nearly entirely opposed to the notion of “numerical majority.” (“By reference to the one person, one vote standard,” Francis E. Lee and Bruce I. Oppenheimer observe for instance in Sizing Up the Senate: The Unequal Consequences of Equal Representation, “the Senate is the most malapportioned legislature in the world.”) In that sense, then, one could easily imagine Macedo having written the above, or these ideas being articulated by Fannie Lou Hamer or the Congressional Black Caucus.

Except, of course, for one thing: the quotes in the above paragraph were taken from the writings of John Calhoun, the former Senator, Secretary of War, and Vice President of the United States—which, in one sense, might seem to give the weight of authority to Macedo’s argument against majoritarianism. At least, it might if not for a couple of other facts about Calhoun: not only did he personally own dozens of slaves (at his plantation, Fort Hill; now the site of Clemson University), he is also well-known as the most formidable intellectual defender of slavery in American history. His most cunning arguments after all—laid out in such works as the Fort Hill Address and the Disquisition on Government—are against majoritarianism and in favor of slavery; indeed, to Calhoun they are much the same: anti-majoritarianism is more or less the same as being pro-slavery. (A point that historians like Paul Finkelman of the University of Tulsa have argued is true: the anti-majoritarian features of the U.S. Constitution, these historians say, were originally designed to protect slavery—a point that might sound outré except for the fact that it was made at the time of the Constitutional Convention itself by none other than James Madison.) And that is to say that Stephen Macedo and Fannie Lou Hamer are choosing a very odd intellectual partner—while the deal between the RNC and the Congressional Black Caucus demonstrates that those arguments are having very real effects.

What’s really significant, in short, about Macedo’s “insights” about majoritarianism is that, as a possessor of a named chair at one of the most prestigious universities in the world, his work shows just how a concern, real or feigned, for minority rights can be used as a means of undermining the very idea of democracy itself. It’s in this way that activists against racism, sexism, homophobia and other pet campus causes can effectively function as what Lenin called “useful idiots”: by dismantling the agreements that have underwritten the existence of a large and prosperous proportion of the population for nearly a century, “intellectuals” like Macedo may be helping to dismantle economically the American middle class. If the opinion of the majority of the people does not matter politically, after all, it’s hard to think that their opinion could matter in any other way—which is to say that arguments like Macedo’s are thusly a kind of intellectual strip-mining operation: they consume the intellectual resources of the past in order to provide a short-term gain for a small number of operators.

They are, in sum, eating their seed-corn.

In that sense, despite the puzzled brows of many of the country’s talking heads, the Trump phenomenon makes a certain kind of potted sense—even if it appears utterly irrational to the elite. Although they might not express themselves in terms that those with elite educations find palatable—in a fashion that, significantly, suggests a return to those Victorian codes of “breeding” and “politesse” that elites have always used against what used to be called the “lower classes”—there really may be an ideological link between a Democratic Party governed by those with elite educations and the current economic reality faced by the majority of Americans. That reality may be the result of the elites’ loss of faith in what even Calhoun called the “fundamental principle, the great cardinal maxim” of democratic government: “that the people are the source of all power.” So, while the organs of elite opinion like The New York Times or other outlets might continue to crank out stories decrying the “irrationality” of Donald Trump’s supporters, it may be that Trumps’ fans (Trumpettes?) are in fact in possession of a deeper rationality than that of those criticizing them. What their votes for Trump may signal is a recognition that, if the Republican Party has become the party of the truly rich, “the 1%,” the Democratic Party has ceased to be the party of the majority and has instead become the party of the professional class: the “10%.” Or, as Frank says, in swapping Republicans and Democrats the nation “merely exchange[s] one elite for another: a cadre of business types for a collection of high-achieving professionals.” Both, after all, disbelieve in the virtues of democracy; what may (or may not) be surprising, while also deeply terrifying, is that supposed “intellectuals” have apparently come to accept that there is no difference between Connacht—and the Other Place.

 

 

**Update: In the hours since I first posted this, I’ve come across two different recent articles in magazines with “New York” in their titles: in one, for The New Yorker, Jill Lepore—a professor of history at Harvard in her day job—argues that “more democracy is very often less,” while the other, written by Andrew Sullivan for New York magazine, is entitled “Democracies End When They Are Too Democratic.” Draw conclusions where you will.