Archives for November 2008

Andrew Gelman on

Andrew Gelman, author of Red State, Blue State, Rich State, Poor State, taped a great interview for with Will Wilkinson earlier in November. Enjoy!

And, though he’s a bit critical of his performance, here’s a post over at his blog about the interview.

David E. Lewis discussing the Obama appointments in the media

Our author and Vanderbilt University Professor David E. Lewis has been discussing the Obama appointments in the media.  His new book THE POLITICS OF PRESIDENTIAL APPOINTMENTS: Political Control and Bureaucratic Performance is a timely new look at how and why presidents use political appointees and how their choices impact government performance–for better or worse.  With Obama’s recent appointees and other vacant cabinet positions, Lewis is the right voice at the right time.  Check out an op-ed piece he wrote for the Nashville Tennessean or today’s feature in the Washington Post

Sheldon Wolin’s DEMOCRACY INC receives 2008 Lannan Notable Book Award

Please join me in congratulating Sheldon Wolin for winnning the 2008 Lannan Notable Book Award for his recent book DEMOCRACY INCORPORATED: Managed Democracy and the Specter of Inverted Totalitarianism.  This is a very prestigious and well-deserved award given to Sheldon for his original scholarship and exceptional writing.  You can read more about the Lannan awards and the Lannan Foundation in general here.

Event with Darius Rejali at British Institute of International and Comparative Law

On Thursday, November 27th from 1:00 PM to 2:00 PM, The British Institute of International and Comparative Law will host a book launch and discussion on the subject of torture and democracy featuring authors Darius Rejali and Philippe Sands. Registration is required–hope you can join us for what will no doubt be a fascinating conversation.

We are long past debating the question “is torture compatible with democracy?” – obviously it has been for some time.

Rejali and Sands will sketch out the answers to questions like under what circumstances is torture compatible with democracy? Why have democracies been historically such powerful innovators of torture? Will the change of government in America make a difference?

They will also consider what social scientists know to be the long-term effects of institutionalized torture, both within government and for social order more generally, and map out the challenges facing the next President.

Date and Time

Thursday, 27 November 2008

1:00 to 2:00 PM

British Institute of International and Comparative Law, Council Chamber, Charles Clore House,
17 Russell Square, London, WC1B 5JP

Darius Rejali, Professor of political science, Reed College and author of Torture and Democracy
Professor Philippe Sands QC, University College London and Matrix Chambers

Yale University Press Centennial

Congratulations go to our colleagues at Yale University Press who celebrated their centennial last Friday, November 14, by hosting a conference in New Haven on “Why Books Still Matter.”  The Press’s first hundred years have been memorialized in a new book by Nicholas Basbanes, A World of Letters: Yale University Press, 1908-2008.

Introducing Tony Rothman’s new column “METAPHYSICS: The World at Large Through the Eyes of a Scientist”

I’m pleased to introduce a new semi-monthly column by writer, physicist, and Princeton University lecturer Tony Rothman.  His most recent book, with Fukagawa Hidetoshi, is called SACRED MATHEMATICS: Japanese Temple Geometry.  Please enjoy his inaugural post!

“Do The Math”

Tony Rothman

The word “metaphysics” derives from the Greek meta ta physika. It was originally used by Aristotle’s Hellenistic editors merely to refer to his books that came after the books on physika—the things of nature. Thus “metaphysics”—after the things of nature. In this series I do not intend primarily to discuss the things of nature, the latest and most dazzling scientific discoveries, trends and fashions. I would like instead to explore how our world looks through the eyes of a professional physicist, one trained in mathematics and steeped in analytical habits. My particular area of expertise is cosmology, the study of the early universe, but like any physical scientist I value facts and data over opinion, pay close attention to the logic of an argument and show an appreciation for a carefully designed experiment or an elegant mathematical demonstration. To those of us raised in the scientific community such an outlook seems reasonable. When we listen to the news, we learn we do not think much like journalists, talk show hosts or politicians. Sometimes we wonder whether we are space aliens.

Hearing “Do the math” does frequently make me ask what planet I inhabit. Over the past few years, “Do the math” has become an American catch phrase. As far as I can tell, it usually refers to counting: “Do the Democrats have enough votes to pass this bill in Congress? You do the math.” It is a sad commentary on twenty-first century America that an activity human beings are supposed to have mastered five or six thousand years ago is considered higher mathematics. Rarely do I hear “Do the math” applied to something as advanced as multiplication; division is out of the question.

I speak seriously. For the past several years I have taught introductory physics at Princeton University. Two years ago we gave our usual final exam at the end of the second semester, which is devoted to electricity and magnetism. Princeton freshmen are easily the best undergraduates I have taught in twenty-five years of teaching and they are far better trained than I was at their age. By the end of the course we have covered some sophisticated material, including an introduction to Maxwell’s equations and even a nontrivial topic in calculus known as surface integrals. On the final exam we decided to give them a break with an easy problem. From some basic quantities they needed to arrive at an equation that amounted to A = B2C. Almost everyone got that far. We next asked if B were lowered by a factor of one thousand, how much would C have to change to keep A the same.

Of the two hundred or so students who took the exam, approximately twenty used their brains. If B goes down a thousand times, B2 goes down a million times; therefore to keep A constant C must increase by a factor of one million. That is all that was required. The vast majority of students went back to their calculators, numerically recomputed B and C from the information provided and found a new value for A. With all the arithmetic mistakes that could and did occur by doing the problem on a calculator, about 60% of the students arrived at the wrong answer. Those who got it right often said that C had to increase by a factor of 999,999.9998 and rounded their answer off to one million. This is a little like the engineer who said that two plus two equals four to within a tolerance of .0002, and got the job.

A formal way of stating the problem is to say that since A is remaining constant, the ratio A/A = 1. Therefore, the ratio of the old value of B2C to the new value of B2C must be also be 1 and consequently the old value of B2C must equal the new value of B2C. Reasoning by proportions, or ratios, was known to the ancient Chinese and is something I was taught in sixth or seventh grade. It is one of the basic tools in the arsenal of any scientist. Not only has the tool been lost to the calculator generation but so has the concept of an exact answer.

The act of dividing one quantity A by another quantity B to get a ratio that compares the sizes of A and B may be the single most important act a numerate person can perform. Yet the ability to make that comparison is vanishing before our eyes. My favorite recent example is the Cingular/AT&T ad “Fewest dropped calls.” The ad is great because it is meaningless. Clearly the company with zero customers will have the fewest dropped calls. It goes without saying that the entire advertising industry is based on ignoring standards of comparison, in other words, by omitting the denominator of a fraction. “Doctors recommend…” How many doctors? What percentage?

Less amusing examples of denominator omission now affect us all. Before me is an article on plug-in hybrid cars that claims 100 mpg for the prototype. It is true, by the odometer one can drive 1,000 miles on 10 gallons of gasoline: 100 mpg. What the claim ignores is the energy needed to charge the battery and the energy lost in transmission from the power plant, both of which are significant. Without entering the debate on the merits of plug-in hybrids, what is clearly called for is a proper ratio to measure vehicle efficiency: the number of miles driven per total amount of energy required, perhaps with an adjustment for emissions.

More seriously, about a month ago the New York Times reported that the total money worldwide tied up in derivatives is roughly 550 trillion dollars. Whether this number is accurate, I don’t know, but I do know that it is about ten times the world’s gross product. One does not have to be a scientist to realize that there is simply not enough money in the world to cover such bets and that the system must collapse.

The moral of these anecdotes is that one number in isolation means little. Only when it is compared with a standard does it give us knowledge. Unfortunately, this elementary truth is ignored on a daily basis, not only by Princeton students, proponents of future cars and financial wizards, but by the news media as a whole. NPR, the BBC and the NY Times routinely presents figures without comparison. “The number of unemployed Americans this year has increased by two million.” “Six thousand million tons of carbon were released into the atmosphere in 2006.” Are these large numbers or are they small numbers? How does one know?

Only when we learn what fraction two million people is of the total workforce does the fact acquire meaning. The simple act of quoting numbers as percentages rather than absolute figures, a procedure known to Chinese peasants three thousand years ago, would instill a great deal of numerical hygiene into public discourse.

Jonathan Macey on why less is more when it comes to government regulation

Jonathan Macey, author of Corporate Governance: Promises Kept, Promises Broken has been interviewed by Yale Law School.

A quick excerpt:

“It’s unpopular now to talk about deregulation. As I give this interview, we’re in the middle of this big market crash, and everybody seems to be a born again deregulator…My book does focus a lot on deregulation and it criticizes the Sarbanes-Oxley Act, and frankly I think recent events have proven the source of these criticisms to be correct…All the companies that we see imploding today are subject to this statute, and one thing that is painfully clear is that the increased attention to risk management that we were supposed to get with Sarbanes-Oxleywe haven’t gotten. And that firms have been free to engage in really incredibly excessive risk taking and that these so-called regulatory or legislative solutions just haven’t worked very well.”

Listen to the rest of the interview here.

Celebrating Margaret Mead at the American Museum of Natural History, November 13

Margaret Mead, possibly the best-known, and certainly one of the most controversial, anthropologists in 20th-century America worked at the American Museum of Natural History for 50 years. On Thursday, November 13, at 6:30pm in the Kaufmann Theater (first floor) Nancy Lutkehaus, Professor of Anthropology, University of Southern California, author of the just-released MARGARET MEAD: THE MAKING OF AN AMERICAN ICON, and Mead’s daughter and granddaughter, Mary Catherine Bateson and Sevanne Kassarjian present memories and images of this riveting woman. Introduced by Laurel Kendall, Curator, Division of Anthropology, AMNH. A book signing will follow. This event is co-presented with the Barnard Center for Research on Women and is supported, in part, by Sara Lee Schupf.

For complete line-up of films at the 32nd annual Margaret Mead Film Festival, visit

6 quick takeaways on what really happened last night from Andrew Gelman

While most of us were sleeping off an evening of watching election returns, statistician Andrew Gelman was busy crunching numbers and creating a great series of graphs.

Six quick takeaways from his post:

1. The election was pretty close. Obama won by about 5% of the vote, consistent with the latest polls and consistent with his forecast vote based on forecasts based on the economy.

2. As with previous Republican candidates, McCain did better among the rich than the poor.

3. The gap between young and old has increased–a lot.

4. By ethnicity: Barack Obama won 96% of African Americans, 68% of Latinos, 64% of Asians, and 44% of whites. In 2004, Kerry won 89% of African Americans, 55% of Latinos, 56% of Asians, and 41% of whites. So Obama gained the most among ethnic minorities.

5. The red/blue map was not redrawn; it was more of a national partisan swing.

6. The pre-election polls pretty much nailed the national vote.


We have been very pleased with the international interest Ray Fisman and Ted Miguel’s ECONOMIC GANGSTERS: Corruption, Violence, and the Poverty of Nations has received–from Hong Kong to the UK to Australia–and now to Shanghai.  The Shanghai Daily ran this review in their pages today. 

Fisman and Miguel’s claim-to-fame was a fascinating study they did in 2006 that measured corruption based on the traffic tickets diplomats in NYC received.  Chris Shea for the NY Times wrote about it here

Just how smart are American voters?

In a recent op-ed in the Los Angeles Times, Princeton professor and author of Unequal Democracy, Larry Bartels, comments on how the electorate as a whole may be wiser and more rational than any individual.

November 3, 2008

One of the bestselling books of the 2008 election season has been “Just How Stupid Are We?” by popular historian Rick Shenkman. It presents a familiar collection of bleak results
from opinion surveys documenting the many things most Americans don’t know about politics, government and history. “Public ignorance,” Shenkman concludes, is “the most obvious cause” of “the foolishness that marks so much of American politics.”

But is that really true? Does it matter whether voters can name the secretary of Defense or whether they know how long a U.S. Senate term is? The important question is not whether voters are ignorant but whether they make sensible choices despite being hazy about the details. (OK, really hazy.) If they do, that’s not stupid — it’s efficient.

Political scientists have been studying this subject for years, and they’ve found plenty of grounds for pessimism about voters’ rationality.

In the early 1950s, Paul Lazarsfeld and his colleagues at Columbia University concluded that electoral choices “are relatively invulnerable to direct argumentation” and “characterized more by faith than by conviction and by wishful expectation rather than careful prediction of consequences.” For example, voters consistently misperceived where candidates stood on important issues.

In 1960, a team of researchers from the University of Michigan described “the general impoverishment of political thought in a large proportion of the electorate.” Shifts in election outcomes, they concluded, were largely attributable to defections from long-standing partisan loyalties by relatively unsophisticated voters with little grasp of issues or ideology. A recent replication of their work found that things haven’t changed much.

The intervening decades have seen a variety of concerted attempts to overturn or evade the findings of the classic Columbia and Michigan studies, but without much success.

In the 1990s, political scientists took a different tack, acknowledging that, yes, voters were generally uninformed, but denying that the quality of their political decisions suffered much as a result. Voters, they argued, used “information shortcuts” to make rational electoral choices. These shortcuts included inferences from personal narratives, partisan stereotypes and endorsements.

In one of the most colorful examples of an information shortcut, political scientist Samuel Popkin suggested that Mexican American voters had good reason to be suspicious of President Ford in 1976 because he didn’t know how to eat a tamale — a shortcoming revealed when he made the mistake of trying to down one without first removing its cornhusk wrapper. According to Popkin, “Showing familiarity with a voter’s culture is an obvious and easy test of ability to relate to the problems and sensibilities of the ethnic group.”

Obvious and easy, yes — but was this a reliable test? Would Mexican American voters have been correct to infer that Ford was less sensitive to their concerns than his primary opponent, Ronald Reagan? I have no idea, and neither does Popkin.

In “Uninformed Votes,” a 1996 study examining presidential elections from 1972 to 1992, I took another approach, assessing how closely voters’ actual choices matched those they would have made had they been “fully informed.” I found that the actual choices fell about halfway between what they would have been if voters had been fully informed and what they would have been if made on the basis of a coin flip.

The ideal of rational voting behavior is further undermined by accumulating evidence that voters can be powerfully swayed by television ads just before an election. A major study of the 2000 presidential election suggested that George W. Bush’s razor-thin victory hinged on the fact that he had more money to spend on television ads in battleground states in the final weeks of the campaign.

Optimism about the democratic process has often been bolstered by appeals to the “miracle of aggregation” — an idea formalized in a mathematical demonstration by the social theorist Condorcet more than 200 years ago. He showed that a group trying to reach a decision by a majority vote (and in which each individual is making an independent judgment) is very likely to reach a correct decision even if each individual is only slightly more likely to reach the correct conclusion than he would simply by flipping a coin.

Applied to electoral politics, Condorcet’s logic suggests that the electorate as a whole may be much wiser than any individual voter. The only problem is that things may not work so happily. Real voters’ errors are quite unlikely to be random and statistically independent, as Condorcet’s logic requires. When thousands or millions of voters misconstrue the same relevant fact or are swayed by the same vivid campaign ad, no amount of aggregation will produce the requisite miracle — individual voters’ “errors” will not cancel out in the overall election outcome.

Voters’ strong tendency to reward incumbents for peace and prosperity and punish them for bad times looks at first glance like a promising mechanism of political accountability, because it does not require detailed knowledge of issues and policy platforms. As political scientist Morris Fiorina has noted, even uninformed citizens “typically have one comparatively hard bit of data: They know what life has been like during the incumbent’s administration.”

Unfortunately, “rational” rewarding and punishing of incumbents turns out to be much harder than it seems, as my Princeton colleague, Christopher Achen, and I have found. Voters often misperceive what life has been like during the incumbent’s administration. They are inordinately focused on the here and now, mostly ignoring how things have gone earlier in the incumbent’s term. And they have great difficulty judging which aspects of their own and the country’s well-being are the responsibility of elected leaders and which are

This election year, an economic downturn turned into an economic crisis with the dramatic meltdown of major financial institutions. John McCain will be punished at the polls as a result. Whether the current economic distress is really President Bush’s fault, much less McCain’s, is largely beside the point.

Does all of this make voters stupid? No, just human. And thus — to borrow the title of another popular book by behavioral economist Dan Ariely — “predictably irrational.” That may be bad enough.

Thomas Kidd on “Barack Obama: Secret Muslim?” over at HNN

From a recent University of Texas poll:

When asked to identify Obama’s religion, 45 percent of respondents accurately identified him as Protestant, however 23 percent erroneously identified him as Muslim.

In a related article over at History News Network, Thomas Kidd, author of American Christians and Islam: Evangelical Culture and Muslims from the Colonial Period to the Age of Terrorism, asks “How has the prospect of a secret Muslim as President taken such a prominent place among the cyber-myths of this election?” As he notes the roots of this “fear” of Islam has long roots in American culture and history:

American fears about Muslims precede 9/11 by hundreds of years, with origins as early as the founding of the first English colonies in America. History also shows conflicted American attitudes toward Islam, even among conservative Christians, whose views of Islam have ranged from studied respect to apocalyptic revulsion.

Click through to read more on this historical perspective.