Ready for football? Remembering the first game between Princeton and Rutgers

It’s that time of year again! The air is saturated with the promise of cooler days ahead, the leaves are holding their breath, and school is nearly back in session. And that means one thing. Football season will soon be here. More specifically, college football. Princeton, as I’m sure you know, has quite the legacy in this area—dating back almost a century and a half.

To be precise, that legacy dates back all the way to November 6th, 1869: The day of the first official collegiate football game played between Rutgers and Princeton (then called The College of New Jersey).

Untitled

Back then, the game was really a hybrid combining elements of rugby and modern-day soccer. Each team consisted of 25 players struggling to kick the ball into the opposing team’s territory. Reportedly, a mere 100 spectators gathered to watch the game, many of them sitting on a wooden fence. The players took the field, removing their hats, coats and vests in preparation for play. Speaking of attire, some believe that the “Scarlet Knights” nickname for Rutgers came to be at this game. To differentiate themselves from Princeton, some players sported scarlet-colored scarves, worn as turbans. Thus, the Scarlet Knights were born. Alas, Rutgers defeated Princeton that day, 6-4. Six to four you ask? That’s right. Even the score-keeping method was different back then.

What a far cry from college athletics today, especially football. If you’ve ever been to a college football game (especially a Division 1 game), you know what I’m talking about. In 2011, many colleges including Michigan, Ohio State, Alabama, and Texas, had over 100,000 fans in attendance at their games. Stadiums practically ooze their team’s colors and the roar of the crowd is deafening. Music pumps through unseen speakers and there are always a few dedicated fans that choose to doff their shirts in favor of painting their team’s colors and/or letters onto their bodies. Who's #1? The Science of Rating and Ranking

People take their college football very seriously these days. There are all different types of divisions, championships, and rankings that decide when and where they get to play. The ratings of the NCAA determine which schools get to play for all the marbles in postseason bowl games. Amy N. Langville and Carl D. Meyer discuss these types of ranking systems in their book Who’s #1?
The Science of Rating and Ranking.

The major differences between college sports in the 19th century and college sports today are significant. College athletics have become an integral part of the community of higher education and of society as a whole.

Gaming the World But the nature of college sports today are troubling to some. On the one hand, college athletic programs serve to bring communities together and unite people who otherwise wouldn’t share any common ground.  In Gaming the World  Andrei Markovits and Lars Rensmann reflect on and explain how sports influence our daily lives and help to confirm a certain local, regional, and national identity. These programs also promote health and wellness at colleges nationwide, which benefits students.

But on the other hand, many colleges and universities, in their constant need to compete with other institutions, sometimes redirect funds and other resources toward football or basketball while the academic side of the institution is forced to manage without those funds.

In addition to the funding problem, there is also an “underperformance” problem. In Reclaiming the Game, William Bowen and Sarah Levin explore the academic experiences of college athletes and oReclaiming the Gamether students. In one of their studies they’ve found that recruited athletes at some schools are four times more likely to achieve admission than are other students (non-athletes) with similar academic qualifications. They also show that the typical recruit is more likely to end up in the bottom third of the college class than are other students and non-athletes.

It’s safe to say that the feverish fandom of college athletics can either boost or take away from the institution itself and the college experience. What’s your opinion on the matter?

Beautiful Game Theory: How Soccer can Help EconomicsIf the impact of sports is a topic that interests you, and you’re intrigued by unusual applications, also check out Ignacio Palacios-Huerta’s Beautiful Game Theory. Palacios-Huerta uses soccer as a lens to study game theory and microeconomics, covering such topics as mixed strategies, discrimination, incentives, and human preferences. Palacios-Huerta makes the case that soccer provides “rich data sets and environments that shed light on universal economic principles in interesting and useful ways.”

PS: Not to worry, Princetonians – we didn’t make a habit of losing to our northern neighbor. On May 2nd, 1866, in the first intercollegiate athletic event in Rutgers history, the Rutgers baseball team lost to Princeton, 40-2. Quite the slaughter! And Rutgers may have ended up winning the first football game 6 to 4, but a week later Princeton won the next match at home, 8 to 0.

A rematch is also on the horizon! If you’ve done your math right (and I’m sure you have) the 150th anniversary of the historic football game takes place in 2019. There have been talks of a rematch for this upcoming anniversary. Read more here.

Image credit: State Archives of Florida, Florida Memory, https://floridamemory.com/items/show/11389

Out of Ashes – Descent into Totalitarianism

Out of Ashes jacket

Out of Ashes – Konrad Jarausch

To mark the release of Konrad H. Jarausch’s Out of Ashes, we continue with our series of vignettes describing some of the most compelling moments of twentieth century European history, many of which are discussed in Jarausch’s book. Today we remember the descent into Totalitarianism. Loop back to our earlier post on the birth of Modernism here.

October 1917, The October Revolution. Centuries of imperial rule implode as revolutions sweep through Russia, triggering political and social changes that would lead to the formation of the Soviet Union. Food is scarce and mounting civil unrest eventually culminates in open revolt, forcing the abdication of Nicholas II, the last Russian czar. On October 24th, the Bolshevik Red Guard initiates a coup with the takeover of government buildings and the Winter Palace in Petrograd, seizing power from Kerensky’s interim government. The storming of the palace, an iconic symbol of the revolution, will be immortalized in Eisenstein’s 1927 film, October.

October 1922, The March on Rome. Italian society is in disarray in October, 1922, when 30,000 fascist blackshirts mass on the outskirts of Rome. Fearing arrest, their leader Benito Mussolini remains safely in Milan until King Victor Emmanuele II invites him to form a new government: he takes the train to Rome (first class) where he is appointed prime minister. A former journalist (not to mention an egomaniac) well-versed in manipulating a news story, Mussolini fakes pictures of himself marching with the blackshirts and subsequently claims to have led a mythical army of 300,000 to Rome on horseback.

Feb. 27 1933, The burning of the Reichstag. On the evening of Feb. 27, 1933, alarms sound. The Reichstag, the German Parliament building, is in flames. Firefighters rush to the inferno, but too late: the embodiment of democracy in Germany is completely destroyed. A young, mentally disturbed Communist Dutchman named Marinus van der Lubbe is arrested in due course. Many see the charges as a pretext, but opportunistic Nazi leaders waste no time issuing an emergency decree abolishing all civil rights enshrined in the Weimar Constitution. It will be 75 years until van der Lubbe (long since beheaded for the crime), is pardoned on the basis that his conviction was politically motivated.

April 26, 1937, The bombing of Guernica. It is 4 pm on a Monday in the Basque village of Guernica, and a group of German bombers are spotted over the hills. Today is market day, and over 10,000 people are in the town, which is widely considered the cultural and spiritual capital of the Basque people. During a relentless three-hour siege aimed at breaking the Basque resistance to Nationalist forces, the town is blanket-bombed, while fighter planes ruthlessly pursue and gun down anyone who tries to flee. Women and children huddle and die in cellars; the town square is surrounded by a wall of flame. Guernica is systematically and utterly destroyed: 1,600 civilians—one third of the population—are killed or wounded. Pablo Picasso will later depict the attack, considered the first aerial assault on a civilian population, in the famous anti-war painting, Guernica. Beneath a fallen horse with a gaping wound, a dismembered soldier is depicted; his severed hand still holds a broken sword from which a flower grows.

A Q&A with Konrad Jarausch can be found here.

Mark Zuckerberg Selects “The Muqaddimah” as his Latest Book Club Read!

MuqaddimahAs part of a 2015 initiative entitled A Year of Books, Mark Zuckerberg has selected a new book every two weeks to share and discuss with the Facebook community. For the second time, A Princeton University Press book has been selected:  The Muqaddimah by Ibn Khaldun is his latest pick!

The Muqaddimah, often translated as “Introduction” or “Prolegomenon,” is the most important Islamic history of the premodern world. Written by the great fourteenth-century Arab scholar Ibn Khaldûn (d. 1406), this monumental work established the foundations of several fields of knowledge, including the philosophy of history, sociology, ethnography, and economics

Mark Zuckerberg shared his personal account of the book and some reasoning behind his selection on his Facebook page:

My next book for A Year of Books is Muqaddimah by Ibn Khaldun.

It’s a history of the world written by an intellectual who lived in the 1300s. It focuses on how society and culture flow, including the creation of cities, politics, commerce and science.

While much of what was believed then is now disproven after 700 more years of progress, it’s still very interesting to see what was understood at this time and the overall worldview when it’s all considered together.

Check out The Muqaddimah and join the conversation through Zuckerberg’s A Year of Books Facebook page!

You can read the introduction here.

 

 

Q&A with Scott L. Montgomery & Daniel Chirot, authors of The Shape of the New: Four Big Ideas and How They Made the Modern World

Scott L. Montgomery and Daniel Chirot, both of the University of Washington, recently sat down for a Q&A on their new book, The Shape of the New: Four Big Ideas and How They Made the Modern World. Read on to learn what these four Enlightenment ideas are, and why they remain so important to the understanding of the ideological and political conflicts of our own time.

The Shape of the New jacketWhy are ideas so important to the history of the modern world and also to understanding so much of the contemporary world?

Many of our social, cultural, and political perceptions have been shaped by big ideas first argued by long dead intellectuals.  For example, Thomas Jefferson and Alexander Hamilton’s argument on the shape of democracy more than 200 years ago continues to play out today in American debates over the size and scope and purpose of government.

Why use the term ‘ideas’ rather than ideology?

Ideology refers largely to already fixed, hardened positions about certain policy choices. The ideas we cover were much broader.  The leading intellectuals who developed them understood many of the conflicting arguments and knew they had to argue their positions in order to have any lasting influence.

What are the “Four Big Ideas” of the title, and why do you focus on them?

Our focus is not on single concepts but entire systems of thought that have affected every level of social experience. Adam Smith wrote about the freedom that individuals must have to decide their material and moral lives and that, if attained, would create the most efficient, prosperous, and free society. Marx spoke of universal equality for humanity, a just and egalitarian world that would arrive due to scientific laws governing history. Darwin took evolution and turned it into a scientific theory of enormous force:  with natural selection as its main mechanism, it gave all life a secular history and human beings a new context liberated from ancient traditions of religious purpose and final principles. Finally, modern democracy gained its first major success through the founders of the United States, most notably Thomas Jefferson and Alexander Hamilton, two brilliant but flawed men whose fierce debates set down essential patterns for how to imagine and institutionalize this new political system that has spread throughout large portions of the world.

You seem to suggest that the most powerful ideas have come from the Enlightenment and mainly from areas like political philosophy, economics, and theories of society or history? Is this correct?

Yes, partly but not political, economic, and social thought alone. Ideas of vital, even extraordinary influence also emerged in the 18th and 19th centuries from the sciences and from religious thought, as shown in our discussion of Darwin and religious fundamentalism in Christianity and Islam. Other domains of thought, such as art and literature, played major roles in the shaping and movement of key ideas.

What are some examples of what you call the “Counter Enlightenment”?

Some hostility came from organized religions that resisted the Enlightenment’s defense of freedom of thought and skepticism about fixed dogma. Much also came from elites opposed to democratization and increased freedom for everyone.  This Counter-Enlightenment has never gone away. Fascism and communism were based on powerful ideas that rejected much of the Enlightenment. Religious opposition remains in some fervent Christian denominations and  in radical Islam there remains bitter hostility to much of modern science and to any questioning of holy texts and authority. Rather than witnessing the continuing expansion of democracy and greater individual freedom that seemed to characterize the late 20th century, some governments, not least China and Russia, reject that side of the Enlightenment and propose instead illiberal forms of autocracy as better alternatives.

What does this have to do with the humanities and social sciences?

We strongly feel that college and university education no longer insists enough on the importance of teaching the ideas on which free, dynamic societies are based. To resist the paranoia about threats coming from all sorts of poorly understood sources we have to reaffirm the importance of the great ideas that shaped so much that we value, and make it known how those ideas were used to combat ignorance and opposition to freedom. Ultimately it is imperative that we understand the ideas that oppose what we value so that we are better equipped to fight against them.

Scott L. Montgomery is an affiliate faculty member in the Henry M. Jackson School of International Studies at the University of Washington. His books include Does Science Need a Global Language?: English and the Future of Research. Daniel Chirot is the Herbert J. Ellison Professor of Russian and Eurasian Studies at the University of Washington. His books include Why Not Kill Them All?: The Logic and Prevention of Mass Political Murder (Princeton). They both live in Seattle.

Medieval Relativisms by John Marenbon

In a commencement speech at Dickinson College yesterday that focused on the virtues of free speech and free inquiry, Ian McEwan referenced the golden age of the pagan philosophers. But from the turn of the fifth century to the beginning of the eighteenth, Christian intellectuals were as fascinated as they were perplexed by the “Problem of Paganism,” or how to reconcile the fact that the great thinkers of antiquity, whose ideas formed the cornerstones of Greek and Roman civilization, were also pagans and, according to Christian teachings, damned. John Marenbon, author of the new book Pagans and Philosophers, has written a post explaining that relativism (the idea that there can be no objective right or wrong), is hardly a post-modern idea, but one that emerged in medieval times as a response to this tension.

Medieval Relativisms
By John Marenbon

Pagans and Philosophers jacketRelativism is often thought to be a characteristically modern, or even post-modern, idea. Those who have looked more deeply add that there was an important strand of relativism in ancient philosophy and they point (perhaps wrongly) to Montaigne’s remark, made late in the sixteenth century, that ‘we have no criterion of truth or reason than the example and idea of the opinions and customs of the country where we are’ as signalling a revival of relativist thinking. But the Middle Ages are regarded as a time of uniformity, when a monolithic Christianity dominated the lives and thoughts of everyone, from scholars to peasants – a culture without room for relativism. This stereotype is wrong. Medieval culture was not monolithic, because it was riven by a central tension. As medieval Christian thinkers knew, their civilization was based on the pagan culture of Greece and Rome. Pagan philosophers, such as Plato and Aristotle, were their intellectual guides, and figures from antiquity, such as the sternly upright Cato or Regulus, the general who kept the promise he had given to his enemies even at the cost of his life, were widely cited as moral exemplars. Yet, supposedly, Christian truth had replaced pagan ignorance, and without the guidance and grace provided for Christians alone, it was impossible to live a morally virtuous life. One approach to removing this tension was to argue that the pagans in question were not really pagans at all. Another approach, though, was to develop some variety of limited relativism.

One example of limited relativism is the view proposed by Boethius of Dacia, a Master in the University of Paris in the 1260s. Boethius was an Arts Master: his job was to teach a curriculum based on Aristotle. Boethius was impressed by Aristotelian science and wanted to remain true to it even on those points where it goes against Christian teaching. For example, Christians believe that the universe had a beginning, when God created it, but Aristotle thought that the universe was eternal – every change is preceded by another change, and so on, for ever. In Boethius’s view, the Christian view contradicts the very principles of Aristotelian natural science, and so an Arts Master like himself is required to declare ‘The world has no beginning’. But how can he do so, if he is also a Christian? Boethius solves the problem by relativizing what thinkers say within a particular discipline to the principles of that discipline. When the Arts Master, in the course of teaching natural science, says ‘The world has no beginning’, his sentence means: ‘The world has no beginning according to the principles of natural science’ – a statement which is consistent with declaring that, according to Christian belief the world did have a beginning. Relativizing strategies were also used by theologians such as Henry of Ghent, Duns Scotus and William of Ockham to explain how some pagans can have even heroic virtue and yet be without the sort of virtue which good Christians alone can have.

These and other medieval relativisms were limited, in the sense that one reference frame, that of Christianity, was always acknowledged to be the superior one. But Boethius’s relativism allowed pragmatically a space for people to develop a purely rational scientific world-view in its own terms, and that of the theologians allowed them to praise and respect figures like Cato and Regulus, leaving aside the question of whether or not they are in Hell. Contemporary relativists often advocate an unlimited version of relativism, in which no reference frame is considered superior to another. But there are grave difficulties in making such relativism coherent. The less ambitious medieval approach might be the most sensible one.

John Marenbon is a senior research fellow at Trinity College, University of Cambridge, honorary professor of medieval philosophy at Cambridge, and a fellow of the British Academy. He is the author and editor of many books, including Abelard in Four Dimensions, The Oxford Handbook of Medieval Philosophy, The Cambridge Companion to Boethius, and Medieval Philosophy: An Historical and Philosophical Introduction.

A Q&A with Cormac Ó Gráda, author of Eating People is Wrong

Cormac Ó Gráda’s new collection of essays on famine—which range in focus from from the economic history to the psychological toll—begins with a taboo topic. Ó Gráda argues that cannibalism, while by no means a universal feature of these calamities, has probably occurred more frequently than previously recognized. Recently he answered some questions on his book, Eating People is Wrong, and Other Essays on Famine, Its Past, and Its Future, its somber title, and his early interest in The Great Irish Famine.

O'Grada jacketWhy did you write this book?

CÓG: When Famine: A Short History (Princeton, 2009) came out, I wanted it to be my last book on the subject. So Eating People is Wrong was not a question of ‘what will I do next?’ I just realized a few years later that I had still had ideas to contribute on topics that would make for a new, different kind of book on famine. These topics ranged from famine cannibalism to the Great Leap Forward, and from market failure to famine in the 21st century; the challenge was to merge the different perspectives that they offered into what would become this new book.  The idyllic résidence I spent in the south of France courtesy of the Fondation des Treilles in the autumn of 2013 was when the different parts came together. By the end of that stay, I had a book draft ready.

What inspired you to get into your field?

CÓG: It is so long ago that I am bound to invent the answer… But I have always had an amateur interest in history—as lots of Irish people tend to have—whereas my academic training was in economics. Economic history seemed a good way of marrying the two, and that has been my chosen field since my time as a graduate student in the 1970s. I began as a kind of jack-of-all-trades economic historian of Ireland, focusing on topics as different as inheritance patterns and famine, or migration and banking. This work culminated in a big economic history of Ireland in 1994. My interest in the Great Irish Famine of the 1840s goes back to my teens, but that interest was sharpened after getting to know Joel Mokyr (also a PUP author) in the late 1970s. Economics taught me to think of the Irish in comparative terms, and that led eventually to the study of famines elsewhere. My books have all been solo efforts, but I have very lucky and privileged to write papers with some great co-authors, and some of these papers influenced the books.

How did you come up with the title or jacket?

CÓG: The title is an ironic nod to Malcolm Bradbury’s eponymous novel (which most people seem ignorant of). A friend suggested it to me over a pint in a Dublin bar. One of the themes of the chapter on famine cannibalism, to which the title refers, is the need to realize that famines not only do terrible things to people, but that people do terrible things to one other in times of famine. Peter Dougherty and his team at PUP came up with jacket. The image is graphic and somber without being sensationalist, which is what I had hoped for.

What is your next project?

CÓG: There is no single all-consuming project. A lot of my research in recent years has been collaborative work on British economic history with UCD colleague Morgan Kelly. So far the results of that work have appeared—when we are lucky—in academic journals rather than in books. We have plans to continue on this basis, but we are also involved in an interesting piece of research with Joel Mokyr on the origins of the Industrial Revolution, and that may eventually yield a monograph by the three of us. I also want to revise several unpublished papers in Irish economic history and to get them published singly or, perhaps, as a monograph. Finally, Guido Alfani of Bocconi University in Milan and I are editing a book on the history of famine in Europe. This is coming along well. The end product will consist of nine specialist country chapters, a cross-country analysis of the famines of World War II, and an overview by Alfani and me.

What are you currently reading?

CÓG: I am at page 630 (so another hundred or so pages to go) of Stephen Kotkin’s Stalin, vol. 1 (Penguin, 2014), which brings the story of Iosif Vissarionovich only as far as 1928. I have been interested in Soviet economic history since the late Alexander Erlich introduced me to the topic in Columbia in the 1970s, and this is what attracted me to Kotkin’s riveting tome—which, however, turns out to rather uninterested in the economic issues! I am also reading Maureen Murphy’s Compassionate Stranger: Asenath Nicholson and the Great Irish Famine (Syracuse, 2015), an account of an eccentric but appealing American evangelist who toured Ireland, mostly on foot, in the years leading up to and during the Great Hunger. I was familiar with Nicholson’s own published accounts of her travels, but knew very little about her otherwise, so Murphy’s book is a revelation.   My current bedtime reading is Henning Mankell’s The Man from Beijing (2010).

Cormac Ó Gráda is professor emeritus of economics at University College Dublin. His books include Famine: A Short History and Black ’47 and Beyond: The Great Irish Famine in History, Economy, and Memory (both Princeton).

Afghanistan President Ashraf Ghani mentions LOST ENLIGHTENMENT before Congress

Last night, Afghan President Ashraf Ghani and Afghan Chief Executive Abdullah were honored at a dinner held in the Ben Franklin Room. President Ashraf Ghani addressed the attendants of the dinner and stated, “[I]f there’s one book that you want to read please do read LOST ENLIGHTENMENT. [T]he story that Fred tells is not the story of the past. Its good news is that it’s the story of the future.” Read the transcript of the event, here.

LOST ENLIGHTENMENT is available in hardcover and will be released in paperback this June. Read the first chapter of this must-read for free, here.


 

bookjacket

Lost Enlightenment:
Central Asia’s Golden Age from the Arab Conquest to Tamerlane

S. Frederick Starr

Interview with n+1 co-founder and PUP author Mark Greif

As Adam Kirsch writes in Tablet Magazine’s review of n+1 co-founder Mark Greif’s widely-reviewed new book, The Age of the Crisis of Man, “[t]he word “crisis” itself seems to capture something essential about our relationship to history, which we now experience as a constant procession of unexpected, suddenly emerging threats.” From cold war to climate change, from economic recession, to war in Iraq, recent decades have seen their share of anxiety-provoking episodes. And yet, it’s safe to say the “crisis of man” has become something of a throwback expression. The notion that human nature itself is under threat is an intellectual artifact of mid-century American culture. Why so?

The question, and Greif’s new book, appear to have struck nerves in today’s intellectual community, inspiring, among an explosion of coverage, Kristin Iversen’s “Man-Splaining” in Brooklyn Magazine, and a widely discussed New York Times Book Review essay by Leon Wieseltier. Recently, Greif took the time to chat with Princeton University Press about his book:

You’re best known for your work as a founder of n+1 and your essays in that magazine. What connects that New York literary world to this book?

MG: To me, they’re tightly connected. When we founded n+1, I wanted to understand how the intellectual and literary worlds worked now. The opening section (of the book?) many of which I wrote in the early issues, was “The Intellectual Situation.” I wanted to know how conventional wisdom got settled; how certain questions became “important” and “serious,” but not others; and especially why new novels and essays sometimes had influence on other debates, and sometimes seemed irrelevant or old-fashioned, past tense. In the same ten years of n+1 attempts to intervene in literary culture, though, my “day job” in effect was as a scholar, I had been digging in the library to see, objectively, how we got where we are. I was reading through complete runs of old journals, Partisan Review, Commentary, to see how to make a twenty-first century journal. But also to see, archeologically, what had been obscured in our picture of the twentieth century. This book is the analytic and philosophical complement to n+1 for me. It’s my best effort to tell a new story of how the twentieth century determined what counts.

Can you say succinctly what the “Age of the Crisis of Man” is?

MG: Sure. It was a period in the center of the twentieth century, from the rise of Nazism to the end of the Sixties, in which we put a universal human character at the center of all “serious” discussion in public.Not incidentally, this period saw the shift of international philosophizing from continental Europe to the United States and England for a little while. And it saw a brief crest of the American novel to its high-water mark of reputation (though maybe not of literary production). And it saw dreams of utopian international order. All those strains come together around the figure of “Man.” But then the same concentration of energy helped create the civil rights and liberation movements that seemed to blow it apart.

So this is an era that we ought to remember and learn from?

MG: Not entirely. It’s not an era I want to champion. I don’t want to reify the Man debates as just one more rival aspect of the twentieth century, as if we need to add it to PBS documentaries alongside the Cold War, suburbanization, existentialism, all the ingredients of the canned version of midcentury. Many of the explicit “crisis of man” books feel empty, frankly. I want to have read them so others don’t have to! But I think the emptiness is important. My basic model of history tries to locate the empty spaces, or blank or negative spaces, in public philosophy and rhetoric and criticism. Those spaces that demand answers that are simply impossible to decide. They (the spaces?) set what matters, what is acceptable, what one should think or say. But as coercive as they are, they may be themselves quite weak, loose, or devoid of reason.

Does your history mean there wasn’t a “crisis of women” or crises in different communities in America, or political crises? How important is a universal “Man” to your story?

MG: Crises of women’s rights and equality exist in this period, and crises of African-American rights, and racism, segregation, white supremacy, you name it. The important thing to see is how “what counts,” as public discourse has it, makes women’s and African Americans’ claims harder to articulate in some registers—in contrast, say, to the earlier (does earlier modify 1930s, i.e. 1931 vs. 1937, or are you using it to mean the entire decade was earlier than the post-WWII starting point of your book?)1930s—and articulable in others. Yet later the same discourse will become a source of explosive power, as feminist and civil rights and black power speakers plant their flag on Man. Sex and race provided the most fundamental contradictions to a universal, unmarked man. But that line of difference, and how tortuously it rose to salience, is a big part of my story.

What have we lost, in the transition from the age whose portrait you give here, to the twenty-first century?

MG: That’s the toughest question. It’s very hard to look at these moments when “ideas mattered,” and novels answered “the big questions,” so to speak, and not be nostalgic. Clearly these ideas did have consequences, too in geopolitics, in the lasting revival of human rights, in the standing of literature, as well as in the creation of a whole atmosphere of life and thought. At the same time, it’s clear that lots of thoughtful and sensitive people found the “discourse of the crisis of man” gaseous and stifling, especially as it got older. Whenever you live, you live among the mediocrities and coercions of the ideas of your own time. History usually tends either to wash them out or take them at their own valuation, while condescending to them, of course, since we always know better now.

I guess what interested me most in my own research was that I came to see it as a mistake to declare we had gone “from universalism to difference” in ideas, or in our picture of the basic human subject. As if there once was unity (even if only among an elite population), which split into groups. Universalism, difference: each of these is an intellectual project, an effort. Neither is more original or more basic than the other, at least not in the twentieth century. You can’t decline from one to the other. That was one thing I tried to point out in the book.

You say in the conclusion that you want to figure out where we start for twenty-first century thought. Do you really think you can give a starting point?

MG: The starting points are already given. The question is: How much do we understand how history has determined our presuppositions—say, what counts for us as “serious” thought, or what role literature and art play in ethical and political thinking? And then: With fuller knowledge, can we choose among our starting points? Can we say that some are stupid, and likely to lead nowhere?

Personally, I am divided about this. The historian in me thinks it’s silly to ask anyone to produce a better discourse of public debate and art from the recognition of past follies. Looking back from the future, “stupidities” are all we have; by which I mean, contingencies, symptoms, actings-out, with no way to step outside of your own time to see how eternity (or the archive, or the leisure of future historians) will regard you. Would knowing the past really help restrain or channel our impulses, now? The “intellectual” in me, on the other hand, or say the participant in culture and literature, the writer, thinks it’s obligatory to try to figure out where your opinions and discoveries come from. Then to see where they’re tending, whether you like to admit those tendencies or not, and then to throw some overboard, while telling people the terrifying prophecy of others. Like a Jeremiah. Whether other people like to hear it or not.

CLIMATE SHOCK authors on TheAtlantic.com: Will camels roam Canada again?

Climate ShockThe last time concentrations of carbon dioxide were as high as they are today, write Marty Weitzman and Gernot Wagner, authors of Climate Shock: The Economic Consequences of a Hotter Planet, camels lived in Canada. That was a bit over 3 million years ago, of course. But how certain does science have to be for the world to act? Wagner and Weitzman had a terrific op-ed appear today on The Atlantic.com where they argue that climate is best thought of as a global-scale risk management problem. Check it out here:

Will Camels Roam Canada Again?

What we know about climate change is bad enough. What we don’t could make it even worse.

Gernot Wagner and Martin L. Weitzman

You are cruising down the highway at 65 miles per hour, reading a book in your self-driving car. Your life is in the hands of a machine—an eminently benevolent one. Meanwhile, in the lane next to you, an 18-wheeler using decidedly last-century technology—relying on a fallible human driver—appears to be swerving your way.

Your car’s computer is on the case. Equipped with orders of magnitude more computing power than the Apollo moon lander, it determines with all the confidence it can muster that there’s a greater-than-50-percent chance—it’s “more likely than not”—that the truck is about to hit you.

You may want to look up from your book. More importantly, you want to know with certainty that your onboard computer will hit the brakes, even if there’s a 49-percent chance that doing so will be a false alarm.

If, instead of “more likely than not,” the danger were “likely,” “very likely,” or even “extremely likely,” the answer would be clearer still. Even if there’s a 95-percent probability of a crash, there’s still a 1-in-20 chance that nothing will happen—but no one would gamble their life on those odds. Your car’s computer hopefully will have engaged the anti-lock braking systems already.

A perfect self-driving car doesn’t exist yet, nor has the world solved global warming. But it’s surprising that, by the standards that we’d expect in a car to keep its occupants safe, the governments of the world haven’t stepped on the brakes to avoid planetary-scale global warming disaster—a 100-year-storm hitting New York every other year, frequent and massive droughts, inundated coastal cities. In 1995, the Intergovernmental Panel on Climate Change declared that it was “more likely than not” the case that global warming was caused by human activity. By 2001, it had progressed to “likely.” By 2007, it was “very likely.” By 2013, it was “extremely likely.” There’s only one step left in official IPCC lingo: “virtually certain.”

Read the rest at The Atlantic.com here.

 

Q&A with Maud S. Mandel, author of Muslims and Jews in France: History of a Conflict

We recently sat down for a Q&A with Maud S. Mandel to talk about her new book Muslims and Jews in France: History of a Conflict. Read the introduction for free, here.

20130924_Brown_JudaicStudies_headshots_StephanieEwensPhoto001(3)

How does your book speak to the current dialogue about tensions between Muslims and Jews in France, particularly in the wake of Charlie Hebdo?

MM: First, my book helps contextualize recent events by placing them in a longer history of Muslim-Jewish relations in France. It thus helps us understand why the violent outburst against Charlie Hebdo became intertwined with an attack against a kosher market, two sites that might not seem obviously linked to contemporary on-lookers. Secondly, I think it also helps us understand the diversity of Muslim-Jewish responses during and after the violence. While French-born Muslim citizens perpetuated the attacks, a French-Muslim policeman died in the conflict and a Muslim immigrant hid Jews in the grocery store. Some Jews have opted in the aftermath to leave France for other countries, while many have never considered such an option. My book helps us get a better grasp on this diversity of possible responses by showing the complex evolution of Muslims and Jews to the French state and each other.

Why did you write this book?

MM: I wrote this book in response to the outbreak of anti-Jewish violence in France in 2000, after which a number of stories came out in the media referring to the “new antisemitism” in France. The term “new” often gives an historian pause, and so I became interested in investigating what was “new” about the events that were unfolding in France. What had changed in Muslim-Jewish relations over time? And what were the forces shaping the evolution of those relations?

What was the most interesting thing you learned from writing this book?

MM: Given the centrality of the Israeli-Palestinian struggle to so much of the media coverage of Muslim-Jewish conflict in France, I had expected the story I was writing to focus largely on that issue. And yet the further I delved into the topic, the more clear it became that the legacy of French colonialism and the evolution of French politics had as great an impact on Muslim-Jewish relations as events in Israel/Palestine. Although this conclusion should not have been a surprise to an historian, given the significance of context to the study of history, I was surprised by the long shadow of French colonialism in shaping my story.

What do you think is the book’s most important contribution?

MM: As in all historical projects, my goal is to complicate simplistic understandings of the problem before us, to challenge notions of inevitability, to force us to question how and why the past took the shape that it did, and to push against monocausal explanations. This approach has pointed me to the diversity of socio-religious relationships between Muslims and Jews in France; conflict is not the only–or even the primary–way of understanding these relationships. This approach has also directed me away from conceptualizing Muslim-Jewish relations in France as arising inevitably from conflict in the Middle East. Rather, I argue that where conflict does exist, its origins and explanation are as much about France and French history as they are about Middle Eastern conflict. While global developments created fault lines around which activists began to mobilize, the nature of that mobilization (i.e. who was involved), the political rhetoric employed, and the success or lack thereof of their appeal emerged from French political transformations.

What was the biggest challenge involved with bringing this book to life?

MM: The biggest challenge involved with bringing this book to life was my stage of life when I wrote it. Newly tenured at Brown and with two young children, I faced the difficulty of finding long stretches of time away from campus and the responsibilities of home life to conduct research abroad. This book would have benefited from much longer periods of ethnographic research in Marseille, one of my key sites of investigation, but it was extremely difficult to balance all the demands of my life in such a way as to accommodate long research trips. The result was that it took me a long time to write this book, and I never felt I could immerse myself as deeply in the project as I desired.

Describe your writing process. How long did it take you to finish your book? Where do you write?

MM: As I mentioned in my answer to the last question, the book took me a long time to write. I began the research when my oldest child was two years old and it came out in print just before he turned fourteen! I wrote most of it in my home office that I share with my husband. Much of the writing happened during a couple of sabbaticals in which we shared that space with several cats. I have fond memories of those long days of writing. My process is to write everything out in long detail and then to pare down to my central argument. First drafts of most chapters thus numbered around 250-300 pages. The work of crafting chapters came in the revisions process, which I really enjoy.

What is the biggest misunderstanding people have about what you do?

MM: People often assume the study of history is either a process of learning about the facts of the past (dates and names) or laying out new information. To my mind, however, the study of history is far more of a humanistic exercise than a social science. Historians are storytellers and interpreters.


 

bookjacket

Muslims and Jews in France:
History of a Conflict
Maud S. Mandel

The Failure of Islamic Democracy, by John Owen, author of CONFRONTING POLITICAL ISLAM: Six Lessons from the West’s Past — Op-Ed Original

The Failure of Islamic Democracy
By John M. Owen IV

The recent jihadist horrors in France, Pakistan, Nigeria, Iraq, and Syria have lured our attention away from political conditions in the Middle East that indirectly helped produce them. In Turkey and Egypt “Islamic democracy” failed in 2014, and that failure will likely have long and deep repercussions for the entire region.

From northwest Africa to South Asia, majorities of Muslims routinely tell pollsters that they believe their country should either adopt literal Sharia, law derived from Islam’s holy texts, or at least follow the principles of those texts. The secularism that authoritarian Muslims imposed on their peoples from the 1920s through the 1970s is simply not popular over this vast region.

At the same time, the late Arab Spring made clear that Middle Eastern Muslims want governments that are accountable to them. The only resolution for most countries in the region, then, is some kind of Islamic democracy.

The very phrase “Islamic democracy” seems incoherent the Western ear, and indeed any Islamic democracy could not be liberal, in the individualist and secularist sense that we mean by that term today.

What, then, is Islamic democracy? Since it took power in 2002, Turkey’s ruling AK (Justice and Development) Party has invited the world to watch it build just such a system (although its leaders insist on the term “conservative democracy”). The early years of AK Party government under Recep Tayyip Erdoğan looked promising, as the economy grew, negotiations with Kurdish separatists progressed, and Turkey even moved toward membership in the European Union. The AK Party fairly won several elections.

The unraveling began in 2013 with a crackdown on protests, and in 2014 it continued with corruption charges against Erdoğan allies, media censorship, politicization of the judiciary, and arrests of political rivals. Elected President in August after twelve years as Prime Minister, Erdoğan has made clear his determination to expand the powers of that office.

Then there is Egypt. Its stirring 2011 revolution ousted the authoritarian secular regime of Hosni Mubarak, and free elections in 2012 produced an Islamist president, Mohamed Morsi, and an Islamist majority in parliament. Openly admiring of the Turkish model, the new Egypt was poised to exemplify an Arab Islamic democracy.

But in November 2012 Morsi assumed extraordinary powers. Mounting public protests against Morsi’s power grab were followed by his ouster by Egypt’s military in July 2013, led by General Abdel Fattah al-Sisi. In 2014 al-Sisi ran nearly unopposed for President, and while in office he has suppressed the Muslim Brotherhood and all other dissenters. Egypt appears where it was before 2011, only with a different former army general in charge.

Turkey’s Erdoğan has bested his opponents; Egypt’s Morsi was destroyed by his. But in both countries the experiment with Islamic democracy has failed. Each elected leader confronted powerful elites and large segments of the public who did not trust him to remain a democrat. Relations deteriorated, factions polarized, and both countries are settling into sultanism.

These depressing stories are not only about Turkey and Egypt. They are about the future of Islamic democracy itself. For nearly a century the entire Middle East has been passing through a legitimacy crisis, or a struggle over the best way to order society. The West and other regions have passed through legitimacy crises of their own in past centuries – most recently, the twentieth-century struggle between communism and liberal democracy. Prolonged spasms like these scramble political loyalties and generate unrest, revolution, and foreign interventions.

In the Muslims’ current crisis the original contenders in the struggle were secularism, pioneered by Atatürk, founder of the Turkish Republic; and Islamism, formulated by thinkers such as the Sunni Hassan al-Banna and the Shia Ruhollah Khomeini. Many Muslim and non-Muslim scholars, journalists, and politicians lately have touted Islamic democracy as a hybrid solution to this long struggle.

Western history shows that long international ideological contests are played out in the policies and performances of real countries. And they end only when a large, influential state that exemplifies one contending ideology manifestly outperforms large states exemplifying the alternative ideologies.

Consider the Cold War, a struggle between the liberal democracy and communism that played out in the competition between the United States and Soviet Union. By the 1980s America’s economic, technological, and military superiority was clear. Societal elites the world over concluded that communism did not work after all. Country after country abandoned state socialism, and liberal democracy enjoyed a period of predominance over much of the globe.

In 2011 and 2012 it appeared that the Middle East was heading for a similar resolution, with Turkey showing the superiority of Islamic democracy, Egypt following its example, and elites in neighboring societies adopting this new hybrid regime as the wave of the future. As 2015 begins, things look nearly the opposite. Tunisia, which recently held fair elections and a peaceful transfer of power, provides some hope. But if history is a good guide, Tunisia is too small and peripheral to be an exemplar or inspire imitation.

We can continue to argue over whether the retreat of Islamic democracy was inevitable or caused by other factors. We can argue over whether Islamic democracy’s time has passed, or not yet arrived. What is clear is that the Middle East’s legitimacy crisis continues, with an end no longer in sight.

John M. Owen IV is Professor of Politics, and a faculty fellow at the Institute for Advanced Studies in Culture, at the University of Virginia and author of CONFRONTING POLITICAL ISLAM.

Donald E. Canfield and Gillen D’Arcy Wood to be honored at annual conference of the American Meteorological Society

On January 7th and 8th in Phoenix, Arizona, authors Donald E. Canfield and Gillen D’Arcy were recognized by the Atmospheric Science Librarians International (ASLI) for their books Oxygen: A Four Billion Year History and Tambora: The Eruption That Changed the World, respectively.

Canfield’s account of the history and importance of oxygen won him the 2014 ASLI Choice Award and will be recognized as “a well-documented, accessible, and interesting history of this vital substance.” Wood received an honorable mention for this year’s Choice Award in History. Tambora, will be acknowledged as “a book that makes this extreme event newly accessible through connecting literature, social history, and science.” More general information on the awards can be found, here.

Congratulations to Donald E. Canfield and Gillen D’Arcy Wood!

bookjacket

Oxygen:
A Four Billion Year History
Donald E. Canfield

 

bookjacket

Tambora:
The Eruption That Changed the World
Gillen D’Arcy Wood