#WinnerWednesday: Congratulations, Ellen Wu!

Ellen D. Wu – The Color of Success: Asian Americans and the Origins of the Model Minority

Finalist for the 2015 Theodore Saloutos Memorial Book Award, Immigration and Ethnic History Society

The Theodore Saloutos Memorial Book Award is given annually to the book judged best on any aspect of the immigration history of the United States.  “’Immigration history’ is defined as the movement of peoples from other countries to the United States, of the repatriation movements of immigrants, and of the consequences of these migrations, for both the United States and the countries of origin.” The Immigration and Ethnic Historical Society has complete information on this award here.

Wu has written on “the model minority myth” for the LA Times, and has answered questions about her book here. She also won The Immigration and Ethnic Historical Society’s Outstanding First Book Award this year.  Congratulations, Ellen!


The Color of Success:
Asian Americans and the Origins of the Model Minority
Ellen D. Wu
Hardcover | $39.50 / £27.95 | ISBN: 9780691157825
376 pp. | 6 x 9 | 19 halftones.eBook | ISBN: 9781400848874
Endorsements | Table of Contents
The Color of Success embodies exciting developments in Asian American history. Through the lens of racial liberalism and cultural diplomacy, Ellen Wu offers a historically grounded analysis of the Asian American model minority in the contexts of domestic race politics and geopolitics, and she unveils the complexities of wartime and postwar national inclusion.”
Eiichiro Azuma, University of Pennsylvania

Medieval Relativisms by John Marenbon

In a commencement speech at Dickinson College yesterday that focused on the virtues of free speech and free inquiry, Ian McEwan referenced the golden age of the pagan philosophers. But from the turn of the fifth century to the beginning of the eighteenth, Christian intellectuals were as fascinated as they were perplexed by the “Problem of Paganism,” or how to reconcile the fact that the great thinkers of antiquity, whose ideas formed the cornerstones of Greek and Roman civilization, were also pagans and, according to Christian teachings, damned. John Marenbon, author of the new book Pagans and Philosophers, has written a post explaining that relativism (the idea that there can be no objective right or wrong), is hardly a post-modern idea, but one that emerged in medieval times as a response to this tension.

Medieval Relativisms
By John Marenbon

Pagans and Philosophers jacketRelativism is often thought to be a characteristically modern, or even post-modern, idea. Those who have looked more deeply add that there was an important strand of relativism in ancient philosophy and they point (perhaps wrongly) to Montaigne’s remark, made late in the sixteenth century, that ‘we have no criterion of truth or reason than the example and idea of the opinions and customs of the country where we are’ as signalling a revival of relativist thinking. But the Middle Ages are regarded as a time of uniformity, when a monolithic Christianity dominated the lives and thoughts of everyone, from scholars to peasants – a culture without room for relativism. This stereotype is wrong. Medieval culture was not monolithic, because it was riven by a central tension. As medieval Christian thinkers knew, their civilization was based on the pagan culture of Greece and Rome. Pagan philosophers, such as Plato and Aristotle, were their intellectual guides, and figures from antiquity, such as the sternly upright Cato or Regulus, the general who kept the promise he had given to his enemies even at the cost of his life, were widely cited as moral exemplars. Yet, supposedly, Christian truth had replaced pagan ignorance, and without the guidance and grace provided for Christians alone, it was impossible to live a morally virtuous life. One approach to removing this tension was to argue that the pagans in question were not really pagans at all. Another approach, though, was to develop some variety of limited relativism.

One example of limited relativism is the view proposed by Boethius of Dacia, a Master in the University of Paris in the 1260s. Boethius was an Arts Master: his job was to teach a curriculum based on Aristotle. Boethius was impressed by Aristotelian science and wanted to remain true to it even on those points where it goes against Christian teaching. For example, Christians believe that the universe had a beginning, when God created it, but Aristotle thought that the universe was eternal – every change is preceded by another change, and so on, for ever. In Boethius’s view, the Christian view contradicts the very principles of Aristotelian natural science, and so an Arts Master like himself is required to declare ‘The world has no beginning’. But how can he do so, if he is also a Christian? Boethius solves the problem by relativizing what thinkers say within a particular discipline to the principles of that discipline. When the Arts Master, in the course of teaching natural science, says ‘The world has no beginning’, his sentence means: ‘The world has no beginning according to the principles of natural science’ – a statement which is consistent with declaring that, according to Christian belief the world did have a beginning. Relativizing strategies were also used by theologians such as Henry of Ghent, Duns Scotus and William of Ockham to explain how some pagans can have even heroic virtue and yet be without the sort of virtue which good Christians alone can have.

These and other medieval relativisms were limited, in the sense that one reference frame, that of Christianity, was always acknowledged to be the superior one. But Boethius’s relativism allowed pragmatically a space for people to develop a purely rational scientific world-view in its own terms, and that of the theologians allowed them to praise and respect figures like Cato and Regulus, leaving aside the question of whether or not they are in Hell. Contemporary relativists often advocate an unlimited version of relativism, in which no reference frame is considered superior to another. But there are grave difficulties in making such relativism coherent. The less ambitious medieval approach might be the most sensible one.

John Marenbon is a senior research fellow at Trinity College, University of Cambridge, honorary professor of medieval philosophy at Cambridge, and a fellow of the British Academy. He is the author and editor of many books, including Abelard in Four Dimensions, The Oxford Handbook of Medieval Philosophy, The Cambridge Companion to Boethius, and Medieval Philosophy: An Historical and Philosophical Introduction.

#MammothMonday: PUP’s pups sound off on How to Clone a Mammoth

The idea of cloning a mammoth, the science of which is explored in evolutionary biologist and “ancient DNA expert” Beth Shapiro’s new book, How to Clone a Mammoth, is the subject of considerable debate. One can only imagine what the animal kingdom would think of such an undertaking, but wonder no more. PUP staffers were feeling “punny” enough to ask their best friends:


Chester reads shapiro

Chester can’t get past “ice age bones”.


Buddy reads shapiro

Buddy thinks passenger pigeons would be so much more civilized… and fun to chase.


Tux reads shapiro

Tux always wanted to be an evolutionary biologist…


Stella reads Shapiro

Stella thinks 240 pages on a glorified elephant is a little excessive. Take her for a walk.


Murphy reads shapiro

A mammoth weighs how much?! Don’t worry, Murphy. The tundra is a long way from New Jersey.


Glad we got that out of our systems. Check out a series of original videos on cloning from How to Clone a Mammoth author Beth Shapiro here.

Win a copy of Relativity: 100th Anniversary Edition by Albert Einstein through Corbis!

We are teaming with Corbis Entertainment to offer this terrific giveaway through their official Albert Einstein Facebook page. Contest details below, but please head over to the “official Facebook page of the world’s favorite genius” to enter!

Enter for a chance to win a FREE COPY of “Relativity: 100th Anniversary Edition” by Albert Einstein!

A Q&A with Cormac Ó Gráda, author of Eating People is Wrong

Cormac Ó Gráda’s new collection of essays on famine—which range in focus from from the economic history to the psychological toll—begins with a taboo topic. Ó Gráda argues that cannibalism, while by no means a universal feature of these calamities, has probably occurred more frequently than previously recognized. Recently he answered some questions on his book, Eating People is Wrong, and Other Essays on Famine, Its Past, and Its Future, its somber title, and his early interest in The Great Irish Famine.

O'Grada jacketWhy did you write this book?

CÓG: When Famine: A Short History (Princeton, 2009) came out, I wanted it to be my last book on the subject. So Eating People is Wrong was not a question of ‘what will I do next?’ I just realized a few years later that I had still had ideas to contribute on topics that would make for a new, different kind of book on famine. These topics ranged from famine cannibalism to the Great Leap Forward, and from market failure to famine in the 21st century; the challenge was to merge the different perspectives that they offered into what would become this new book.  The idyllic résidence I spent in the south of France courtesy of the Fondation des Treilles in the autumn of 2013 was when the different parts came together. By the end of that stay, I had a book draft ready.

What inspired you to get into your field?

CÓG: It is so long ago that I am bound to invent the answer… But I have always had an amateur interest in history—as lots of Irish people tend to have—whereas my academic training was in economics. Economic history seemed a good way of marrying the two, and that has been my chosen field since my time as a graduate student in the 1970s. I began as a kind of jack-of-all-trades economic historian of Ireland, focusing on topics as different as inheritance patterns and famine, or migration and banking. This work culminated in a big economic history of Ireland in 1994. My interest in the Great Irish Famine of the 1840s goes back to my teens, but that interest was sharpened after getting to know Joel Mokyr (also a PUP author) in the late 1970s. Economics taught me to think of the Irish in comparative terms, and that led eventually to the study of famines elsewhere. My books have all been solo efforts, but I have very lucky and privileged to write papers with some great co-authors, and some of these papers influenced the books.

How did you come up with the title or jacket?

CÓG: The title is an ironic nod to Malcolm Bradbury’s eponymous novel (which most people seem ignorant of). A friend suggested it to me over a pint in a Dublin bar. One of the themes of the chapter on famine cannibalism, to which the title refers, is the need to realize that famines not only do terrible things to people, but that people do terrible things to one other in times of famine. Peter Dougherty and his team at PUP came up with jacket. The image is graphic and somber without being sensationalist, which is what I had hoped for.

What is your next project?

CÓG: There is no single all-consuming project. A lot of my research in recent years has been collaborative work on British economic history with UCD colleague Morgan Kelly. So far the results of that work have appeared—when we are lucky—in academic journals rather than in books. We have plans to continue on this basis, but we are also involved in an interesting piece of research with Joel Mokyr on the origins of the Industrial Revolution, and that may eventually yield a monograph by the three of us. I also want to revise several unpublished papers in Irish economic history and to get them published singly or, perhaps, as a monograph. Finally, Guido Alfani of Bocconi University in Milan and I are editing a book on the history of famine in Europe. This is coming along well. The end product will consist of nine specialist country chapters, a cross-country analysis of the famines of World War II, and an overview by Alfani and me.

What are you currently reading?

CÓG: I am at page 630 (so another hundred or so pages to go) of Stephen Kotkin’s Stalin, vol. 1 (Penguin, 2014), which brings the story of Iosif Vissarionovich only as far as 1928. I have been interested in Soviet economic history since the late Alexander Erlich introduced me to the topic in Columbia in the 1970s, and this is what attracted me to Kotkin’s riveting tome—which, however, turns out to rather uninterested in the economic issues! I am also reading Maureen Murphy’s Compassionate Stranger: Asenath Nicholson and the Great Irish Famine (Syracuse, 2015), an account of an eccentric but appealing American evangelist who toured Ireland, mostly on foot, in the years leading up to and during the Great Hunger. I was familiar with Nicholson’s own published accounts of her travels, but knew very little about her otherwise, so Murphy’s book is a revelation.   My current bedtime reading is Henning Mankell’s The Man from Beijing (2010).

Cormac Ó Gráda is professor emeritus of economics at University College Dublin. His books include Famine: A Short History and Black ’47 and Beyond: The Great Irish Famine in History, Economy, and Memory (both Princeton).

Which of these 15 myths of digital-age English do you believe?

One Day in the Life of the English Language by Frank Cioffi, a new style guide that eschews memorization in favor of internalizing how sentences actually work, handily refutes these 15 myths of digital-age English. Think brevity is best? Swear by your default settings? Feel sure the internet is a “total latrine”? Try out this “True or False” test and see whether you’re the digital-age wordsmith you thought you were:

Myth 1 image1.  In the age of the tweet, short and concise is always the best.
True, true, short messages are often the best. But not always. Sometimes one needs to go on at some length. Sometimes it is necessary to provide a context, especially if one is trying to communicate more than just minimal information. And sometimes the very brevity or terseness of a tweet makes it impossible to understand.

2.  My word processing program doesn’t let me change margins, spacing, or other aspects of format.
Most word processing programs can be set up to accommodate any standard style; however, you need to use the program’s capabilities and not always accept default settings. In Microsoft Word, for example, many writers allow the program its silly default—to put an extra line space between paragraphs of the same format. This should be unselected as a default off the “paragraph” menu.

Myth 3 image3.  My word processing program will highlight and automatically fix any errors I make.
These automatic correction programs are notoriously unreliable, as they often “fix” writing that is in fact correct. For example, at first I thought one of my students had subject-verb agreement problems; then I noted that the program tried to get me to introduce such errors into my own work. You, not the program, are the mind behind the words. Don’t rely on your program to fix everything. Let it check—but you check too.

4.  “Logical punctuation” is the best option in most situations.
This idea usually refers to putting punctuation either inside or outside of quotation marks. The logicality of doing so or not doing so has been questioned by many. It’s probably best to follow conventions of a given style, unless you are not working within any particular field. In that case, you can invent new rules; just don’t expect others to understand or follow them.

5. People don’t really read anymore; they merely “scan a page for information.”
Gary Shteyngart brings up this idea in his 2011 novel Super Sad True Love Story. It’s interesting and has some truth to it: I agree that many people don’t read with a lot of care or seek to understand and internalize the written ideas they encounter. But some do. Think of that “some” as your audience. At the same time, consider the needs of an audience that just “scans the page.” Ask yourself, “Does this page I’ve just written include information worth scanning?”

Cioffi jacket6.  Anyone can publish written material nowadays, so what’s the value of Standard Written English?
With the Internet, it’s true that anyone can publish now. And many self-publishing options are open to any writer seeking to get work in print. Simply publishing something is now less a guarantee of its excellence or importance than it once was, but if you strive to have your work read—by more than family and friends—it will have to respect some standard forms and conventions. Or to put it another way, no matter what your publishing goals, if you want people to read your work, you will have to write with a high level of competence and lucidity.

7.  People are much less precise and exact than they used to be, now that they have computers to rely on.
This is clearly not the case in all situations. In fact, people must be much more careful now with details such as spelling, especially when entering passwords or usernames. In many digital contexts, attentiveness to language accuracy is obligatory. If you are inattentive, you often can’t even use the computer or the program. If you don’t respect the syntax of a program, it just won’t run.

8.  “Talking street” is what most people want to do anyway.
I think that most people have to use multiple forms of English. They might speak one way to their family, one way to their friends, one way on their jobs, and another way, perhaps, when they need to write a paper for a college course they are taking. People can and should become multilingual.

9.  Most grammatical stuff is of minor importance—kind of too boring and persnickety to bother with.
I agree that there are more important things in the world, but I have been making the argument throughout this book that in fact these “minor” matters do seem to make a difference to some people—and a major difference to a small minority. And writ large, they make a big difference in our society. Admittedly, there is a persnickety quality to some of the material, but isn’t specialization all about being persnickety?

10.  Someone else can “wordsmith” my ideas; I just generate them.
The line between the idea and the expression of it is very fine; that is, how you say something is often inextricable from what you say. You need to take charge of not just coming up with a basic idea or notion but also of how that idea gets expressed. If you have a stake in how an idea exists in its final form, you should take great care with its exact verbal formulation.

11.  Since so many “styles” (MLA, APA, Chicago . . .) are available and used by various specialties, it’s pointless to worry about this kind of superficial overlay.
There are a lot of forms and styles, to be sure. But you need to find the form that’s conventional in your professional field and use that. If you don’t, you almost automatically label yourself an “outsider” to that field, or perhaps even an interloper. And sometimes, just abiding by the conventions of a style gains you credibility in and of itself, allows entrée into a field.

12.  There’s no possibility of an original idea anymore: it’s all been said.
One certainly feels as though this might be possible, considering the ever-expanding scope of the Internet and the existence of over seven billion human minds on the planet. However, each of us has his or her own individual experience—which is unique. And out of that, I feel, originality can emerge. You must really want that originality to emerge, though, and resist succumbing to the pressure of the multitude to simply conform to what’s standard, acceptable, predictable, dull.

13.  If something is published on the Internet, it’s true.
I know that no one really believes this. But I want to emphasize that a great deal of material on the Internet is simply false—posted by people who are not reliable, well-informed, or even honest. Much Internet material that claims to be true is in fact only a form of advertising. And finally, do keep in mind that almost anyone can create websites and post content, whether they are sane or insane, children or adults, good or evil, informed or misinformed.

myth 4 image14.  The Internet is a total latrine.
A few years ago, I heard a well-known public intellectual give a talk for which this was the thesis. And there are certainly many things on the Internet and about the Internet that bear out such a judgment. However, there are also some amazing things, which prompt me to say that the Internet is the greatest accumulation of information and knowledge in the history of humankind. But you need to learn how to use it efficiently and effectively, and sort the good from the bad.

Myth 15 image

15.  I can cut and paste my way through any college paper assignment.
There are many opportunities to create what looks like your own work—cutting and pasting here, auto- summarizing there, adding a few transitional sentences, and mashing it all together. I don’t recommend this kind of work; it doesn’t really benefit you to create it. You want to write papers of your own, ones that express your own ideas and that use your own language. The cut-and-pasters are ultimately sacrificing their humanity, as they become people of the machine. And when they’re caught, the penalties can be severe.

How did you do?

Frank L. Cioffi is professor of English at Baruch College, City University of New York, and has taught writing at Princeton and Indiana universities and at Bard and Scripps colleges. He is the author of The Imaginative Argument: A Practical Manifesto for Writers (Princeton), among other books.

Graphics by Chris Ferrante

Happy Birthday, Søren Kierkegaard

Lowrie jacket5-8 Kierkegaard_TheSeducersDiaryIntroversion has been having a moment of late, and today happens to be the birthday of one of the world’s most famous—and brilliant—introverts. To quote the (excellent) copy for A Short Life of Kierkegaard by Walter Lowrie, Kierkegaard was “a small, insignificant-looking intellectual with absurdly long legs, a veritable Hans Christian Andersen caricature of a man.” In life, he often hid behind pseudonyms, and yet, he remains one of the most important thinkers of modern times. Read about Kierkegaard’s turbulent life in this classic biography (literary duel? Check. Tragic love affair? Check.) or sample The Seducer’s Diary, which John Updike called, “An intricate curiosity—a feverishly intellectual attempt to reconstruct an erotic failure as a pedagogic success, a wound masked as a boast.”

Happy Birthday, Søren Kierkegaard.

Read Chapter 1 of The Seducer’s Diary here.

Read the Introduction to A Short Life of Kierkegaard here.

An interview with Nancy Woloch, author of A Class by Herself

Nancy Woloch’s new book, A Class by Herself: Protective Laws for Women Workers 1890s-1990s, looks at the historical influence of protective legislation for American women workers, which served as both a step toward modern labor standards and as a barrier to equal rights. Recently, Nancy took the time to answer some questions about the book, her reasons for writing it, and the modern day legacies of this legislation, from pregnancy law, to the grassroots movement to raise the minimum wage.

Woloch jacketWhy did you write this book?

NW: Conflict over protective laws for women workers pervades twentieth-century US women’s history. These laws were everywhere. Since the early 1900s, almost every state enacted some sort of women-only protective laws—maximum-hour laws, minimum wage laws, night work laws, factory safety laws. Wherever one turns, the laws spurred debate, in the courts and in the women’s movement. Long drawn to the history of these laws and to the arguments that they generated, I saw the opportunity to carve out a new narrative: to track the rise and fall of protective laws from their roots in progressive reform to their collapse in the wake of Title VII of the Civil Rights Act of 1964, and beyond. Here was a chance to fuse women’s history and legal history, to explore social feminism, to reconstruct a “constitutional conversation,” and to ferret around all the topics that protective laws touch — from transatlantic connection to social science surveys to the rise of equal rights. Above all, the subject is contentious. Essentially, activist women disrupted legal history twice, first to establish single-sex protective laws and then to overturn them. This was irresistible.

What is your book’s most important contribution?

NW: My book shows the double imprint that protective laws for women workers left on US history. The laws set precedents that led to the Fair Labor Standards Act of 1938 and to modern labor law, a momentous achievement; they also sustained a tradition of gendered law that abridged citizenship and impeded equality until late in the century.

Which groups of women activists first supported women-only protective laws?

NW: I focus on members of the National Consumers’ League, a pressure group formed in 1898 and led as of 1899 by reformer Florence Kelley. One of the most vibrant and successful reform organizations of the Progressive Era, the NCL enabled the campaign for protective laws to move forward. I also focus on the federal Women’s Bureau, started in 1920, which inherited the mission of the NCL: to preserve and promote protective laws. Other women’s associations, too, were involved; so were women labor leaders. But the NCL and the Women’s Bureau were most crucial. Women who promoted women-only protective laws endorsed a dual rationale: the laws would redress disadvantages that women faced in the labor force and provide “industrial equality”; they would also serve as an “entering wedge” to labor standard for all workers. The dual rationale persisted, with variations, for decades.

 How did you come up with the title?

NW: “A Class by Herself” is a phrase used by Justice David J. Brewer in Muller v. Oregon, the landmark Supreme Court decision of 1908 that upheld a state ten-hour law for women workers in factories and laundries. Woman, Justice Brewer stated, “is properly placed in a class by herself, and legislation designed for her protection may be sustained, even when like legislation is not necessary for men and could not be sustained.” Two issues intersect in the Muller case: Can the state impose labor standards? Is classification by sex constitutional? The fusion of issues shapes my narrative.

The Muller case remains fascinating. I am stunned with the exceptional leverage that Florence Kelley grasped when she intervened in the final appeal of the case. I am struck with the link that Muller’s lawyers posited between employers’ interests and equal rights; with the fragile relationship between the famous Brandeis brief and the Brewer opinion; and with the way that Justice Brewer challenged Brandeis for dominance. I still ask myself: Who took advantage of whom? Looking back on Muller, I find an intriguing contrast between that case and the Supreme Court case that terminally rejected the Muller principle, UAW v. Johnson Controls (1991). This is when single-sex protective laws definitively expired. Johnson Controls also offers a counter-image of the 1908 case.

Did classification by sex ever help women workers?

NW: Yes, of course. Women-only state protective laws might provide benefits to women workers. In many instances, they provided shorter hours, higher wages, or better working conditions, just as reformers envisioned. But women-only laws always had built-in liabilities. Laws based on “difference” perpetuate difference. They entail hierarchy, stratification, and unequal power. They can quash opportunity, advancement, and aspiration. Once embedded in law, classification in sex might be adapted to any goal conjured up by lawmakers, or, as a critic in the 1920s pointed out, used to impose whatever restrictions “appeal to the caprice or prejudice of our legislators.”

What sort of challenges did you face as an author?

NW: Protective laws were tough customers. They fought back; they resisted generalization; they defied narrative. Part of the challenge was that I deal with a great mass of legislation –several hundred state laws — and each type of law followed its own trajectory. I also cover the laws and their ramifications over many decades. To estimate the impact of protective laws on women workers at any given time was a hazardous undertaking; one could not easily measure the negative effects, or what one critic called the “debit side.” Changing circumstances compound the problem; the effects of the laws were always in flux. Not least, protective laws generate controversy among historians; to tackle this subject is to stroll through a minefield. A special challenge: to cope with the end of protective laws in the 1960s and 1970s.

What was the biggest surprise you encountered in writing this book?

NW: The role of “surprise” itself was a surprise. Progressive reformers who promoted women-only labor laws in the early 1900s could not see around corners, anticipate shifts in the economy, or envision changes in the female work force. Nor could their successors or their opponents. Much of my narrative is a story of close calls and near misses, of false hopes and unexpected consequences, of accident and unpredictability. The theme of the unforeseen peaks with the addition of “sex” to Title VII of the Civil Rights bill of 1964; the impact of the amended Title VII on women-only protective laws was yet more of a surprise. I was surprised myself, as narrator, by the complexity of the downfall of protective laws. I was also surprised to discover the key role that “overtime” played in my story and the gradual mutation in its meaning over the decades.

Does your subject have present-day legacies?

NW: Definitely. In a sense, single-sex protective laws sank totally out of sight when they capsized in the 1970s. But in another sense, many facets of the history of protective laws reverberate; the echoes pervade current events. Labor standards are now a global issue, as illustrated in Bangladesh in 2012 and 2013. The fire in a garment factory on the outskirts of Dhaka that killed 117 workers, so reminiscent of the 1911 Triangle fire, and the yet more lethal collapse of an 8-story building, with garment production on its upper floors, underline the need for safety regulation everywhere. Closer to home, the drive to improve labor standards continues. Most recently, we have seen a grassroots movement to raise the minimum wage and efforts to revise federal law on the threshold for overtime. Reconciling work and parenthood impels discussion. Pregnancy law remains a challenge; enforcement of the Pregnancy Discrimination Act of 1978 has spurred more litigation than anyone expected. A recent case is Young v. United Parcel Service (2015). Beyond that, demands for compensated parental leave proliferate. President Obama’s proposal to fund parental leave, though unlikely to move forward right now, at least keeps the issue on the table. Finally, equal employment opportunity cases remain a challenge, from the Lily Ledbetter case of 2007 to the dismissed Wal-Mart case of 2011. Title VII, which catalyzed the end of single-sex protective law, turns out to be a work in progress.

Writers on Writers Giveaway


We have a new giveaway! Enter for a chance to win the complete set of Writers on Writers, a series of brief, personal books by contemporary writers about an author, past or present, who has inspired or influenced them in some way.

Each book gives the reader a window into both the life and work of the chosen author and the mind of the writer. In On Elizabeth Bishop, Colm Tóibín highlights the parallels between his life and that of his subject, particularly in their experience of loss and exile. He traces her footsteps to Nova Scotia, Key West, and Brazil and shows the reader how her influence helped to shape him as a novelist. Compared to Tóibín’s measured, deeply personal account, Alexander McCall Smith’s contribution, What W.H. Auden Can Do For You, is a playful, charming take on the manifold ways that Auden has been a guiding force in his life. McCall Smith calls him one of the best guides on how to live. He shows us how he has been inspired by Auden and how each of us can benefit from his work.

One of the most famous nineteenth-century novelists, Sir Arthur Conan Doyle has provided inspiration to many. On Conan Doyle: Or, The Whole Art of Storytelling by Pulitzer-prize winning critic Michael Dirda is not only an engaging introduction to the author and his work, it is a rare glimpse into the best-known of all Sherlockian groups, the Baker Street Irregulars, of which Dirda is a member. Another famous nineteenth-century author, Walter Whitman, is the subject of Pulitzer-prize winning poet C.K. Williams. On Whitman explores the reasons why Leaves of Grass continues to inspire. Williams shows what Whitman had in common with other poets of his time and how his influence continues to be felt today.

Finally, renowned essayist Phillip Lopate describes Sontag as one of the “foremost interpreters of…our recent contemporary moment” in Notes on Sontag. While admiring her free-thinking originality, Lopate is critical of her tendency toward exaggeration, feeling that it undermines her common sense. Lopate provides a clever and enjoyable reflection on his chosen writer through a series of essays, a form used by Sontag herself.

Writers on Writers is necessary reading for anyone interested in the creative process and the often-complex relationship between writers. To enter for a chance to win the complete series, please follow the directions in the RaffleCopter box below. Winners will be selected on or around May 19, 2015.

a Rafflecopter giveaway

George Akerlof and Robert Shiller pose with their new book jacket

Nobel Prize winners Robert Shiller and George Akerlof got the chance to pose with the phenomenal cover for their forthcoming book, Phishing for Phools, the lead title on our Fall 2015 list (stay tuned for the posting of our new seasonal catalog!)  The drawing on the cover is an original by New Yorker cartoonist Edward Koren, and the jacket design is by our own Jason Alejandro. You can catch George talking about the book, which is a fascinating look at the central role of manipulation in economics, at this lecture at Duke University.

Akerloff and Shiller


An interview with Frank Cioffi, author of One Day in the Life of the English Language

This week we had the opportunity to ask Frank Cioffi questions about his new book, One Day in the Life of the English Language, which was recently featured in Inside Higher Ed. Cioffi offers insights on the “ethics” of usage, why grammar is “not just a set of rules”, and why students often readily grasp proper usage in exercises, but struggle with their own prose.

What was the inspiration for this book?Cioffi jacket

FC: Here is what I wrote in my five-year diary on 12/28/08: “millions of sentences are uttered and written. . . Most float off into a void, never to be heard of or recalled again. Most are ‘ungrammatical,’ no doubt unable to pass the scrutiny of a gimlet-eyed grammarian. But these sentences, and those of the previous days, and those of the next ones, make up our lives. They help to form the dense linguistic net of which we are all a part. And this book seeks to both represent that net and to show how you as a writer might well make a small, a human scale, a molecule-level, improvement of it.”

In what way or ways does your handbook differentiate itself from the thousand or so English handbooks already out on the market?

FC: I guess I am trying to persuade readers that Standard Written English (SWE) matters; it’s not just something to be memorized, like how to factor polynomials or the quadratic equation, but has a real impact on how we live and function as human beings. For example, using SWE usually improves one’s capacity for communicating to a wide and varied audience. More people will understand you if you use SWE than if you use, say, a dialect or an argot.

In addition, when you don’t use SWE you run the risk of stigmatizing yourself, of giving your audience the excuse to ignore what you say (“He can’t be saying anything of any importance—he’s clearly uneducated and dumb”). Now that’s not the right response, I know, and I emphasize in my book that we should not stigmatize people because their English is unpolished or somewhat far from the “standard,” but it still happens, so people need to learn SWE in order not to be stigmatized.

For many decades now I’ve been teaching English at the college level, and I have seen a lot of handbooks. None of them, I felt, had a sufficiently human voice. Most books say, “Here it is: learn it.” I say, “Here it is, and here is why it’s important to learn it.” Fred Crews’s Random House Handbook was something of an exception, but it’s now out of print. It is also not a compact book, which mine attempts to be.

Tell us a bit more about the “voice” of a handbook.

FC: Grammar books have multiple voices: the author who is lecturing, the author who is commenting on samples of English, and the sample sentences, often also by the author. I thought there was something wrong with all of these as they exist in current texts. In particular, I wanted the sentences to come from a real world, not the one of “Dick and Jane” books.

Here is the paradox I saw: students could do worksheets or exercises very readily, but their own prose didn’t reflect the lessons of those exercises. For example, my students did a worksheet on comma splices, but comma splices still marred their writing. We did a worksheet on apostrophes, but apostrophes were still a major problem in the formal papers. Why is that?

It seemed to me that maybe in our handbooks, workbooks, and even lectures, we tended to simplify example sentences too much. We tended to make them spare and simple so as to illustrate a grammatical point. But that point is easy to understand with simple sentences. As complexity grows, the capacity for error enlarges.

At the same time, students might think, “Only a total dummy would make a mistake like this sample sentence!” or maybe “That’s not me!”Or they might think, “This book is totally condescending.”

So I wanted sample sentences that were complex.

But the problem here was that making up sentences in the sample sentence genre suddenly grew difficult, since their lack of content becomes much more apparent as they grow in elaborateness. This made me wonder about the “world” depicted in the example sentences. It’s a made-up world. a world of nonevents, a world where nothing scary or awful or threatening or sexy happens. It’s the same world that the Educational Testing Service depicts in the “fairness guidelines” that they give to test preparers, which in some ways makes sense. We don’t want to distract students from the grammatical issue at hand.

Yet the world of these sample sentences has the interesting effect of making grammar somehow disembodied, disconnected from a real world. Its sentences emerge from a world where nothing is really happening, and where nothing really matters. What message does that send to our students or to our readers?

That’s when I decided to go for real-world sentences.

These come from the “one day,” then, of your title?

FC: Yes. I didn’t want to make these the culled variety we see in Strunk and White, or Robert Graves and Alan Hodge’s book The Reader over Your Shoulder. No. I just wanted them to be from a single day, since that would show how we all make mistakes, how language is really tricky even for professionals to get just right.

So I combed magazines and newspapers published on December 29, 2008, and I tried to find examples of good sentences, elegant sentences, let’s say, as well as of sentences whose grammar struck me as “dubious,” as one of my colleagues likes to say. I came up with almost 300 of these sentences, so the book is at once a grammar handbook and a curious snapshot of history, on a day that is not particularly historical. And oddly enough, even though it’s more than six years later now, a lot of the sentences still resonate with current events.

What about the “rules” of Standard Written English: don’t you feel these need to be hammered home?

FC: As far as “learning grammar” goes, I didn’t want to provide just a set of rules, though of course I do emphasize what’s SWE and what is not. I instead argue that students and readers need to internalize the pattern and form of English sentences, really need to get inside them in a profound way, need to become, in a way, linguists themselves, in order to express themselves more fully.

In addition, I wanted to be honest. The rules of English are not apodictic: they are constantly being debated by professors; they are under constant pressure. Think of the problems with pronoun reference. Think of the “acceptable” comma splice. There are borderlands of acceptability in English that are becoming increasingly large.

And too we need to recognize that not all English needs to be SWE. We need to allow our students their own language in many situations, just as editors allowed that in the papers and magazines I looked at. One of the things we want to keep in mind is that so much of the success of one’s English has to do with accurately gauging what’s appropriate to a given situation, with assessing the audience for one’s words.

Your book also emphasizes the “ethics” of usage. Can you elaborate on this?

FC: I also suggest that grammaticality or accuracy is something that has an ethical component, since lives, careers, futures—our future—can hinge on the accuracy of English. At the same time, SWE often allows people to better express their ideas to a wider audience—people can get heard “when it matters,” if they properly gauge their audience and if they are able to be agile enough with their language to move from one register to the next, and to assume SWE when it’s needed and abandon it when it might be counterproductive, when it might sound stilted or stuffy or supercilious to use it.

What surprised you about writing and publishing this book?

FC: I was surprised by how hard it was to get published. It came close to being accepted by a couple of textbook houses, but it didn’t make the grade. One time, after three very positive outside reviews, I thought the book was as good as accepted. I was to meet with the editor soon and we were to work out the details. But then at the last minute the editor canceled our meeting and said the book could not be published by her press.

“Why not?” I wondered. Then it occurred to me that if I am writing a book that challenges the value of standard handbooks, then a publisher that has 100 such handbooks on its list isn’t likely to publish mine! This also clued me in to why it is that all the handbooks out there are so similar.

It’s as if there is a weird monopoly of ideas—we can’t rock the boat too much with new ideas or approaches, since we’re making a ton of money off of the old ones!

When I was teaching in Poland a few years ago, it was communist days, and I was complaining about censorship. One of my colleagues, though, challenged me on this: “You have censorship in America, too, you know, and it’s as repressive of new ideas as ours is, maybe more: books that aren’t deemed salesworthy are simply not published. That silences all sorts of voices.” So a book might be itself salesworthy, but might drag down the sales of the other books published by a press, so that book won’t see print, at least not by them.

So do you think your book might change the way that college writing is taught?

FC: My book attempts to get writing instructors to grapple on an ongoing basis with the complexities of English usage and grammar, and to work with students as they try to plumb these issues together. It’s not a quick fix. It’s a course of instruction in what, for many students, is a new language altogether. If we really want to change the quality of the work our students produce, we need to reimagine how the college composition course is structured, staffed, and funded.

How did you come up with the title of the book, which is a play on Solzhenitsyn’s One Day in the Life of Ivan Denisovich?

FC: I was going to call it “One Day’s Sentences in America,” but I wasn’t all that happy with that title. One day, though, my wife, Kathleen Cioffi, said, “Hey, why not call the book ‘One Day in the Life of the English Language’?” Bingo.

What are you reading right now?

FC: Right now I am reading a collection of short stories by Alberto Moravia. He is a marvelous and, I think, neglected Italian writer. His stories examine the minutiae of daily life; they explore the psychological menace and poignancy of the ordinary. In some ways they are stories about a lack of communication between people and the effects of that.

What are your next writing projects?

FC: I have several going on right now. Probably I have too many. I have three completed book manuscripts: one is about teaching entitled Beyond Zombie Pedagogy. I’ve also written a biography of my late uncle, the philosopher Frank Cioffi. And I kept a detailed diary of my life in communist Poland. The diary is maybe 700,000 words, though—I kept it for three years—so I need to cut it down and turn it into a narrative/analysis of life in Poland in the waning days of communism. Still waiting for publishers and contracts for these three books—!

I also have a volume of poetry that I’ve culled from the hundreds of poems I’ve written over the last three decades.

Really? Poetry? Perhaps you could give us a short poem?


Ok, here is a villanelle, “Noisome T. Rex”:


Fuse frayed synapses, hurt to reinvent.

Smooth feelings blunt as a plastic doll’s sex,

scrub brain raw of all, all that you repent.


Moving ‘midst throngs swarm-clogging the pavement,

lumb’ring dumb-monstrous as noisome T. Rex,

fuse frayed synapses, hurt to reinvent.


Pointless to think of her lips or prevent

recall of their blood-damp cling pre/post-X.

Scrub brain raw of all, all that you repent.


Don’t look directly—no, keep that gaze bent,

as eyes switchblade your so vulner’ble neck .

Fuse frayed synapses, hurt to reinvent.


Its fluid-flow blocked, mind needing a stent

or swift amputation—painless, unvex’d—

scrub brain raw of all, all that you repent.


Violate space through some vocal event.

Stall devolution, and fight your thrawn hex.

Scrub brain raw of all, all that you repent.

Fuse frayed synapses, hurt to reinvent.


Be sure to read the introduction here.

Jonathan Zimmerman on how to publish your Op Ed

Jonathan Zimmerman, author of the new book Too Hot to Handle: A Global History of Sex Education, also happens to be known for writing (and publishing) more op eds than any other living historian. Recently he spoke to the History News Network about his unusual success in this area—a must-listen for authors and anyone whose desktop features a few op eds looking for a home.