Mark Serreze on Brave New Arctic

In the 1990s, researchers in the Arctic noticed that floating summer sea ice had begun receding. This was accompanied by shifts in ocean circulation and unexpected changes in weather patterns throughout the world. The Arctic’s perennially frozen ground, known as permafrost, was warming, and treeless tundra was being overtaken by shrubs. What was going on? Brave New Arctic is Mark Serreze’s riveting firsthand account of how scientists from around the globe came together to find answers. A gripping scientific adventure story, Brave New Arctic shows how the Arctic’s extraordinary transformation serves as a harbinger of things to come if we fail to meet the challenge posed by a warming Earth.

Why should we care about what is going on in the Arctic?

The Arctic is raising a red flag. The region is warming twice as fast as the globe as a whole. The Arctic Ocean is quickly losing its summer sea ice cover, permafrost is thawing, glaciers are retreating, and the Greenland ice sheet is beginning to melt down. The Arctic is telling us that climate change is not something out there in some vague future. It is telling us that it is here and now, and in a big way. We long suspected that as the climate warms, the Arctic would be leading the way, and this is exactly what has happened.

There are a lot of books out there on the topic of climate change. What makes this one different and worth reading?

I wanted to get across how science is actually done. Scientists are trained to think like detectives, looking for evidence, tracking down clues, and playing on hunches. We work together to build knowledge, and stand on the shoulders of those who came before us. It a noble enterprise, but a very human one as well. We sometimes make mistakes (I’ve made a few doozies in my time) and get off the rails. Too often, science gets twisted up with politics. I tell it like it is, as a climate scientist who was there back when the Arctic was just beginning to stir, and both watched and participated in the story of the changing north.

You’ve hinted about how growing up in Maine got you interested in snow and ice. Can you tell us a little about this?

I grew up in coastal Maine in the 1960s and 1970s when there were some pretty impressive winters. Winter was my favorite season. I was way into daredevil sledding, and spent countless hours building the iciest, slickest track possible and modifying my sled for maximum speed. I developed a reputation for building tremendous snow forts with five or six rooms connected by tunnels. We’d would go crawling through the tunnels at night and light candles in each room. Then there was the simple primal joy of watching a big Nor’easter snowstorm come through and grind commerce to halt. The craziest winter activity I got into with my sister Mary and friend Dave was riding ice floes on the Kennebunk River. I probably should have drowned several times over, but, in retrospect, I learned a lot about the behavior of floating ice. Now, this was all back in an era when most of us were free-range kids—my mom would say, “get out of the house, I don’t want to see you ‘til dinner.” So you made your own fun and it wasn’t always safe. But it prepared me very well for a career studying snow and ice.

It took you quite a few years to be convinced of a human role in climate change. Why so long?

As mentioned, scientists are detectives, and we are always weighing the evidence. For me, it was never a question of if we would eventually see the human imprint of climate change in the Arctic—the basic physics behind greenhouse warming had been understood as far back as the late 19th century. Rather, it was a question of whether the evidence was solid enough to say that the imprint had actually emerged. The challenge we were up against is that natural variability is quite strong in the Arctic, the system is very complex, and most of the climate records we had were rather short. By the late 1990s, it was clear that we were seeing big changes, but at least to me, a lot of it still looked like natural variability. It was around the year 2002 or 2003 that the evidence became so overwhelming that I had to turn. So, I was a fence sitter for a long time on the issue of climate change, but that is how science should work. We are trained to be skeptical.

What happened in the year 2007?  Can you summarize?   

In the early summer of 2007, sea ice extent was below average, but this didn’t really grab anyone’s attention. That quickly changed when ice started disappearing at a pace never seen before. Through July and August, it seemed that the entire Arctic sea ice community was watching the daily satellite images with a growing sense of awe and foreboding. Huge chunks of the ice were getting eaten away. By the middle of September, when it was all over, the old record low for sea ice hadn’t just been beaten, it had been blown away. There was no longer any doubt that a Brave New Arctic was upon us. Arctic climate science was never really the same after that.

We keep hearing about how science tends to be a male-dominated field. But the impression that one gets from your book is that this isn’t really the case in climate research. Can you comment?

I don’t know what the actual numbers look like in climate science versus, say, computer science, but in my experience,  when it comes climate research, nobody really cares about your gender. What’s important is what you know and what you can contribute. What you do see, certainly, is more female graduate students now coming through the system in STEM fields (Science, Technology, Education, Mathematics).

Are you frustrated by the general inaction, at least in the United States, to deal with climate change? 

I’m constantly amazed that we don’t take the issue of climate change more seriously in this country. We are adding greenhouse gases to the air. The climate is warming as a result. The physics are well understood. Just as expected, the Arctic is leading the way. Sure, there are uncertainties regarding just how warm it well get,  how much sea level will rise, and changes in extreme events, but we know plenty about what is happening and where we are headed. The costs of inaction are going to far outweigh the costs of addressing this issue.

Mark C. Serreze is director of the National Snow and Ice Data Center, professor of geography, and a fellow of the Cooperative Institute for Research in Environmental Sciences at the University of Colorado at Boulder. He is the coauthor of The Arctic Climate System. He lives in Boulder, Colorado.

Martin Rees: Stephen Hawking — An Appreciation

Soon after I enrolled as a graduate student at Cambridge University in 1964, I encountered a fellow student, two years ahead of me in his studies; he was unsteady on his feet and spoke with great difficulty. This was Stephen Hawking. He had recently been diagnosed with a degenerative disease, and it was thought that he might not survive long enough even to finish his PhD. But, amazingly, he lived on to the age of 76. Even mere survival would have been a medical marvel, but of course he didn’t just survive. He become one of the most famous scientists  in the world—acclaimed  as a world-leading researcher in mathematical physics, for his best-selling books about space, time, and the cosmos, and for his astonishing triumph over adversity.

Astronomers are used to large numbers. But few numbers could be as large as the odds I’d have given, back in 1964 when Stephen received his ‘death sentence,’ against witnessing this uniquely inspiring crescendo of achievement sustained for more than 50 years. Few, if any, of Einstein’s successors have done more to deepen our insights into gravity, space, and time.

Stephen went to school in St Albans, near London, and then to Oxford University. He was, by all accounts, a ‘laid back’ undergraduate, but his brilliance nonetheless earned him a first class degree in physics, and an ‘entry ticket’ to a research career in Cambridge. Within a few years of the onset of his disease he was wheelchair-bound, and his speech was an indistinct croak that could only be interpreted by those who knew him. But in other respects fortune had favored him. He married a family friend, Jane Wilde, who provided a supportive home life for him and their three children, Robert, Lucy, and Tim.

The 1960s were an exciting period in astronomy and cosmology: this was the decade when evidence began to emerge for black holes and the big bang. In Cambridge, Stephen  joined a lively research group. It was headed by Dennis Sciama, an enthusiastic and effective mentor who urged him to focus on the new mathematical concepts being developed by Roger Penrose, then at London University, which were initiating a renaissance in the study of Einstein’s theory of general relativity. Stephen mastered Penrose’s techniques and quickly came up with a succession of insights into the nature of black holes (then a very new idea),   along with new arguments that our universe had expanded from a ‘big bang.’ The latter work was done jointly with George Ellis, another of Sciama’s students, with whom Stephen wrote a monograph entitled The Large-Scale Structure of Space-Time. Especially important was the realization that the area of a black hole’s horizon (the ‘one-way membranes’ that shroud the interior of black holes, and from within which nothing can escape) could never decrease. The analogy with entropy (a measure of disorder, that likewise can never decrease) was developed further by the late Israeli theorist Jacob Bekenstein. In the subsequent decades, the observational support for these ideas  has strengthened—most spectacularly with the 2016 announcement of the detection of gravitational waves from colliding black holes.

Stephen was elected to the Royal Society, Britain’s main scientific academy, at the exceptionally early age of 32. He was by then so frail that most of us suspected that he could scale no further heights. But, for Stephen, this was still just the beginning. He worked in the same building as I did. I would often push his wheelchair into his office, and he would ask me to open an abstruse book on quantum theory—the science of atoms, not a subject that had hitherto much interested him. He would sit hunched motionless for hours—he couldn’t even to turn the pages without help. I wondered what was going through his mind, and if his powers were failing. But within a year he came up with his best-ever idea—encapsulated in an equation that he said he wanted on his memorial stone.

The great advances in science generally involve  discovering a link between phenomena that seemed hitherto conceptually unconnected: for instance, Isaac Newton famously realized that the force making an apple fall was the same as the force that held the moon and planets in their orbits. Stephen’s ‘eureka moment’ revealed a profound and unexpected  link between gravity and quantum theory: he predicted that black holes would not be completely black, but would radiate in a characteristic way. Bekenstein’s concept that black holes had ‘entropy’ was more than just an analogy. This radiation is only significant for black holes much less massive than stars—and none of these have been found. However, ‘Hawking radiation’ had very deep implications for mathematical physics—indeed one of the main achievements of string theory has been to corroborate his idea. It is still the focus of theoretical interest—a topic of debate and controversy more than 40 years after his discovery. Indeed the Harvard theorist, Andrew Strominger (with whom Stephen recently collaborated) said that this paper had caused ‘more sleepless nights among theoretical physicists than any paper in history.’ The key issue is whether information that is seemingly lost when objects fall into a black hole is in principle recoverable from the radiation when it evaporates. If it is not, this violates a deeply believed general physical principle. In 2013 he was one of the early winners of the Breakthrough Prize, worth 3 million dollars, which was intended to recognize theoretical work.

Cambridge was Stephen’s base throughout his career, and he became a familiar figure navigating his wheelchair around the city’s streets. By the end of the 1970s, he had advanced to one of the most distinguished posts in the University—the Lucasian Professorship of Mathematics, once held by Newton himself. He held this chair with distinction for 30 years; but reached the retiring age in 2009 and thereafter held a special research professorship. He travelled widely: he was an especially frequent visitor at Caltech, in Pasadena, California; and at Texas A&M University. He continued to seek new links between the very large (the cosmos) and the very small (atoms and quantum theory) and to gain deeper insights into the very beginning of our universe—addressing questions like ‘was our big bang the only one?’ He had a remarkable ability to figure things out in his head. But latterly he worked with students and colleagues who would write a formula on a blackboard; he would stare at it, and say whether he agreed with it, and perhaps what should come next.

In 1987, Stephen contracted pneumonia. He had to undergo a tracheotomy, which removed even the limited powers of speech he then possessed. It had been more than 10 years since he could write, or even  use a keyboard. Without speech, the only way he could communicate was by directing his eye towards  one of the letters of the alphabet on a big board in front of him.

But he was saved by technology. He still had the use of one hand; and a computer, controlled by a single lever, allowed him to spell out sentences. These were then declaimed by a speech synthesizer, with the androidal American accent that has since become his trademark. His lectures were, of course, pre-prepared, but conversation remained a struggle. Each word involved several presses of the lever, so even a sentence took several minutes. He learnt to economize with words. His comments were aphoristic or oracular, but often infused with wit. In his later years, he became too weak to control this machine effectively, even via facial muscles or eye movements, and his communication—to his immense frustration—became even slower.

At the time of his tracheotomy operation, he had a rough draft of a book, which he’d hoped would describe his ideas to a wide readership and earn something for his two eldest children, who were then of college age. On his recovery from pneumonia, he resumed work with the help of an editor. When the US edition of   A Brief History of Time appeared, the printers made some errors (a picture was upside down), and the publishers tried to recall the stock. To their amazement, all copies had already been sold. This was the first inkling that the book was destined for runaway success—four years on bestseller lists around the world.

The feature film The Theory of Everything (where he was superbly impersonated by Eddie Redmayne, in an Oscar-winning performance) portrayed  the human story behind his struggle. It surpassed most biopics in  representing the main characters so well that they themselves were happy with the portrayal (even though it understandably omitted and conflated key episodes in his scientific life). Even before this film, his life and work had featured in movies. In  an excellent TV docudrama made in 2004, he was played by Benedict Cumberbatch (In 2012 Cumberbatch spoke his words in a 4-part documentary The Grand Design made for the Discovery TV  Channel).

Why did he become such a ‘cult figure?’ The concept of an imprisoned mind roaming the cosmos plainly grabbed people’s imagination. If he had achieved equal distinction in (say) genetics rather than cosmology, his triumph of intellect against adversity probably wouldn’t have achieved the same resonance with a worldwide public.

The Theory of Everything conveyed with sensitivity how the need for support (first from a succession of students, but later requiring a team of nurses) strained his marriage to breaking point, especially when augmented by the pressure of his growing celebrity. Jane’s book, on which the film is based chronicles the 25 years during which, with amazing dedication, she underpinned his family life and his career.

This is where the film ends. But it left us only half way through Stephen’s adult life. After the split with Jane, Stephen married, in 1995, Elaine Mason, who had been one of his nurses, and whose former husband had designed Stephen’s speech synthesizer. But this partnership broke up within a decade. He was sustained, then and thereafter, by a team of helpers and personal assistants, as well as his family. His daughter Lucy has written books for children with her father as coauthor. His later theories were described, and beautifully illustrated, in other books such as Our Universe in a Nutshell and The Grand Design. These weren’t  bought by quite as many people as his first book—but probably more readers got to the end of them.

The success of A Brief History of Time catapulted Stephen to international stardom. He  featured in numerous TV programs; his lectures filled the Albert Hall, and similar venues in the US and Japan. He  featured in Star Trek and The Simpsons, and in numerous TV documentaries, as well as advertisements. He lectured at Clinton’s White House; he was back there more recently when President Obama presented him with the US Medal of Freedom, a very rare honor for any foreigner—and of course just one of the many awards he accumulated over his career (including Companion of Honor from the UK). In the summer of 2012, he reached perhaps his largest-ever audience when he had a star role at the opening ceremony of the London Paralympics.

His 60th birthday celebrations, in January 2002 , were a memorable occasion for all of us. Hundreds of leading scientists came from all over the world to honor and celebrate Stephen’s discoveries, and to spend a week discussing the latest theories on space, time, and the cosmos. But the celebrations weren’t just scientific—that wouldn’t have been Stephen’s style. Stephen was surrounded by his children and grandchildren; there was music and singing; there were ‘celebrities’ in attendance. And when the week’s events were all over, he celebrated with a trip in a hot air balloon.

It was amazing enough that Stephen reached the age of 60; few of us then thought that he would survive 16 more years. His 70th birthday was again marked by an international gathering of scientists in Cambridge, and also with some razzmatazz. So was his 75th birthday, though now shared by several million people via a livestream on the internet. He was in these last years plainly weakening. But he was still able to ‘deliver’ entertaining (and sometimes rather moving) lectures via his speech synthesizer and with the aid of skillfully prepared visuals.

Stephen continued, right until his last decade, to coauthor technical papers, and speak at premier international conferences—doubly remarkable in a subject where even healthy researchers tend to peak at an early age. Specially influential were his contributions to ‘cosmic inflation’—a theory that many believe describes the ultra-early phases of our expanding universe. A key issue is to understand the primordial seeds which eventually develop into galaxies. He proposed (as, independently, did the Russian theorist Viatcheslav Mukhanov) that these were quantum fluctuations—somewhat analogous to those involved in ‘Hawking radiation’ from black holes. He hosted an important meeting in 1982 where such ideas were thoroughly discussed. Subsequently, particularly with James Hartle and Thomas Hertog, he made further steps towards linking the two great theories of 20th century physics: the quantum theory of the microworld and Einstein’s theory of gravity and space-time.

He continued  to be an inveterate traveller—despite attempts to curb this as his respiration weakened. This wasn’t just to lecture. For instance, on a visit to Canada he was undeterred by having to go two miles down a mine-shaft to visit an underground laboratory where famous and delicate experiments had been done. And on a later trip, only a last-minute health setback prevented him from going to the Galapagos. All these travels—and indeed his everyday working life—involved an entourage of assistants and nurses. His fame, and the allure of his public appearances, gave him the resources for  nursing care, and protected him against the ‘does he take sugar?’ type of indignity that the disabled often suffer.

Stephen was far from being the archetype unworldly or nerdish scientist—his personality remained amazingly unwarped by his frustrations and handicaps. As well as his extensive travels, he enjoyed  trips to theatre or opera. He had robust common sense, and was ready to express forceful political opinions. However, a downside of his iconic status was that that his comments attracted exaggerated attention even on topics where he had  no special expertise—for instance philosophy, or the dangers from aliens or from intelligent machines. And he was sometimes involved in media events where his ‘script’ was written by the promoters of causes about which he may have been ambivalent.

But there was absolutely no gainsaying his lifelong commitment to campaigns for the disabled, and (just in the last few months) in support of the NHS—to which he acknowledged he owed so much. He was always, at the personal level, sensitive to the misfortunes of others. He recorded  that, when in hospital soon after his illness was first diagnosed, his depression was lifted when he compared his lot with a boy in the next bed who was dying of leukemia. And he was firmly aligned with other political campaigns and causes. When he visited Israel, he insisted on going also to the West Bank. Newspapers in 2006 showed remarkable pictures of him, in his wheelchair, surrounded  by fascinated and curious crowds in Ramallah.

Even more astonishing are the pictures of him ‘floating’ in the NASA aircraft  (the ‘vomit comet’) that allows passengers to experience weightlessness—he was manifestly overjoyed at escaping, albeit briefly, the clutches of the gravitational force he’d studied for decades and which had so cruelly imprisoned his body.

Tragedy struck Stephen Hawking when he was only 22. He was diagnosed with a deadly disease, and his  expectations dropped to zero. He himself said that everything that happened since then was a bonus. And what a triumph his life has been. His name will live in the annals of science; millions have had their cosmic horizons widened by his best-selling books; and even more, around the world, have been inspired by a unique example of achievement against all the odds—a manifestation of amazing will-power and determination.

Martin Rees is Astronomer Royal of Great Britain, a Fellow of Trinity College, Cambridge, a former director of the Cambridge Institute of Astronomy and author, most recently, of the bestselling Just Six Numbers: The Deep Forces That Shape the Universe. His forthcoming book, On the Future, will be available in October 2018.

Presenting the trailer for Heretics!: The Wondrous (and Dangerous) Beginnings of Modern Philosophy

This entertaining and enlightening graphic narrative tells the exciting story of the seventeenth-century thinkers who challenged authority—sometimes risking excommunication, prison, and even death—to lay the foundations of modern philosophy and science and help usher in a new world. With masterful storytelling and color illustrations, Heretics! offers a unique introduction to the birth of modern thought in comics form—smart, charming, and often funny. A brilliant account of one of the most brilliant periods in philosophy, Heretics! is the story of how a group of brave thinkers used reason and evidence to triumph over the authority of religion, royalty, and antiquity. Watch the trailer here:

 

Heretics!: The Wondrous (and Dangerous) Beginnings of Modern Philosophy by Steven Nadler & Ben Nadler from Princeton University Press on Vimeo.

HereticsSteven Nadler is the William H. Hay II Professor of Philosophy and Evjue-Bascom Professor in the Humanities at the University of Wisconsin–Madison. His books include Spinoza: A Life, which won the Koret Jewish Book Award, and Rembrandt’s Jews, which was a finalist for the Pulitzer Prize. He lives in Madison. Ben Nadler is a graduate of the Rhode Island School of Design and an illustrator. He lives in Chicago. Follow him on Instagram at @bennadlercomics.

Welcome to the Universe microsite receives a Webby

We’re pleased to announce that the accompanying microsite to Welcome to the Universe by Neil DeGrasse Tyson, Michael A. Strauss, and J. Richard Gott has won a People’s Choice Webby in the Best Use of Animation or Motion Graphics category. Congratulations to Eastern Standard, the web designer, on a beautifully designed site.

Winning a Webby is especially gratifying because it honors how much fun we had making the site. We knew we wanted an unconventional approach that would mirror both the complexity and accessibility of the book it was meant to promote. Our wonderful in-house team and creative partners, Eastern Standard took on this challenge, and we are so happy with the results.
—Maria Lindenfeldar, Creative Director, Princeton University Press 

Creating this microsite was a wonderful experiment for us at Princeton University Press.  We wanted to explore how we, as a publisher, could present one of our major books to the public in a compelling way in the digital environment.  Ideally, we had a vision of creating a simple site with intuitive navigation that would give readers an inviting mini-tour through the topics of the book, Welcome to the Universe, by Neil deGrasse Tyson, Michael Strauss, and Richard Gott.  The animation was meant to be subtle, but meaningful, and to gently encourage user interaction, so that the focus would always remain immersing the reader in the content of the book – what we feel is the most interesting part!  We were very happy with how it turned out and now all the more thrilled and honored that the site was chosen for a Webby!
—Ingrid Gnerlich, Science Publisher, Princeton University Press

Everyone’s favorite genius takes the spotlight

Along with Einstein fans everywhere, we’re fairly excited to binge-watch National Geographic’s upcoming series, “Genius”, premiering Tuesday, April 25. The first episode shows a young Einstein (Johnny Flynn), poring over the nature of time, a concept well covered in our An Einstein Encyclopedia along with most any other topic that could interest an Einstein devotee, from fame, to family, to politics, to myths and misconceptions. In Genius, prepare to see a show-down between a feisty young Einstein and a particularly rigid teacher. Engrossing to watch—and bound to leave viewers wanting more. Not to worry: “Teachers, education and schools attended” are covered in depth in the Encyclopedia, as are “Rivals”.

Episode 2 of Genius promises to show Einstein embarking, after much head-butting, on a love affair with the determined Mileva Maric. Often remembered as the lone, eccentric, Princeton-based thinker, Einstein’s youthful relationship with Maric sometimes comes as a surprise even to Einstein fans. And yet in 1903, a young Albert Einstein married his confidante despite the objections of his parents. Her influence on his most creative years has given rise to much discussion—but theirs was only one of several romantic interests over the course of Einstein’s life that competed with his passion for physics. Einstein’s love life has been the subject of intense speculation over the years, but don’t believe everything you hear: “Romantic Interests: Actual, Probable, and Possible”, all included in the Encyclopedia, won’t leave you guessing.

Mileva Maric, first wife of Albert Einstein

 An Einstein Encyclopedia is the single most complete guide to Einstein’s life, perfect for browsing and research alike. Written by three leading Einstein scholars who draw on their combined wealth of expertise gained during their work on the Collected Papers of Albert Einstein, this accessible reference features more than one hundred entries and is divided into three parts covering the personal, scientific, and public spheres of Einstein’s life.

With science celebrated far and wide along with Earth Day this past weekend, what better time to get your dose of genius and #ReadUp.

 

 

Celebration of Science: A reading list

This Earth Day 2017, Princeton University Press is celebrating science in all its forms. From ecology to psychology, astronomy to earth sciences, we are proud to publish books at the highest standards of scholarship, bringing the best work of scientists to a global audience. We all benefit when scientists are given the space to conduct their research and push the boundaries of the human store of knowledge further. Read on for a list of essential reading from some of the esteemed scientists who have published with Princeton University Press.

The Usefulness of Useless Knowledge
Abraham Flexner and Robbert Dijkgraaf

Use

The Serengeti Rules
Sean B. Carroll

Carroll

Honeybee Democracy
Thomas D. Seeley

Seeley

Silent Sparks
Sara Lewis

Lewis

Where the River Flows
Sean W. Fleming

Fleming

How to Clone a Mammoth
Beth Shapiro

Shapiro

The Future of the Brain
Gary Marcus & Jeremy Freeman

Brain

Searching for the Oldest Stars
Anna Frebel

Frebel

Climate Shock
Gernot Wagner & Martin L. Weitzman

Climate

Welcome to the Universe
Neil DeGrasse Tyson, Michael A. Strauss, and J. Richard Gott

Universe

The New Ecology
Oswald J. Schmitz

Schmitz

A peek inside The Calculus of Happiness

What’s the best diet for overall health and weight management? How can we change our finances to retire earlier? How can we maximize our chances of finding our soul mate? In The Calculus of Happiness, Oscar Fernandez shows us that math yields powerful insights into health, wealth, and love. Moreover, the important formulas are linked to a dozen free online interactive calculators on the book’s website, allowing one to personalize the equations. A nutrition, personal finance, and relationship how-to guide all in one, The Calculus of Happiness invites you to discover how empowering mathematics can be. Check out the trailer to learn more:

The Calculus of Happiness: How a Mathematical Approach to Life Adds Up to Health, Wealth, and Love, Oscar E. Fernandez from Princeton University Press on Vimeo.

FernandezOscar E. Fernandez is assistant professor of mathematics at Wellesley College and the author of Everyday Calculus: Discovering the Hidden Math All around Us. He also writes about mathematics for the Huffington Post and on his website, surroundedbymath.com.

Welcome to the Universe microsite nominated for a Webby

We’re thrilled to announce that the microsite for Welcome to the Universe by Neil DeGrasse Tyson, Michael A. Strauss, and J. Richard Gott, designed by Eastern Standard, has been nominated for a Webby in the Best Use of Animation or Motion Graphics category. Be sure to check it out and vote for the best of the internet!

webby

 

Just in time for Pi Day, presenting The Usefulness of Useless Knowledge

In his classic essay “The Usefulness of Useless Knowledge,” Abraham Flexner, the founding director of the Institute for Advanced Study in Princeton and the man who helped bring Albert Einstein to the United States, describes a great paradox of scientific research. The search for answers to deep questions, motivated solely by curiosity and without concern for applications, often leads not only to the greatest scientific discoveries but also to the most revolutionary technological breakthroughs. In short, no quantum mechanics, no computer chips. This brief book includes Flexner’s timeless 1939 essay alongside a new companion essay by Robbert Dijkgraaf, the Institute’s current director, in which he shows that Flexner’s defense of the value of “the unobstructed pursuit of useless knowledge” may be even more relevant today than it was in the early twentieth century. Watch the trailer to learn more:

The Usefulness of Useless Knowledge by Abraham Flexner from Princeton University Press on Vimeo.

Michael Strauss: Our universe is too vast for even the most imaginative sci-fi

As an astrophysicist, I am always struck by the fact that even the wildest science-fiction stories tend to be distinctly human in character. No matter how exotic the locale or how unusual the scientific concepts, most science fiction ends up being about quintessentially human (or human-like) interactions, problems, foibles and challenges. This is what we respond to; it is what we can best understand. In practice, this means that most science fiction takes place in relatively relatable settings, on a planet or spacecraft. The real challenge is to tie the story to human emotions, and human sizes and timescales, while still capturing the enormous scales of the Universe itself.

Just how large the Universe actually is never fails to boggle the mind. We say that the observable Universe extends for tens of billions of light years, but the only way to really comprehend this, as humans, is to break matters down into a series of steps, starting with our visceral understanding of the size of the Earth. A non-stop flight from Dubai to San Francisco covers a distance of about 8,000 miles – roughly equal to the diameter of the Earth. The Sun is much bigger; its diameter is just over 100 times Earth’s. And the distance between the Earth and the Sun is about 100 times larger than that, close to 100 million miles. This distance, the radius of the Earth’s orbit around the Sun, is a fundamental measure in astronomy; the Astronomical Unit, or AU. The spacecraft Voyager 1, for example, launched in 1977 and, travelling at 11 miles per second, is now 137 AU from the Sun.

But the stars are far more distant than this. The nearest, Proxima Centauri, is about 270,000 AU, or 4.25 light years away. You would have to line up 30 million Suns to span the gap between the Sun and Proxima Centauri. The Vogons in Douglas Adams’s The Hitchhiker’s Guide to the Galaxy (1979) are shocked that humans have not travelled to the Proxima Centauri system to see the Earth’s demolition notice; the joke is just how impossibly large the distance is.

Four light years turns out to be about the average distance between stars in the Milky Way Galaxy, of which the Sun is a member. That is a lot of empty space! The Milky Way contains about 300 billion stars, in a vast structure roughly 100,000 light years in diameter. One of the truly exciting discoveries of the past two decades is that our Sun is far from unique in hosting a retinue of planets: evidence shows that the majority of Sun-like stars in the Milky Way have planets orbiting them, many with a size and distance from their parent star allowing them to host life as we know it.

Yet getting to these planets is another matter entirely: Voyager 1 would arrive at Proxima Centauri in 75,000 years if it were travelling in the right direction – which it isn’t. Science-fiction writers use a variety of tricks to span these interstellar distances: putting their passengers into states of suspended animation during the long voyages, or travelling close to the speed of light (to take advantage of the time dilation predicted in Albert Einstein’s theory of special relativity). Or they invoke warp drives, wormholes or other as-yet undiscovered phenomena.

When astronomers made the first definitive measurements of the scale of our Galaxy a century ago, they were overwhelmed by the size of the Universe they had mapped. Initially, there was great skepticism that the so-called ‘spiral nebulae’ seen in deep photographs of the sky were in fact ‘island universes’ – structures as large as the Milky Way, but at much larger distances still. While the vast majority of science-fiction stories stay within our Milky Way, much of the story of the past 100 years of astronomy has been the discovery of just how much larger than that the Universe is. Our nearest galactic neighbour is about 2 million light years away, while the light from the most distant galaxies our telescopes can see has been travelling to us for most of the age of the Universe, about 13 billion years.

We discovered in the 1920s that the Universe has been expanding since the Big Bang. But about 20 years ago, astronomers found that this expansion was speeding up, driven by a force whose physical nature we do not understand, but to which we give the stop-gap name of ‘dark energy’. Dark energy operates on length- and time-scales of the Universe as a whole: how could we capture such a concept in a piece of fiction?

The story doesn’t stop there. We can’t see galaxies from those parts of the Universe for which there hasn’t been enough time since the Big Bang for the light to reach us. What lies beyond the observable bounds of the Universe? Our simplest cosmological models suggest that the Universe is uniform in its properties on the largest scales, and extends forever. A variant idea says that the Big Bang that birthed our Universe is only one of a (possibly infinite) number of such explosions, and that the resulting ‘multiverse’ has an extent utterly beyond our comprehension.

The US astronomer Neil deGrasse Tyson once said: ‘The Universe is under no obligation to make sense to you.’ Similarly, the wonders of the Universe are under no obligation to make it easy for science-fiction writers to tell stories about them. The Universe is mostly empty space, and the distances between stars in galaxies, and between galaxies in the Universe, are incomprehensibly vast on human scales. Capturing the true scale of the Universe, while somehow tying it to human endeavours and emotions, is a daunting challenge for any science-fiction writer. Olaf Stapledon took up that challenge in his novel Star Maker (1937), in which the stars and nebulae, and cosmos as a whole, are conscious. While we are humbled by our tiny size relative to the cosmos, our brains can none the less comprehend, to some extent, just how large the Universe we inhabit is. This is hopeful, since, as the astrobiologist Caleb Scharf of Columbia University has said: ‘In a finite world, a cosmic perspective isn’t a luxury, it is a necessity.’ Conveying this to the public is the real challenge faced by astronomers and science-fiction writers alike. Aeon counter – do not remove

UniverseMichael A. Strauss is professor of astrophysics at Princeton University and coauthor with Richard Gott and Neil DeGrasse Tyson of Welcome to The Universe: An Astrophysical Tour.

This article was originally published at Aeon and has been republished under Creative Commons.

Robbert Dijkgraaf on The Usefulness of Useless Knowledge

FlexnerA forty-year tightening of funding for scientific research has meant that resources are increasingly directed toward applied or practical outcomes, with the intent of creating products of immediate value. In such a scenario, it makes sense to focus on the most identifiable and urgent problems, right? Actually, it doesn’t. In his classic essay “The Usefulness of Useless Knowledge,” Abraham Flexner, the founding director of the Institute for Advanced Study in Princeton, describes a great paradox of scientific research. The search for answers to deep questions, motivated solely by curiosity and without concern for applications, often leads not only to the greatest scientific discoveries but also to the most revolutionary technological breakthroughs. This brief book includes Flexner’s timeless 1939 essay alongside a new companion essay by Robbert Dijkgraaf, the Institute’s current director. Read on for Dijkgraaf’s take on the importance of curiosity-driven research, how we can cultivate it, and why Flexner’s essay is more relevant than ever.

The title of the book, The Usefulness of Useless Knowledge, is somewhat enigmatic—what does it mean?

RD: Abraham Flexner, an educational reformer and founding director of the Institute for Advanced Study, wrote an essay with this title for Harper’s magazine in 1939. He believed that there was an indispensable connection between intellectual and spiritual life—“useless forms of activity”—and undreamed-of utility.

Cited as a philanthropic hero by Warren Buffett, Flexner was responsible for bringing Albert Einstein to America to join the Institute’s inaugural Faculty, just when Hitler came to power in 1933.

A true visionary, Flexner was acutely aware that our current conception of what is useful might suffice for the short term but would inevitably become too narrow over time. He believed that the best way to advance understanding and knowledge is by enabling leading scientists and scholars to follow their natural curiosity, intuition, and inquiry, without concern for utility but rather with the purpose of discovering answers to the most fascinating questions of their time.

Flexner’s 1939 article is reprinted in the book along with a companion essay that you have written. What did you realize in revisiting Flexner’s ideas?

RD: One large realization is that while the world has changed dramatically in terms of technological progress since Flexner’s time, human beings still wrestle with the benefits and risks of freedom, with power and productivity versus imagination and creativity, and this dichotomy continues to limit our evolution and sometimes leads to abhorrent behavior as we saw during Flexner’s era and which continues to haunt ours today.

A significant difference is that in the twenty-first century, we are increasingly creating a one-dimensional world determined by external metrics. Why? Our world is becoming ever larger and more complex. In order to provide some clarity, we try to quantify that world with share prices and rankings. In the process, we have exiled our intuition and have lost contact with our environment.

We need to return to timeless values like searching for the truth, while being honest about the things we don’t understand. There is also a great need for passion. I wake up every morning with the thought: I want to do something that I feel good about. As a society, we have largely lost that feeling. We need to reconsider: what kind of world do we want exactly? And what new systems do we need to do good things?

Why is curiosity-driven basic research important today and how can we cultivate it?

RD: The progress of our modern age, and of the world of tomorrow, depends not only on technical expertise, but also on unobstructed curiosity and the benefits of traveling far upstream, against the current of practical considerations. Much of the knowledge developed by basic research is made publicly accessible and so benefits society as a whole, spreading widely beyond the narrow circle of individuals who, over years and decades, introduce and develop the ideas. Fundamental advances in knowledge cannot be owned or restricted by people, institutions, or nations, certainly not in the current age of the Internet. They are truly public goods.

But driven by an ever-deepening lack of funding, against a background of economic uncertainty, global political turmoil, and ever-shortening time cycles, research criteria are becoming dangerously skewed towards conservative short-term goals that may address more immediate problems, but miss out on the huge advances that human imagination can bring in the long term.

The “metrics” used to assess the quality and impact of research proposals—even in the absence of a broadly accepted framework for such measurements—systematically undercut pathbreaking scholarship in favor of more predictable goal-directed research. It can easily take many years, even decades, or sometimes, a century, as in the case of the gravitational waves predicted by Einstein’s theory of relativity that were only detected last year, for the societal value of an idea to come to light.

In order to enable and encourage the full cycle of scientific innovation, we need to develop a solid portfolio of research in much the same way as we approach well-managed financial resources. Such a balanced portfolio would contain predictable and stable short-term investments, as well as long-term bets that are intrinsically more risky but can potentially earn off-the-scale rewards. The path from exploratory basic research to practical applications is not one-directional and linear, but rather complex and cyclic, with resultant technologies enabling even more fundamental discoveries. Flexner and I give many examples of this in our book, from the development of electromagnetic waves that carry wireless signals to quantum mechanics and computer chips.

How do curiosity and imagination enable progress?

RD: An attitude aimed at learning and investigating, wherein imagination and creativity play an important role, is essential not only in scientific institutions but in every organization. Companies and institutions themselves need to develop the inquisitive and explorative approach they would like to see in their employees. Organizations are often trapped in the framework of their own thinking. Out-of-the-box thinking is very hard, because one doesn’t know where the box is. At the basis of progress lies a feeling of optimism: problems can be solved. Organizations need to cultivate the capacity to visualize the future and define their position in it.

What conditions are necessary for the spark of a new idea or theory?

RD: If we want more imagination, creativity, and curiosity, we need to accept that people occasionally run in the wrong direction. As a business, institution, or society, we need to allow once again for failure. Encourage workers to spend a certain percentage of their time on the process of exploration. A brilliant idea never appears out of the blue, but is generated simply by allowing people to try out things. Nine times out of ten, nothing results, but something may emerge suddenly and unexpectedly. That free space and those margins of error are increasingly under pressure in our head, our role, our organization, and our society. I am worried about the loss of that exploratory force.

What don’t we know, and how does uncertainty drive advancement?

RD: How did the universe begin and how does it end? What is the origin of life on Earth and possibly elsewhere in the cosmos? What in our brain makes us conscious and human? In addition to these fundamental questions and many others, we are struggling with major issues about time and space, about matter and energy. What are our ideas on this and what questions are we trying to answer? In science, a long process precedes any outcome. In general, the media only has time and space to pay attention to outcomes. But for scientists it’s precisely the process that counts, walking together down that path. It’s the questions that engage us, not the answers.

Abraham Flexner (1866–1959) was the founding director of the Institute for Advanced Study, one of the world’s leading institutions for basic research in the sciences and humanities. Robbert Dijkgraaf, a mathematical physicist who specializes in string theory, is director and Leon Levy Professor at the Institute for Advanced Study. A distinguished public policy adviser and passionate advocate for science and the arts, he is also the cochair of the InterAcademy Council, a global alliance of science academies, and former president of the Royal Netherlands Academy of Arts and Sciences. They are the authors of The Usefulness of Useless Knowledge.

Dalton Conley & Jason Fletcher on how genomics is transforming the social sciences

GenomeSocial sciences have long been leery of genetics, but in the past decade, a small but intrepid group of economists, political scientists, and sociologists have harnessed the genomics revolution to paint a more complete picture of human social life. The Genome Factor shows how genomics is transforming the social sciences—and how social scientists are integrating both nature and nurture into a unified, comprehensive understanding of human behavior at both the individual and society-wide levels. The book raises pertinent questions: Can and should we target policies based on genotype? What evidence demonstrates how genes and environments work together to produce socioeconomic outcomes? Recently, The Genome Factor‘s authors, Dalton Conley and Jason Fletcher, answered some questions about their work.

What inspired you to write The Genome Factor?

JF: Our book discusses how findings and theories in genetics and biological sciences have shaped social science inquiry—the theories, methodologies, and interpretations of findings used in economics, sociology, political science, and related disciplines —both historically and in the newer era of molecular genetics. We have witnessed, and participated in, a period of rapid change and cross-pollination between the social and biological sciences. Our book draws out some of the major implications of this cross-pollination—we particularly focus on how new findings in genetics has overturned ideas and theories in the social sciences. We also use a critical eye to evaluate what social scientists and the broader public should believe about the overwhelming number of new findings produced in genetics.

What insights did you learn in writing the book?

JF: Genetics, the human genome project in particular, has been quite successful and influential in the past two decades, but has also experienced major setbacks and is still reeling from years of disappointments and a paradigm shift. There has been a major re-evaluation and resetting of expectations the clarity and power of genetic effects. Only 15 years ago, a main model was on the so-called OGOD model—one gene, one disease. While there are a few important examples where this model works, it has mostly failed. This failure has had wide implications on how genetic analysis is conducted as well as a rethinking of previous results; many of which are now thought to false findings. Now, much analysis is conducted using data 10s or 100s of thousands of people because the thinking is that most disease is caused by tens, hundreds, or even thousands of genes that each have a tiny effect. This shift has major implications for social science as well. It means genetic effects are diffuse and subtle, which makes it challenging to combine genetic and social science research. Genetics has also shifted from a science of mechanistic understanding to a large scale data mining enterprises. As social scientists, this approach is in opposition to our norms of producing evidence. This is something we will need to struggle through in the future.

How did you select the topics for the book chapters?

JF: We wanted to tackle big topics across multiple disciplines. We discuss some of the recent history of combining genetics and social science, before the molecular revolution when “genetics” were inferred from family relationships rather than measured directly. We then pivot to provide examples of cutting edge research in economics and sociology that has incorporated genetics to push social science inquiry forward. One example is the use of population genetic changes as a determinant of levels of economic development across the world. We also focus our attention to the near future and discuss how policy decisions may be affected by the inclusion of genetic data into social science and policy analysis. Can and should we target policies based on genotype? What evidence do we have that demonstrates how genes and environments work together to produce socioeconomic outcomes?

What impact do you hope The Genome Factor will have?

JF: We hope that readers see the promise as well as the perils of combining genetic and social science analysis. We provide a lot of examples of ongoing work, but also want to show the reader how we think about the larger issues that will remain as genetics progresses. We seek to show the reader how to look through a social science lens when thinking about genetic discoveries. This is a rapidly advancing field, so the particular examples we discuss will be out of date soon, but we want our broader ideas and lens to have longer staying power. As an example, advances in gene editing (CRISPR) have the potential to fundamentally transform genetic analysis. We discuss these gene editing discoveries in the context of some of their likely social impacts.

Dalton Conley is the Henry Putnam University Professor of Sociology at Princeton University. His many books include Parentology: Everything You Wanted to Know about the Science of Raising Children but Were Too Exhausted to Ask. He lives in New York City. Jason Fletcher is Professor of Public Affairs, Sociology, Agricultural and Applied Economics, and Population Health Sciences at the University of Wisconsin–Madison. He lives in Madison. They are the authors of The Genome Factor: What the Social Genomics Revolution Reveals about Ourselves, Our History, and the Future.