PUP math editor Vickie Kearn: How real mathematicians celebrate Pi Day

Who doesn’t love Pi (aka Pie) Day? Residents here in Princeton, NJ love it so much that we spend four days celebrating. Now, to be honest, we’re also celebrating Einstein’s birthday, so we do need the full four days. I know what I will be doing on 3.14159265 but I wondered what some of my friends will be doing. Not surprisingly, a lot will either be making or eating pie. These include Oscar Fernandez (Wellesley), Ron Graham (UCSD), and Art Benjamin (who will be performing his mathemagics show later in the week). Anna Pierrehumbert (who teaches in NYC) will be working with upper school students on a pi recitation and middle school students on making pi-day buttons. Brent Ferguson (The Lawrenceville School) has celebrated at The National Museum of Mathematics in NYC, Ireland, Greece, and this year Princeton. Here he is celebrating in Alaska:

Pi

The Princeton University Math Club will be celebrating with a party in Fine Hall. In addition to eating pie and playing games, they will have a digit reciting contest. Tim Chartier (Davidson College) will be spending his time demonstrating how to estimate pi with chocolate chips while also fielding interview requests for his expert opinion on March Madness (a lot going on this month for mathematicians). Dave Richeson (Dickinson College) goes to the local elementary school each year and talks with the fifth graders about pi and its history and then eats creatively rendered pi themed pie provided by the parents.

You might be wondering why we celebrate a mathematical constant every year. How did it get to be so important? Again I went back to my pi experts and asked them to tell me the most important uses of pi. This question is open to debate by mathematicians but many think that the most important is Euler’s Identity, e(i*pi) + 1 = 0. As Jenny Kaufmann (President of the Princeton University Math Club) puts it, “Besides elegantly encoding the way that multiplication by i results in a rotation in the complex plane, this identity unites what one might consider the five most important numbers in a single equation. That’s pretty impressive!” My most practical friend is Oscar and here is what he told me: “There are so many uses for pi, but given my interest in everyday explanations of math, here’s one I like: If you drive to work every day, you take many, many pi’s with you. That’s because the circumference of your car’s tires is pi multiplied by the tires’ diameter. The most common car tire has a diameter of about 29 inches, so one full revolution covers a distance of about 29 times pi (about 7.5 feet). Many, many revolutions of your tires later you arrive at work, with lots and lots of pi’s!” Anna is also practical in that she will be using pi to calculate the area of the circular pastry she will be eating, but she also likes the infinite series for pi (pi/4 = 1 – 1/3 + 1/5 – 1/7 etc.). Avner Ash (Boston College) sums it up nicely, “ We can’t live without pi—how would we have circles, normal distributions, etc.?”

One of the most important questions one asks on Pi Day is how many digits can you recite? The largest number I got was 300 from the Princeton Math Club. However, there are quite a few impressive numbers from others, as well as some creative answers and ways to remember the digits. For example, Oscar can remember 3/14/15 at 9:26:53 because it was an epic Day and Pi Time for him. Art Benjamin can recite 100 digits from a phonetic code and 5 silly sentences. Ron Graham can recite all of the digits of pi, even thousands, as long as they don’t have to be in order. Dave Richeson also knows all of the digits of pi which are 0,1,2,3,4,5,6,7,8,and 9.

No matter how you celebrate, remember math, especially pi(e) is useful, fun, and delicious.

Vickie Kearn is Executive Editor of Mathematics at Princeton University Press.

J. Richard Gott: What’s the Value of Pi in Your Universe?

Carl Sagan’s sci-fi novel Contact famously introduced wormholes for rapid transit between the stars. Carl had asked his friend Kip Thorne to tell him if the physics of wormholes was tenable and this led Thorne and his colleagues to investigate their properties. They found that traversable wormholes required exotic matter to prop them open and that, by moving the wormhole mouths one could find general relativity solutions allowing time travel to the past. A quantum state called the Casimir vacuum whose effects have been observed experimentally, could provide the exotic matter. To learn whether such time machines could be constructible in principle, we may have to master the laws of quantum gravity, which govern how gravity behaves on microscopic scales. It’s one of the reasons physicists find these solutions so interesting.

But in Contact there is lurking yet another fantastic sci-fi idea, which gets less publicity because it was not included in the movie version. In the book, the protagonist finds out from the extraterrestrials that the system of wormholes throughout the galaxy was not built by them, but by the long gone “old ones” who could manipulate not only the laws of physics but also the laws of mathematics! And they left a secret message in the digits of pi. In his movie Pi, Darren Aronofsky showed a man driven crazy by his search for hidden meanings in the digits of pi.

This opens the question: could pi have been something else? And if so, does pi depend on the laws of physics? Galileo said: “Philosophy is written in this grand book…. I mean the universe … which stands continually open to our gaze…. It is written in the language of mathematics.” The universe is written in the language of mathematics. Nobel laureate Eugene Wigner famously spoke of the “unreasonable effectiveness of mathematics” in explaining physics. Many philosophers take the Platonic view that mathematics would exist even the universe did not. And cosmologist Max Tegmark goes so far as to say that the universe actually is mathematics.

Yet maybe it is the other way around. The laws of physics are just the laws by which matter behaves. They determine the nature of our universe. Maybe humans have simply developed the mathematics appropriate for describing our universe, and so of course it fits with what we see. The mathematician Leopold Kronecker said, “God created the integers, all the rest is the work of man.” Are the laws of mathematics discovered by us in the same way as we discover the laws of physics? And are the laws of mathematics we discover just those which would have occurred to creatures living in a universe with physics like ours? In our universe, physics produces individual identical particles: all electrons are the same for example. We know about integers because there are things that look the same (like apples) for us to count. If you were some strange creature in a fractal universe containing only one object—yourself—and you thought only recursively, you might not ever think of counting anything and would never discover integers.

What about π = 3.14159265.…? Might it have a different value in a different universe? In our universe we have a fundamental physical dimensionless constant, the fine structure constant α which is related to the square of the value of the electric charge of the proton in natural geometrical Planck units (where the speed of light is 1 and the reduced Planck constant is 1 and Newton’s gravitational constant is 1). Now 1/α = 137.035999… Some physicists hope that one day we may have a mathematical formula for 1/α using mathematical constants such as π and e. If a theory for the fine structure constant could be developed giving a value in agreement with observations but allowing it to be calculated uniquely from pure mathematics, and if more and more digits of the constant were discovered experimentally fulfilling its prediction, it would certainly merit a Nobel Prize. But many physicists feel that no such magic formula will ever be discovered. Inflation may produce an infinite number of bubble universes, each with different laws of physics. Different universes bubbling out of an original inflating sea could have different values of 1/α. As Martin Rees has said, the laws of physics we know may be just local bylaws in an infinite multiverse of universes. String theory, if correct, may eventually give us a probability distribution for 1/α and we may find that our universe is just somewhere in the predicted middle 95% of the distribution, for example. Maybe there could be different universes with different values of π.

Let’s consider one possible example: taxicab geometry. This was invented by Hermann Minkowski. Now this brilliant mathematician also invented the geometrical interpretation of time as a fourth dimension based on Einstein’s theory of special relativity, so his taxicab geometry merits a serious look. Imagine a city with a checkerboard pattern of equal-sized square blocks. Suppose you wanted to take a taxicab to a location 3 blocks east, and 1 block north of your location, the shortest total distance you would have to travel to get there is 4 blocks. Your taxi has to travel along the streets, it does not get to travel as the crow flies. You could go 1 block east, then 1 block north then 2 blocks east, and still get to your destination, but the total distance you traveled would also be 4 blocks. The distance to your destination would be ds = |dx| + |dy|, where |dx| is the absolute value of the difference in x coordinates and |dy| is the absolute value of the difference in y coordinates. This is not the Euclidean formula. We are not in Kansas anymore! The set of points equidistant from the origin is a set of dots in a diamond shape. See diagram.

Gott

Image showing an intuitive explanation of why circles in taxicab geometry look like diamonds. Wikipedia.

Now if the blocks were smaller, there would be more dots, still in a diamond shape. In the limit where the size of the blocks had shrunk to zero, one would have a smooth diamond shape as shown in the bottom section of the diagram. The set of points equidistant from the origin has a name—a “circle!” If the circle has a radius of 1 unit, the distance along one side of its diamond shape is 2 units: going from the East vertex of the diamond to the North vertex of the diamond along the diagonal requires you to change the x coordinate by 1 unit and the y coordinate by 1 unit, making the distance along one side of the diagonal equal to 2 units (ds = |dx| + |dy| = 1 + 1 units = 2 units). The diamond shape has 4 sides so the circumference of the diamond is 8 units. The diameter of the circle is twice the radius, and therefore 2 units. In the taxicab universe π = C/d = C/2r = 8/2 = 4. If different laws of physics dictate different laws of geometry, you can change the value of π.

This taxicab geometry applies in the classic etch-a-sketch toy (Look it up on google, if you have never seen one). It has a white screen, and an internal stylus that draws a black line, directed by horizontal and vertical control knobs. If you want to draw a vertical line, you turn the vertical knob. If you want to draw a horizontal line you turn the horizontal knob. If you want to draw a diagonal line, you must simultaneously turn both knobs smoothly. If the distance between two points is defined by the minimal amount of total turning of the two knobs required to get from one point to the other, then that is the “taxicab” distance between the two points. In Euclidean geometry there is one shortest line between two points: a straight line between them. In taxicab geometry there can be many different, equally short, broken lines (taxicab routes) connecting two points. Taxicab geometry does not obey the axioms of Euclidean geometry and therefore does not have the same theorems as Euclidean geometry. And π is 4.

Mathematician and computer scientist John von Neumann invented a cellular automaton universe that obeys taxicab geometry. It starts with an infinite checkerboard of pixels. Pixels can be either black or white. The state of a pixel at time step t = n + 1 depends only on the state of its 4 neighbors (with which it shares a side: north, south, east, west of it) on the previous time step t = n. Causal, physical effects move like a taxicab. If the pixels are microscopic, we get a taxicab geometry. Here is a simple law of physics for this universe: a pixel stays in the same state, unless it is surrounded by an odd number of black pixels, in which case it switches to the opposite state on the next time step. Start with a white universe with only 1 black pixel at the origin. In the next time step it remains black while its 4 neighbors also become black. There is now a black cross of 5 pixels at the center. It has given birth to 4 black pixels like itself. Come back later and there will be 25 black pixels in a cross-shaped pattern of 5 cross-shaped patterns.

Come back still later and you can find 125 black pixels in 5 cross-shaped patterns (of 5 cross-shaped patterns). All these new black pixels lie inside a diamond-shaped region whose radius grows larger by one pixel per time step. In our universe, drop a rock in a pond, and a circular ripple spreads out. In the von Neumann universe, causal effects spread out in a diamond-shaped pattern.

If by “life” you mean a pattern able to reproduce itself, then this universe is luxuriant with life. Draw any pattern (say a drawing of a bicycle) in black pixels and at a later time you will find 5 bicycles, and then 25 bicycles, and 125 bicycles, etc. The laws of physics in this universe cause any object to copy itself. If you object that this is just a video game, I must tell you that some physicists seriously entertain the idea that we are living in an elaborate video game right now with quantum fuzziness at small scales providing the proof of microscopic “pixelization” at small scales.

Mathematicians in the von Neumann universe would know π = 4 (Or, if we had a taxicab universe with triangular pixels filling the plane, causal effects could spread out along three axes instead of two and a circle would look like a hexagon, giving π = 3.). In 1932, Stanislaw Golab showed that if we were clever enough in the way distances were measured in different directions, we could design laws of physics so that π might be anything we wanted from a low of 3 to a high of 4.

Back to the inhabitants of the von Neumann universe who think π = 4. Might they be familiar with number we know and love, 3.14159265…? They might:

3.14159265… = 4 {(1/1) – (1/3) + (1/5) – (1/7) + (1/9) + …} (Leibnitz)

If they were familiar with integers, they might be able to discover 3.14159265… But maybe the only integers they know are 1, 5, 25, 125, … and 4 of course. They would know that 5 = SQRT(25), so they would know what a square root was. In this case they could still find a formula for

3.14159265. . . =
SQRT(4) {SQRT(4)/SQRT(SQRT(4))}{SQRT(4)/SQRT(SQRT(4) + SQRT(SQRT(4)))}{SQRT(4)/ SQRT(SQRT(4) + SQRT(SQRT(4) + SQRT(SQRT(4))))} …

This infinite product involving only the integer 4 derives from one found by Vieta in 1594.

There are indeed many formulas equal to our old friend 3.14159265… including a spectacular one found by the renowned mathematician Ramanujan. Though every real number can be represented by such infinite series, products and continued fractions, these are particularly simple. So 3.14159265… does seem to have a special intimate relationship with integers, independent of geometry. If physics creates individual objects that can be counted, it seems difficult to avoid learning about 3.14159265… eventually—“If God made the integers,” as Kronecker suggested. So 3.14159265… appears not to be a random real number and we are still left with the mystery of the unreasonable effectiveness of mathematics in explaining the physics we see in our universe. We are also left with the mystery of why the universe is as comprehensible as it is. Why should we lowly carbon life forms be capable of finding out as much about how the universe works as we have done? Having the ability as intelligent observers to ask questions about the universe seems to come with the ability to actually answer some of them. That’s remarkable.

UniverseGottJ. Richard Gott is professor of astrophysics at Princeton University. His books include The Cosmic Web: Mysterious Architecture of the Universe. He is the coauthor of Welcome to the Universe: An Astrophysical Tour with Neil DeGrasse Tyson and Michael A. Strauss.

Marc Chamberland: Why π is important

On March 14, groups across the country will gather for Pi Day, a nerdy celebration of the number Pi, replete with fun facts about this mathematical constant, copious amounts pie, and of course, recitations of the digits of Pi. But why do we care about so many digits of Pi? How big is the room you want to wallpaper anyway? In 1706, 100 digits of Pi were known, and by 2013 over 12 trillion digits had been computed. I’ll give you five reasons why someone may claim that many digits of Pi is important, but they’re not all good.

Reason 1
It provides accuracy for scientific measurements

Pi1

This argument had merit when only a few digits were known, but today this reason is as empty as space. The radius of the universe is 93 billion light years, and the radius of a hydrogen atom is about 0.1 nanometers. So knowing Pi to 38 places is enough to tell you precisely how many hydrogen atoms you need to encircle the universe. For any mechanical calculations, probably 3.1415 is more than enough precision.

Reason 2
It’s neat to see how far we can go

Pi2

It’s true that great feats and discoveries have been done in the name of exploration. Ingenious techniques have been designed to crank out many digits of Pi and some of these ideas have led to remarkable discoveries in computing. But while this “because it is there” approach is beguiling, just because we can explore some phenomenon doesn’t mean we’ll find something valuable. Curiosity is great, but harnessing that energy with insight will take you farther.

Reason 3
Computer Integrity

Pi3

The digits of Pi help with testing and developing new algorithms. The Japanese mathematician Yasumasa Kanada used two different formulas to generate and check over one trillion digits of Pi. To get agreement after all those arithmetic operations and data transfers is strong evidence that the computers are functioning error-free. A spin-off of the expansive Pi calculations has been the development of the Fast Fourier Transform, a ground-breaking tool used in digital signal processing.

Reason 4
It provides evidence that Pi is normal

Pi4

A number is “normal” if any string of digits appears with the expected frequency. For example, you expect the number 4 to appear 1/10 of the time, or the string 28 to appear 1/100 of the time. It is suspected that Pi is normal, and this was evidenced from the first trillion digits when it was seen that each digit appears about 100 billion times. But proving that Pi is normal has been elusive. Why is the normality of numbers important? A normal number could be used to simulate a random number generator. Computer simulations are a vital tool in modeling any dynamic phenomena that involves randomness. Applications abound, including to climate science, physiological drug testing, computational fluid dynamics, and financial forecasting. If easily calculated numbers such as Pi can be proven to be normal, these precisely defined numbers could be used, paradoxically, in the service of generating randomness.

Reason 5
It helps us understand the prime numbers

Pi5

Pi is intimately connected to the prime numbers. There are formulas involving the product of infinitely numbers that connect the primes and Pi. The knowledge flows both ways: knowing many primes helps one calculate Pi and knowing many digits of Pi allows one to generate many primes. The Riemann Hypothesis—an unsolved 150-year-old mathematical problem whose solution would earn the solver one million dollars—is intimately connected to both the primes and the number Pi.

And you thought that Pi was only good for circles.

SingleMarc Chamberland is the Myra Steele Professor of Mathematics and Natural Science at Grinnell College. His research in several areas of mathematics, including studying Pi, has led to many publications and speaking engagements in various countries. His interest in popularizing mathematics resulted in the recent book Single Digits: In Praise of Small Numbers with Princeton University Press. He also maintains his YouTube channel Tipping Point Math that tries to make mathematics accessible to a general audience. He is currently working on a book about the number Pi.

Praeteritio and the quiet importance of Pi

pidayby James D. Stein

Somewhere along my somewhat convoluted educational journey I encountered Latin rhetorical devices. At least one has become part of common usage–oxymoron, the apparent paradox created by juxtaposed words which seem to contradict each other; a classic example being ‘awfully good’. For some reason, one of the devices that has stuck with me over the years is praeteritio, in which emphasis is placed on a topic by saying that one is omitting it. For instance, you could say that when one forgets about 9/11, the Iraq War, Hurricane Katrina, and the Meltdown, George W. Bush’s presidency was smooth sailing.

I’ve always wanted to invent a word, like John Allen Paulos did with ‘innumeracy’, and πraeteritio is my leading candidate–it’s the fact that we call attention to the overwhelming importance of the number π by deliberately excluding it from the conversation. We do that in one of the most important formulas encountered by intermediate algebra and trigonometry students; s = rθ, the formula for the arc length s subtended by a central angle θ in a circle of radius r.

You don’t see π in this formula because π is so important, so natural, that mathematicians use radians as a measure of angle, and π is naturally incorporated into radian measure. Most angle measurement that we see in the real world is described in terms of degrees. A full circle is 360 degrees, a straight angle 180 degrees, a right angle 90 degrees, and so on. But the circumference of a circle of radius 1 is 2π, and so it occurred to Roger Cotes (who is he? I’d never heard of him) that using an angular measure in which there were 2π angle units in a full circle would eliminate the need for a ‘fudge factor’ in the formula for the arc length of a circle subtended by a central angle. For instance, if one measured the angle D in degrees, the formula for the arc length of a circle of radius r subtended by a central angle would be s = (π/180)rD, and who wants to memorize that? The word ‘radian’ first appeared in an examination at Queen’s College in Belfast, Ireland, given by James Thomson, whose better-known brother William would later be known as Lord Kelvin.

The wisdom of this choice can be seen in its far-reaching consequences in the calculus of the trigonometric functions, and undoubtedly elsewhere. First semester calculus students learn that as long as one uses radian measure for angles, the derivative of sin x is cos x, and the derivative of cos x is – sin x. A standard problem in first-semester calculus, here left to the reader, is to compute what the derivative of sin x would be if the angle were measured in degrees rather than radians. Of course, the fudge factor π/180 would raise its ugly head, its square would appear in the formula for the second derivative of sin x, and instead of the elegant repeating pattern of the derivatives of sin x and cos x that are a highlight of the calculus of trigonometric functions, the ensuing formulas would be beyond ugly.

One of the simplest known formulas for the computation of π is via the infinite series ????4=1−13+15−17+⋯

This deliciously elegant formula arises from integrating the geometric series with ratio -x^2 in the equation 1/(1+????^2)=1−????2+????4−????6+⋯

The integral of the left side is the inverse tangent function tan-1 x, but only because we have been fortunate enough to emphasize the importance of π by utilizing an angle measurement system which is the essence of πraeteritio; the recognition of the importance of π by excluding it from the discussion.

So on π Day, let us take a moment to recognize not only the beauty of π when it makes all the memorable appearances which we know and love, but to acknowledge its supreme importance and value in those critical situations where, like a great character in a play, it exerts a profound dramatic influence even when offstage.

LA MathJames D. Stein is emeritus professor in the Department of Mathematics at California State University, Long Beach. His books include Cosmic Numbers (Basic) and How Math Explains the World (Smithsonian). His most recent book is L.A. Math: Romance, Crime, and Mathematics in the City of Angels.

Where would we be without Pi?

Pi Day, the annual celebration of the mathematical constant π (pi), is always an excuse for mathematical and culinary revelry in Princeton. Since 3, 1, and 4 are the first three significant digits of π, the day is typically celebrated on 3/14, which in a stroke of serendipity, also happens to be Albert Einstein’s birthday. Pi Day falls on Monday this year, but Princeton has been celebrating all weekend with many more festivities still to come, from a Nerd Herd smart phone pub crawl, to an Einstein inspired running event sponsored by the Princeton Running Company, to a cocktail making class inside Einstein’s first residence. We imagine the former Princeton resident would be duly impressed.

Einstein enjoying a birthday/ Pi Day cupcake

Einstein enjoying a birthday/ Pi Day cupcake

Pi Day in Princeton always includes plenty of activities for children, and tends to be heavy on, you guessed it, actual pie (throwing it, eating it, and everything in between). To author Paul Nahin, this is fitting. At age 10, his first “scientific” revelation was,  If pi wasn’t around, there would be no round pies! Which it turns out, is all too true. Nahin explains:

Everybody “knows’’ that pi is a number a bit larger than 3 (pretty close to 22/7, as Archimedes showed more than 2,000 years ago) and, more accurately, is 3.14159265… But how do we know the value of pi? It’s the ratio of the circumference of a circle to a diameter, yes, but how does that explain how we know pi to hundreds of millions, even trillions, of decimal digits? We can’t measure lengths with that precision. Well then, just how do we calculate the value of pi? The symbol π (for pi) occurs in countless formulas used by physicists and other scientists and engineers, and so this is an important question. The short answer is, through the use of an infinite series expansion.

NahinIn his book In Praise of Simple Physics, Nahin shows you how to derive such a series that converges very quickly; the sum of just the first 10 terms correctly gives the first five digits. The English astronomer Abraham Sharp (1651–1699) used the first 150 terms of the series (in 1699) to calculate the first 72 digits of pi. That’s more than enough for physicists (and for anybody making round pies)!

While celebrating Pi Day has become popular—some would even say fashionable in nerdy circles— PUP author Marc Chamberland points out that it’s good to remember Pi, the number. With a basic scientific calculator, Chamberland’s recent video “The Easiest Way to Calculate Pi” details a straightforward approach to getting accurate approximations for Pi without tables or a prodigious digital memory. Want even more Pi? Marc’s book Single Digits has more than enough Pi to gorge on.

Now that’s a sweet dessert.

If you’re looking for more information on the origin of Pi, this post gives an explanation extracted from Joseph Mazur’s fascinating history of mathematical notation, Enlightening Symbols.

You can find a complete list of Pi Day activities from the Princeton Tour Company here.

Pi Day Recipe: Apple Pie from Jim Henle’s The Proof and the Pudding

Tomorrow (March 14, 2015) is a very important Pi Day. This year’s local Princeton Pi Day Party and other global celebrations of Albert Einstein’s birthday look to be truly stellar, which is apt given this is arguably the closest we will get to 3.1415 in our lifetimes.

Leading up to the publication of the forthcoming The Proof and the Pudding: What Mathematicians, Cooks, and You Have in Common by Jim Henle, we’re celebrating the holiday with a recipe for a classic Apple Pie (an integral part of any Pi Day spread). Publicist Casey LaVela recreates and photographs the recipe below. Full text of the recipe follows. Happy Pi Day everyone!


Notes on Jim Henle’s Apple Pie recipe from Publicist Casey LaVela

The Proof and the Pudding includes several recipes for pies or tarts that would fit the bill for Pi Day, but the story behind Henle’s Apple Pie recipe is especially charming, the recipe itself is straightforward, and the results are delicious. At the author’s suggestion, I used a mixture of baking apples (and delightfully indulgent amounts of butter and sugar).

Crust:

All of the crust ingredients (flour, butter, salt) ready to go:

_IGP2734

After a few minutes of blending everything together with a pastry cutter, the crust begins to come together. A glorious marriage of flour and butter.

_IGP2742

Once the butter and flour were better incorporated, I dribbled in the ice water and then turned the whole wonderful mess out between two sheets of plastic wrap in preparation for folding. The crust will look like it won’t come together, but somehow it always does in the end. Magical.

_IGP2758

Now you need to roll out and fold over the dough a few times. This is an important step and makes for a light and flaky crust. (You use a similar process to make croissants or other viennoiserie from scratch.)

_IGP2765

I cut the crust into two (for the top crust and bottom crust) using my handy bench scraper:

_IGP2780

Apples:

The apples cored, peeled, and ready to be cut into slices. I broke out my mandolin slicer (not pictured) to make more even slices, but if you don’t own a slicer or prefer to practice your knife skills you can just as easily use your favorite sharp knife.

_IGP2749

Beautiful (even) apple slices:

_IGP2788

Action shot of me mixing the apple slices, sugar, and cinnamon together. I prefer to prepare my apple pie filling in a bowl rather than sprinkling the dry ingredients over the apple slices once they have been arranged in the bottom crust. I’m not sure if it has much impact on the flavor and it is much, much messier, but I find it more fun.

_IGP2797

Assembly:

The bottom crust in the pie plate:

_IGP2784

Arrange the apple slices in the bottom crust:

_IGP2804

Top with the second crust, seal the top crust to the bottom with your fingers, and (using your sharp knife) make incisions in the top crust to allow steam to escape:

_IGP2830

The apple pie before going into the oven (don’t forget to put a little extra sugar on top):

_IGP2847

The finished product:

_IGP2851

There was a little crust left over after cutting, so I shaped it into another pi symbol, covered it in cinnamon and sugar, and baked it until golden brown. I ate the baked pi symbol as soon as it had cooled (before thinking to take a picture), but it was delicious!

_IGP2848


Apple Pie

The story of why I started cooking is not inspiring. My motives weren’t pure. Indeed, they involved several important sins.

I really am a glutton. I love to eat. As a child, I ate well; my mother was a wonderful cook. But I always wanted more than I got, especially dessert. And of all desserts, it was apple pie I craved most. Not diner pies, not restaurant pies, and not bakery pies, but real, homemade apple pies.

When I was six, I had my first homemade apple pie. It was at my grandmother’s house. I don’t remember how it tasted, but I can still recall the gleam in my mother’s eye when she explained the secret of the pie. “I watched her make it. Before she put on the top crust, she dotted the whole thing with big pats of butter!”

Several times as I was growing up, my mother made apple pie. Each one was a gem. But they were too few—only three or four before I went off to college. They were amazing pies. The apples were tart and sweet. Fresh fall apples, so flavorful no cinnamon was needed. The crust was golden, light and crisp, dry when it first hit the tongue, then dissolving into butter.

I grew up. I got married. I started a family. All the while, I longed for that pie. Eventually I set out to make one.

Success came pretty quickly, and it’s not hard to see why. The fact is, despite apple pie’s storied place in American culture, most apple pies sold in this country are abysmal. A pie of fresh, tart apples and a crust homemade with butter or lard, no matter how badly it’s made, is guaranteed to surpass a commercial product.

That means that even if you’ve never made a pie before, you can’t go seriously wrong. The chief difficulty is the crust, but I’ve developed a reliable method. Except for this method, the recipe below is standard.

For the filling:
5 cooking apples (yielding about 5 cups of pieces)
1/4 to 1/3 cup sugar
2 Tb butter
1/2 to 1 tsp cinnamon
lemon juice, if necessary
1 tsp flour, maybe

For the crust:
2 cups flour
1 tsp salt
2/3 cup lard or unsalted butter (1 1/3 sticks)
water

The crust is crucial. I’ll discuss its preparation last. Assume for now that you’ve rolled out the bottom crust and placed it in the pie pan.

Core, peel, and slice the apples. Place them in the crust. Sprinkle with sugar and cinnamon. Dot with butter. Roll out the top crust and place it on top. Seal the edge however you like. In about six places, jab a knife into the crust and twist to leave a hole for steam to escape. Sprinkle the crust with the teaspoon of sugar.

Bake in a preheated oven for 15 minutes at 450° and then another 35 minutes at 350°. Allow to cool. Serve, if you like, with vanilla ice cream or a good aged cheddar.

Now, the crust:

Mix the flour and salt in a large bowl. Place the lard or butter or lard/butter in the bowl. Cut it in with a pastry cutter.

Next, the water. Turn the cold water on in the kitchen sink so that it dribbles out in a tiny trickle. Hold the bowl with the flour mixture in one hand and a knife in the other. Let the water dribble into the bowl while you stir with the knife. The object is to add just enough water so that the dough is transformed into small dusty lumps. Don’t be vigorous with the knife, but don’t allow the water to pool. If the water is dribbling too fast, take the bowl away from the faucet from time to time. When you’re done, the dough will still look pretty dry.

Recipes usually call for about 5 tablespoons of water. This method probably uses about that much.

Actually, the dough will look so dry that you’ll think it won’t stick together when it’s rolled out. In fact, it probably won’t stick together, but trust me. This is going to work.

Tear off a sheet of plastic wrap and lay it on the counter. Place a bit more than half the dough on the sheet and cover it with a second sheet of plastic.

With a rolling pin, roll the dough out between the two sheets. Roll it roughly in the shape of a rectangle.

It won’t look great and it probably would fall apart if you picked it up.

Don’t pick it up. Remove the top sheet of plastic wrap and fold the bottom third up, and fold the top third down, then do the same horizontally, right and left.

Now replace the top sheet of plastic wrap and roll the dough out gently into a disk.

This time it should look pretty decent. This time the dough will stick together.

You should be able to remove the top sheet of plastic and, using the bottom sheet, turn it over into the pie pan. The crust should settle in nicely without breaking.

Form the top crust the same way.

This method rolls each crust twice—usually not a good idea because working the dough makes it tough. But remarkably, crusts produced this way are tender and light. I’m not sure why but I suspect it’s because the dough is fairly dry.

Notes:
• Cooking apples are tart apples. The best I know is the Rhode Island Greening, but they’re hard to find. Baldwins and Jonathans are decent, but they’re hard to find too. The British Bramleys are terrific. I’ve made good pies from the French Calville Blanc d’Hiver. But we’re not living in good apple times. Most stores don’t sell apples for cooking. When in doubt, use a mixture.
• The lemon juice and the larger quantity of cinnamon are for when you have tired apples with no oomph. The cheese also serves this purpose. It should be a respectable old cheddar and it should be at room temperature.
• Consumption of too many commercial pies makes me loath to add flour or cornstarch to pie filling. The flour is here in case you fear your apples will be too juicy. I don’t mind juice in a pie, in moderation. If adding flour, mix the apples, sugar, cinnamon, and flour in a bowl before pouring into the crust.
• Lard is best. Its melting point is higher than butter’s. It successfully separates the flour into layers for a light, crispy crust. Butter is more likely to saturate the flour and produce a heavy crust. Some like half butter/half lard, preferring butter for its flavor. But the flavor of lard is nice too, and its porkiness is wonderful with apple.


This recipe is taken from:

Henle_TheProof_S15

The Proof and the Pudding

What Mathematicians, Cooks, and You Have in Common

Jim Henle

“If you’re a fan of Julia Child or Martin Gardner—who respectively proved that anyone can have fun preparing fancy food and doing real mathematics—you’ll enjoy this playful yet passionate romp from Jim Henle. It’s stuffed with tasty treats and ingenious ideas for further explorations, both in the kitchen and with pencil and paper, and draws many thought-providing parallels between two fields not often considered in the same mouthful.”—Colm Mulcahy, author of Mathematical Card Magic: Fifty-Two New Effects

#PiDay Activity: Using chocolate chips to calculate the value of pi

Chartier_MathTry this fun Pi Day activity this year. Mathematician Tim Chartier has a recipe that is equal parts delicious and educational. Using chocolate chips and the handy print-outs below, mathematicians of all ages can calculate the value of pi. Start with the Simple as Pi recipe, then graduate to the Death by Chocolate Pi recipe. Take it to the next level by making larger grids at home. If you try this experiment, take a picture and send it in and we’ll post it here.

Download: Simple as Pi [Word document]
Download: Death by Chocolate Pi [Word document]

For details on the math behind this experiment please read the article below which is cross-posted from Tim’s personal blog. And if you like stuff like this, please check out his new book Math Bytes: Google Bombs, Chocolate-Covered Pi, and Other Cool Bits in Computing.

For more Pi Day features from Princeton University Press, please click here.


 

Chocolate Chip Pi

How can a kiss help us learn Calculus? If you sit and reflect on answers to this question, you are likely to journey down a mental road different than the one we will traverse. We will indeed use a kiss to motivate a central idea of Calculus, but it will be a Hershey kiss! In fact, we will have a small kiss, more like a peck on the cheek, as we will use white and milk chocolate chips. The math lies in how we choose which type of chip to use in our computation.

Let’s start with a simple chocolatey problem that will open a door to ideas of Calculus. A Hershey’s chocolate bar, as seen below, is 2.25 by 5.5 inches. We’ll ignore the depth of the bar and consider only a 2D projection. So, the area of the bar equals the product of 2.25 and 5.5 which is 12.375 square inches.

Note that twelve smaller rectangles comprise a Hershey bar. Suppose I eat 3 of them. How much area remains? We could find the area of each small rectangle. The total height of the bar is 2.25 inches. So, one smaller rectangle has a height of 2.25/3 = 0.75 inches. Similarly, a smaller rectangle has a width of 5.5/4 = 1.375. Thus, a rectangular piece of the bar has an area of 1.03125, which enables us to calculate the remaining uneaten bar to have an area of 9(1.03125) = 9.28125 square inches.

Let’s try another approach. Remember that the total area of the bar is 12.375. Nine of the twelve rectangular pieces remain. Therefore, 9/12ths of the bar remains. I can find the remaining area simply be computing 9/12*(12.375) = 9.28125. Notice how much easier this is than the first method. We’ll use this idea to estimate the value of π with chocolate, but this time we’ll use chocolate chips!

Let’s compute the area of a quarter circle of unit radius, which equals π/4 since the full circle has an area of π. Rather than find the exact area, let’s estimate. We’ll break our region into squares as seen below.

This is where the math enters. We will color the squares red or white. Let’s choose to color a square red if the upper right-hand corner of the square is in the shaded region and leave it white otherwise, which produces:

Notice, we could have made other choices. We could color a square red if the upper left-hand corner or even middle of the square is under the curve. Some choices will lead to more accurate estimates than others for a given curve. What choice would you make?

Again, the quarter circle had unit radius so our outer square is 1 by 1. Since eight of the 16 squares are filled, the total shaded area is 8/16.

How can such a grid of red and white squares yield an estimate of π? In the grid above, notice that 8/16 or 1/2 of the area is shaded red. This is also an approximation to the area of the quarter circle. So, 1/2 is our current approximation to π/4. So, π/4 ≈ 1/2. Solving for π we see that π ≈ 4*(1/2) = 2. Goodness, not a great estimate! Using more squares will lead to less error and a better estimate. For example, imagine using the grid below:

Where’s the chocolate? Rather than shading a square, we will place a milk chocolate chip on a square we would have colored red and a white chocolate chip on a region that would have been white. To begin, the six by six grid on the left becomes the chocolate chip mosaic we see on the right, which uses 14 white chocolate of the total 36 chips. So, our estimate of π is 2.4444. We are off by about 0.697.

Next, we move to an 11 by 11 grid of chocolate chips. If you count carefully, we use 83 milk chocolate chips of the 121 total. This gives us an estimate of 2.7438 for π, which correlates to an error of about 0.378.

Finally, with the help of public school teachers in my seminar Math through Popular Culture for the Charlotte Teachers Institute, we placed chocolate chips on a 54 by 54 grid. In the end, we used 2232 milk chocolate chips giving an estimate of 3.0617 having an error of 0.0799.

What do you notice is happening to the error as we reduce the size of the squares? Indeed, our estimates are converging to the exact area. Here lies a fundamental concept of Calculus. If we were able to construct such chocolate chip mosaics with grids of ever increasing size, then we would converge to the exact area. Said another way, as the area of the squares approaches zero, the limit of our estimates will converge to π. Keep in mind, we would need an infinite number of chocolate chips to estimate π exactly, which is a very irrational thing to do!

And finally, here is our group from the CTI seminar along with Austin Totty, a senior math major at Davidson College who helped present these ideas and lead the activity, with our chocolatey estimate for π.

Pi Day: “Was Einstein Right?” Chuck Adler on the twin paradox of relativity in science fiction

This post is extracted from Wizards, Aliens, and Starships by Charles Adler. Dr. Adler will kick off Princeton’s Pi Day festivities tonight with a talk at the Princeton Public Library starting at 7:00 PM. We hope you can join the fun!

For more Pi Day features from Princeton University Press, please click here.


Tfts56[1]Robert A. Heinlein’s novel Time for the Stars is essentially one long in-joke for physicists. The central characters of the novel are Tom and Pat Bartlett, two identical twins who can communicate with each other telepathically. In the novel, telepathy has a speed much faster than light. Linked telepaths, usually pairs of identical twins, are used to maintain communications between the starship Lewis and Clark and Earth. Tom goes on the spacecraft while Pat stays home; the ship visits a number of distant star systems, exploring and finding new Earth-like worlds. On Tom’s return, nearly seventy years have elapsed on Earth, but Tom has only aged by five.

I call this a physicist’s in-joke because Heinlein is illustrating what is referred to as the twin paradox of relativity: take two identical twins, fly one around the universe at nearly the speed of light, and leave the other at home. On the traveler’s return, he or she will be younger than the stay-at- home, even though the two started out the same age. This is because according to Einstein’s special theory of relativity, time runs at different rates in different reference frames.

This is another common theme in science fiction: the fact that time slows down when one “approaches the speed of light.” It’s a subtle issue, however, and is very easy to get wrong. In fact, Heinlein made some mistakes in his book when dealing with the subject, but more on that later. First, I want to list a few of the many books written using this theme:

  • The Forever War, by Joe W. Haldeman. This story of a long-drawn-out conflict between humanity and an alien race has starships that move at speeds near light speed to travel between “collapsars” (black holes), which are used for faster-than-light travel. Alas, this doesn’t work. The hero’s girlfriend keeps herself young for him by shuttling back and forth at near light speeds between Earth and a distant colony world.
  • Poul Anderson’s novel, Tau Zero. In this work, mentioned in the last chapter, the crew of a doomed Bussard ramship is able to explore essentially the entire universe by traveling at speeds ever closer to the speed of light.
  • The Fifth Head of Cerberus, by Gene Wolfe. In this novel an anthropologist travels from Earth to the double planets of St. Croix and St. Anne. It isn’t a big part of the novel, but the anthropologist John Marsch mentions that eighty years have passed on Earth since he left it, a large part of his choice to stay rather than return home.
  • Larris Niven’s novel A World out of Time. The rammer Jerome Corbell travels to the galactic core and back, aging some 90 years, while three million years pass on Earth.

There are many, many others, and for good reason: relativity is good for the science fiction writer because it brings the stars closer to home, at least for the astronaut venturing out to them. It’s not so simple for her stay-at-home relatives. The point is that the distance between Earth and other planets in the Solar System ranges from tens of millions of kilometers to billions of kilometers. These are large distances, to be sure, but ones that can be traversed in times ranging from a few years to a decade or so by chemical propulsion. We can imagine sending people to the planets in times commensurate with human life. If we imagine more advanced propulsion systems, the times become that much shorter.

Unfortunately, it seems there is no other intelligent life in the Solar System apart from humans, and no other habitable place apart from Earth. If we want to invoke the themes of contact or conflict with aliens or finding and settling Earth-like planets, the narratives must involve travel to other stars because there’s nothing like that close to us. But the stars are a lot farther away than the planets in the Solar System: the nearest star system to our Solar System, the triple star system Alpha Centauri, is 4.3 light-years away: that is, it is so far that it takes light 4.3 years to get from there to here, a distance of 40 trillion km. Other stars are much farther away. Our own galaxy, the group of 200 billion stars of which our Sun is a part, is a great spiral 100,000 light-years across. Other galaxies are distances of millions of light-years away.

From our best knowledge of physics today, nothing can go faster than the speed of light. That means that it takes at least 4.3 years for a traveler (I’ll call him Tom) to go from Earth to Alpha Centauri and another 4.3 years to return. But if Tom travels at a speed close to that of light, he doesn’t experience 4.3 years spent on ship; it can take only a small fraction of the time. In principle, Tom can explore the universe in his lifetime as long as he is willing to come back to a world that has aged millions or billions of years in the meantime.

 

Was Einstein Right?

This weird prediction—that clocks run more slowly when traveling close to light speed—has made many people question Einstein’s results. The weirdness isn’t limited to time dilation; there is also relativistic length contraction. A spacecraft traveling close to the speed of light shrinks in the direction of motion. The formulas are actually quite simple. Let’s say that Tom is in a spacecraft traveling along at some speed v, while Pat is standing still, watching him fly by. We’ll put Pat in a space suit floating in empty space so we don’t have to worry about the complication of gravity. Let’s say the following: Pat has a stopwatch in his hand, as does Tom. As Tom speeds by him, both start their stopwatches at the same time and Pat measures a certain amount of time on his watch (say, 10 seconds) while simultaneously watching Tom’s watch through the window of his spacecraft. If Pat measures time ∆t0 go by on his watch, he will see Tom’s watch tick through less time. Letting ∆t be the amount of time on Tom’s watch, the two times are related by the formula

where the all-important “gamma factor” is

The gamma factor is always greater than 1, meaning Pat will see less time go by on Tom’s watch than on his. Table 12.1 shows how gamma varies with velocity.

Note that this is only really appreciable for times greater than about 10% of the speed of light. The length of Tom’s ship as measured by Pat (and the length of any object in it, including Tom) shrinks in the direction of motion by the same factor.

Even though the gamma factor isn’t large for low speeds, it is still measurable. To quote Edward Purcell, “Personally, I believe in special relativity. If it were not reliable, some expensive machines around here would be in very deep trouble”. The time dilation effect has been measured directly, and is measured directly almost every second of every day in particle accelerators around the world. Unstable particles have characteristic lifetimes, after which they decay into other particles. For example, the muon is a particle with mass 206 times the mass of the electron. It is unstable and decays via the reaction

It decays with a characteristic time of 2.22 μs; this is the decay time one finds for muons generated in lab experiments. However, muons generated by cosmic ray showers in Earth’s atmosphere travel at speeds over 99% of the speed of light, and measurements on these muons show that their decay lifetime is more than seven times longer than what is measured in the lab, exactly as predicted by relativity theory. This is an experiment I did as a graduate student and our undergraduates at St. Mary’s College do as part of their third-year advanced lab course. Experiments with particles in particle accelerators show the same results: particle lifetimes are extended by the gamma factor, and no matter how much energy we put into the particles, they never travel faster than the speed of light. This is remarkable because in the highest-energy accelerators, particles end up traveling at speeds within 1 cm/s of light speed. Everything works out exactly as the theory of relativity says, to a precision of much better than 1%.

How about experiments done with real clocks? Yes, they have been done as well. The problems of doing such experiments are substantial: at speeds of a few hundred meters per second, a typical speed for an airplane, the gamma factor deviates from 1 by only about 1013. To measure the effect, you would have to run the experiment for a long time, because the accuracy of atomic clocks is only about one part in 1011 or 1012; the experiments would have to run a long time because the difference between the readings on the clocks increases with time. In the 1970s tests were performed with atomic clocks carried on two airplanes that flew around the world, which were compared to clocks remaining stationary on the ground. Einstein passed with flying colors. The one subtlety here is that you have to take the rotation of the Earth into account as part of the speed of the airplane. For this reason, two planes were used: one going around the world from East to West, the other from West to East. This may seem rather abstract, but today it is extremely important for our technology. Relativity lies at the cornerstone of a multi-billion-dollar industry, the global positioning system (GPS).

GPS determines the positions of objects on the Earth by triangulation: satellites in orbit around the Earth send radio signals with time stamps on them. By comparing the time stamps to the time on the ground, it is possible to determine the distance to the satellite, which is the speed of light multiplied by the time difference between the two. Using signals from at least four satellites and their known positions, one can triangulate a position on the ground. However, the clocks on the satellites run at different rates as clocks on the ground, in keeping with the theory of relativity. There are actually two different effects: one is relativistic time dilation owing to motion and the other is an effect we haven’t considered yet, gravitational time dilation. Gravitational time dilation means that time slows down the further you are in a gravitational potential well. On the satellites, the gravitational time dilation speeds up clock rates as compared to those on the ground, and the motion effect slows them down. The gravitational effect is twice as big as the motion effect, but both must be included to calculate the total amount by which the clock rate changes. The effect is small, only about three parts in a billion, but if relativity weren’t accounted for, the GPS system would stop functioning in less than an hour. To quote from Alfred Heick’s textbook GPS Satellite Surveying,

Relativistic effects are important in GPS surveying but fortunately can be accurately calculated. . . . [The difference in clock rates] corresponds to an increase in time of 38.3 μsec per day; the clocks in orbit appear to run faster. . . . [This effect] is corrected by adjusting the frequency of the satellite clocks in the factory before launch to 10.22999999543 MHz [from their fundamental frequency of 10.23 MHz].

This statement says two things: first, in the dry language of an engineering handbook, it is made quite clear that these relativistic effects are so commonplace that engineers routinely take them into account in a system that hundreds of millions of people use every day and that contributes billions of dollars to the world’s commerce. Second, it tells you the phenomenal accuracy of radio and microwave engineering. So the next time someone tells you that Einstein was crazy, you can quote chapter and verse back at him!

Fantasy Physics: Should Einstein Have Won Seven Nobel Prizes?

This guest post from A. Douglas Stone is part of our celebration of all things Einstein, pi, and, of course, pie this week. For more articles, please click here. Please join Prof. Stone at the Princeton Public Library on March 14 at 6 PM for a lecture about Einstein’s quantum breakthroughs.

Cross-posted with the Huffington Post.

Thanks to RealClearScience for posting about this article!!


2014-03-12-Albert_Einstein_28Nobel291.pngAlbert Einstein never cared too much about receiving awards and honors, and that included the Nobel Prizes, which were established in 1901, at roughly the same time as Einstein was beginning his research career in physics. In 1905, at the age of 25, Einstein began his ascent to scientific pre-eminence and world-wide fame with his proposal of the Special Theory of Relativity, as well as a “revolutionary” paper on the particulate properties of light, his foundational work on molecular (“Brownian”) motion, and finally his famous equation, E = mc2. In 1910, he was first nominated for the Prize and was nominated many times subsequently, usually by multiple physicists, until he finally won the 1921 Prize (awarded in 1922). Surprisingly, he did not win for his most famous achievement, Relativity Theory, which was still deemed too speculative and uncertain to endorse with the Prize. Instead, he won for his 1905 proposal of the law of the photoelectric effect—empirically verified in the following decade by Robert Millikan—and for general “services to theoretical physics.” It was a political decision by the Nobel committee; Einstein was so renowned that their failure to select him had become an embarrassment to the Nobel institution. But this highly conservative organization could find no part of his brilliant portfolio that they either understood or trusted sufficiently to name specifically, except for this relatively minor implication of his 1905 paper on particles of light. The final irony in this selection was that, among the many controversial theories that Einstein had proposed in the previous seventeen years, the only one not accepted by almost all of the leading theoretical physicists of the time was precisely his theory of light quanta (or photons), which he had used to find the law of the photoelectric effect!

In keeping with his relative indifference to such honors, Einstein declined to attend the award ceremony, because he had previously committed to a lengthy trip to Japan at that time and didn’t feel it was fair to his hosts to cancel it. Moreover, when the Prize was officially announced and the news reached him during his long voyage to Japan, he neglected to even mention the Prize in the travel diary he was keeping. He had taken one practical note of it however, in advance. When he divorced his first wife, Mileva Maric in 1919, he agreed to transfer to her the full prize money, a substantial sum, in the form of a Trust for the benefit of her and his sons, should he eventually win.

However, while Einstein himself barely dwelt at all on this honor, it is an interesting exercise to ask how many distinct breakthroughs Einstein made during his productive research career, spanning primarily the years 1905 to 1925, that could be judged of Nobel caliber, when placed in historical context and evaluated by the standards of subsequent Nobel Prize awards. Admittedly, this analysis has a bit in common with fantasy sports, in which athletes are judged and ranked by their statistical achievements and arguments are made about who was the GOAT (“greatest of all time”). Well, why not spend a few pages on this guilty pleasure, at least partly in the service of illuminating the achievements of this historic genius, even if Einstein would not have approved?

Let’s start with the Prize he did receive, which was absolutely deserved, if the committee had had the courage to write the citation, “for his proposal of the existence of light quanta.” The law of the photoelectric effect, which they cited, only makes sense if light behaves like a particle in some important respects, and that is what he proposed in 1905. This proposal came at a time when the wave theory of light was absolutely triumphant and was even enshrined in a critical technology: radio. Not a single physicist in the world was thinking along similar lines as Einstein, nor were all of the important theorists convinced by his arguments for two more decades. Nonetheless, the photon concept was unambiguously confirmed in experiments by 1925, and now is considered the paradigm for our modern quantum theory of force-carrying particles. It is the first in a family of particles known as bosons, most recently augmented by the (Nobel-winning) discovery of the Higgs particle. So the photon is a Nobel slam dunk.

We can move next to two more “no-brainers,” the two theories of relativity, the Special Theory, proposed in 1905, and the General Theory, germinated in 1907 and completed in 1915. These are quite distinct contributions. The Special Theory introduced the Principle of Relativity, that the law of physics must all be the same for bodies in uniform relative motion. An amazing implication of this statement is that time does not elapse uniformly, independent of the motion of observers, but rather that the time interval between events depends on the state of relative motion of the observer. Einstein was the first to understand and explain this radical notion, which is now well-verified by direct experiments. Moreover, Einstein’s concept of “relativistic invariance” is built into our theory of the elementary particles, and so it has had a profound impact on fundamental physics. However, here it must be noted that the equations of Special Relativity were first written down by Hendrik Lorentz, the great Dutch physicist whom Einstein admired the most of all his contemporaries. Lorentz just failed to give them the radical interpretation with which Einstein endowed them; he also failed to notice that they implied that energy and mass were interchangeable: E = mc2. There are also a few votes out there for the French mathematician, Henri Poincare, who enunciated the Principle of Relativity before Einstein, but I can’t put him in the same category as Lorentz with regard to this debate. Einstein would have been happy to share Special Relativity with Lorentz, so let’s split this one 50-50 between the two.

General Relativity on the other hand is all Albert. Like the photon, no one on the planet even had an inkling of this idea before Einstein. Einstein realized that the question of the relativity of motion was tied up with the theory of Gravity: that uniform acceleration (e.g. in an elevator in empty space) was indistinguishable from the effect of gravity on the surface of a planet. It gave one the same sense of weight. From this simple seed of an idea arose arguably the most beautiful and mathematically profound theory in all of physics, Einstein’s Field Equations, which predict that matter curves space and that the geometry of our universe is non-Euclidean in general. The theory underlies modern cosmology and has been verified in great detail by multiple heroic and diverse experiments. The first big experiment, which measured the deflection of starlight as it passed by the sun during a total eclipse, is what made Einstein a worldwide celebrity. This one is probably worth two Nobel prizes, but let’s just mark it down for one.

Here we exhaust what most working physicists would immediately recognize as Einstein’s works of genius, and we’re only at 2.5 Nobels. But it is a remarkable fact that Einstein’s work on early atomic theory, what we now call quantum theory, is vastly under-rated. This is partially because Einstein himself downplayed it due to his rejection of the final version of the theory, which he dismissed with the famous phrase, “God does not play dice.” But if one looks at what he actually did, the Nobels keep piling up.

The modern theory of the atom, quantum theory, began in 1900 with the work of the German physicist, Max Planck, who, in what he called “an act of desperation,” introduced into physics a radical notion, quantization of energy. Or so the textbooks say. This is the idea that when energy is exchanged between atoms and radiation (e.g. light), it can only happen in discrete chunks, like a parking meter that only accepts quarters. This idea turns out to be central to modern atomic physics, but Planck didn’t really say this in his work. He said something much more provisional and ambiguous. It was Einstein in his 1905 paper—but then much more clearly in a follow-up paper on the vibrations of atoms in solids in 1907—who really stated the modern principle. It is not clear if Planck himself accepted it fully even a decade after his seminal work (although he was given credit for it by the Nobel Prize committee in 1918). In contrast, Einstein boldly applied it to the mechanical motion of atoms, even when they are not exchanging energy with radiation, and stated clearly the need for a quantized mechanics. So despite the textbooks, Einstein clearly should have shared Planck’s Nobel Prize for the principle of quantization of energy. We are up to 3.0 Nobels for Big Al.

The next one in line is rarely mentioned. After Einstein proposed his particulate theory of light in 1905, he did not adopt the view that light was simply made of particles in the ordinary sense of a localized chunk of matter, like a grain of sand. Instead, he was well aware that light interfered with itself in a similar manner to water waves (a peak can cancel a trough, leading to no wave). In 1909, he came up with a mathematical proof that the particle and wave properties were present in one formula that described the fluctuations of the intensity of light. Hence, he announced that the next era of theoretical physics would see a “fusion” of the particle and wave pictures into a unified theory. This is exactly what happened, but it took fourteen years for the next advance and another three (1926) for it all to fall into place. In 1923, the French physicist Louis de Broglie hypothesized that electrons, which have mass (unlike light) and were always previously conceived of as particles, actually had wavelike properties similar to light. He freely admitted his debt to Einstein for this idea, but when he got the Nobel Prize for “wave-particle” duality in 1929, it was not shared. But it should have been. Another half for Albert, at 3.5 and counting.

From 1911 to 1915 Einstein took a vacation from the quantum to invent General Relativity, which we have already counted, so his next big thing was in 1916 (he didn’t leave a lot of dead time in those days). That was three years after Niels Bohr introduced his “solar system” model of the atom, where the electrons could only travel in certain “allowed orbits” with quantized energy. Einstein went back to thinking about how atoms would absorb light, with the benefit of Bohr’s picture. He realized that once an atom had absorbed some light, it would eventually give that light energy back by a process called spontaneous emission. Without any particular event to cause it, the electron would jump down to a lower energy orbit, emitting a photon. This was the first time that it was proposed that the theory of atoms had such random, uncaused events, a notion that became a second pillar of quantum theory. In addition, he stated that sometimes there was causal emission, that the imposition of more light could cause the atom to release its absorbed light energy in a process called stimulated emission. Forty-four years later, physicists invented a device that uses this principle to produce the purest and most powerful light sources in nature, the LASER (Light Amplified by Stimulated Emission of Radiation). The principles of spontaneous and stimulated emission introduced by Einstein underlie the modern quantum theory of light. One full Prize please—now at 4.5.

After that 1916-1917 work, Einstein had some health problems and became involved in political and social issues for a while, leading to a Nobel batting slump for a few years. (He did still collect some hits, like the prediction of gravitational waves (a double) and the first paper on cosmology and the geometry of the Universe using General Relativity (a triple)). But he came out of his slump with a vengeance in 1924 when he received a paper out of the blue from an unknown Indian, physicist Satyendranath Bose. It was yet another paper about particles of light, and although Bose did not state his revolutionary idea very clearly, reading between the lines, Einstein detected a completely new principle of quantum theory, the idea that all fundamental particles are indistinguishable. This is the standard terminology in physics, but it is actually very misleading. Here, indistinguishability is not the idea that humans can’t tell two photons apart (like identical twins); it is the idea that Nature can’t tell them apart, and in a real sense interchanging the two photons doesn’t count as a different state of light.

When Bose applied this principle to light he didn’t get anything radically new; it was just a different way of thinking about Planck’s original discovery in 1900. But Einstein then took the principle and applied it to atoms for the very first time, with amazing results. He discovered that a simple gas of atoms, if cooled down sufficiently, would cease to obey all the laws that physicists and chemist had discovered for gases over the centuries, and to which no exception had ever been found. Instead, all gases should behave like a weird liquid or super-molecule known as a Bose-Einstein condensate. But remember, Bose had no clue this would happen; he didn’t even try to apply his principle to atoms. It turns out that Einstein condensation underlies some of the most dramatic quantum effects, such as superconductivity, which is needed to make the magnets in MRI machines and has been the basis for five Nobel Prizes. No knowledgeable physicist would dispute that Einstein deserved a full Nobel Prize for this discovery, but I am sure that Einstein would have wanted to share it with Bose (who never did receive the Prize).

So we are at 5.0 “units” of Nobel Prize but seven trips to Stockholm. And this leaves out other arguably Nobel-caliber achievements (Brownian motion as well as the Einstein-Podolsky-Rosen effect, which underlies modern quantum information physics). And wait a minute—when someone shares the Nobel Prize do we refer to them as a “half- Laureate”? No way. Even scientists who get a “measly” third of a Prize are Nobel Laureates for life. Thus by the standard we apply to normal humans, Einstein deserved at least seven Nobel Prizes. So next time you make your fantasy scientist draft, you know who to take at number one.


Stone_EinsteinQuantum_jktA. Douglas Stone is author of Einstein and the Quantum: The Quest of The Valiant Swabian.