METAPHYSICS – The Evolution of Evolution

The Evolution of Evolution”*


                                   Tony Rothman


            If there is a grand truth about science, it is that science is a collective enterprise.  Researchers trade ideas, borrow any that come their way.  Colleagues and rivals are indistinguishable, borrowing becomes what in other circles goes by the name of theft; opponents are generally recognized, sometimes not, more often in the vast flood of papers, lost.  Vanishing few are the discoveries made by a single individual.  Strange, then, that even today the media so often portrays the great advances of science as springing fully formed from the brow of towering geniuses who work in splendid isolation.

            No better example can be found than in the current celebration of Charles Darwin’s two-hundredth anniversary.  Certainly no scientific theory of the last four hundred years has had as much impact on human thought and culture as evolution.  Yet, the shallowness of the reportage merely highlights the fact that more than any other theory, evolution has been deprived of a public genealogy.   

             Some schoolchildren do know that the famously dilatory Charles Darwin was spurred to complete his Origin of Species upon receipt of an essay by Alfred Russell Wallace containing “exactly the same theory as mine.”  Less well known is that Charles was indebted for many of his ideas to his own grandfather, Erasmus, whom he conspicuously failed to acknowledge.

            Born in Nottinghamshire in 1731, Erasmus Darwin was the seventh child of Robert and Elizabeth.  He studied medicine at Cambridge and Edinburgh, became the most successful physician in England, meanwhile inventing a plethora of devices that included a horse-drawn carriage whose steering mechanism is that used by automobiles today.  His experiments in electro-shock therapy stimulated not only the livers of his patients but the imagination of Mary Shelley and her writing of Frankenstein.  Friend of Ben Franklin, Erasmus was the first to understand cloud formation, made contributions to geology, was an early convert to Lavoisier’s theory of combustion and became, to top it off, the leading English poet of the day.  He also proposed the theory of evolution.

            As early as 1770 fossils unearthed during the construction of the Harecastle tunnel convinced Erasmus that life as we know it was descended from a common ancestor.  To the family coat of arms, three scallop shells, he even added the motto E conchis omnia (“everything from shells”).  The first of his ideas he published in two long and wildly successful poems, The Loves of the Plants and The Economy of Vegetation, which not only make it clear that he believed in a common ancestor of man, but in something like the big bang theory.  His ideas are further elucidated in the posthumously published Temple of Nature and his 1000-page treatise, the Zoonomia of 1794. 

While natural selection may not be quite announced in the Zoonomia, Erasmus observes that males of certain bird species are armed with claws which the females lack and concludes these weapons cannot be for fighting external enemies; they are for fighting “for the exclusive possession of the females.“  He goes on to say, “The final cause of this contest amongst the males seems to be, that the strongest and most active animal should propagate the species, which should thence become improved.”  Erasmus also reiterates his belief that all creatures descended from common “filaments,” or molecules.  In the most celebrated passage of the Zoonomia he asks

…would it be it be too bold to image that in the great length of time since the earth began to exist, perhaps millions of ages before the commencement of the history of mankind, would it be too bold to image that all warm-blooded animals have arisen from one living filament, which THE GREAT FIRST CAUSE endued with animality, with the power of acquiring new parts, attended with new propensities, directed by irritations, sensations, volitions, and associations; and thus possessing the faculty of continuing to improve by its own inherent activity, and of delivering down those improvements by generation to its posterity, world without end.


Erasmus’s willingness to accept geological timescales for the age of the Earth was incredibly foresighted.  In this he was more modern than Charles, who had to contend with physicists who refused to accept a geologic age for the Earth. (It goes without saying that here Erasmus outdistances creationists of our own era.) 

On the other hand, his idea that species might mutate through “volitions” does seem to veer in the direction of Lamarck, who followed a few years later and believed that characteristics acquired during the lifetime of an organism ( a giraffe stretching its neck to obtain food) would be inherited by its offspring.  Erasmus, though, apparently had more general mechanisms of heredity in mind, understanding that the differentiation of birdbeaks “had been gradually produced during many generations by the perpetual endeavor of the creatures to supply the want of food.”

            To some extent it may indeed have been due to a confusion with Lamarck that Darwin’s ideas were discredited, though by the time Zoonomia was published, England was soon to go to war with France and nobody wanted to hear about the brotherhood of man or atheistic theories.   

   Why Charles Darwin, born in 1809, was so reluctant to acknowledge his own grandfather is a matter for psychologists.  In his Autobiography, Charles overflows with praise of George Lyell, about whose Principles of Geology he says, “The science of Geology is enormously indebted to Lyell–more so, as I believe, than to any other man who ever lived.”  As for his grandfather, however, we know that while a student at the University of Edinburgh Charles studied the Zoonomia closely.  Yet the only mention of it in the Autobiography comes during a conversation with Robert Grant, a lecturer at Edinburgh who tried to convert Charles to the views of Lamarck and Erasmus.  “I listened in silent astonishment, and as far as I can judge, without any effect on my mind.  I had previously read the Zoonomia of my grandfather, in which similar views are maintained, but without producing any effect on me.  Nevertheless, it is probable that the hearing rather early in life such views maintained and praised may have favored my upholding them under a different form in my Origins of Species.  At this time I admired greatly the Zoonomia; but on reading it a second time after an interval of ten or fifteen years, I was much disappointed; the proportion of speculation being so large to the facts given.”

In his defense, Charles took pains to point out that Erasmus was the wild theorizer and he the meticulous observer.  “I look at a strong tendency to generalise as an entire evil,” Charles once wrote to a close friend.  Indeed, perhaps the main reason The Origin of Species of 1859 occupies its deserved position in the canon of great and influential books is that it overflows with facts and observations.  Yet Origin also overflows with hundreds of names.  Not one is Erasmus. 

 At the age of seventy, Charles had a change of heart and penned a biography of his grandfather, publishing it as a 127-page preface to an 86-page essay on Erasmus by Dr. Ernst Krause of Germany.  The attempt backfired.   Not only did it touch off a lifelong feud with family friend Samuel Butler, author of Erewhon, who correctly accused Darwin of gross tampering with Krause’s essay as to minimize Erasmus’ contributions, but Charles’ own essay as published damns Erasmus with faint praise.  Of his evolutionary ideas Charles says almost nothing, deferring that task to Krause.  It turns out that Charles had allowed his daughter Henrietta, who hated everything Erasmus stood for, to censor the work.  All favorable passages, including the final peroration in which Charles praised Erasmus for his generosity and prophetic spirit, were cut.  It was only in 1958 that Darwin’s granddaughter Nora Barlow restored some passages and appended correspondence relating to the controversy.  Well, as Charles put it to T.H. Huxley, “the history of error is unimportant.”

Erasable.  Anthropologist Loren Eiseley* has pointed out that Origin  was already in proof when Lyell himself caught Darwin in the act of ignoring Lamarck.  In the first edition, Darwin omitted “by inadvertence” Wallace’s name in the final summary, despite the historic joint announcement and publication of their papers.  Of course, as Darwin told Lyell, he “never got a fact or idea” from Lamarck.  One wonders what Charles would have said about Edward Blyth.

In fact he said nothing.  Blyth (1810-1873) was a friend of Darwin’s and a pioneering naturalist who wrote major papers on heredity and zoology for The Magazine of Natural History.  In those papers Blyth clearly saw the importance of variation and sexual selection, although he mistakenly interpreted natural selection as a force tending to stabilize species, rather than to diversify them.  In 1835 Blyth wrote, “…as in the brute creation, by a wise provision, the typical characters of a species are, in a state of nature, preserved by those individuals chiefly propagating, whose organisation is the most perfect, and which, consequently, by their superior energy and physical powers, are enabled to vanquish and drive away the weak and sickly, so in the human race degeneration is, in great measure, prevented by the innate and natural preference which is always given to the most comely…”

   Could Darwin have been unaware of Blyth’s work?  Well, they were friends.  Furthermore, by tracing Darwin’s own footnotes Eiseley has been able to determine that Darwin was in possession of and read the same issues of Natural History in which Blyth’s papers appeared.  Yet, in Origin Darwin repeats many of Blyth’s assertions almost verbatim without acknowledgement and Eisley makes a convincing case that it was Blyth, not Thomas Malthus as Darwin claimed, who started him on the road to natural selection. 

If any other precursors to evolution are now forgotten more than Edward Blyth they must be Sir William Lawrence (1783-1867) and Patrick Matthew (1790-1874).*  Lawrence is forgotten intentionally because his book Natural History of Man, published in 1819, came to conclusions so distasteful to those times (and in part to ours) that it was suppressed.  A professor at the Royal College of Surgeons and called by T.H. Huxley “one of the ablest men whom I have known,” Lawrence did not arrive at natural selection and believed that species were fixed.  Nevertheless, he emphatically rejected Lamarck’s dominant notion of acquired characteristics and stated that  1) The physical, mental and moral differences in the races of man are hereditary; 2) The different races have arisen through mutations; 3) Sexual selection has improved the beauty of advanced races and governing classes; 4) “Selection and exclusions” are the means of change and adaptation; 5) The study of man as an animal is the only proper foundation for research in medicine, morals and even in politics.  “The diversification of physical and moral endowments which characterize the various races of man,” he wrote, “must be entirely analogous in their nature, causes, and origin, to those which are observed in the rest of the animal kingdom and therefore must be explained on the same principles.”  The origins of man “cannot be settled by an appeal to the Jewish scriptures.”

   Lawrence’s insistence on treating man as an animal utterly offended England of his day.  The Church was outraged, Lawrence was repudiated by the leaders of his profession.  The Lord Chancellor refused to allow copyright for the book on the grounds that it contradicted Scripture.  So crushed was Lawrence by the affair that he withdrew Natural History from circulation and clammed up for the rest of his life.  Nevertheless, both Darwin and Wallace read Lawrence.  Darwin does not seem to have been impressed but Wallace was.  Indeed he may well have taken Lawrence’s ideas to their logical conclusion in postulating natural selection.  We also know that Blyth cited Lawrence as a major source and in that way he indirectly influenced Darwin.

When Charles learned of Patrick Matthew’s book, On Naval Timber and Aboriculture, published in 1831, he himself wrote, “Mr. Patrick Matthew… gives precisely the same view on the origin of the species…as that propounded by Mr. Wallace and myself…He clearly saw…the full force of the principle of natural selection.”

Certainly Matthew, writing twenty-eight years before Darwin, states, “…As the field of existence is limited and pre-occupied, it is only the hardier, more robust, better suited to circumstance individuals, who are better able to struggle forward to maturity, these inhabiting only the situations to which they have the superior habit of adaptation and greater power of occupancy than any other kind …This principle is in constant action…those only come forward to maturity from the strict ordeal by which Nature tests their adaptation to her standard of perfection and fitness to continue their kind of reproduction.”  Matthew however, in contrast to Darwin, was a catastrophist and viewed evolution as being spurred on by geologic upheavals, interrupted by eons of stability (in this he seems to share certain ideas of modern theories.)

Little is known about Matthew and it does not appear that Darwin knew of his work before publication of Origin of the Species.  Nevertheless, while with hindsight we tend to view naturalists such as Lawrence and Matthew as merely highlighting Darwin’s correctness, in their own time they did not consider themselves Darwin’s precursors.  They should be credited for their own accomplishments and, as we celebrate Darwin’s two hundredth, let us remember that science would look very different if shadows never haunted the wings.


* A different version of this essay appeared in Everything’s Relative and Other Fables from Science and Tecnology (Wiley, Hoboken, NJ, 2003).

* Loren Eiseley, “Charles Darwin and Edward Blyth,” Proceedings of the American Philosophical Society 103, 94-158 (1959). 

* See, e.g., Kentwood Wells, “Sir William Lawrence: a study of  pre-Darwinian ideas on heredity and inheritance,” and “The historical context of natural selection: The case of Patrick Matthew,” J. Hist. of Biology 4, 319-361 (1973); 6, 225-258 (1973) and references therein.


Apocalypse CERN

Tony Rothman


Between those who have watched the Large Hadron Rap on YouTube, and regard the Large Hadron Collider at CERN as the all-time greatest inspiration to pop music, and those who await the imminent destruction of earth world by the black holes that the LHC is certain to create, everyone on the planet is accounted for.


You know what I’m talking about: The Large Hadron Collider, CERN’s giant particle accelerator on the border between France and Switzerland, may create ultra microscopic black holes capable of swallowing the world in a matter of months, putting an end to all misery.  Long before the black-hole flap hit the New York Times, I was party to several internet discussions about the matter and contacted by a disciple of Otto Rossler, a European chemist who opposes the LHC because of the black-hole danger.  The disciple, an artist, urgently requested that I come out publically in support of Rossler and denounce the LHC as a threat to mankind.  Since then, Rossler has become the most visible opponent of the LHC, appearing in European newspaper, magazine, television, radio and YouTube interviews.   He claims that according to his calculations the black holes produced by the LHC are indeed a threat to earth.  Other LHC opponents, Walter Wagner and Luis Sancho, went so far to file suit in Hawaii to prevent the machine from being turned on.  The suit was dismissed in October, 2008, on jurisdictional grounds.


Let me say at once that, unlike most physicists, I have read Rossler’s paper and judge it to be a crackpot piece of work.  Rossler is an eminent chemist, and perhaps for that reason the European media has not dismissed him, but the paper belies his public claims that his calculations prove something.   He has calculated nothing at all; he has merely made a series of misinformed and incorrect statements about Einstein’s general theory of relativity. 


What interests me is not whether Rossler is correct—he isn’t.  Serious and comprehensive LHC risk assessments have been carried out; they rightly conclude that the potential threat posed by a black hole is about at the same probability of an electron turning into a dinosaur, which Sarah Palin alone might credit.  What interests me more is the attitude displayed by the media to the affair—and the response of scientists.


If the media had been interested in truth more than controversy, Rossler never would have been afforded a platform.  For its part, however, the physics community hasn’t helped.  In general its attitude has been typical: an arrogant dismissal of public concern.  Few physicists have bothered to read Rossler’s paper and fewer have countered his assertions in public.  They can’t be bothered and the reason they can’t be bothered can be found in one of the three knee-jerk responses a physicist makes to any claim: “It’s wrong,” “It’s trivial,” or “I did it first.” 


When the topic of the LHC black holes first arose every physicist I know, including several world-class ones, immediately enlisted cosmic-rays as a counterargument: Cosmic ray protons of far higher energy than the LHC will produce have been bombarding the earth for billions of years.  If a black hole were going to be produced and swallow the earth, it already would have, but here we are. 


Unfortunately the reflexive response, designed to end discussion, only played into the hands of the alarmists.  Microscopic black holes produced by cosmic rays would be moving at high velocity and pass harmlessly through a planet-sized body, while the LHC black holes would be nearly at rest, potentially capable of causing much greater damage.  A more detailed rebuttal must thus be made (and has been), but if a physicist condescendingly espoused this argument as an expert witness at a trial, the prosecution would ensure that he left the stand with egg on his face.


Physicists do not want to deal with public concerns because they believe that the LHC belongs to them and that the public is incapable of understanding the issues.  But as the judge in the Hawaii case stated, matters such as the safety of the LHC extend far beyond the physics community.  She dismissed the case, but not on the merits.


It is perhaps time that some permanent and impartial mechanism be established to deal with scientific safety issues.  The LHC is far from the first scientific project to raise public alarm.  Walter Wagner himself filed a similar lawsuit in 2000 to prevent the Relativistic Heavy Ion Collider at Brookhaven National Laboratory from being turned on.   Throughout the 1960s and 1970s the public frequently protested recombinant DNA and low-level electromagnetic radiation.  Nuclear power has been a constant source of public protest and more recently wind farms.  We must expect that that in the years to come scientific safety issues will arise ever more frequently.


Past controversies have been dealt with on an ad-hoc basis, either by local communities, or by scientific panels set up by the National Research Council or even by the organizations directly involved in the controversies, which was the case with the LHC.  Such organizations have an obvious vested interest in the outcome and in the public eye are not above suspicion.  Nevertheless, it is clear that lawsuits are not the best mechanism to resolve such controversies.  When the New York Times article appeared I asked a friend, Giovanni Bonello, a justice on the European Court of Human Rights, what mechanism should be put in place to resolve scientific safety disputes.  He emphatically replied not the courts: in matters involving international organizations, such as CERN, courts are perpetually hamstrung over jurisdictional matters.  Additionally there is the highly technical nature of the issues, forcing judges to sit through weeks of expert testimony, which they do not understand.

A more reasonable model might be Doctors Without Borders.  It is a private, transnational, nongovernmental volunteer organization, which observes strict neutrality.   It is, certainly, easier for physicians to observe neutrality in war zones than scientists in safety disputes.  Here the issues are technical and physicists, not artists, will necessarily be called upon to evaluate risks such as the LHC.  Nevertheless, one might avoid frivolous lawsuits if laboratories voluntarily submitted to arbitration by an impartial organization, and there are enough competent scientists unattached to the involved projects to supply a pool of experts.  The main problem is that scientists would rather pursue their own research, as inconsequential as it may be, rather than enlist in a project for the greater good.


The scientist’s reluctance to lift one’s head from the sand, which I share, and the same “it’s wrong” attitude of the physics community has, apart from safety issues, led to a paradoxical situation.  In order for the putative LHC black holes to destroy the planet, they must survive long enough to do the job.  Stephen Hawking’s famous result, that black holes radiate away their mass, has led most physicists to believe that any black-holes created by the LHC would evaporate after approximately a trillionth of a trillionth of a second, far too short a time to do any damage.  Nevertheless, in recent years it has become clear that Hawking’s calculation made a number of unjustified assumptions and, for example, did not take into account the effect of the black hole radiation itself on the spacetime in which it resides.


A few years ago, Grigory Vilkovisky, a Russian physicist, published a trilogy of papers claiming that if one properly took this effect into account, black holes would evaporate only about half their mass; the rest would remain.  If Vilkovisky’s conclusion is correct, it would not only radically alter our ideas of black-hole physics, but would have a tremendous impact on our ideas about dark matter and would pave way for the possibility that any black holes created at CERN might actually survive long enough to be taken seriously.


Unlike Rossler, Vilkovisky is a highly res[ected physicist.  I acknowledge that I have known him for thirty years.  Bryce DeWitt, the “father of quantum gravity,” regularly referred to Grisha as “the smartest young man in the Soviet Union,” and I consider him a physicist of a similar caliber to Hawking.  He has had an unfortunate career, is something of a recluse and currently in bad health.  In contrast to Rossler’s paper, however, Vilkovisky’s have been published through standard channels in a peer-reviewed journal.  They consist of calculations that a physicist can in principle follow.  They may be wrong, and most of Vilkovisky’s colleagues with whom I have spoken think they probably are.  On the other hand, none of these scientists has read them.


So: Vilkovisky’s papers, which deserve more attention than Rossler’s, are long and difficult and have met with no response.  But conceivably they are right.  Rossler, whose papers are easier to read and nonsensical, has become the focus of media, even scientific attention.  Clearly this picture needs to be corrected, and the physics community is the only one that can do it.  





METAPHYSICS – Al and Marilyn

Al and Marilyn*

Tony Rothman


Film lovers over forty may remember the scene in Nicholas Roeg’s 1985 Insignificance where “The actress,” who bears an uncanny resemblance to Marilyn Monroe, explains the theory of relativity to “The Professor,” whose wild hair leaves no doubt as to his identity.  One wonders whether Roeg could make his film today with impunity, because Albert and Marilyn have more in common than relativity; they have in common celebrity.


Several years ago I had a book in press, Everything’s Relative and Other Fables From Science and Technology.  Given the title, the publisher’s house artist not unreasonably designed a cover that included a photographic image of Albert Einstein.  The publisher (Wiley) had properly licensed the photo from Bill Gates’ firm Corbis.  One would have thought that would end the matter.


One would have thought.  Six weeks before publication I received a frantic email from the editor.  Albert was to be stricken from the cover.  Why?  For fear of being sued by the “Einstein estate.”  To a physicist who grew up miles from Einstein’s home in Princeton, New Jersey, the phrase “Einstein estate” rang oddly.  Albert died in 1955; his children and literary secretary are all dead.  What Einstein estate?  The editor didn’t know;  evidently the attorneys did and despite my strenuous protests, Einstein vanished to be replaced by a locomotive and E=mc2. 


A little investigation revealed that indeed no “Einstein estate” existed, but that the Hebrew University of Jerusalem and Princeton University Press owned the rights to all of Einstein’s writings which were not copyrighted by anyone else.  Moreover, HUJ had authorized Beverly Hills’ Roger Richman Agency to license Einstein’s image for promotional purposes and to “prevent unauthorized use of the likeness and image of Albert Einstein.”  Yes, a Hollywood agency specializing in protecting the rights of movie stars claimed to have exclusive rights to Einstein (and Sigmund Freud). 


It wasn’t clear that Richman could prevent the use of a legally obtained photograph on a book jacket, so I phoned the agency and asked point blank whether use of Einstein’s photo on a book fell under its “jurisdiction.”  The spokesman asked if the book was about Einstein.  “In part,” I answered, to which he replied that I should follow the publisher’s attorneys’ advice, whatever that happened to be.  I interpreted this to mean he didn’t know.  While the cover was still in flux, I had the editors send Richman the chapters devoted to Einstein.  What was there to lose?  Eventually Richman answered: “We do not wish to participate in the publishing of the book entitled Everything’s Relative,” which presumably translates as, “we’re not sure we can sue you now; just wait.”


At issue, you see, is what has become known as the “right of publicity,” a.k.a. “right of celebrity.”  A celebrity is entitled to financial gain from use of his or her image or likeness, and photographers must obtain releases before publishing such images.  Complicating matters is that state, not federal, law governs publicity.  In most common-law states publicity rights die with the celebrity, but in other states the rights are “descendible.”  In California, post-mortem rights extend for seventy years (Hollywood) and in Tennessee, in perpetuity (Elvis).  New Jersey, where Einstein resided, is a common-law state, which apparently means that New Jersey is so short of celebrities that no one has bothered writing down any statutes.  There are, however, precedents.  The most widely cited case took place in 1984 when a New Jersey court held that an Elvis impersonator violated the rights of Elvis Presley Enterprises.  Celebrities and their rights do exist in NJ; the state, though, has not yet established a duration for prohibition on impersonating the King.


What of the Person of the Century?  Several lawyers gave me several opinions: 1) The use of Einstein photos on a book jacket could be considered advertising (magazine covers are) and Richman might stake a claim;  2)  As long as the book was about Einstein (several chapters) there should be no problem; 3) As long as the copyright was cleared from Corbis we were cleared too.  A representative from Corbis said that a photo used inside the book was safer than on the cover.


Confused?  Welcome.  To muddy the waters further, there is the question of jurisdiction.  Richman, in an attack on a website posting Einstein quotations, acknowledged that New Jersey law is the relevant one, but one lawyer thought that California law might apply because California is the location of the executor.  Actually, Jerusalem is.  I eventually queried the Einstein Archives of the Hebrew University by what law could they restrict the use of Einstein’s image and received no reply.


Richman’s tactic, of course, was one of “virtual litigation”: The agency flexes its muscle, publishers get cold feet and back off rather than fight out a real court battle.  As a result Richman and HUJ accrue a monopoly on Einstein.  In the case of my publishers it succeeded, but not without a large dose of irony: Shortly after my book appeared, the Richman agency was bought by Corbis, the firm that had authorized the use of the photo to begin with.


The battle over Einstein is a mere skirmish next to the one over Marilyn Monroe.   Originally handled by Richman, Monroe’s images until recently were licensed through the agency CMG Worldwide and Monroe’s estate, MMLLC.  The latter is controlled by Anna Strasberg, widow of Marilyn’s acting coach Lee Strasberg, to whom she willed the bulk of her property.  Strasberg and CMG have accumulated over $30 million marketing Marilyn items.


In 2004, CMG and MMLLC sued the children of four photographers who had taken photographs of Marilyn during her lifetime; the children continued to license the photos on various items, including calendars, handbags and wine bottles.  CMG, headquartered in Indianapolis, alleged that the sale of t-shirts bearing Marilyn’s image violated Indiana statutes, which recognize publicity rights for a full 100 years after death.  One inevitably wonders whether CMG’s presence in the state and the nearly infinite monopoly period were coincidental.


But the photographers’ heirs did not roll over.  They countersued in New York and California, claiming that Monroe was a New Yorker and her publicity rights expired upon death.  In May 2007 judges in both states found for the photographers on the grounds that publicity laws simply did not exist in New York, California or Indiana at the time of Monroe’s death.  Property that did not exist at the time of death cannot be transferred, including publicity rights. 


CMG and MMLLC struck back.  Since no law existed at the time in question, why not create one?  In 2007  Sheila Kuehl  (a former actress) of the California senate,  spearheaded the assembly to gut stem-cell bill SB 771 and replace its contents by a bill whose avowed purpose was to “abrogate” the NY and CA decisions.  The legislature passed the law, retroactively assigning publicity rights to celebrities who died before 1985.


The saga didn’t end there, however.  In September 2008, a NY federal judge summarily ruled in favor of the photographers’ children.  The court recognized that the California law specifically allowed retroactive transfer of publicity rights to spouses and children of the deceased but not to other beneficiaries.  The bottom line: Strasberg and CMG do not own the publicity rights to Marilyn.


That’s where things stand at the moment; stay tuned. 


So, where does all this leave us?  About the only thing that is clear is that money, and money alone, is the name of the game.  According to a lawyer quoted by Discover magazine about my case,* “Every living person has the right to protect his or her own image,” but according to other legal eagles, that right is reserved for celebrities:  In which case, if a colleague snapped a photo of me and Al debating quantum mechanics at a scientific conference, I might have to license his picture from Corbis, but he wouldn’t have to license mine.  For that matter, does Corbis own the rights to the photos in the Lotte Jacobi Archive at the University of New Hampshire, which Einstein gave to Jacobi?  With the CA law in force, if impersonating Elvis Presley is illegal, should director Roeg be sued retroactively and are playwrights henceforth banned from writing comedies about Bill Clinton?  It is left as an exercise for the reader to construct further legal absurdities.


The one other thing that is clear is that laws of the CA type are dragging us toward the “French” model, where an executor, nine times removed from the deceased, owns all the rights.  Long ago we entered the realm of the ludicrous.  If one believes the declarations page of the Dover Memoirs of Hector Berlioz, the copyright is owned by the executor of the estate of the fellow who in 1932 published a revision of the 1884 translation of the 1870 original, which appeared after the composer’s death and for which he never received a sous.  Is this to be the right of celebrity, which unlike a memoir is not even the creation of an artist but conferred upon the personality by the public?  The thought that a century from now a Paris Hilton impersonator could be sued by her estate is not only peculiar but frightening.



* An earlier version of this post was published by US #1 newspaper, Sept. 14, 2005.

* Discover magazine, online version, March 5, 2008.

METAPHYSICS – Japanese Temple Geometry, by Tony Rothman

Japanese Temple Geometry*

Tony Rothman


            With the American daily news having transformed itself into an incessant trumpeting of economic Armageddon, it might seem the height of escapism, not to mention irrelevance, to contemplate a vanished Japanese mathematical tradition known as temple geometry.  Escapism it surely is; irrelevant, maybe.

            In the year 1600 Tokugawa Ieyasu defeated his rival warlords at the battle of Sekigahara and three years later became shogun of Japan.  The Tokugawa family ruled Japan for the better part of three hundred years.  Immediately upon Ieyasu’s death in 1616, his successors, having had their fill of Jesuits and Franciscans, began the systematic expulsion of foreigners from the country.  By 1640, all Western missionaries and merchants had been forcibly evicted, and it became a crime punishable by death for Japanese to travel abroad.  For the next two centuries, Japan’s sole contact with the West was through the Dutch East India Company, which was permitted to occupy a small island called Deshima  in Nagasaki harbor.

            Drastic without a doubt, nonetheless the Tokugawa’s isolationist policies produced a brilliant cultural renaissance.  The late seventeenth century saw the flowering of many of the traditions for which Japan is famous: tea ceremonies, garden architecture, haiku, no and kabuki drama, the ukiyo-e or “floating world” prints.

            Histories of the Tokugawa shogunate do not mention that the seventeenth century also witnessed the rise of a homegrown mathematics, one largely uninfluenced by Western developments, such as the invention of calculus by Newton and Leibnitz.  The samurai warriors had been pacified, many became highly educated government functionaries and, to supplement their meager salaries, some moonlighted as math teachers.  They fanned out into the countryside to small private schools teaching reading, writing and ‘rithmatic—and geometry.

            It had long been a Japanese tradition to hang votive tablets in Buddhist temples and Shinto shrines.  Such tablets might display a picture of a horse, the image being a substitute for the offering of a genuine horse to the temple.  But around 1660—we don’t know the exact date—a strange and wonderful custom emerged.  Lovers of mathematics began to hang wooden tablets engraved with geometry problems under the eaves of religious buildings.  These sangaku, a Japanese word that literally means “mathematical tablet,” were often large—a meter wide and three or four long.  Typically a sangaku contained a dozen problems with descriptions of each, diagrams and answers—but rarely full solutions.  The tablets themselves are works of art.  One of the most beautiful is framed by a pair of carved dragons, another is gilded; all are adorned by attractive, brightly colored figures drawn to resemble fans or kites, or whatever everyday object inspired the particular problem.

During the Tokugawa period, thousands of tablets appeared over Japan.  Only 910 survive today.  We are certain that over 1700 have been lost or destroyed and from mentions of tablets in old books, the 910 could represent as few as two percent of the original number.  From the inscriptions, we know that people from all walks of life created sangaku—samurai mathematicians, merchants, farmers, women and children.  We have problems from twelve-year-old boys and sixteen-year-old girls.  Many sangaku exercises are elementary, similar to those we encounter in a high school geometry course.  Others are astonishingly difficult and a handful anticipate theorems discovered in the West.  Sometimes the methods the Japanese employed were unwieldy, especially in matters requiring calculus, but in other respects were simpler than those taught today.  In yet other cases we cannot be certain exactly how the folk mathematicians solved a given problem.

It is clear that sangaku were hung both as challenges to other devotees and as thanks to God for mathematical progress.  Westerners often refuse to concede that mathematics can constitute a form of worship, can be sacred.  These are merely math problems, nothing more.  But whether they are math problems or concert oratorios or church relics or rocks, the sacredness of objects is never intrinsic; their status is conferred by the believer.  Moreover, the motto of Zen Buddhism might be taken to be “discipline in the service of enlightenment.”  What could be more disciplining—and enlightening—than  mathematics? 

The practice of hanging sangaku gradually died out when the Japanese   adopted Western mathematics after the collapse of the Tokugawa shogunate in 1868, and today the custom is virtually unknown, even in Japan.  For the past twenty years, a high school geometry teacher Fukagawa Hidetoshi has been instrumental in bringing sangaku to a wider audience and recently I collaborated with him in writing a book on the subject.**

For me, a physicist, it is pleasing to realize that there have existed societies less math-phobic than our own.   Japan is not alone, although America might be.  At the Princeton physics department, where I teach, it has become a running joke that the best incoming students are foreigners and not long ago I quipped to a colleague that we should force anyone whose surname ends in “ovich,” “adzic” or “escu” to be in the honors section without discussion.  The Eastern Europeans’ sterling performance is due to the remnants of the high-powered, no-nonsense Soviet educational system.  Asian and Asian American students, no longer to anyone’s surprise, also outperform their “native” counterparts.  They are hardly genetically superior; they are imbued with the Buddhist work ethic.

Contrast this with the American position, whose schizophrenic attitude toward math is embodied in the catch phrase “Do the math,” which rarely refers to anything more advanced than counting.*  On the one hand, Americans acknowledge math’s utility, if not its beauty; on the other hand we refuse to adequately train our students, believing that one has to be a born “rocket scientist” to be competent.  A fourth-grade teacher recently told me that in her school district there is no science or math curriculum—it is every teacher for herself and most of them do not know how to plot a straight-line graph or compute the number of seconds in a year.  Even among my Princeton students, the mindset that one can do physics and engineering without calculus is present.  And how often have instructors heard non-majors complain, “I’m interested in the concepts, not the math.”  In science, unfortunately, the two are impossible to disentangle.

With their “show me” attitude, Americans are repeating the age-old question, “What is the point of higher mathematics?”  Mathematicians tend to reply by quoting Benjamin Franklin’s version of their perennial answer: “What science can there be more noble, more excellent, more useful for men, more admirably high and demonstrative than mathematics?”

Yet even in Franklin’s response, there is more than a hint of the utilitarian.  Certainly the technological civilization we inhabit would collapse were not a sufficient number of people versed in higher mathematics.  All modern physics, chemistry, electronics, engineering and computer security relies on more algebra, calculus, number theory and geometry than one could possibly imagine hearing “Do the math.”  The development of alternative energy strategies will not be carried out by people inadequately trained in mathematics.

Unfortunately, even from the utilitarian point of view the prospects do not appear promising.  In a recent review of our book, the critic lamented that the study of geometry has decayed to such a primitive state in Britain that students under forty would find the book a severe challenge.  The same might be said in this country.  But the reviewer was not bemoaning the lack of appreciation of geometry’s utility.  Real mathematicians tend to be insulted if you insinuate that their work has any practical value.  Neither are young people attracted to mathematics because of its practical applications.  They like math because it opens up an universe of infinite possibilities.  The reviewer was speaking of the loss of cultural heritage.

It is here that the feudal Japanese farmers can offer their services.  They did not solve their temple geometry problems because they were practical, but because they were beautiful.  It would be naïve to suppose that every student will become a competent mathematician, but emphasizing the utility of mathematics is perhaps the wrong strategy.  Emphasize its beauty, its pleasures, and the skills developed in the attainment of this beauty will follow.

* A shorter version on this post was broadcast by the Australian Broadcasting Corporation on  their program “Perspective” for Wednesday, November 19, 2008.

*Sacred Mathematics: Japanese Temple Geometry (Princeton University Press, 2008).

* “Do the Math,” Posted November 17, 2008.

Introducing Tony Rothman’s new column “METAPHYSICS: The World at Large Through the Eyes of a Scientist”

I’m pleased to introduce a new semi-monthly column by writer, physicist, and Princeton University lecturer Tony Rothman.  His most recent book, with Fukagawa Hidetoshi, is called SACRED MATHEMATICS: Japanese Temple Geometry.  Please enjoy his inaugural post!

“Do The Math”

Tony Rothman

The word “metaphysics” derives from the Greek meta ta physika. It was originally used by Aristotle’s Hellenistic editors merely to refer to his books that came after the books on physika—the things of nature. Thus “metaphysics”—after the things of nature. In this series I do not intend primarily to discuss the things of nature, the latest and most dazzling scientific discoveries, trends and fashions. I would like instead to explore how our world looks through the eyes of a professional physicist, one trained in mathematics and steeped in analytical habits. My particular area of expertise is cosmology, the study of the early universe, but like any physical scientist I value facts and data over opinion, pay close attention to the logic of an argument and show an appreciation for a carefully designed experiment or an elegant mathematical demonstration. To those of us raised in the scientific community such an outlook seems reasonable. When we listen to the news, we learn we do not think much like journalists, talk show hosts or politicians. Sometimes we wonder whether we are space aliens.

Hearing “Do the math” does frequently make me ask what planet I inhabit. Over the past few years, “Do the math” has become an American catch phrase. As far as I can tell, it usually refers to counting: “Do the Democrats have enough votes to pass this bill in Congress? You do the math.” It is a sad commentary on twenty-first century America that an activity human beings are supposed to have mastered five or six thousand years ago is considered higher mathematics. Rarely do I hear “Do the math” applied to something as advanced as multiplication; division is out of the question.

I speak seriously. For the past several years I have taught introductory physics at Princeton University. Two years ago we gave our usual final exam at the end of the second semester, which is devoted to electricity and magnetism. Princeton freshmen are easily the best undergraduates I have taught in twenty-five years of teaching and they are far better trained than I was at their age. By the end of the course we have covered some sophisticated material, including an introduction to Maxwell’s equations and even a nontrivial topic in calculus known as surface integrals. On the final exam we decided to give them a break with an easy problem. From some basic quantities they needed to arrive at an equation that amounted to A = B2C. Almost everyone got that far. We next asked if B were lowered by a factor of one thousand, how much would C have to change to keep A the same.

Of the two hundred or so students who took the exam, approximately twenty used their brains. If B goes down a thousand times, B2 goes down a million times; therefore to keep A constant C must increase by a factor of one million. That is all that was required. The vast majority of students went back to their calculators, numerically recomputed B and C from the information provided and found a new value for A. With all the arithmetic mistakes that could and did occur by doing the problem on a calculator, about 60% of the students arrived at the wrong answer. Those who got it right often said that C had to increase by a factor of 999,999.9998 and rounded their answer off to one million. This is a little like the engineer who said that two plus two equals four to within a tolerance of .0002, and got the job.

A formal way of stating the problem is to say that since A is remaining constant, the ratio A/A = 1. Therefore, the ratio of the old value of B2C to the new value of B2C must be also be 1 and consequently the old value of B2C must equal the new value of B2C. Reasoning by proportions, or ratios, was known to the ancient Chinese and is something I was taught in sixth or seventh grade. It is one of the basic tools in the arsenal of any scientist. Not only has the tool been lost to the calculator generation but so has the concept of an exact answer.

The act of dividing one quantity A by another quantity B to get a ratio that compares the sizes of A and B may be the single most important act a numerate person can perform. Yet the ability to make that comparison is vanishing before our eyes. My favorite recent example is the Cingular/AT&T ad “Fewest dropped calls.” The ad is great because it is meaningless. Clearly the company with zero customers will have the fewest dropped calls. It goes without saying that the entire advertising industry is based on ignoring standards of comparison, in other words, by omitting the denominator of a fraction. “Doctors recommend…” How many doctors? What percentage?

Less amusing examples of denominator omission now affect us all. Before me is an article on plug-in hybrid cars that claims 100 mpg for the prototype. It is true, by the odometer one can drive 1,000 miles on 10 gallons of gasoline: 100 mpg. What the claim ignores is the energy needed to charge the battery and the energy lost in transmission from the power plant, both of which are significant. Without entering the debate on the merits of plug-in hybrids, what is clearly called for is a proper ratio to measure vehicle efficiency: the number of miles driven per total amount of energy required, perhaps with an adjustment for emissions.

More seriously, about a month ago the New York Times reported that the total money worldwide tied up in derivatives is roughly 550 trillion dollars. Whether this number is accurate, I don’t know, but I do know that it is about ten times the world’s gross product. One does not have to be a scientist to realize that there is simply not enough money in the world to cover such bets and that the system must collapse.

The moral of these anecdotes is that one number in isolation means little. Only when it is compared with a standard does it give us knowledge. Unfortunately, this elementary truth is ignored on a daily basis, not only by Princeton students, proponents of future cars and financial wizards, but by the news media as a whole. NPR, the BBC and the NY Times routinely presents figures without comparison. “The number of unemployed Americans this year has increased by two million.” “Six thousand million tons of carbon were released into the atmosphere in 2006.” Are these large numbers or are they small numbers? How does one know?

Only when we learn what fraction two million people is of the total workforce does the fact acquire meaning. The simple act of quoting numbers as percentages rather than absolute figures, a procedure known to Chinese peasants three thousand years ago, would instill a great deal of numerical hygiene into public discourse.