Monday, July 9, 2018

review -- Threads in the Sash: The Story of the Métis People by Fred Shore


Every Canadian knows one thing about the Métis, that they are descended from the union of French fur traders and Indigenous women. That bare little fact usually snoozes in the Canadian mind for a lifetime, perhaps alongside a hazy image of Louis Riel and the factoid of his hanging. But the Métis story deserves a fuller fleshing out, not only because the details are interesting and, in this era of reconciliation, highly relevant, but also because something remarkable, something uncommon, lies at the heart of the Métis story. In the space of only two centuries, a new people was born, a new nation, a unified society with its own language, economy and culture. Between the early 1600s and 1800s the Métis developed from non-existence to a golden age.

For many years no one quite understood what was happening. Children who were born to French fur traders and Indigenous women were regarded simply as French or Indigenous, depending on where they lived. Those who were sent away to be raised in French settlements were accepted there as fully French, with no special meaning attaching to their parentage. Those who stayed with their parents in Indigenous communities were accepted as fully Indigenous. The idea of Métis did not exist.

However, as the fur trading system expanded further and further into the Great Lakes region, the great distances became problematic for the fur companies. A single season was no longer adequate to ship trade goods from headquarters in Quebec City, Montreal and Trois-Rivières to the French and First Nations trappers. As a result, the companies set up depots to store supplies over the winter and staffed them with fur traders and their Indigenous wives and families. Those families living in small, secluded depots, separated from both French and First Nations communities, quietly, unwittingly, laid the foundations of the future Métis nation. A new culture began to take shape, unique to the depots, primarily a blend of French and Indigenous cultures but with the addition of distinctive new practices. Interestingly, the depot people still did not see themselves as having their own identity.

When the French trading system reached beyond the Great Lakes in the late 1700s, into what was then known as Rupert’s Land, the isolation ended and the Métis nation was born. In the open lands of the west the depot people gathered and realized their commonality. Rupert’s Land was English, under the governance of the Hudson’s Bay Company. The Métis stood out distinctly because of their language, their Catholicism, virtually every part of their culture. Their self-awakening had begun.

By about 1820 they began a period of rapid self-development and growing self-confidence, an era they later came to regard as their golden age. Recognizing the vast demand for pemmican to supply the hundreds of canoe brigades fanning out across the enormous distances of western and northern Canada, the Métis seized the opportunity and created a powerful new economy based on the buffalo hunt. The effort of organizing hunts on a massive scale inspired them to develop organizational skills to a high level and to institute rules and practices that were later transferred successfully to the military and political spheres. After a hunt, the meat was processed into pemmican, which the Métis sold, insisting always on cash. By refusing to accept the traditional HBC scrip, they freed themselves from the HBC monopoly and were able to buy their trade goods more cheaply south of the border. In an effort to protect their monopoly, the HBC issued laws and regulations, but it had no effect on the Métis, who continued to operate as they liked, whether as trappers, farmers or hunters. Their best years, these years of prosperity and self-assertion, lasted until the process of Confederation began.

The rest of Métis history is much less upbeat. As Confederation approached, English newcomers from Ontario arrived, greedy for land and power, hostile to the Métis, openly racist, and ready to use fraud and violence to achieve their ends. They were largely supported by John A. Macdonald and the central Canadian government. In less than a generation the Métis lost their economic base and were driven off their lands around the forks of the Red River into the outer fringes of the west. They were hardly alone in suffering and being dispossessed during the expansion of Canada. All Indigenous people suffered. As the “taming” of the West continued, many of the smaller, marginalized groups, those without a clear national identity, such as “non-status Indians,” were eventually absorbed into the Métis nation.

The largest of these other groups were the descendants of HBC employees, some of whom had, like French fur traders, married Indigenous women. However, their experience was very different from the French. Because the HBC system focused everything on the fort — trappers came to the fort rather than the company going out to the trappers — Indigenous women had to live in the fort with their husbands, and children were raised in the fort, where they were given a British upbringing. But the HBC officials brought deep-seated racist attitudes from Britain, and although the children lived in British settlements and were raised in British ways, they were never accepted in society, always cruelly stigmatized as “half-breeds”. Instead of recognizing their potential and recruiting them to be valuable employees, the HBC exploited them only as cheap labour. As their numbers grew, they moved out of the forts and took up farming in the areas surrounding the forts, rejected, yet always thinking of themselves as British. Thus they were never able to develop an independent sense of identity. Eventually, however, their position as permanent exiles became untenable, and they too merged into the Métis nation.

The final chapters of Fred Shore’s Threads in the Sash spell out the details of the promises made to the Métis, the promises broken, the treaties ignored, the abuses perpetrated, the theft of land and rights, all aimed at crushing a proud and enterprising people. It also presents the current legal and moral case against the Canadian government and the grounds the Métis have for seeking compensation. Starting in the late 1960s, a century after their dispossession at the Red River, the Métis have re-awakened as a people, regrouped, reorganized, and determined a way forward that they hope will lead to a restoration of some of their former independence. As the largest of Canada’s Indigenous groups, numbering almost half a million, one-third of all the Indigenous people of Canada, the Métis must arrive at a satisfactory settlement with the rest of Canada, if Canada’s ambitious reconciliation process is to have any chance of success.


Saturday, May 12, 2018

review -- The Infidel and the Professor: David Hume, Adam Smith, and the Friendship That Shaped Modern Thought


The Infidel and the Professor: David Hume, Adam Smith, and the Friendship That Shaped Modern Thought
by Dennis G Rasmussen


For historical writers, a feud is a gift—lurid attacks, entertaining insults, factions, court cases—all guarantee drama. Love affairs are even easier with ecstasies, cries de coeur, torrents of letters, the language heated and glowing, rising sometimes into poetry. But friendship, what is there to write about in a friendship? What will the emotional highlight be, a letter of congratulation, a heartfelt testimonial? The author can note the times and places of their meeting, but even if Boswell were on the sideline recording every word, it would likely consist of little more than good-humoured banter mixed with shop talk. In The Infidel and the Professor political science professor Dennis C. Rasmussen tries, quite successfully, to bring to life the friendship between the two premier figures of the eighteenth century Scottish Enlightenment: David Hume, about whom Isaiah Berlin wrote “no man has influenced the history of philosophical thought to a deeper and more disturbing degree,” the infidel, the great skeptic, whose far-reaching doubts unnerved the religious establishment; and Adam Smith, the professor, the placater of the establishment, author of The Wealth of Nations, which has been described precisely as “the one book between Newton’s Principia and Darwin’s The Origin of Species that actually, substantially, and almost immediately started improving the quality of human life and thought” and extravagantly as “probably the most important book that has ever been written.”

Hume, twelve years senior to Smith, is remembered today for his philosophical works, with John Locke and George Berkeley as a founder of British empiricism, for his famous analysis of causality as merely the “constant conjunction” of events, and for his skeptical examination of arguments for religion. But in his essays and in his six-volume History of England he wrote about much more than philosophy, including political economy, where Smith would make his mark. Rasmussen spends a good deal of time tracing—with a light touch, thankfully—the pervasive intellectual influence Hume exerted on Smith, presenting the first of Smith’s two books, The Theory of Moral Sentiments, as a quiet dialogue with Hume. Even The Wealth of Nations, Smith’s magnum opus, his most original work, was influenced in critical ways by Hume’s writings.

 Intellectual influences aside, however, depicting the friendship remains problematic. Because Hume was a regular letter writer, Rasmussen does what he can by quoting Hume’s invitations to Smith to join him in Edinburgh or meet him in London or Paris. And he reveals in detail their itineraries as they move about, even from house to house, showing how their paths crossed, or might have crossed, even when he can offer nothing about whether they did in fact meet or what happened if they met. He quotes their expressions of esteem and describes the small favours they did for each other in the business of publishing and book promotion. It is all gentle and civilized, entirely fitting for major thinkers of the Enlightenment. But it does not stir the blood.

 To add pizzazz to the book, Rasmussen makes set pieces of the few incidents which were dramatic(-ish). The oft-told story of Hume’s encounter with Jean-Jacques Rousseau, for example, gets its own chapter. Rousseau, fleeing arrest warrants in France and Switzerland for his radical ideas, and frightened by the mobs that had stoned his house, accepted an offer from Hume to help get him out of France and find him shelter in England. Friends of Hume warned “you are warming a viper in your bosom,” and, sure enough, the edifying spectacle of a great, disruptive writer coming to the aid of a fellow great, disruptive writer soon fell to pieces. Rousseau, unstable and delusional, accused Hume of leading a conspiracy to silence him and bury him in obscurity. Upon Rousseau’s returning to France incognito, Hume published documentation of the affair and, in his private correspondence, made several waspish comments about Rousseau, probably, given his genial personality, the only remarks of that nature he ever made in his life.

 For twenty years after finishing his History of England, Hume stopped publishing. “Too old, too fat, too lazy, and too rich,” he explained to friends. “When I see my bulk on a shelf, as well as when I see it in a glass, I would fain prevent my growing more corpulent either way.” It was during and after his final illness that Smith’s part in the friendship was put to the test. Hume asked Smith to oversee the publication of his yet-unpublished Dialogues Concerning Natural Religion. It would be his most thorough (and skeptical, of course) discussion of the rational arguments for religion. Despite their many years of friendship and despite the poignant timing of the request, Smith refused. Although Smith was probably a Deist, if not an atheist, almost certainly not a Christian, he had lived his whole life hiding his true beliefs to avoid the hornet’s nest of the religious establishment. Hume did not press him and found another acquaintance to take on the job. In the end, however, Smith redeemed himself somewhat and was roundly condemned as a result. While the establishment eagerly waited to hear of Hume’s deathbed conversion to Christianity or, just as good, of his dying in spiritual agony, Smith published a tribute that described Hume dying serenely and concluded with a sentence that caused an uproar: “Upon the whole, I have always considered him, both in his lifetime and since his death, as approaching as nearly to the idea of a perfectly wise and virtuous man as perhaps the nature of human frailty will permit.” It generated the eighteenth century equivalent of a Twitterstorm.

 The Infidel and the Professor may offer no new revelations or overturn big theories, but it makes for a thoroughly enjoyable read. Rasmussen’s prose is transparent and easy, free of the usual academic clunkiness. And, luckily for the reader, we hear much more from witty and good-humoured Hume than from serious and reserved Smith, giving us, for example, this account of Hume’s visit to Maria Theresa, the Holy Roman Empress: “After we had a little conversation with her Imperial Majesty, we were to walk backwards, through a very long room, curtseying all the way. And there was very great danger of falling foul of each other, as well as of tumbling topsy-turvy. She saw the difficulty we were in and immediately called to us, Allez, allez, messieurs, sans ceremonies. Vous n’etes pas accoutumés a ce movemen et le plancher est glissant. ("Go on, go on, sirs, without ceremony. You are not accustomed to this movement and the floor is slippery.") We esteemed ourselves very much obliged to her for this attention, especially my companions, who were desperately afraid of my falling on them and crushing them.”

Tuesday, April 17, 2018

review -- Empress of the East: How a European Slave Girl Became Queen of the Ottoman Empire by Leslie Peirce


Empress of the East: How a European Slave Girl Became Queen of the Ottoman Empire
by Leslie Peirce


The sultan’s harem was an object of constant fascination for the West. For European women it was probably a nightmare, a symbol of horror and sexual degradation; for men it seemed a pleasure garden, a perfect setting for erotic fantasies. For the sultan himself, however, it was neither. While hardly a palace of pain, the harem had a serious purpose that was not about indulging his lusts, for above all else it served a dynastic function. It was a factory for male heirs. No real importance attached to the sultan’s momentary feelings (never mind the concubine’s) provided the essential duty was performed — one sultan put the empire in jeopardy because he was not attracted to women — and once the woman became pregnant the erotic relationship ended. She was banned from his bed and withdrew permanently into the women’s quarters of the palace.

Unlike the royal houses of Europe, where primogeniture was designed to ensure orderly succession after the death of a king or queen, the Ottomans developed a very different system. The sultan would create several potential heirs from among the concubines. After giving birth to a son, the concubine, now a royal mother, would separate from the sultan and devote her life to raising her son. All her efforts were directed toward inculcating in him the character and the political and military skills required to outmaneuver, when the sultan died, all the other sultan’s sons, so that he could seize the throne, either through skill or violence. (Female children, free from a future of this high-stakes competition, were prized and raised with great affection.)

In the early years of the empire, before concubines were turned into royal mothers, sultans had married princesses of other nations, as European royalty did. This practice, however, did not last, the Ottomans judging that the danger presented by foreign-born wives with divided loyalties outweighed the benefit of creating political alliances. The marriageable daughters of leading Ottoman families were also ruled out as potential mates on the grounds that this, too,  might encourage challenges to the the ruling dynasty. Thus concubines and the dynastic harem came into being. But where could a steady supply of concubines come from? Since they would live as slaves, Islamic law presented a sticking point, since it forbade a Muslim from enslaving another Muslim. The solution was for the Ottomans to buy Christian slaves from Crimean Tatars, whose periodic raids virtually emptied villages over wide areas in the nearby countries to the north, in what are now the Balkan states, Ukraine and southern Russia.

One such slave was a clever girl of seventeen, probably from southern Russia. Despite the horrific experience of enslavement, she must have impressed her captors, for they named her Hürrem, Persian for “joyful” or “laughing.” She probably began in the household of a high official before being given as a gift to the sultan. “Young but not beautiful, although graceful and petite” was how she was described in a report written by a Venetian ambassador.

Suleyman the Magnificent, aged twenty-six, had already sired a son with another concubine before he encountered the girl. Then, as tradition dictated, the mother and the boy were shunted aside, and in 1520 Suleyman moved on to Roxelana, as she was then known, the next womb in the assembly line. That’s when something unaccountable happened between the two, something completely out of the norm, something that in time overturned many of the empire’s precedents and traditions. Many people thought she exercised black arts over the sultan and called her a witch. Today there would be people pointing to the Stockholm Syndrome. But the most credible explanation is that Suleyman and Roxelana simply fell deeply, passionately in love.

What alerted the rest of the world was that, after giving birth, Roxelana continued as Suleyman’s mistress rather than being banished to a remote nursery. Suleyman did not turn his attentions to another concubine. In fact, it was later said that he was never unfaithful to her all his life. Within a short space of time, the couple had five more children. Nothing like that had ever happened in Ottoman history. More and more Roxelana appeared with him in the main palace, normally off limits to women except for brief conjugal visits. Finally she moved there permanently into her own apartments. And after his mother died, Suleyman freed her from slavery, and the two married, forming a monogamous, nuclear family in the sultan’s palace, mother, father and children. Throughout the empire and throughout Europe people were amazed.

If Roxelana had done nothing else, her transformation from slave girl to sultan’s wife would have ensured her a place in history. But after she settled into her new position, she proved to be much more than a clever girl who merely knew how to take advantage of opportunities. Her intelligence she placed in the service of the sultan and the empire, keeping herself up to date on political affairs, advising him, even engaging in diplomacy of her own by corresponding with and sending envoys to influential figures in foreign countries. Other powerful women were of special interest to her, such as Princess Bona of Milan, who married the King of Poland, and her daughter, Isabella, Queen of Hungary (a letter to Isabella began, “We are both born from one mother, Eve …”). She also made a name for herself through many charitable works, starting with a large complex in a neighbourhood in  Istanbul known for its women’s market. The third largest complex in the city — no one would mistake her power — it consisted of a mosque, two schools, a fountain, and a hospital for women. In cities and towns across the empire and beyond she founded soup kitchens, hostels, baths, and mosques, including in the holiest cities of Islam, Mecca, Medina, and Jerusalem. After her death, succeeding generations of royal Ottoman women looked to Roxelana as an example and made themselves wielders of real political power, serving as advisors to their sons and sometimes serving as regents.

Empress of the East is an excellent biography of Roxelana, covering most of what is possible to know about this extraordinary woman. It is especially strong in explaining Ottoman traditions and putting Roxelana into her political context. It is not, however, a vivid portrait. Contemporary sources are scant. Some of her letter to Suleyman have survived, but they consist largely of effusive missives telling him how much she misses him. Some European diplomats sent reports on what they observed about the Ottoman court. But mostly the author tries to put Roxelana's life together through circumstantial evidence. It produces truth and precision at the expense of drama. For drama one could turn to the Turkish television series "Magnificent Century," available on Netflix (sometimes), in which Roxelana plays a big part.  It contains a lot of historical fiction, of course, but it makes a welcome complement to this more serious biography. Afterwards one might listen to the second movement of Haydn’s Symphony No. 63, which was written about 1780 as incidental music for a stage work which featured Roxelana.

Sunday, January 7, 2018

review -- I Am Not a Brain by Markus Gabriel and Consciouness: a very short introduction by Susan Blackmore

For many people, philosophy is a swamp, a madhouse, an ear-splitting cave filled with pointless,  hair-splitting, logic-chopping argumentation. That may be true, but periodically we all find ourselves falling into that swamp, that madhouse, that cave. In the early days of computers, digital technology lured us in, asking us to ponder whether robots will one day look us in the eye with human-like intelligence. Much hair-splitting and argumentation ensued, with no clear results. Today the philosophical abyss opens again, thanks to sweeping claims made by neuroscience, which believes it has found the answers to some big questions. Does a mind exist or only a brain? What exactly is our inner life, our consciousness? Where does consciousness take place? How can something immaterial make connections with the material body? Is there any reality to the self, the ego, our feeling that we are the agents of our own actions? With recent discoveries about how our thoughts and actions depend on the brain, how can we believe that we have free will? Most of these questions go back many hundreds of years, and the philosophers who took them on, Descartes, Leibniz, Kant, Russell, and others disagreed with each other in a thousand ways. Perhaps we are nearing a day now when science will provide the final answers.

In her clear and concise monograph, Consciousness : a very short introduction, psychologist Susan J. Blackmore outlines the scientific findings on consciousness. The experiments and case studies she presents are entertaining and thought-provoking, as they seek evidence in dreaming, synaesthesia, multiple personalities, ouijah boards, out-of-body and near-death experiences, animal consciousness, and so on. Even ordinary experiences seem to yield insights. For example, since people can commute to or from work for half an hour or more and have no memory of doing it, they must have somehow been conscious to navigate and obey traffic signals. Yet it’s not a normal sort of consciousness that leaves no trace in memory immediately afterwards. In the 1960s one of the most startling discoveries was made when brains were scanned of subjects performing very simple physical operations, such as moving an arm. Parts of the brain associated with preparing for physical movement were found to be activated a full half-second before the conscious subject thought they had decided to move the arm. This raised grave doubts as to whether we are really agents of our own actions, whether the self exists, and whether we have free will.

Dubious Conclusions

Philosophical questions are ubiquitous in Blackmore’s account. Again and again as she describes what science says about consciousness, she bumps into the problem of explaining the connection between what scientists observe from the outside and what we as individuals experience from the inside, and she admits repeatedly that she cannot give an answer. She uncovers indications that seem to undermine the common sense view but never quite gets to proofs. To cover the gap she invokes future discoveries—coming ”soon,” she says—when our technologies will be more advanced. Perhaps we will even discover the elusive “neural correlates of consciousness.” In the meantime, however, Blackmore does not bind herself to the evidence at hand, instead deeming the preliminary indications to be, likely, the whole truth. This leads to some peculiar theses. The mind, she believes, that is, the self, the entire mental world, is an illusion, and we ought to live dutifully keeping in mind that it is all an illusion: “This is tough, but I think it gets easier with practice,” she says, without offering any tips on how it is to be done. Free will is also an illusion, according to Blackmore, but, since studies have shown that rejecting belief in free will increases one’s tendency to depression, we should live “as if” we believe in free will. Her book is an excellent introduction to the science of consciousness, but when it addresses ancient philosophical questions, it founders badly.


A Critique of Neurocentrism

In the recently published I Am Not A Brain: Philosophy of Mind for the Twenty-First Century German philosopher Markus Gabriel takes the philosophical questions head-on in a multi-pronged attack on what he calls “neurocentrism,” the blurring, even the identification, of mind and brain. The claims of today’s neuroscientists and psychologists, he believes, are riddled with omissions, incoherence, and bad logic. In a book sometimes dense with argument, sometimes light and spacious, making its points with references to Fargo and Doctor Who almost as often as Kant and Hegel, Gabriel subjects the pillars of neurocentrism to close scrutiny, picking away at their logic, exposing their presumptions, and investigating alternative explanations.

One of the most powerful images driving us toward the neurocentric, materialistic view is that of a machine-like universe consisting of nothing but particles and energy, all locked into a chain of causality stretching from the beginning of the universe until its end. Because our brains belong to that realm, and because the brain is the originator, apparently, of all thought, it is argued that we are mere automata (as are all other conscious creatures), our entire mental world functioning beyond our control, inescapable and foreordained. Gabriel attempts to weaken the force of that image, pointing out, for example, that the current state of physics is not a closed, finished system: it cannot yet integrate gravity with quantum mechanics, it has no account of dark matter and dark energy, and causality seems vitiated by probabilities. However, he accepts the validity of determinism—as applied to the world of matter. Taking it further, though, is, he says, to over-extend one model of explanation over the entire cosmos.

The World Does Not Exist

Gabriel’s most unusual argument was presented at length in a previous book (and TEDTalk) entitled Why the World Does Not Exist. It is not possible, he believes, to step outside everything that exists, comprehend it all at once in a God-like glance, and thus see both the entire contents and the absolute limits of “the world” or “the universe.” That’s what materialists think they have done when they declare that only matter and energy exist. But why only matter and energy? Do numbers not exist? The rules of logic? How about facts? Or the Federal Republic of Germany, Hamlet (the play, not the physical words printed on the page), relationships, democracy, love? Gabriel contends that it is a mistake to assert that everything that exists belongs to a single class and that everything that exists cannot be comprehended in a single frame of reference. Instead, he wants us to accept a countless number of what he calls “fields of sense,” so that, just as it is meaningful and true to say that chairs and rainbows exist, in other fields of sense it can be meaningful and true to say that principles exist, or friendship or even Ebenezer Scrooge. Of course, Gabriel’s main interest in this argument is to clear the way for the mind and its cognates to be recognized as existing just as surely as atoms.

A chapter is devoted to each of consciousness, self-consciousness, the self, and freedom, as he both deals with various reductionist views of the mind and develops his own position, which he calls New Realism. Much of it is the common sense view of the mind — that it is real (although not a mysterious ‘substance’ and not existing apart from the brain); the true originator of many, but not all, of our actions; and operating with free will, even as it is subject to unconscious processes. He offers a tricky notion, though, for the defining function of the mind, which he takes to be its ability to think creatively about itself, ceaselessly to form conceptions of itself. At least part of what this means is our ability to imagine our own identities, as a Christian, for example, or a German, a patriot, a gift to the opposite sex (the mind can make errors about itself, of course) or a plaything of fate. This feature he takes to be absolutely crucial: “The human mind does not have a reality that is independent of its self-images.” Because a self-image has consequences in action and engenders a multiplicity of further thoughts, Gabriel believes it is important to push back against neurocentrism’s false image of the mind as illusory and unfree.

Here and there Gabriel raps the knuckles of some incidental figures, such as Richard Dawkins for his thesis that the human is no more than an elaborate biological mechanism devoted to the single purpose of passing on genes, Freud for his idea that the mind is enslaved to the libido, and Silicon Valley types who anticipate cyborgs and a future when an individual’s human experience can be uploaded to a computer, a network, or a USB stick. “Darwinitis” comes under fire for invoking a remote, mythical past to explain concepts such as egoism, altruism, good, and evil in terms of the struggle for survival and genetic transmission rather than accepting the historical development of these concepts, already so well documented in culture.

The Other Sciences of the Mind

“Nothing is more human than the wish to deny one’s humanity,” wrote philosopher Stanley Cavell. Again and again Gabriel sees attempts to reduce our humanity to something other than, and always less than, human. The German word for the humanities, he points out, Geisteswissenschaften, means “sciences of the mind” and consists of subjects such as philosophy, history, musicology, linguistics, and theatre studies. There, he thinks, is where we learn the most about the human mind. Neuroscience undoubtedly helps us understand the biological phenomena without which, of course, there is no mental life. But it has not proven that we are identical with our brains or provided satisfactory explanations of mental phenomena. More important, it seems unlikely ever to provide the level of insight into ourselves that we find beyond the sciences in figures like Sophocles, de Tocqueville, Proust, or Niebuhr.


I Am Not A Brain could be much better focused. As it shifts from topic to topic, the connections can be fuzzy, sometimes leaving the feeling of a miscellany, as if portions were patched together from notebooks. Nevertheless, it is very stimulating, invites repeated readings, and provokes hours of reflection. Written with the lay reader in mind without sacrificing intellectual rigour, it offers a bracing reminder to keep our guard up against, not neuroscience itself, but its philosophical pretensions.

Thursday, May 18, 2017

review -- Hitler's American Model: The United States and the Making of Nazi Race Law by J. Q. Whitman

In a memorable scene from the movie "Judgment at Nuremberg," the defence lawyer played by Maximilian Schell reads a legal opinion to the court: “We have seen more than once that the public welfare may call upon the best citizens for their lives. It would be strange indeed, if it could not call upon those who already sapped the strength of the state for these lesser sacrifices in order to prevent our being swamped by incompetence. It is better for all the world if, instead of waiting to execute degenerate offsprings for crime or to let them starve for their imbecility, society can prevent their propagation by medical means in the first place. Three generations of imbeciles are enough.” Snapping the book closed, Schell continues, “The opinion upholds the sterilization law in the State of Virginia, of the United States and was written and delivered by that great American jurist Supreme Court Justice, Oliver Wendell Holmes.” It is an unsettling moment in the film. Although the American precedent is not developed any further, it hints at a disturbing reality.
Could Oliver Wendell Holmes really have written such a thing? Could the words have been taken out of context? Could it be more than a rhetorical flourish? While racism was there for all to see in the American South with its segregationist Jim Crow laws, putting America side-by-side with Nazi Germany sounds almost obscene. And even if we must acknowledge that the Nazis regularly quoted U.S. eugenicists and U.S. race laws as precedents, we want to believe that such efforts were sheer propaganda, a shabby effort to put a veneer of respectability on their own odious regime.
Unfortunately, as J. Q. Whitman shows in Hitler's American Model: The United States and the Making of Nazi Race Law, the truth is much uglier than this. The truth is that the Nazis undertook a deep and sustained study of the laws of America as they were designing their infamous anti-Semitic Nuremberg Laws, because America was for them, as Hitler himself said in Mein Kampf, the one state that had made progress in developing a “healthy racial order.” To be sure, Britain was no slouch, along with its colonies and dominions, when it came to racist immigration preferences, treatment of non-whites, and so on. But America appealed to the Nazis more by being explicit in its laws—and harsher. Until at least 1936 Nazi Germany remained hopeful that it could “reach out the hand of friendship” to the U.S. on the basis of a shared commitment to white supremacy.
This may seem at first to be an extreme interpretation, but doubts quickly disappear as Whitman, Professor of Comparative and Foreign Law at Yale Law School, offers copious quotations from German texts of the 1930s. With chilling effectiveness, Hitler's American Model reveals how deeply a second current runs in the American system, counter to its high ideals of freedom, equality, and the rights of man, a current of white, even Aryan, racism.
While today Jim Crow laws probably come to mind most easily for us, they were not the focus of the Nazis who, after all, did not plan to create an apartheid regime. Their aim was—before the 1942 Wannsee Conference and its genocidal Final Solution—to drive non-Aryans out of the country and create a racially pure state. Their tools would be new laws on citizenship and sexual relations.
Citizenship law in America drew a clear race line as far back as 1790 when the Naturalization Act allowed citizenship to “any alien, being a free white person.” In the following century denying citizenship to Asians became the focus. Indigenous peoples were marginalized by being deemed “nationals” but not citizens. And when the Spanish-American War brought new non-white peoples into the American system, the U.S. Supreme Court allowed the creation of second-class citizenship for Puerto Rican and Filipino subjects, a disempowered status of “non-citizen nationals,” “foreign to the United States in a domestic sense.”

African-Americans presented a special difficulty as a result of the Fourteenth Amendment of 1868, which gave them citizenship rights; however, the Nazis were careful to note that, especially at the state level, “all means are used to render the Negro’s right to vote illusory” through petty measures such as poll taxes, literacy tests, etc. Most states, too, had laws to restrict African-Americans in their freedom of movement and career possibilities. The few Asians and Mexicans who had made it into the country had their voting rights blocked with similar legislation. Legal ingenuity such as this was appreciated by the Nazis, and in the case of the Czechs they did use a second-class status similar to the American example. But within Germany they were determined to be direct in their purposes by instituting straightforward racist laws.
According to Whitman, from the late nineteenth century to the 1920s and 1930s the United States came to be regarded not just by the Nazis but throughout Europe as “the leader in developing explicitly racist policies of nationality and immigration.” As the National Socialist Handbook put it, until the coming of Hitler, the United States had held “the leadership of the white peoples” in the “Aryan struggle for world domination”—although it had merely groped its way toward the historic mission to be undertaken by Germany.
The passing of the U.S. Immigration Act in 1924 delighted Hitler. He took as a given its Asian Exclusion Act (an extension of the 1922 Cable Act which revoked the citizenship of American women who married an Asian), but the National Origins Act struck him as especially revealing, for it favoured immigrants from the “Nordic” countries while limiting immigrants from southern and eastern Europe. For him it was a prime example of völkisch citizenship legislation—in fact, the only in modern times. Hitler spoke of it in combination with the earlier genocidal wars on indigenous peoples, believing that it showed clearly that the U.S. was “the model of a state organized on principles of Rasse and Raum,” that is, on the principles of race and the seizure of territory for a völk defined by race.  
But immigration and citizenship laws are not enough to create a racially pure nation. There had to be metrics to determine the degree of acceptable racial purity and laws to prevent racially mixed births ("mongrelization," in Germany, "miscegenation" in America). Here again America provided the precedents for Germany, and in particular for the Nuremberg Blood Law where, according to Whitman, the American model is seen at its most influential. In many societies mixed marriages have been discouraged through social constraints and sometimes they have been annulled as a matter of civil law, but historically bigamy has been the only form of marriage subject to criminalization and prosecution. In their review of American legislation, the Nazi researchers found that thirty states had passed criminal laws against miscegenation, some of them with penalties as severe as ten years imprisonment. (Virginia continued to enforce its miscegenation statute until 1967, when the Supreme Court, in the case of Richard and Mildred Loving, struck it down. See the 2016 Hollywood film, “Loving.”) The Nazis passed their own criminal laws against race mixing, but the Americans, they thought, had been too harsh, especially with the “one-drop” rule that some states used to define Negroes. As a result, the Reich Citizenship Law of 1935 was milder, defining a Jew as a person having three Jewish grandparents; and it allowed as a mitigating factor the degree of a person’s assimilation into non-Jewish society. The Jim Crow laws, too, were seen by the Nazis as going too far; German laws prohibited German women from consorting “indecently” with black men in public, but they did not place sanctions on private behaviour, as in the U.S.
Nazi Germany showed racism in its ugliest, most murderous form. However, a clear-eyed look at the historical record, such as we get in Hitler's American Model, is a healthy reminder that the nations that banded together to destroy Nazi Germany were also infected by the same disease, the difference being simply in the degree of toxicity.

Saturday, April 8, 2017

Review -- Heisenberg’s War: The Secret History of the German Atomic Bomb

Heisenberg’s War: The Secret History of the German Atomic Bomb
by Thomas Powers

A Founder of Modern Physics
By the late 1930s Werner Heisenberg’s fame as one of the founders of modern physics was firmly established. His paper on the “uncertainty principle,” published when he was 26, revealed an amazing, fundamental feature of the subatomic world. Our knowledge of the physical world, he announced, would never be complete, because in principle it is not possible to know both the position and momentum of a particle (or better, that the more precisely we know its position, the less precisely we can know its momentum, and vice versa). In 1932 he was awarded the Nobel Prize for Physics "for the creation of quantum mechanics." His work, combined with that of his mentor and friend, the towering figure of Niels Bohr, formed the background to a series of historic debates Bohr held with Einstein about how to make sense of the bizarre findings of modern quantum mechanics. While Einstein argued powerfully for the idea that fundamentally the physical universe is comprehensible, governed by causality and predictability, that “God does not play dice with the universe,” Bohr, to the satisfaction of most physicists, refuted him, arguing for the “Copenhagen interpretation” of quantum mechanics, an idea he shared with Heisenberg, the view that the physical world is indeterminate, governed not by certainties and traditional causality but ultimately by probabilities.
On a tour of America in the summer of 1939 Heisenberg was repeatedly invited by various scientists he met there to emigrate, as many of them had done, and make a clean break from Hitler’s Germany. While despising Nazism, Heisenberg felt bound by a deep loyalty to his native country and replied rather optimistically that he was needed at home to offer a voice of reason, to “create islands of decency,” to protect young German scientists from conscription, and to keep German science on the right path in the face of appalling nonsense about “Jewish science.”
The Most Dangerous Man in Germany
A month after his return home hostilities broke out, and soon thereafter Heisenberg was made leader of atomic research in Germany. Although scientists outside Germany struggled to maintain contacts, friendships, and a semblance of internationalism, their efforts wobbled under the pressures of war. To some, Heisenberg, who they feared might put an atomic bomb in the hands of Hitler, became “the most dangerous man in Germany because of his brain power.” Ever since, suspicion has hung over his name. How morally compromised was he by working under the Nazis? How committed was he to developing a German atomic bomb? How far did the project get? And even—was the failure of Germany to produce an atom bomb the result of his scientific incompetence?
In Heisenberg’s War: The Secret History of the German Atomic Bomb, a work of extraordinarily detailed research, Thomas Powers sorts through a vast range of sources to answer those key questions. The weighing of evidence is subtle and complex, but the conclusions Powers comes to are clear.
Hero or Traitor?
Powers provides evidence that Heisenberg did deliberately prevent Germany from working on an atomic bomb, although after the war he was not entirely forthcoming about his role. In the early years of fission research, soon after it was discovered in 1939 by Otto Hahn, all scientists acknowledged that, if a bomb were feasible, developing it would require a gargantuan, resource-draining effort. As the war began, German scientists agreed that such an effort would take too long to produce results before the war was over. In June 1942 Albert Speer called a crucial meeting between government officials and scientists to determine whether Germany should pursue an atomic bomb (about the same time, incidentally, that the Manhattan Project got underway). Heisenberg addressed the meeting, clearly spelling out all the difficulties. Soon afterward, all work on a German atomic bomb came to an end, with nuclear research restricted to small-scale work on an “energy machine” or reactor. Heisenberg had managed, in Powers’s words, “to guide German atomic research into a broom closet where scientists tinkered until the end of the war.” Germany’s stockpile of uranium was used to make armour-piercing shells.
Was Heisenberg simply being realistic in his assessment for Speer, or was he deliberately discouraging bomb development? After the war Heisenberg maintained that he had honestly believed a bomb to be impractical at the time, adding that German scientists were thus “spared the moral decision” of whether to work on a bomb. (On August 6, 1945 he and the other German scientists being held in detention in Britain were stunned by the news of the Hiroshima attack, shocked that American scientists had worked on the bomb and horrified that President Truman had used it to destroy a city.) Yet Powers points out that Heisenberg’s post-war account omitted important details. What he had told Speer may not have been untruthful, but it was carefully crafted to dampen interest. While outlining the extreme difficulties of separating the fissionable U-235 isotope from uranium, Heisenberg did not mention the relatively easier path to a bomb using plutonium. Asked whether a nuclear explosion might set the entire world on fire, he did not rule it out as a possibility. His talk emphasized all the difficulties and, most important of all, buried in technical language the most alarming possibility, the mere fact that there was an outside chance that someone—if not Germans, then Americans—might build a bomb for use in the war. Phrased differently, his presentation could have put Speer on high alert and ensured that Germany would set to work full bore on an atomic bomb. The crucial moral decision had, in fact, been made by Heisenberg and his colleagues, and they had chosen to block Hitler. His fudging of that fact after the war was probably an attempt to escape being branded a traitor by some of his countrymen.
During the war Heisenberg made a number of very risky efforts to let Allied scientists know that Germany was not working on a bomb. His intention—his “vague hope,” as his wife described it—was that he might inspire all scientists everywhere to refuse to work on a bomb, thus preventing the horror of a world armed with nuclear weapons. The hope, however, was as naive as it was vague. Americans, especially the émigré scientists who had fled Nazism, distrusted him for remaining behind, wondering whether his assurances were aimed at stopping Allied efforts, thus clearing the way for Germany to become the sole nuclear power. The mere mention of a bomb by Heisenberg prompted the Allies to undertake the most intense efforts to find information about a German bomb program—the American effort, under General Groves, chief of the Manhattan Project, never relaxing an iota until Germany was occupied and its last laboratory was inspected.
Kill or Kidnap?
In its 500 pages Heisenberg’s War covers a great deal of material other than Heisenberg and the moral culpability question. The SS came close to arresting him twice. There is the adventure-movie story of the attacks on the heavy water production facility in Norway (“The Heavy Water War,” a Norwegian series on Netflix gives a too-dark portrayal of Heisenberg).  Heisenberg’s famous visit to Bohr in 1941 is given close scrutiny. The visit led to such a chill in their relations that it never fully rewarmed; it has been the subject of endless comment, including in the 1998 play “Copenhagan.” Powers’s conclusion is that the break occurred, first, because Bohr was shocked to hear his old friend speaking of atomic bombs and, second, because he was deeply angered by several pro-German remarks Heisenberg made about the necessity of occupying various European countries, including Denmark, and the great good Germany was doing for Europe by attacking Russia. With supreme insensitivity he once remarked how much better off Europe would be dominated by Germany than dominated by Russia.
One thread that runs through many chapters is the deep concern the Allies had over what to do about Heisenberg. A first step was to determine the exact nature of his research projects, leading to several fascinating cloak-and-dagger operations involving more than one outlandish character. By the summer of 1943 the British reached the conclusion that there was no German atomic bomb program, mainly on information coming from their contacts with scientists in neutral countries who were still in touch with sympathetic German scientists, but also because the code breakers at Bletchley Park had never deciphered a single message referring to atomic bombs or Heisenberg. Because the Brits shared their conclusion but not their sources, they failed to convince the Americans, who remained implacable. Robert Oppenheimer’s opinion was that Heisenberg should be killed. General Groves considered recommending a bombing raid on Heisenberg’s lab in Berlin with the object of killing the scientists. More elaborate and enduring were several plans to kidnap Heisenberg as he attended a conference in Switzerland. At one point, an agent with a pistol in his pocket sat in the front row of a lecture hall as Heisenberg talked, ready to shoot him if he dropped the slightest hint about a German atomic bomb.
After Germany was occupied, the Allies were stunned to realize the primitive stage of German nuclear research, and the story was born that Heisenberg and his team had made grave scientific blunders. On the night of Hiroshima, Otto Hahn, in detention with them, called them “second-raters.” (Hahn himself was on the verge of suicide over his discovery of fission.) Power spends about ten pages considering the evidence before finally concluding that Heisenberg’s apparent mistakes were really a ruse to prevent the development of a bomb. He knew the world’s most dangerous secret, and he kept it from even his closest colleagues.

Wednesday, January 4, 2017

Review -- Moscow Nights: The Van Cliburn Story—How One Man and His Piano Transformed the Cold War by Nigel Cliff

Khrushchev’s desire to mark a new era in Soviet life had a drastic effect on an American musician, enmeshing concert pianist Van Cliburn so thoroughly in Cold War politics that to the end of his life he seldom broke free. In 1958, at the inaugural International Tchaikovsky Piano Competition in Moscow, Khrushchev learned that the judges were in a quandary. The rules of the game had been rigged to ensure a Soviet performer would win, but the judges, including greats such as pianists Sviatoslav Richter and Emil Gilels and composers Dmitri Kabalevsky and Aram Khachaturian, agreed that the 23-year-old Texan clearly outshone all the others. His grand, expressive, Romantic style was more Russian than the Russians’. Khrushchev said simply that if Cliburn was the best, the gold medal should go to him. When the win was announced, it created a sensation in the midst of Cold War hostilities, with Cliburn gaining instant, world-wide fame, on a rock-star scale. In New York he was given a ticker-tape parade. In the Soviet Union where, even before the win, his warm, open-hearted personality had charmed the public, waves of adulation swept across the country, never really to subside.

Moscow Nights by Nigel Cliff tells the story of Van Cliburn’s Russian connections. The book begins, perhaps appropriately but very oddly, with a chapter on Stalin, Molotov, Beria, and the internal Soviet politics that led up to the Tchaikovsky Competition. Although Cliburn had not traveled outside the U.S. before arriving in Moscow, Russian music was in his blood, inherited directly from his mother and teacher, a pianist whose great boast was that she once met Rachmaninoff at a minor concert in Louisiana. And at the Juilliard School in New York Cliburn was immersed even more deeply in Russian Romantic pianism by the great Russian teacher, Rosina Lhévinne. The Tchaikovsky competition was a natural for him.

As its title suggests, Moscow Nights presents Cliburn’s life with a strong Russian filter. Given the source of his fame, this is not so objectionable. It certainly makes for a pleasant and readable book, sure to be treasured by his die-hard fans. Anecdote piles upon anecdote, sometimes amounting to an hour-by-hour narrative of a concert and reception or a weekend in Washington or Moscow. At the same time the Russian filter is constraining. We learn little about his repertoire beyond Tchaikovsky and Rachmaninoff or his career outside the U.S.-U.S.S.R. nexus or his personal life. Of his homosexuality there is only the barest mention, even though it must have been difficult for him in his early years as a good, church-going boy, as well as later when he was being watched closely by both the FBI and the KGB. While much more than a fanbook, Moscow Nights is less than a full biography. And the non-fanatic might wish the editors had rejected some of the anecdotes that did not really make a point.


Although he lived for 55 years after the Moscow triumph, Van Cliburn’s fate was forever wedded to his one greatest moment. His later career, almost to the end of his life, seems to have consisted in being trotted out to perform for every black-tie event held for Russian leaders.