Tom McMillan, Not My Party: The Rise and Fall of Canadian Tories, from Robert Stanfield to Stephen Harper. Halifax: Nimbus Publishing, 2016. 600 pages.

If you know someone who thinks he or she is an expert on Canadian politics, here is a question to stump him or her with: Name any three persons who have represented Prince Edward Island in the federal cabinet, and the portfolios that they held.

Okay, it’s not an easy question. PEI’s cabinet ministers tend to have somewhat similar Celtic names, and few have made a major impact on the federal scene. However, one of them has written an interesting book. He is Tom McMillan, who served in two portfolios during the early years of the Mulroney government but lost his parliamentary seat in the 1988 election.

McMillan’s book is called Not My Party. On the back of the dust jacket he laments that “the new Conservative Party is nothing I recognize as part of either my own political tradition or that of my family. Now, it is no longer my party. I am an orphan; I no longer have a political home.” This lament may lead readers to expect a critical analysis of the “new” Conservative Party that was founded and led by Stephen Harper, but in fact there is not much about Harper in this rather long book. Practically all of it is devoted to the years from 1967, when Robert Stanfield became leader of the Progressive Conservatives, to 1988, when the author lost his seat and retired from politics to serve, for the remainder of the Mulroney era, as the Canadian Consul General in Boston. The heroes of the book are Stanfield, “the best prime minister Canada never had”; Tom Symons, who was the first president of Trent University and assisted Stanfield in redefining the party’s program and policies; and Brian Mulroney.

Stanfield and Symons were natural partners – both born to wealthy families, both somewhat shy, thoughtful and serious, and both, as McMillan tells us, motivated by a sense of noblesse oblige. Reading McMillan’s very detailed account of their partnership reinforces my wish that Stanfield really had become prime minister. Mulroney is a very different personality, but earned the author’s respect as a capable politician who was able to win elections and put much of the Stanfield-Symons program into effect, as well as making the party a serious force in Quebec. McMillan especially praises Mulroney for his interest in the environment, which is the portfolio that McMillan held for most of his four years in the government and about which he has a lot to say in his book. This is a side of Mulroney’s career that deserves to be remembered.

McMillan is less kind to other recent Conservative chieftains, with the exception of Jean Charest, whom he likes. He despises John Diefenbaker as a “populist” with (according to McMillan) no ideas about policy, thinks Joe Clark was a nice guy but not much of a leader, and ridicules Kim Campbell (although acknowledging that she was “winsome”) for a trivial incident involving an ice cream cone. However, his real grievance against Campbell, which he rants about for several paragraphs, is that she promised to re-equip the Canadian armed forces with new helicopters. Rather oddly for a Tory, McMillan seems to think that the armed forces are a waste of money and that any politician who takes them seriously is somewhat un-Canadian.

Peter MacKay, of course, is denounced as the man who sold out the party to Stephen Harper. Harper himself, although not discussed in any great detail, is the villain whom the audience is supposed to hiss when he appears on stage – the man who stole the party of John A. Macdonald from the “classic Canadian conservatives” (McMillan’s description of himself) and who Americanized Canadian politics.

The book discusses many of the author’s other political contemporaries in a chapter entitled “Rogues’ Gallery.” Despite the title, not all of his comments are critical, most are free of partisan bias and almost all are interesting. A few could have been omitted without any loss to the quality of the book; for example he says that John Turner was old-fashioned, citing as evidence, among other things, the fact that Turner preferred to say “aircraft” rather than “planes.” (Lest I be accused of hiding my own bias, I will confess that I also prefer “aircraft” to “planes.”)

An entire chapter is devoted to a critique of Elizabeth May, the present leader of the Green Party. Before she became a politician, May worked for McMillan as a departmental contract employee when he was Minister of the Environment. While he acknowledges her ability and hard work and her sincere interest in the environment, he accuses her of being indiscreet and of repeatedly sharing confidential information with people who were not supposed to know about it. The title of the chapter sums up his feeling about her: she was “my mixed blessing.”

McMillan himself comes across as a somewhat odd and ambiguous character. He writes a lot about his Catholic background, but constantly attributes lucky incidents in his life to “the gods,” a Neopagan metaphor that becomes tiresome and somewhat offensive after the first few times it appears. He writes about his love for Prince Edward Island, but admits that he was so embittered by losing his seat in Parliament in 1988 that he never wanted to live there again, and apparently never did. (In 1997, when his friend Charest was leader of the party, McMillan was an unsuccessful candidate in Peterborough, Ontario.) Echoing many defeated politicians, he thinks that the island media treated him unfairly. (He doesn’t comment on the irony that most Maritimers, after complaining for a century about the National Policy tariffs, voted in 1988 against the party that offered them free trade with the United States.)

He makes invidious comments about American society and politics, but decided to spend his retirement in Boston after the end of his term as Consul General and was still living there when his book was published. He writes the usual nice things that politicians write about their wives, but admits near the end of the book that he and his wife divorced each other, for a reason that seems rather trivial, soon after his political career ended.

There is quite a lot in the book about the bridge that now connects Prince Edward Island with the mainland or, as it is called in Islandspeak, the “fixed link.” Like Mulroney in his own memoirs, McMillan regards this piece of infrastructure as one of the Mulroney government’s greatest achievements, and even compares it to the building of the CPR. (Ironically the government that built the “fixed link” was also the one that ended passenger train service on the CPR’s main line.) He ridicules those islanders, of whom there were many, who opposed the project. Both Mulroney and McMillan claim that it did a lot for the island’s economy, although the evidence for this is rather sketchy. Mulroney even says in his memoirs that it became “a big tourist attraction in itself,” an accolade that might be more fairly bestowed on the ferry service which it replaced. Whether it was really worth a billion dollars at 1980s prices, at a time when the federal debt had reached a dangerous level and was growing rapidly, to build one of the world’s longest and highest bridges to serve fewer people than live in Oakville, Ontario, not all of whom thought it was a good idea, is a question readers may decide for themselves.

McMillan does however have a valid point when he laments the disappearance of the old Progressive Conservative Party. He approvingly cites Hugh Segal’s 2011 book, The Right Balance: Canada’s Conservative Tradition. Segal differs from McMillan (and sounds more authentically Conservative) in arguing that Canada should have strong military forces, and he is more charitable in his assessment of Diefenbaker, but in other ways their views are similar.

In the world in which Segal, McMillan and I grew up, one sometimes heard complaints that “the two old parties” were too much alike, but there was some virtue in the fact that they had much in common. Both supported our traditional institutions, including first-past-the-post voting and the appointed Senate. Both favoured a gradually expanding welfare state, to which both contributed; an economy with a mixture of state-owned and private enterprise; and a foreign policy backed by meaningful armed forces. Both thought the state should avoid cultural and “social” issues, apart from supporting two official languages. While they disagreed on the details, they were at least on the same wavelength, and talked about the same things.

Today our politics are totally polarized and the parties not only disagree; they talk past each other. While the Liberals plumb the depths of cultural radicalism and political correctness, the Conservatives respond by chanting monotonously about lowering taxes and balancing the budget, two objectives that are not always compatible. As I pointed out a year ago in Inroads,1 essentially the same thing has happened in the United States. The political trends, fashions and practices in that country always appear in Canada within a few years, as has been true for most of our history.

Whether we can ever return to the good old days is questionable, but at least books like Tom McMillan’s can remind us of what we have lost.

Continue reading “Lament For A Party”

In the two years since I last wrote a column about Ireland,1 important developments have taken place on the island of Ireland, both north and south.

Enda Kenny, the Taoiseach (prime minister) who led his Fine Gael party into the inconclusive election of 2016, managed to cling to power for a while after the election by putting together a minority coalition government in May of that year. However, his days were numbered and in June 2017 he was succeeded as party leader by Leo Varadkar. The new government was still a minority dominated by Fine Gael, confirmed in office by a vote of 57 to 50 with 47 abstentions. Fianna Fáil, the other major party in the 26-county state, abstained on that vote.

Varadkar, a medical doctor by profession, represents a new generation and a new style of political leadership in Ireland. He was only 38 years old when he took office, the youngest person to lead an Irish government since Michael Collins almost a century earlier. His father was an immigrant from India, although his mother is Irish. He is also openly gay, having indicated as much two years before he became the leader of the government. In January of this year he announced that his government would hold a referendum in May on the question of whether to delete the eighth amendment to the Irish constitution, which “acknowledges the right to life of the unborn.” Added to the constitution in 1983, that amendment made explicit what was already true at the time: abortion in Ireland is illegal. Polls suggest that the deletion of the amendment will be approved by a substantial majority of the voters. Subsequently the Dáil voted in favour of holding the referendum, although most of the opposition Fianna Fáil members were opposed.

Meanwhile, even more dramatic developments were taking place in the six-county state north of the partition line. The consociational executive, a coalition between the Democratic Unionist Party (founded by the late Ian Paisley) and Sinn Féin, collapsed in January 2017 when the two parties were unable to agree on a number of issues, including legislation to protect the Irish language and to legalize same-sex marriage, both of which were opposed by the DUP. This has left Northern Ireland with no executive, although the assembly and the public service continue to exist. Martin McGuinness, the deputy leader of Sinn Féin and deputy first minister of the executive, was in poor health at the time and resigned his leadership, to be succeeded by Michelle O’Neill. McGuinness died two months later.

Sinn Féin also experienced a change of leadership south of the border when Gerry Adams, the party’s national leader since 1983 and leader of the Sinn Féin members in the Dáil since 2011, retired from both positions in February 2018 and was succeeded by Mary Lou McDonald. Thus the two most important positions in Sinn Féin are now held by women. McDonald – in contrast to Adams, McGuinness and O’Neill – is a Dubliner who has never lived north of the partition line. Sinn Féin has been for some time the third most important party in the 26-county state, and has recently stated that it would be willing to enter a coalition government with one of the two major southern parties.

However, dramatic as they are, these developments have all been overshadowed by an issue that has grave implications for both parts of Ireland: the decision by British voters in June 2016 to withdraw the United Kingdom from the European Union.

Being formally a part of the United Kingdom, Northern Ireland participated in that referendum. Like Scotland, it disagreed with the decision of the English-dominated majority: 56 per cent of the Northern Irish voted to remain in the European Union, and 44 per cent voted to leave. The geographical distribution of the votes for and against makes it clear that Catholic and Nationalist voters voted overwhelmingly to stay in Europe while Protestant and Unionist voters, still a majority in the six-county state, voted mainly although not as overwhelmingly for Brexit. This outcome reflects the position of the two main parties on the issue, with Sinn Féin in favour of remaining while the DUP is in favour of Brexit.

The Good Friday Agreement of 1998, which ended three decades of the “troubles” north of the border, contains a provision that no change can be made in the constitutional status of the six-county state without the approval of a majority of its people. The GFA also includes references to the European Union. Irish nationalists argue, correctly, that withdrawing Northern Ireland from the European Union would be a major constitutional change that a majority of its people explicitly rejected in the Brexit referendum, and thus would violate the terms of the GFA.

Irish interest in Europe is not new. Ireland tried to join the European Community at the time of the first British application to join in 1962. Ireland withdrew its application after President de Gaulle vetoed the British application, since Ireland at that time was heavily dependent on its economic ties with the United Kingdom. Ireland and the United Kingdom finally joined the European Community in 1973. Ireland felt confident enough in 1979 to break the link between its currency and the British pound. Ireland also adopted the euro currency in 2000, discarding in the process the attractive coins (designed by William Butler Yeats) that it had used since the foundation of the state in the 1920s. If the British had accepted the euro, both parts of Ireland would have used the same currency, but this did not happen. Ireland obviously felt that the advantages of using the euro nonetheless outweighed the disadvantages.

Originally, Ireland’s entry into Europe may have been a purely pragmatic decision, following in the wake of the large neighbour that, at that time, accounted for most of its external trade. But ties with Europe soon became much more meaningful than that, for a variety of reasons. For a country that was still mainly rural at that time, the European Union’s agricultural policy was a source of major benefits. Access to the whole European market and, later, adoption of the European currency also made it possible to attract a lot of foreign (especially American) direct investment by firms that wished to sell their goods and services in different parts of that market, stimulating the economic boom that made Ireland a rich country, the so-called “Celtic Tiger” of the 1980s and 1990s. As it became rich, Ireland, whose population had scarcely increased in size during the 20th century because of emigration (mainly to the U.K.), began to attract immigrants, especially from other parts of Europe, so that the inflow of population exceeded the outflow for the first time in centuries.

Moving out from under the shadow of its large neighbour, Ireland increasingly perceived itself as a proud part of a larger entity called Europe, rather than merely an appendage of the United Kingdom. The National 1798 Visitor Centre at Vinegar Hill, the site of the bloodiest battle in the unsuccessful rebellion that marked the emergence of Irish republicanism, opened in 1998 to mark the bicentennial of that event. The commentary that visitors to the museum can read asserts that the European Union today represents the same ideals that the Irish rebels of 1798 were fighting for.

Northern Ireland’s experience was somewhat different, if only because it continued to use the British pound sterling instead of the euro and because many of its people regarded themselves as “British.” But even there joining Europe represented a sort of opening to the world. Major firms including Bombardier, the Canadian manufacturer of aircraft and other transportation equipment, opened factories in the six-county state to serve European markets. The European Union allowed both goods and people to move freely between the two parts of the island, so that the partition line between north and south became almost as invisible to travellers by car, train or bus as the border between Quebec and Ontario. As the “troubles” subsided, the six counties also became attractive to immigrants. Following the Good Friday Agreement, when the northern government made a serious effort to attract Catholic recruits to what had been an overwhelmingly Protestant police force, it found that many of the people who applied were actually Polish!

For most people in the 26-county state, and for those northern Irish who had voted against Brexit in the British referendum, the outcome of the referendum was a shock and a source of great anxiety for both economic and political reasons. The British recession that quickly and predictably followed the referendum might spread to Ireland, both north and south. Also, Brexit would drive the two parts of the partitioned island of Ireland further apart. (Presumably most DUP voters, who still wave the Union Jack and regard southern Ireland as a foreign country, supported Brexit for that very reason.) In particular, it seemed almost unavoidable that if Brexit took place the almost invisible partition line would become a “hard” border like that between Canada and the United States, with customs and immigration controls on both sides. When the subsequent British election left the DUP holding the balance of power at Westminster and propping up Theresa May’s precarious government, it soon became apparent that the British government would have difficulty supporting any effort to make the border less visible.

Efforts at damage control began when the Irish cabinet, still headed by Enda Kenny at the time, held an emergency meeting within two days of the counting of the referendum ballots. North of the border a cross-community coalition of interest groups filed a request for a judicial review of the legality of Brexit in the High Court of Northern Ireland. Sinn Féin suggested that a referendum on Irish reunification should be held on both sides of the border, which is allowed under the terms of the Good Friday Agreement. In April 2017, in the Council of the European Union, all 27 member governments other than the U.K. supported a statement that if Northern Ireland voted to join the rest of Ireland it would automatically be allowed to stay in the European Union rather than having to seek admission as a new member. However, Prime Minister May had already stated in March that her government would not allow a referendum on either Irish unification or Scottish independence to take place.

Meanwhile the British Conservatives were struggling with their own problems, since the party and the government were seriously divided on whether there should be a “hard” or a “soft” Brexit, or something in between, or perhaps no Brexit at all, as some of the cooler heads in the party were beginning to suggest. Could the United Kingdom enjoy free trade in goods and services with the European Union without being part of it, as Switzerland and Norway do, albeit in slightly different ways? Could it continue to harmonize all its regulations with those of the EU and thus save itself the trouble of changing them, but still terminate its membership? Could Brexit be made compatible with its obligations under the Good Friday Agreement? Could it be made compatible with a “soft” (i.e. invisible) border between the two parts of Ireland, and if so, how? Could Northern Ireland have some kind of special status or associate membership where it remained in the EU while being part of a former member state that would be a nonmember? “Squaring the circle,” as Leo Varadkar aptly described these intellectual exercises, has occupied a lot of time as the clock ticks inexorably towards the planned date of British withdrawal.

In December 2017, agreement seemed near on a joint United Kingdom–European Union statement that reaffirmed the commitment of both entities to respect the Good Friday Agreement, called for “regulatory alignment” between British and European laws, and promised a “soft” border between two parts of Ireland while continuing to give Northern Ireland full access to the United Kingdom market. Whether the last two commitments were compatible with each other was perhaps questionable, but the question became an academic one when at the last minute the DUP, which holds the balance of power at Westminster, prevented Parliament from endorsing the agreement.

A month later, in January 2018, a public opinion poll in the 26-county state asked voters to choose between two evils: following the United Kingdom out of Europe by having its own “Irexit” or accepting a “hard” border between north and south. The response was overwhelming: 78 per cent would sacrifice the “soft” border with the north in order to stay in the European Union, 10 per cent would not and 12 per cent were undecided..

It is possible, however, that the choice will not have to be made. To the chagrin and apparent surprise of the British, the European Union and its principal member states are sympathetic to Ireland’s dilemma and are insisting on the preservation of a “soft” border as part of any Brexit agreement. For the British government this is bad news, because a “soft” border means that Ireland could be a back door for unwanted immigrants to enter the United Kingdom. The only way to prevent this would be to harden the border between Northern Ireland and Great Britain, which would be unthinkable for a British government that depends on the half-dozen DUP members of Parliament to stay in office. On March 19 Prime Minister May reluctantly conceded that the text of the draft withdrawal agreement would include the so-called “backstop,” or a pledge that Northern Ireland would remain under EU rules whether or not Brexit takes place. A legally binding plan to this effect, including the promise of a “soft” border, will be worked out through negotiations in Brussels.

The proclamation that Patrick Pearse read on the steps of the Dublin post office during the Easter Rising of 1916 referred to “gallant allies in Europe” who were allegedly supporting the Irish struggle for freedom. At the time it was written this reference was, to say the very least, an exaggeration. Today it seems that the “gallant allies” may at last be giving Ireland some real support.

Continue reading “Ireland and Brexit: A Game in Progress”

In September 2016 I visited the Yasukuni Shrine, located in a pleasant Tokyo neighbourhood. Had I been a Japanese politician, this would have caused a commotion in some parts of the world.

The shrine was established in 1868, the year of the Meiji Restoration that is usually considered the foundation of the modern Japanese state. It is surrounded by a park where vendors of military memorabilia and various antiques often display their wares. Other people stroll through the park, walk their dogs or enjoy various kinds of exercise, as they do in most parks throughout the world.

Shinto, the traditional – although no longer the official – religion of Japan, is largely based on reverence for ancestors. Shinto shrines are found throughout the country, and Yasukuni is by no means the oldest of them, but it is probably the most famous. It would not be stretching a point very much to say that Yasukuni is to Japan what Westminster Abbey is to England, a place of worship but also a place to remember the honoured dead who served their country. The Yasukuni Shrine contains the ashes, and is dedicated to the memory, of almost two and a half million Japanese who died and were killed in military service between 1868 and 1945. They include not only human warriors but some of the dogs and horses employed by the Imperial Japanese Army, an idea that I find rather touching.

The reason why visits to Yasukuni bother some people (although more outside of Japan than within it) is that the two and half million (more or less) include 16 individuals who were condemned as major war criminals and hanged by the victorious Americans after the Pacific war. General Hideki Tojo, the most famous of these, was depicted by American wartime propaganda as a dictator comparable to Hitler, Stalin or Mussolini.

Obviously he was not one, since his forced resignation in 1944, unlike Mussolini’s the previous year, had little or no impact on his country’s conduct of the war. The Americans magnified his role because they needed someone to be a target of popular hatred and anger, like “Goldstein the enemy of the people” in George Orwell’s 1984. Hitler served that purpose very well in the European war, which is probably why the allies made no effort to assassinate him, but the Americans needed an Asian equivalent. They did not want that equivalent to be Emperor Hirohito, the obvious choice, because they had already decided that they would keep the Emperor in place as a constitutional monarch after the war ended.

Since I am an unconditional opponent of capital punishment, my own view on the fate of Tojo and his colleagues is essentially that expressed by Richard Dudgeon in George Bernard Shaw’s The Devil’s Disciple: “You talk to me of Christianity when you are in the act of hanging your enemies. Was there ever such blasphemous nonsense?” Winston Churchill privately had misgivings about the practice of war crimes trials, although he deferred to his American and Soviet allies.1 For one thing, it is obvious that only the losers of wars are hanged, not the winners. George W. Bush started a war, but Saddam Hussein was hanged after it ended. The Nazis’ foreign minister, Joachim von Ribbentrop, was hanged for plotting to wage an aggressive war, but his Soviet partner in crime, Vyacheslav Molotov, enjoyed a lengthy and peaceful retirement since his country ironically found itself on the winning side when the dust had settled.

The Tokyo trial at which Tojo was convicted was controversial even at the time it occurred. Like Nuremberg but even more so, it was essentially an American show, but in contrast to Nuremberg (where only the four occupying powers were represented) all the other countries that had been at war with Japan, large and small, were invited to appoint judges to the tribunal. The Australian judge was unhappy that the Emperor was not included among the defendants and there is some evidence that Tojo collaborated with the prosecution in his own trial to protect the Emperor’s reputation. The Indian judge complained that Asian imperialists were being judged by a different standard than European imperialists, a point that was not without merit. By the time the death sentences were carried out, in December 1948, the Pacific war was yesterday’s news and some of the victors were preparing to make war on one another instead.

Getting back to the shrine and my visit to it, I felt no discomfort about being there. In Paris I have visited the elaborate tomb at Les Invalides of Napoleon Bonaparte, who was certainly a major war criminal by any standard. The Arc de Triomphe, which commemorates Napoleon’s victories, is the centrepiece of the annual celebrations on November 11, one of which I attended during the reign of General de Gaulle. I have twice visited the embalmed remains of V.I. Lenin, also a mass murderer of considerable notoriety. In Tennessee I have driven past the state park named after Nathan Bedford Forrest, who systematically massacred all the African-American soldiers captured by his Confederate forces and who later helped to found the Ku Klux Klan. In London I have seen the memorial to Air Marshall Arthur Harris, who admitted that his Bomber Command deliberately targeted residential areas of industrial cities so as to kill as many working-class Germans (the Germans least likely to have voted for the Nazis) as possible.2 There was some controversy in 1990 when the late Queen Mother dedicated the Harris memorial, but not much. More than a fifth of Bomber Command’s aircrews, incidentally, were Canadians.

In any event the Yasukuni Shrine, or at least the museum attached to it, is very interesting and well worth a visit. In it you can see a surviving example of the iconic Mitsubishi Zero fighter aircraft, and also a Japanese dive bomber. There are two impressive artillery pieces (both used in the Philippines, according to the captions) and even a steam locomotive from the railway in Thailand whose construction inspired The Bridge on the River Kwai. You can watch old newsreels of the Imperial Japanese Army on a movie screen, and you can read a rather poignant letter that Admiral Rinosuke Ichimaru, shortly before his death at Iwo Jima, addressed to President Roosevelt. (Roosevelt, who died two months later, almost certainly never had a chance to read it.) Japan’s earlier wars are represented by a large assortment of firearms, swords, helmets and other paraphernalia.

But perhaps just as interesting as the artifacts are the displays that present Japan’s impression of its own history. The story begins with a display about Commodore Matthew Perry’s “opening” of Japan in 1853. Like most North Americans, I always thought of this American exploit as a harmless expression of curiosity, or at worst an effort to open a new market for trade, but from Japan’s viewpoint it was an aggressive act and a wake-up call showing the country’s weakness. The conclusion drawn from it was that Japan must become an industrial state and develop a modern army and navy like those of the Western powers.

Over the next half century it did so, a more impressive achievement, and at far less cost in human suffering, than the much ballyhooed industrialization of Russia under Stalin. Perhaps inevitably, Japan imitated the Western powers (remember the Spanish-American War and the Boer War, among others) by adding Taiwan and Korea to its empire. It was rewarded with a seat at the high table, signing an equal treaty of alliance with the United Kingdom in 1902. Twenty years later the British renounced this alliance, which had served them well between 1914 and 1918, under pressure from a more important ally, the United States.

Thereafter the Japanese seem to have regarded the hostility of the Anglosphere as unavoidable. Whether this was true or not, Japan’s rapid conquest of Manchuria in 1931, its attempt to conquer the rest of China starting in 1937 and its fruitless alliance with Hitler (a white supremacist who secretly preferred the British to his Japanese allies) eventually made it so. By the time the Americans cut off their access to oil in 1941, the Japanese had run out of options. Any attempt to withdraw from China, as the Americans demanded, would have provoked a military coup and even endangered the life of the Emperor.

In contrast to Germany’s attack on the USSR, which matched roughly equal forces and had a reasonable chance of success, the attack on Pearl Harbor was a gambler’s last desperate roll of the dice rather than the opening move in a serious game plan. Japan could never have conquered the United States, which had half the world’s industrial capacity and produced two thirds of the world’s oil, and the Japanese knew it. At most they might have conquered Hawaii if they had won the battle of Midway, but what good would that have done them?

During the war Japanese propaganda claimed, as the displays at the Yasukuni museum do today, that they were fighting to free Asia from Western colonialism and racism. No doubt many Japanese sincerely believed this, and some still do. In any event it was quite widely believed in other parts of Asia, despite the atrocious behaviour of the Japanese army toward civilians in China and the Philippines. Even in those two countries the Japanese found many collaborators. Several hundred thousand Chinese migrated to Japanese-controlled Manchuria in the 1930s, suggesting that life was better there than under Chiang Kai-shek.3 An important faction of the Indian nationalist movement, led by Subhas Chandra Bose, supported Japan and provided some military assistance in its fight against the British in southeast Asia.

In 1935 an American diplomat, John V.A. MacMurray, had warned in a thoughtful memorandum that an American war against Japan, even if it resulted in total victory, “would be no blessing to the Far East or to the world” and would benefit only the Soviet Union, which would fill the vacuum created by Japan’s defeat.4 Sixteen years later, when George F. Kennan cited the MacMurray memorandum in his celebrated lectures on American diplomacy at the University of Chicago, MacMurray’s prophecy had been fulfilled.

The last military events described at the Yasukuni museum are the atomic bombings of Hiroshima and Nagasaki, but they are mentioned only briefly and without much emotion. The question of whether they were necessary to end the war is not addressed at Yasukuni, nor is an even more delicate and controversial question: was Emperor Hirohito really a harmless figurehead, as the Americans maintained at the time, or had he actually planned and conducted the war, as some later American historians have alleged?

Atomic bombs or no, Japan was treated far better in 1945 than Germany. Harry Truman, an avid student of history who had fought in World War I and whose relatives had supported the Confederacy in the American Civil War, knew that today’s enemy can be tomorrow’s friend, and vice versa. He abandoned Roosevelt’s policy of unconditional surrender and ended the Pacific war before the Soviets could threaten the Japanese homeland, and before American troops had landed there. Japan was spared the horrors of invasion and partition, and the atrocities that the Soviet army inflicted on eastern Germany. The American occupation was peaceful, the Emperor kept his throne, a new parliament was promptly elected, and the peace treaty signed in 1951 made Japan a fully sovereign state again and an American ally, although a Soviet veto kept it out of the United Nations for another five years.

Looking at Tokyo today, when it is cleaner, safer, more efficient and more attractive than most American cities (which admittedly wouldn’t be difficult), it is hard to believe that it was virtually laid waste by incendiary bombs in March 1945. It is even harder to believe that when I was born Japan ruled an empire that briefly stretched from Myanmar to the Aleutian Islands.

“The War,” as my parents always called it (as though there had never been any other wars), is fast receding into the mists of time. Soon the people with actual memories of it will all have vanished from this world. Who knows what our descendants will think about it a few decades from now, if they think about it at all? Will it all seem too remote and exotic to be believable, or will it simply be forgotten? Or will their views resemble those the young Robert Southey expressed about the battle of Blenheim, in lines that he wrote less than a century after the event?

“And every body praised the Duke,
Who this great fight did win.”
“But what good came of it at last?”
Quoth little Peterkin.
“Why, that I cannot tell” said he,
“But ‘twas a famous victory.” 

Continue reading “A visit to Yasukuni”

Does history repeat itself? Toward the end of the 19th century the United States, having survived a terrible civil war, made the transition from an agrarian to an industrial economy and emerged as a serious rival to the great powers of Europe. In the first half of the 20th century it grew to overshadow them all, economically and militarily, and become the richest, most powerful and most influential country in the world.

A century after the American Civil War ended, the People’s Republic of China, still a very poor country in the 1960s and something of an outcast in international politics, endured a so-called “Cultural Revolution” that threatened to tear it apart. Immediately afterwards, in the 1970s, it achieved a rapprochement with its bitter enemy, the United States, was seated at the United Nations and began to move away from a centrally planned economy toward a form of capitalism. Like the United States a century earlier, it then entered a Gilded Age of unprecedented economic and industrial growth, continuing to the point where it will soon displace the United States as the world’s largest economy, if it has not already done so.

As a result of these developments, few would dispute the proposition that a Sino-American bipolarity in international relations has replaced the Soviet-American bipolarity known as the Cold War that lasted for almost half a century, from 1945 to 1991. This in turn raises two important questions. In a bipolar world, is hostility between the two greatest powers inevitable? And if it is, will the hostility inevitably lead to a major war between them, which nowadays would exceed in death and destruction the so-called “world wars” of the first half of the 20th century?

The Soviet-American precedent might suggest that the answers are yes and no, respectively. The Cold War was certainly marked by plenty of hostility between the then two superpowers, but a major war was averted, although the world came close to one in 1962 and although some Soviet military personnel did fight, secretly, against the Americans in both Korea and Vietnam.

The Cold War was originally a contest about the future of Europe and especially of Germany, which had been left in dispute after the defeat of the Nazis, but its most dangerous moment came when the Soviets penetrated the American sphere of influence in the Caribbean. The Sino-American rivalry (Cold War II, so to speak) seems likely to find its most dangerous flashpoints in two messy pieces of unfinished business left over from the American war against Japan: the unnatural division of Korea and the ambiguous status of Taiwan.

Viewed in retrospect, both situations are the result of serious mistakes made by the United States between 1941 and 1945. Korea was divided because the Americans thought, erroneously, that they would need Soviet help to defeat Japan. Taiwan, which should have become an independent state after it was detached from the Japanese Empire, was instead promised to the moribund regime of Nationalist China and became the last stronghold of that regime after it lost the civil war on the mainland.

Perhaps in the 1950s a tradeoff could theoretically still have been arranged: give Taiwan to the Chinese Communists in return for giving North Korea to South Korea. That is a solution that might have appealed to Richelieu, Metternich or Bismarck. But now that Taiwan, despite unpromising beginnings, has become a successful democracy, it would be unthinkable for the Americans to abandon it to its fate. Meanwhile, China covets both South Korea and Taiwan as part of its natural sphere of influence or, in the case of Taiwan, as part of Chinese territory.

In his classic account of the war between Athens and Sparta, Thucydides, an Athenian who lived through that war, asserted that the danger of war between two dominant powers is greatest when the balance of power between them is shifting from one to the other, a situation that has acquired the label of “Thucydides’s trap.” Since China is obviously gaining power today while the United States is (relatively) losing it, this seems to suggest that we are entering the period of maximum danger. Another conclusion Thucydides drew is that the situation becomes particularly dangerous when one of the two great powers seems to threaten a small ally of the other great power. Taiwan and the two Koreas come to mind.

Graham Allison, an American political scientist who made his reputation in 1971 by publishing an analysis of the Cuban Missile Crisis, has tried in a recent book to apply these insights to the Sino-American situation.1 With some of his colleagues and students at Harvard, he has undertaken a research project in comparative history, focusing on the problem of “Thucydides’s trap” as illustrated by 18 case studies, from Athens and Sparta in classical times to China and the United States today.

His book summarizes the results of the other 16 case studies, 12 of which led to war, with the originally dominant power winning five wars and the rising power winning seven. The originally dominant power also won in the Soviet-American Cold War, which was decided without a major war, while the rising power won in the three other cases that were settled without a major war, so the actual score is ten to six for the rising powers (ten to seven if the war between Sparta and Athens is counted, since Sparta, the originally dominant power, won). Allison seems to think that China’s rise to dominance is inevitable but he thinks that war between China and the United States may (or may not) still be avoided.

All of this is interesting. However, the number of cases is too small to produce a scientifically valid result, in some cases the outcome was more ambiguous than Allison suggests, and his list of cases is not exhaustive. (One interesting case that he fails to mention is the American Civil War, which the previously dominant South started as a last desperate and unsuccessful effort to prevent the rapidly growing North from taking over control of the country.) Allison also fails to distinguish clearly between cases where the rising power really was overtaking the dominant power, or had already done so, and cases where the previously dominant power overestimated the threat to its position. Not surprisingly, the rising power is more likely to win contests of the first kind, whether through war or otherwise, than of the second.

Two of Allison’s cases deserve more careful study than most of the others, since their circumstances have the most striking parallels with the present Sino-American contest. The first is the Anglo-German rivalry prior to 1914, which led to a major war, and the second is the American-Soviet Cold War, which did not. In the first case the two rivals had much in common. Both were in gradual transition from a premodern form of government to modern democracy – the British were slightly further along the road to democracy than the Germans, but not as much further as wartime propaganda would later suggest. Their languages and their established Anglican and Lutheran churches were similar, their royal families were closely related and they had similar problems with ethnic (and Catholic) minorities: the Irish and the Poles.

Yet they went to war. The British worried about the rise of Germany and felt the need to protect France and Belgium; the Germans worried about the rise of Russia and felt the need to protect Austria-Hungary, their only ally; and the Russians felt the need to protect Serbia, which was involved in the assassination of the heir to the Austro-Hungarian throne.

In the second case the Americans and Soviets had little in common, culturally, politically or economically, yet they avoided going to war. Perhaps their leaders were more skilful or luckier than their British and German counterparts in the earlier contest. A more likely explanation is that the existence of nuclear weapons, and personal experience by the policymakers of what total war had meant in practice, even without nuclear weapons, made them more cautious. Nuclear weapons of course are even more terrible today than during the Cold War, so we can hope that the caution is still there. But the Chinese leaders today were born after the end of their civil war, and the last four American presidents have had no experience of real war, in contrast to most of their predecessors who held office during the Cold War.

Fortunately it is hard to imagine present-day American leaders, armed with thermonuclear weapons, starting a war in an effort to reverse the apparent course of history, as the Confederates did when they fired on Fort Sumter in 1861. On the other hand the sinister regime in North Korea seems crazy enough to risk such a war, and South Korea and Taiwan are vulnerable hostages to fortune today, as Serbia, Belgium and Austria-Hungary were in 1914. So there is definitely no reason for complacency.

Continue reading “Can China and America avoid Thucydides’s trap?”

Few events in American political history have inspired more fear, dismay and anxiety than the election of Donald Trump as the 45th president. Never since the 19th century have so many Americans (and non-Americans too) regarded the outcome of a presidential election as illegitimate, or viewed the successful candidate as morally or intellectually unfit to hold the office.

Thomas Mulcair, the leader of Canada’s NDP, has even called Trump a fascist, a label that is unsuitable for several reasons. Fascism was a movement of young men (Mussolini took power at 39, Hitler at 43) who rose from obscurity in the aftermath of a world war, who founded new parties and whose followers dressed up in coloured shirts and fought brawls in the streets against communists and socialists. In countries with many Jews it was anti-Semitic. Its foreign policy was based on territorial expansion and imperialism; fascists claimed that their country needed more space to accommodate its population.

(LEFT TO RIGHT: FILLMORE, PIERCE, BUCHANAN AND JOHNSON PHOTOS VIA WIKIMEDIA COMMONS)

Trump, by contrast, is an elderly millionaire who became the presidential candidate of a long-established party. His followers don’t wear uniforms or fight brawls in the streets. His favourite daughter married a Jew (who is an important adviser to the President) and is herself a convert to Judaism. Trump’s foreign policy promises were isolationist rather than expansionist (although in practice his foreign policy has not been as radical a break with the past as some people expected), and his vast country has plenty of room for a population that is growing rather slowly.

That Trump is not a particularly nice man may be conceded without attaching an exotic label to him. His crude speech and behaviour rival those of Lyndon Johnson, who had a strange obsession with the word piss and forced members of his entourage to watch him sitting on the toilet. Judging by his first few months, Trump will not be a great president, but he will probably be no worse a president than the sad consecutive trio of Fillmore, Pierce and Buchanan, who dithered their way into the Civil War, or than Andrew Johnson, who succeeded Lincoln. In the 1920s the mediocrity of Harding and Coolidge inspired H.L. Mencken to write that maybe European countries had a better idea when they chose their heads of state by hereditary succession rather than allowing the people to elect them.

LYNDON JOHNSON
ANDREW JACKSON
JOHN F. KENNEDY

Trump’s slogan “Make America Great Again,” which seems to alarm some people although I find it perfectly innocuous, has echoes of Lyndon Johnson (“The Great Society”) and John F. Kennedy (“Get This Country Moving Again”). His views on policy also have roots in American history. Anti-Mexican sentiment goes back almost two centuries to the siege of the Alamo. Hostility to immigration led to the “Know-Nothing” party in the 1850s, and later to the quota system that virtually ended immigration from Europe and Asia between 1920 and 1965.

Economic protectionism was Republican Party orthodoxy for almost a century after the Civil War; the Smoot-Hawley tariff act of 1930 carried protectionism to a level unlikely to be reached in any conceivable future. Isolationism, meaning a lack of interest in Europe and its problems, was a sentiment shared by most Republicans and many Democrats until the 1950s. Even Trump’s cosy relationship with the Russians, although genuinely disturbing, has a precedent in Franklin Roosevelt’s administration, in which sentimental Russophiles like Harry Hopkins and Henry A. Wallace, and even card-carrying Communists like Alger Hiss and Harry Dexter White, held important positions.

Donald Trump is a populist, and the President whom he most resembles, Andrew Jackson, is often considered the founder and prototype of American populism. Like Trump, Jackson was a wealthy man who posed as the tribune of the common people and the enemy of elites. Like Trump he was already elderly when first elected president. Like Trump he was hot-tempered and impulsive, and seemed to be perpetually angry about something. Alexis de Tocqueville, who visited the United States during Jackson’s administration, was not favourably impressed by Jackson and would probably not be surprised by the emergence of Trump.

Huey Long
William Jennings Bryan

William Jennings Bryan, the Democratic Party’s presidential candidate in 1896 at the young age of 36, was another populist who left a mark on American history, although he never became president. The most flamboyant and colourful American populist of the 20th century, and perhaps the most dangerous, was Huey Long of Louisiana, the model for Buzz Windrip in Sinclair Lewis’s novel It Can’t Happen Here. In the novel, Buzz replaces Franklin Roosevelt as the Democratic presidential candidate in 1936, wins the election and establishes a dictatorship. In a very Long-like gesture of contempt for the elites, he offers his predecessor the post of ambassador to Haiti, which is politely declined. In real life Long, who had established a virtual dictatorship within his own state and was contemplating a run for the presidency in 1936, was assassinated in September 1935, just before the novel was published.

These examples are exceptions; there have been few populist presidents or even populist presidential candidates. A Republican was likely to win in 2016, since it is rare for one party to hold the White House for three consecutive terms and Hillary Clinton was a controversial candidate for several reasons. But why did the Republicans nominate Trump? The short answer is that none of the alternatives who sought the nomination was particularly inspiring. But their choice of Trump also reflects recent changes in the party system.

The founders of the American republic were suspicious of parties. They devised a system that would dilute the influence of parties by separating the legislative and executive branches of government, and they designed institutions, the Senate and the Electoral College, that would emphasize divisions among the several states. In response, the parties became decentralized organizations that emphasized territory rather than ideology and brokered a variety of interests through compromise rather than adopting a rigid party line. There was no need for party discipline since the president held office for a fixed term regardless of what went on in Congress. As a result there was much overlap between the parties. In the 1950s Democrat Lyndon Johnson helped Republican Dwight Eisenhower get his program through the Senate.

As recently as the 1960s neither “liberal Republican” nor “conservative Democrat” was an oxymoron. In those days “liberals” and “conservatives” (terms rarely used by Americans before the New Deal) had opposing views about taxation, spending and the relationship between business and labour, the issues that had dominated the agenda in the 1930s, but such issues could usually be resolved through compromise. Senators and representatives in Congress could vote against their party colleagues, or against a president of their own party, as long as they looked after the interests of their state or district.

Over the last half century all this has changed. Cultural and “social” issues that appeal to people’s emotions and are not easily resolved through compromise have largely replaced the traditional politics of who gets what, when, how. In response the parties have become more distinct from each other, more ideological and more centralized. A “liberal” nowadays means someone who has no religious faith (unlike Franklin Roosevelt and Harry Truman, who were devout Christians), believes abortion is a “right,” thinks homosexuality and lesbianism are normal and is convinced that straight men of European ancestry are the source of all the world’s problems. A “conservative” means the opposite. Most “liberals” live in large metropolitan areas close to the east or west coast and have been to university. Most “conservatives” live in smaller towns, rural areas and the inland states, and have not been to university. As “liberals” and “conservatives” thus defined have little in common with one another, the “liberals” have almost all become Democrats and the “conservatives” Republicans.

The “liberals” fired the first shots in the culture war by using questionable interpretations of the Constitution by the courts to override tradition, custom and public opinion. In Engel v. Vitali (1962) the judges decided that a brief nonsectarian prayer recited in schools at the behest of a local school board was an unconstitutional “establishment of religion” even though the Constitution explicitly states that the prohibition of establishing religion applies only to Congress, not to state and local government. In Roe v. Wade (1973) they used an alleged “right to privacy” to allow an abortion to a woman whose life was not endangered by her pregnancy, although neither privacy nor abortion is mentioned in the Constitution.

Ross Perot

The “conservatives,” not getting much satisfaction from the judicial branch, responded by turning the Republican Party into a centralized European-style political machine. The “liberals,” subsequently and less successfully, tried to do the same to the Democratic Party. Both parties nowadays choose their presidential candidates through primary elections or caucuses in which only the ideologically committed are likely to vote, rather than through the traditional politics of brokerage. As the parties have become more distinct from each other and more centralized, the separation of powers between president and Congress has become unworkable. And since the courts seem to be where the action is, judicial appointments have become a tough game in which both parties are willing to play hardball.

In 1992 a precursor of Trump, another populist millionaire named Ross Perot, ran for president as a third-party candidate. He took enough normally Republican votes to deny the incumbent, George H.W. Bush, a second term that he well deserved for his achievements in foreign policy. Bill Clinton won the election with 43 per cent of the popular vote. That must have persuaded many Republicans that (to borrow a phrase from Lyndon Johnson) it was better to have a populist inside the tent pissing out than outside the tent pissing in.

Having learned that lesson, the Republicans in 2016 found their ideal candidate in Donald Trump, who, unlike Perot, was already a celebrity when he began his campaign. If the hated liberal elites made fun of him, all the better. If his opponent was a charter member of the elites and the wife of a controversial ex-president, better still. In 2016 Donald Trump was just the candidate the party needed.

The colourful new president of the Philippines, Rodrigo Duterte, has attracted attention around the world since he took office in June 2016. Few Canadians know much about his country, although for the last several years it has been the principal source of new immigrants to Canada. In 2015 it accounted for 50,812 new permanent residents admitted to Canada, or almost 19 per cent of the 271,660 admitted from all sources. In spite of the close historical ties between the United States and the Philippines, about as many Philippine people now move to Canada as to the United States, which admits more than a million new permanent residents each year from all sources.

The Philippines suffer from a great many problems, including extreme vulnerability to both typhoons and earthquakes; poor communications between the numerous islands of which the country consists; an exceptionally high birth rate; a multiplicity of dialects and languages; a high level of crime and violence, some of it linked to political unrest; and a very unequal distribution of land and other assets among its people. Various insurgent movements, some Marxist and some Islamist, have persisted for decades on Mindanao, the second largest island. Nonetheless, its rate of economic growth in recent years (about 6 per cent per annum) is one of the highest in the world, although admittedly economic growth per capita is less impressive. In addition to those who move permanently to North America, large numbers of Philippine people live and work temporarily around the world, and the money they send home helps to keep the economy afloat.

The name Philippines derives from the king of Spain who claimed sovereignty over the islands in the 16th century. The Spanish converted most of the people to Catholicism and made them adopt Spanish surnames, but made little effort to teach them the Spanish language, which is virtually unknown there today. A sense of national identity did not develop until the end of the 19th century, and was largely the work of the multitalented José Rizal (1861–1896), who was executed by a Spanish firing squad because he had written (in Spanish, ironically) two anticolonial and anticlerical novels. The gigantic monument over his grave, Manila’s most conspicuous landmark, is guarded 24/7 by Philippine soldiers, and the study of his life and works is a compulsory subject in Philippine schools.

Rizal’s death helped to inspire a rebellion against Spanish rule which in turn led to intervention by the United States. The Americans annihilated the feeble Spanish navy in Manila Bay in 1898, and a future president, William Howard Taft, was sent to Manila as the first American governor. The United States occupied the Philippines for nearly half a century, apart from a brief and unpleasant interval of Japanese rule during World War II, and both Douglas MacArthur and Dwight Eisenhower spent a large part of their military careers in the country. The Americans established a public school system, built railways (now largely abandoned), made English the lingua franca, created and trained a Philippine army (mainly the work of MacArthur), and prepared the country for self-government. After 1934 (except for the Japanese interlude) the Philippines had about the same degree of autonomy as Canada enjoyed in the days of Macdonald and Laurier, and in 1946 it became fully independent and joined the United Nations.

Since the end of the Spanish regime the population has increased from about 18 million to more than 100 million. About half live on the main island of Luzon. The most widely spoken indigenous language is Tagalog, often referred to as “Filipino” by people who speak it, but there are many others. Most middle- and upper-class people on all of the islands can speak at least some English, which is extensively used in the media, government, the armed forces and business.

The political institutions of the Philippines are modelled after those of the United States, with a president and vice-president, a bicameral congress and a supreme court. In contrast to the United States, the president and vice-president are elected separately and may belong to different parties. Since 1986 a president can only be elected once for a term of six years but, as in the United States, if a president dies the vice-president completes his or her term and is eligible for reelection.

Philippine politics is dominated by large landowners. The parties, which do not seem to differ much in ideology or program, are riddled with factionalism, patronage and corruption. Observers familiar with American politics before the New Deal, or Quebec politics before the Quiet Revolution, may not find the level of corruption particularly exceptional, but it is increasingly resented by middle-class Philippine voters. The traditional main parties were the Liberals and the Nationalists – perhaps vaguely analogous to U.S. Democrats and Republicans respectively – which dominated the scene until 1972. Several other party labels have become prominent in recent years and the country now has a multiparty system.

Although democracy has been more durable in the Philippines than in most Third World countries, an exception is the long era of Ferdinand Marcos. Elected president in 1965 and again in 1971, Marcos proclaimed martial law in 1972, citing the threat of Communism as justification, and held office as a dictator until 1986. His regime was supported by the United States, which initially needed its large military and naval bases in the Philippines to conduct the war in Indochina. As the dictator’s health began to fail, opposition to him began to grow. The most prominent opposition leader, Benigno Aquino, Jr., who had been exiled to Massachusetts, returned to Manila in 1983 but was assassinated within minutes of stepping off the airplane. The crime was widely attributed to Marcos, or perhaps to his wife Imelda, who wished to succeed him, but there are other suspects as well and nothing was ever proven.

In 1986 a “People’s Power” movement took to the streets and overturned the dictatorship with minimal violence. Marcos escaped to Hawaii, where he died soon afterwards. Aquino’s widow, Corazón Aquino, was elected president under a revised constitution and served until 1992. The American bases were closed. She was succeeded by Fidel V. Ramos, a professional soldier and the only Protestant ever to hold the office. The next president, Joseph Estrada, was impeached on suspicion of massive corruption after serving half his term and forced out of office. He was sentenced to life imprisonment but promptly pardoned by his successor, Gloria Macapagal-Arroyo, who had been vice-president. She was then elected in her own right and served a total of nine years. Soon after leaving office she was accused of electoral fraud and then of embezzling funds from the national lottery. After several years of legal battles and some time in jail she was acquitted on all charges in July 2016.

Benigno Aquino III, the son of Corazón and Benigno, Jr., was elected president as a Liberal in 2010. Under his administration the economy flourished. Capital punishment was abolished (for the second time; it had been abolished during his mother’s administration and restored after she left office). The armed forces were strengthened and military ties with the United States were partially restored in response to territorial disputes with China. However, failure to deal with corruption, crime and the ongoing insurgencies on Mindanao, which some foreign governments have advised their citizens not to visit, reduced his popularity.

The stage was set for the election of Rodrigo Duterte to the presidency in 2016. Duterte had been the mayor of Davao, the country’s second largest city, for several years and boasted that he had eradicated crime in the city. He promised to do the same at the national level if elected. He won an impressive plurality in a race among several candidates, including an overwhelming majority of the votes cast by Philippine citizens living overseas. Some of his support was based on nostalgia for Marcos, whose son came very close to being elected vice-president at the same time, and whom he wants to bury in a military heroes’ cemetery. However, Duterte is more impulsive and outspoken than his hero and is a populist rather than a true conservative. The temptation to compare him with Donald Trump is almost irresistible. When a visit by Pope Francis made the traffic jams in Manila even worse than usual, his response was “Pope, son of a whore, go home. Don’t visit any more.” More recently, he used the same epithet in relation to President Obama, who was definitely not amused. In foreign policy he is abandoning the pro-American orientation of the younger Aquino and leaning more toward Russia and China. Several years ago he said that he “hated” the United States.

However, he is notorious in other countries, and popular in his own, primarily because of the methods by which he seeks to eradicate crime, and particularly the drug trade. Before being elected he promised that the fish in Manila Bay would grow fat on the bodies of criminals whom he would kill and throw into the harbour. Several thousand people suspected of drug dealing and other crimes have actually been killed without trial by police and other agents of the state since he took office, a fact that has led to protests from the United States, the United Nations and the European Union. In September, four months after the election, he proclaimed a “state of lawlessness,” which he insisted was not the same as the martial law proclaimed by Marcos. Fears have been expressed that foreign investors and tourists will avoid the country if the mayhem continues.

It remains to be seen whether Duterte will become more conventional as he continues in office and depart peacefully at the end of his six-year term or whether his raucous beginning is the prelude to another dictatorship like that of Marcos, or even worse. At the time of writing all bets are off.

For Ireland, 2016 is an important year, the centennial of the Easter Rising that led to a war for independence, the partition of the country, a civil war and, eventually, a new independent state comprising 26 of Ireland’s 32 counties. The Rising also inspired one of the greatest poems by the 20th century’s greatest poet, William Butler Yeats. On February 26 this year, two months before the anniversary, the 26-county state conducted a general election that may mark the end, or at least the beginning of the end, for a two-party system that has lasted through most of the intervening century.

The philosophical differences between Ireland’s two main parties, Fine Gael and Fianna Fáil, are subtle and not easily perceived by outsiders. Both could be classified as brokerage parties and might be labelled, at least by observers accustomed to the more ideological politics of other European democracies, as moderately right of centre. Perhaps the closest parallels to them are the Progressive Conservative and Liberal parties in Canada’s Atlantic provinces, which seem to alternate in office without making much difference to the political agenda. As in Atlantic Canada, party loyalties are strong and durable, often passing from one generation to another, even if outsiders find this difficult to understand.

The origins of the two parties can be traced to the Irish Civil War of 1922–23, which is not to be confused with the war of independence against the British a few years earlier. The issue in the civil war was whether Ireland should accept the status of a British dominion with a governor general representing the Crown, as provided by the Anglo-Irish peace treaty of 1921, or whether it should insist on becoming the republic envisaged by the Proclamation of 1916. (Both sides knew and accepted that the partition of the island was a fait accompli, at least in the short term.) The civil war was on a very small scale, with fewer than a thousand people killed, and the result was a foregone conclusion since the British aided the pro-treaty side. Michael Collins had argued that the treaty was “not the freedom we desire but the freedom to achieve it.” He was at least partly right, since Ireland replaced the governor general with a president in 1937 and withdrew from the Commonwealth in 1949. However, his signature on the treaty, as he himself had predicted, was probably the motive for his assassination.

Fine Gael, which held office without interruption from 1922 to 1932, was the pro-treaty side in the civil war; Fianna Fáil, which was founded by Éamon de Valera in 1926 and has held office for 62 of the last 84 years, represents the anti-treaty side. Fine Gael is more pro-British than its rival, slightly more liberal on “social” issues that challenge the teachings of the Catholic Church, but somewhat more conservative on economic issues. It appeals most to relatively affluent voters, particularly in Dublin, and to the small Protestant minority. Fianna Fáil is more nationalist and more populist. Both now accept the republican constitution of 1937 (largely drafted by de Valera) and neither seriously pursues the goal of regaining the six counties that comprise “Northern Ireland.”

Unlike most European countries, Ireland is blessed by the absence of a right-wing anti-immigrant party, but two other parties are worth mentioning. The Labour Party, founded in 1912, was originally a party of the left, as its name suggests, but supported the pro-treaty side in the civil war. It has frequently, since 1948, held office as the junior partner in a coalition with Fine Gael, which shares its social liberalism to some degree. It has coalesced with Fianna Fail only once, in a government that lasted less than two years. Sinn Féin, the only party that contests elections in both northern and southern Ireland, is further to the left than Labour and is the only party seriously committed to the reunification of Ireland. It has never held office in the 26-county state but shares power in the north, where it is the largest party.

Fianna Fáil held office for most of the years of the so-called “Celtic Tiger,” when Ireland finally became rich, thanks to American direct investment and the European Union. After the economic boom collapsed the European Union forced it to impose austerity measures in return for economic aid. The election of 2011 relegated Fianna Fáil to third place, for the first time in its history, while the Labour Party won 37 seats, an all-time record. Fine Gael and Labour formed a coalition government, with an overwhelming majority. That government in turn gradually lost support when it continued the austerity measures, while the economy recovered only slowly. The nationalization of water utilities, which had previously been a local responsibility, and the imposition of water charges based on the amount of water used helped to make the coalition government unpopular, with Sinn Féin leading the offensive against these measures.

Such was the background to the election of 2016, but before analyzing the outcome it remains to explain the bizarre electoral system, which may be a warning to Canadians of what the Liberals have in store for us. The Dáil Éireann, analogous to our House of Commons, has 158 members, a rather large number for such a small country. They represent 40 ridings, each of which elects either three, four or five members depending on its population. In each riding a party may present as many candidates as there are seats, or fewer if it wants to avoid spreading its support too thinly. Each voter has only one vote, regardless of the number of seats in the riding, and may rank all of the candidates, or as many of them as he or she cares about, from first to last. A candidate is elected when he or she achieves the “quota,” meaning the number of votes cast in the riding divided by the number of seats. In the recent election only 23 out of several hundred candidates were the first choice of enough voters to do this on the first count. With each count the lowest candidates are dropped and their votes reallocated to those who were the second preferences of the voters who cast them. Recounting continues until the necessary number of candidates (three, four or five) have achieved the quota. Typically seven or eight counts are enough, but in the riding of Longford-Westmeath a total of 14 counts were necessary to fill the five seats, and the result could not be determined until six days after the election.

3_Joan_BurtonFor the first time in history the two major parties between them received (very slightly) less than 50 per cent of the first preference votes in 2016. Fine Gael, which had won 77 seats in 2011, won only 50, but remained the largest party. Fianna Fáil, reduced to an all-time low of 20 seats in 2011, rose to 44 seats, still a below-average performance for that party. The Labour Party suffered a catastrophic decline to only seven seats, and seems to be on the brink of extinction. Like the British Liberal Democrats and the German Free Democrats, it learned that being the junior partner in a coalition can be hazardous to a party’s health. Sinn Féin increased from 14 seats to 23, its best-ever showing south of the partition line. Three minor parties of the left took 11 seats. Independent candidates, most of them defectors from the two major parties, took 23, an unusually large number.

A coalition between the two major parties would seem to be the logical response to this outcome – unless you are Irish. In fact the historic animosity between these two tribes makes it highly unlikely. Neither major party will accept Sinn Féin as a coalition partner, and the other parties, including Labour, seem to have no desire to enter a coalition.

On March 10 the Dáil met and managed to elect a Fianna Fáil member as Ceann Comhairle (speaker of the house), the first time this office has been filled by a secret ballot. As predicted, however, it failed to elect a Taoiseach (prime minister). Neither of the two major party leaders won any votes outside of his own party, apart from the seven Labour members who voted for the incumbent, Enda Kenny of Fine Gael. At the time of writing (March 25) it seems unlikely that a government will be formed before the middle of April, and perhaps not even then. If all else fails the president of the republic, who is elected by the people but has mainly ceremonial duties, will ask Kenny to stay on as a caretaker Taoiseach heading a minority government until a new election takes place, either this year or next. What would happen if that election produced a similar result to the previous one remains to be seen.

When the Soviet regime collapsed in 1991 and the “republics” that had been ruled by that regime for most of the 20th century became sovereign states, the response in much of the Western world was a mixture of incredulity, relief and euphoria. Although the growing weakness of the regime had been known to Western intelligence agencies for several years, it had not been known to the general public. Suddenly the regime was gone, and the “Cold War” between it and the Western democracies, a conflict that had dominated the agenda of international relations for four and a half decades, seemed to have vanished also.

Even if Francis Fukuyama’s announcement of “the end of history” was an exaggeration, there was a general perception that everyone could relax, forget about international relations for a while and concentrate on problems closer to home. After a presidential election campaign that, for the first time since 1936, largely ignored international issues, Americans responded to the good news in 1992 by choosing as their president a pleasant young man who had been governor of one of the most insignificant states in the Union but had no experience in Washington. George H.W. Bush, who had presided over the Western victory and deserved at least some of the credit for it, was dismissed by the voters of his country much as Winston Churchill had been in 1945.

But unfortunately, the end of Communism did not mean the end of Russian expansionism, which long predates the revolution of 1917. Alexis de Tocqueville did not foresee the revolution, but he predicted in 1835 that Russia and America would eventually dominate the world. De Tocqueville also observed that while American conquests are “gained by the plowshare,” those of Russia are gained “by the sword.”1 Given the successful American attacks on Mexico and Spain subsequent to this comment, he was not entirely right, but neither was he entirely wrong.

The Russians had occupied all of Siberia before the end of the 17th century, and added what are now Belarus, most of Ukraine, the Baltic states, part of Poland and Alaska to their empire in the 18th. The Caucasian lands between the Black and Caspian seas had been absorbed into the Russian Empire by 1830. The Monroe Doctrine in the United States (1823) was largely inspired by Russian activities on the west coast of North America. The British and the Russians engaged in a cold war for most of the 19th century, which led to actual fighting in the Crimean War of 1854–56 but did not prevent Russia from taking most of Central Asia. The sale of Alaska to the United States in 1867, perhaps the only occasion on which Russia voluntarily withdrew from one of its possessions, was an anti-British and anti-Canadian move, intended to ensure that British Columbia would be hemmed in by American territory on both sides. Czar Nicholas II, now widely regarded in the Western world as an innocent victim of the Communists, told one of his ministers in 1903 that he had designs on Manchuria, Korea, Tibet, Iran and Turkey.2 Nicholas II has been described as “wise and great” by none other than Vladimir Putin.3

A recent work by a young British historian provides strong evidence that the outbreak of war in 1914 was as much Russia’s fault as it was Germany’s, if not more so.4 Russia’s immediate aim in that war was to gain control of Constantinople (Istanbul) and the straits connecting the Black Sea with the Mediterranean. Extreme Russian nationalists also dreamed of acquiring all of Europe east of a line from Stettin to Trieste. Having decided, for better or for worse, that Germany was a more immediate menace than Russia, the British promised Constantinople and the straits to their new Russian ally in a secret agreement during the war. The overthrow of the czar and the victory of the Bolsheviks made it unnecessary to carry out that promise. The Bolsheviks gleefully revealed this and many other secret treaties to the world after they took power.

So the end of the Soviet regime in 1991 did not mean the end of Russian nationalism and expansionism – in fact quite the reverse. It is true that under Boris Yeltsin, its president from 1991 through 1999, Russia enjoyed a decade as a peaceful and democratic member of the community of nations, much like Germany’s Weimar Republic in the 1920s. But like the Weimar Republic, the Russian democracy, partly because of economic problems, was undermined and eventually destroyed by the rise of extreme right-wing nationalism and revanchism, under a leader who made no secret of his geopolitical ambitions.

Vladimir Putin, who either as president or as prime minister has dominated Russia since 2000, stated in 2005 that the collapse of the USSR should be acknowledged as “a major geopolitical disaster of the 20th century.”5 It is important to interpret this somewhat ambiguous remark. Putin’s regime is socially conservative, allied with the Russian Orthodox Church and heavily influenced by a coterie of millionaire tycoons who have profited greatly from its existence. What Putin laments is not the collapse of Marxist ideology or the centrally planned economy, neither of which he has made any effort to restore, but the loss of the extended physical boundaries that the Russian empire (a.k.a. the USSR) enjoyed between 1945 and 1990.

Although Putin’s Russia is smaller and weaker than the old USSR, and lacks the “soft power” that the USSR enjoyed in the days when Communism was still taken seriously, it is in some respects more dangerous. The post-1945 USSR was a satisfied power with all the territory that it wanted. Its faith in the eventual triumph of its ideology made it patient and cautious, most of the time. (Nikita Khrushchev’s placement of missiles in Cuba in 1962 was a conspicuous exception, and probably contributed to his removal from office two years later.) Putin’s Russia is a defeated power that resents its diminished status in the world and is motivated solely by nationalism and the desire for revenge. Putin wants to win back the lost territories in his own lifetime. The analogy with Hitler’s Germany, although unfair in some respects, cannot be entirely dismissed.

Putin’s first target was Georgia, from which two ethnic minorities, the Abkhazis and the South Ossetians, had managed to secede, de facto, shortly after Georgia won its own independence in 1991. Neither secessionist regime gained international recognition, although the Abkhazis revealed an unexpected sense of humour in 1995 when they issued postage stamps in honour of (Groucho) Marx and (John) Lennon, thus earning a certain amount of hard currency from Western philatelists. In 2008 Putin attacked Georgia, occupied both of the secessionist enclaves and recognized them as sovereign states. Most of the world regards them as Russian dependencies.

Ukraine, the most populous of the post-Soviet states next to Russia itself and the main victim of Stalin’s ill-advised collectivization of agriculture, was the next target. Many Russians claim that Ukrainians are only a branch of the Russian people, since their language is similar to Russian, but few Ukrainians would agree. The far western part of the country, to which most Ukrainian Canadians trace their roots, was never under Russian rule before 1945. Ukraine does, however, have a large Russian-speaking minority, mainly in the industrialized eastern part of the country, some of which was part of the Russian Empire as early as the 16th century. Ukraine’s situation is also complicated by its fairly recent acquisition of Crimea, a strategically important peninsula that was impulsively given to it by Khrushchev in 1954. In pre-Stalinist times the Crimean Tatars, a Muslim people, were the largest ethnic group in the peninsula, but Stalin deported them to Central Asia, making ethnic Russians a majority, although many Tatars returned after Stalin’s death.

As long as Ukraine was part of the USSR, Khrushchev’s unexpected gift had little practical significance. Once Ukraine became independent in 1991, some Russians may have regretted Khrushchev’s action. However, in 1994 Russia, along with the other permanent members of the UN Security Council, signed the Budapest Memorandum, pledging not to threaten or use force against Ukraine’s independence and territorial integrity. In return Ukraine surrendered the stockpile of nuclear weapons that had been based on its territory by the USSR.

Ukraine continued to suffer from tension between the minority of Russian speakers in the east, some of whom had never fully accepted the country’s independence, and the Ukrainian-speaking majority in the rest of the country, who would like to join the European Union and possibly NATO. Following the presidential election in 2004 the so-called Orange Revolution prevented the pro-Russian candidate, Viktor Yanukovych, from taking the office which he had allegedly won on a second ballot between the two leading candidates. The runoff election was widely deemed to have been corrupt and fraudulent. Yanukovych did win the next presidential election, in 2010, but resigned and fled to Russia in February 2014 after mass demonstrations in Kiev against his pro-Russian and anti-Western policies. Three months earlier, he had refused to sign an agreement with the European Union, which he had earlier favoured, after the Kremlin publicly warned him not to sign it.

Putin attributed the demonstrations to “fascists” (a remarkable case of the pot calling the kettle black) and promptly responded by occupying and formally annexing Crimea, an operation for which plans had obviously been prepared in advance. Later in the year fighting broke out in eastern Ukraine between pro-Russian separatists, armed by the Russians and assisted by Russian regular forces, and the Ukrainian army. The Russians at first denied that their regular forces were involved but later admitted it. Much of eastern Ukraine was occupied by the Russians and their local allies. Casualties on both sides numbered in the thousands. Presumably by accident, the separatists also shot down a civilian airliner in flight from the Netherlands to Malaysia, killing everyone on board. Two separatist enclaves, styling themselves the people’s republics of Donetsk and Lugansk respectively, effectively seceded from Ukraine with Russian help and remain in existence under the protection of the Russian army.

In September 2014 a ceasefire agreement was signed in Minsk, the capital of Belarus, by representatives of Russia, Ukraine and the two enclaves. Although it reduced the intensity of the fighting for a time, the ceasefire broke down completely in January 2015, when the separatists launched a new offensive and succeeded in occupying the Donetsk international airport. This led to a diplomatic intervention by President François Hollande of France and Chancellor Angela Merkel of Germany. After 16 hours of negotiations a second agreement, very similar to the first, was signed at Minsk in February by the leaders of France, Germany, Ukraine and Russia. “Minsk II,” like its predecessor, reduced the intensity of the armed conflict without really ending it. Its terms have been the target of criticism by both the separatists and some Ukrainian nationalists, which is not surprising since they are deliberately ambiguous. The two “people’s republics” have both announced plans to hold local elections, with or without Ukraine’s permission, and have stated their desire to join the Russian Federation.

Reaction in the Western world to these ominous events has been lukewarm, except in Poland, Estonia, Latvia and Lithuania, which are understandably fearful that they may be next on Putin’s list of potential victims. France and Germany have tried to bring a negotiated end to the conflict, as noted above, although with little success. Other European countries have imposed “sanctions” on Russia with varying degrees of enthusiasm, but almost a century of experience with such measures suggests that they rarely have much effect. It is worth noting also that the status of Crimea has not even been discussed in the Minsk negotiations. The Russian annexation of the peninsula seems to be regarded in Europe, and in North America also, as a fait accompli.

The Americans have recently begun helping to train and arm the rather ineffective Ukrainian army. They have also sent token forces to Poland and the Baltic countries, which are members of NATO, as has Canada. However, the Ukrainian crisis seems to be having little or no impact on the American public, and is conspicuously absent from the discourse of the various would-be successors to President Obama. This indifference to a major European crisis is in sharp contrast to the almost hysterical American obsession with Iran, a weak state that has not started any wars or annexed any territory for nearly 1,500 years.

In Canada the Ukrainian situation also seemed to be a non-issue in the federal election campaign, surprisingly so in view of the substantial number of Canadians whose ancestors came from western Ukraine. Some left-wing Canadians have taken an openly pro-Russian and anti-Ukrainian position, even though Putin’s domestic policies are the antithesis of what they would want to see in this country. A notable example is the President of the Ontario Federation of Labour, Sid Ryan. Mr. Ryan is on record, and rightly so, as an opponent of the British military presence in partitioned Ireland, and one might have hoped that he would appreciate the obvious analogy with Russian military intervention in Ukraine. Like the Russians in Ukraine, the British in Ireland claim to be there to protect a minority, and they share the same habit of slandering their nationalist opponents in the former colony as “fascists.”

No one wants a major war against a Russia armed with nuclear weapons. However, the alteration by force of the boundary between two European states is a serious issue, and one without precedent since 1945. Furthermore, there is no moral or legal reason why Ukraine should be prevented from joining the European Union or even NATO, if that is what its people want. The argument that Russia has some inherent right to retain the former borderlands of the Russian/Soviet empire as a sphere of influence is nonsense. All such claims, if they ever existed, became null and void when the USSR collapsed and the new Russia recognized the independence of its neighbours. History suggests that the appeasement of tyrants and bullies rarely buys peace for very long, if at all. It may be hoped that the Western world has not entirely forgotten that lesson. Continue reading “Imperial Russia redux and the crisis of Ukraine”

Conrad Black, Rise to Greatness: The History of Canada from the Vikings to the Present.
Toronto: McClelland and Stewart, 2014. 1,106 pages.

Black is a Canadian phenomenon. Media mogul, confidant of prime ministers (two of whom share in the dedication of this book), biographer of American presidents, sometime resident of an American prison (albeit on questionable charges), member of the British House of Lords and frequent commentator on Canadian issues, he is a bête noire to many people. However, even his detractors have to concede that he is more interesting than most businessmen, and certainly far more than any of his predecessors as the prototypical Canadian celebrity entrepreneur: Hugh Allan, Herbert Holt and E.P. Taylor.

Less well known than other aspects of his curriculum vitae, at least to people who don’t frequent bookstores, is that Black is a distinguished historian. His first effort, a biography of Maurice Duplessis based in part on his master’s thesis, appeared in 1977 and was reviewed by the present reviewer in the now-defunct Canadian Forum. Despite certain flaws, which I noted at the time, it has stood the test of time and is still the best and fairest treatment of its subject. Since then he has produced a history of the United States, massive biographies of Franklin D. Roosevelt and Richard M. Nixon, and now the present volume. In total, these five books encompass well over 4,000 pages. Many academics, who complain about “publish or perish” but who manage to avoid doing either while spending only nine or ten weekly hours in the classroom during an academic year of less than eight months, could be put to shame by this performance.

There have not been that many one-volume surveys of Canadian history. The two that dominated the field for many years were Arthur Lower’s Colony to Nation and Donald Creighton’s Dominion of the North, both first published more than 70 years ago and long out of print. More recently, Robert Bothwell’s Penguin History of Canada (2006) has been the most prominent example. Black’s book is twice as long as any of the above and is a worthy successor to all of them.

Apart from a brief introduction, which provides an unsentimental overview of the indigenous peoples in their original state, and a brief conclusion, Rise to Greatness is divided into three main parts. The first, entitled “Colony,” covers the years from 1603 to 1867, the second and longest part, called “Dominion,” carries the story to 1949, and the third part, “Realm,” brings it up to the present. The rationale for the division between parts two and three is not made explicit, but is probably based on Newfoundland’s entry into Canada and Canada’s entry into NATO, which occurred almost simultaneously in the spring of 1949. The third significant event of 1949, the abolition of Canadian appeals to the Judicial Committee of the Privy Council, is not even mentioned in the book.

This is very much a political history, and readers who expect to find much cultural or social history, the fields that seem to be preferred by most present-day historians, will be disappointed. (This comment is not intended as a criticism, but merely as a warning to readers who might expect a different kind of book.) Stephen Leacock appears only once, as a professor and not as a humorist. Tom Thomson and the Group of Seven are not mentioned at all, nor are Robertson Davies, Hugh MacLennan, A.M. Klein or Mordecai Richler. There is nothing about sports, apart from the odd revelation that Mackenzie King despised basketball. The only mention of organized labour is one reference to the Winnipeg General Strike. The social consequences of industrialization, the cultural impact of television and later of computers, the transformation of cities into “census metropolitan areas” dominated by their suburbs, the emancipation of women and the growing prominence and political influence of gays and lesbians are likewise ignored, as they perhaps had to be if the book were not to become too large to fit within a single volume.

22_Champlain flickr

Part one of Rise to Greatness is in some respects the most interesting, and covers ground that is least likely to be familiar to most readers. Black places Canada’s origins in the context of the rivalry between Britain, France and other European powers, later joined by the United States. His grasp of this complicated story is effective and his writing engages the reader’s interest. Important figures like Champlain, Talon, Frontenac, Carleton and Durham come to life in these pages and are treated with the respect they deserve. Due importance is given to the achievement of responsible government. (However, the most egregious error in the whole book occurs in this section when Black asserts that the Saskatchewan River “flowed west from Lake Winnipeg.” As every Canadian should know, it flows east into Lake Winnipeg from its source in the Rocky Mountains of Alberta.)

Part two is dominated by the personalities of prime ministers, especially the big three of Macdonald, Laurier and King. Black seems to regard Macdonald (rightly in my opinion) as the greatest of the three, but concedes that the other two also had claims to greatness. Black’s touch is less sure when he deals with certain issues: the second and third transcontinental railways, and Catholic education in Alberta and Saskatchewan. He is probably wrong to assert that Reciprocity would not have affected Imperial Preference and he does not seem to have read the Statute of Westminster very carefully, if at all. In fact constitutional issues, such as the judicial interpretations of the British North America Act, are largely ignored. Watson and Haldane, the two law lords whose opinions from the 1890s through the 1920s significantly broadened the scope of provincial jurisdiction, are not even mentioned.

On the other hand, one subject with which Black seems excessively preoccupied is the Second World War, which accounts for 96 pages of text, or almost one tenth of the book. (Bothwell, although a specialist in foreign policy, covered the war in 16 pages, and Black himself manages to cover the First World War in 20.) There is far more in Rise to Greatness about Churchill, Roosevelt, Stalin, Hitler and Mussolini, not to mention Ribbentrop, Ciano and Molotov, than is appropriate in a survey of Canadian history, even though Black’s judgements about the principal actors are generally shrewd and accurate.

In part three my interest began to flag, in part because (like the author) I remember most of the events and have read too much about them already. (I did learn the meaning of the word callipygian, which appears on page 882, but only after I looked it up on Google.) Younger readers who did not live through the era of Diefenbaker, Pearson and Pierre Trudeau will find much enlightenment in these pages. Hopefully, however, they will not really believe that Jack Pickersgill at the age of 52 looked “like a young Hermann Göring,” especially considering that the real Göring died at 53. Saint-Laurent, Pearson and Mulroney are given good marks (rightly in my opinion) as prime ministers.

Black devotes much of part three to relations between Quebec and the federal government, and gives Pierre Trudeau perhaps more credit than he deserves for preventing the secession of Quebec. He has little to say about the western alienation, particularly directed at the Liberals, that was so significant in the second half of the 20th century. There are also a few errors of fact that should have been avoided. David Lewis was not the leader of the NDP in 1966, the revolt against the Official Languages Act in Robert Stanfield’s caucus was in 1969, not 1972, the Liberals won only two seats west of Ontario in 1980, not two seats west of Manitoba, and the Meech Lake Accord would not have given the provinces a veto over all kinds of constitutional amendments.

In the brief conclusion, Black sums up his evaluation of the various prime ministers and notes some recent trends, such as the declining influence of Quebec and the virtual disappearance of Canada’s traditional inferiority complex vis-a-vis the United States. He recommends “a return to great visions and projects” and a revival of “the creative tradition of public-private sector cooperation” which dates from the French regime but has been allowed to lapse recently. (Both suggestions might be seen as subtle criticisms of Stephen Harper.)

This is a useful and well-written book that should be in every Canadian public or university library, and in many private libraries as well. Perhaps Black will be remembered as a historian long after other aspects of his career have faded from the nation’s memory.

If you live in Canada and have not recently returned from a long sabbatical on another planet, you have probably heard the phrase missing Aboriginal women more often than you wished to hear it. If you drove along Highway 401 in eastern Ontario you could even see a billboard placed by the local Aboriginal band referring to the “missing women.” If you took the train from Toronto to Montreal instead of driving, you may have learned about them in a different way. On more than one occasion the passenger train service between our two largest cities has been suspended because the Aboriginals blocked the Canadian National’s main line in an effort to force the government to launch an “inquiry” into the alleged disappearances. People who are into this issue often allege that more than a thousand, or even “thousands,” of Aboriginal women have mysteriously disappeared.

There is something very Canadian about the belief that social problems, real or imaginary, can be solved by the government conducting an inquiry. Our national addiction to Royal Commissions, task forces and so forth is well known. But the demand for an inquiry into this particular matter seems to have acquired an unusually large and vociferous following. Amnesty International, an organization that used to specialize in helping people who were imprisoned for their religious or political beliefs, has made the “missing” women one of its major priorities for the last several months, although situations closer to its original terms of reference are certainly not in short supply these days. Both of the major opposition parties in Parliament have called on the federal government to conduct an inquiry into missing Aboriginal women. The provincial premiers, always happy to find a stick with which to beat the federal government, have done the same, joined by a large part of the media.

Not to be outdone, the Globe and Mail on February 14 devoted the entire back page of its Saturday “Focus” section to a photograph of a woman’s dress hanging from a tree and an upper-case headline that reads “IMAGINE 1,181 DAUGHTERS NEVER RETURNING HOME. NOW IMAGINE THAT NO ONE CARES.” In three paragraphs that followed the newspaper rather pompously announced that its “drive for justice will not vanish” until the matter is cleared up. Two weeks later the Globe repeated exactly the same advertisement in case anyone had missed it the first time.

Prime Minister Harper, to his credit, continues to assert that such an inquiry would serve no useful purpose, and that the police and courts are quite capable of dealing with the problem, as indeed they are doing. A few lonely voices have agreed with him, notably Jeffrey Simpson in a column published last August 27 under the headline “Posturing is the Only Reason for a Missing Women Inquiry.”As a matter of fact, and as Simpson’s column points out, in 2013 the Commissioner of the RCMP actually initiated such an inquiry, which was published the following year under the title Missing and Murdered Aboriginal Women: A National Operational Overview. The inquiry and report were in response to representations by the Native Women’s Association of Canada (NWAC).

The RCMP report covers the years from 1980 to 2012 inclusive, a time frame chosen to correspond with that used by NWAC in its efforts to collect data on the subject. It is filled with data that deserve careful reading (see box). Over that period of time, 164 Canadian Aboriginal women, or about five every year, had gone missing in the sense that they were never found either dead or alive. (One of them turned up alive in Tennessee in December 2014, several months after the release of the report.) The police suspected foul play in only 44 of these cases, but could not prove it in the absence of a body. The rest were presumably victims of accident or suicide unless, like the one found in Tennessee, they are actually still alive.

During the same period of 33 years another 1,017 Canadian Aboriginal women were known victims of homicide, or an average of about 31 each year. The people who demand an “inquiry,” including the Globe and Mail, usually combine the two numbers to produce a total of 1,181 “missing” women, although only about one seventh of the total number are actually missing in the normal sense of the term.

According to the RCMP data, Aboriginal women accounted for almost 19 per cent of female homicide victims in Canada over the years 1980 to 2012 inclusive, which is four to five times the Aboriginal percentage of Canada’s population. Clearly Aboriginal women are at greater risk of falling victim to homicide than other Canadian women. Among the population of Aboriginal women the annual homicide rate is between 5 and 6 per 100,000. This is high by Canadian standards, but is about the same as the average rate for both sexes and for all races in the United States.

The great majority (89 per cent) of these homicides were solved by the police in the sense that the killers were arrested, charged and convicted. The solution rates for homicides with female victims were identical for Aboriginal and non-Aboriginal victims. The majority (52 per cent) of the murdered Aboriginal women were killed by their husbands or other family members, another 10 per cent by close friends, and 30 per cent by acquaintances. Only 8 per cent were killed by someone not known to the victim. Physical beating or stabbing were the main causes of death for murdered Aboriginal women, accounting for nearly two thirds of all cases.

The data suggest that the great majority of the victims were almost certainly killed by fellow Aboriginals, either male or female, a fact that tends to be downplayed or denied by most of those people who demand an inquiry. The Harper government’s observation that the problem is mainly one of domestic violence is not appreciated by those who prefer to attribute the deaths to racism, colonialism, imperialism and the other sins to which white men are supposedly addicted. The fact that one of the most publicized criminal trials in recent years, that of British Columbian mass murderer Robert Pickton, involved a white man many of whose victims were Aboriginal women, has of course provided some grist for their mill.

Although declining to launch an inquiry, the Harper government eventually agreed to participate in a one-day roundtable to discuss the issue. This event took place in Ottawa on February 27, and the federal government was represented by the Minister of Aboriginal Affairs, Bernard Valcourt, and the Minister responsible for the Status of Women, Kellie Leitch. The premiers of Ontario, Manitoba and the Northwest Territories also put in an appearance, as did an assortment of Aboriginal people from various parts of the country.

The day before the roundtable the Aboriginals released a document entitled Violence against Indigenous Women and Girls in Canada: Review of Reports and Recommendations. The authors of the document claimed to have reviewed no fewer than 58 studies, reports and inquiries into the issue, not counting their own report, a claim that would seem to cast doubt on the need for still another inquiry. This did not stop the Globe and Mail on the same day by printing an editorial entitled “Not Enough Data, Not Enough Answers, on Missing Aboriginal Women.”

The roundtable predictably failed to produce a meeting of minds between the government and its Aboriginal and non-Aboriginal critics, although a few of the Aboriginal attendees (not named in the Globe and Mail’s account the following day) apparently agreed with the government that yet another inquiry would not be useful. The two federal ministers in attendance were on record as saying that “the solution to the violence rests largely with changing the behaviours and attitudes of men on reserves who they say are often the perpetrators of the crimes.” This attitude, according to the Globe and Mail, angered some (male?) Indigenous leaders.

On March 20 Aboriginal Affairs Minister Valcourt elaborated on these comments in a private meeting with some Aboriginal chiefs in Calgary. He stated that “up to” 70 per cent of murdered Aboriginal women were killed by people in “their own communities” and that the RCMP had data to support this claim. In response several chiefs demanded the minister’s resignation. Steve Courtoreille, the Grand Chief of Treaty 8 First Nations, sent a letter to Prime Minister Harper in which he described Valcourt’s comments as “disrespectful” and “offensive” and stated that the members of his organization would no longer communicate with the federal government until the minister was replaced.

Following these events the RCMP, which had originally chosen not to include data on the ethnicity of the killers in its public report, reversed its decision and promised to release another public report which would confirm the data cited by the minister. Meanwhile the Commissioner of the RCMP indicated in a letter to the Grand Chief of Treaty 6 First Nations that the 70 per cent figure was accurate. The second report was expected to be made public in mid-May, but had not yet been released at the time of writing.

So what more would we learn if the sought-after government “inquiry” actually took place? Apart from providing a platform on which the usual suspects could sound off about treaties, constitutional rights, residential schools, the settlement of the prairies, the Royal Proclamation of King George III and various other matters, what would be the point of having one? I think these questions must be answered with regard to the larger political context.

In recent years Canadian courts have indulged in more and more generous interpretations of the “aboriginal and treaty rights” referred to in section 35 of the Constitution Act, 1982. Although some provincial premiers at the time of patriation tried to avoid this outcome by insisting that the word existing be inserted before “aboriginal and treaty rights,” this modification has had little or no effect. “Existing” rights are whatever the Supreme Court says they are, even if their existence cannot be proven by the rules of evidence used in most judicial proceedings. Indeed the Court has even ruled that stories passed down orally from generation to generation can be cited as evidence in cases involving section 35.

These broad interpretations have produced, predictably, a revolution of rising expectations. The stakes are enormous, because the future development of Canada’s resources, the improvement of its infrastructure and the expansion of its major metropolitan areas inevitably involve the use of land, almost all of which might be claimed by some group of Aboriginals, supported by their army of lawyers and consultants, as part of their domain. Much of the energy of the political and intellectual left in Canada has already been diverted from its traditional concerns into the support of such claims. The environmental movement, which has gained great strength in recent years, has formed a strong alliance with the Aboriginal “rights” movement, since they seem to share a common interest in obstructing the development of Canada’s lands and resources. (Of course whether this is really in the interest of Aboriginals, many of whom work in the resource industries and many more of whom could potentially do so, is quite another matter.)

The one element that is still needed to create an almost unbeatable left-wing coalition in support of Aboriginal claims is the feminist movement. Historically the relations between feminists and Aboriginal elites have not been smooth. Despite much of the romantic nonsense about Aboriginal culture that one reads or hears nowadays, Aboriginal men have not been renowned for their sympathy or respect for women. In the Lavell case (1973), the Supreme Court upheld a provision of the Indian Act which deprived Aboriginal women, but not Aboriginal men, of their Indian status if they married non-Aboriginals. Male Aboriginal elites welcomed this unfair decision because they feared an influx onto the reserves of white men, who would compete for the more desirable women. In the 1992 referendum on the Charlottetown Accord, most Aboriginal women (and a majority of all voters on reserves) apparently voted against Aboriginal self-government, although the Aboriginal elites had insisted on its inclusion in the document.

However, the campaign for an inquiry into missing and murdered Aboriginal women has the potential to bring the Aboriginal establishment and feminists into the same camp. Note that there has been no mention of missing or murdered Aboriginal men, although presumably they have existed. Judging by the media, this strategy of alliance-building is already starting to pay off.

No one should doubt that Aboriginal Canadians face many serious problems. Most of these problems have been created or at least exacerbated by the reserve system, well-intentioned though it may have been in the beginning. Encouraging people to live in overcrowded and mainly remote locations with minimal access to health care, education and employment is not doing them a favour. While this policy provides wealth and power for a small Aboriginal elite, it is also a source of high rates of violence, exacerbated by alcohol and affecting both men and women.

The contrast between the lives of most Aboriginal people in Canada and the relative success achieved, often in less than a generation, by most immigrants of all races, languages and creeds is painfully obvious. It is at least arguable that Pierre Trudeau and Jean Chrétien were on the right track in 1969 when they proposed, in the notorious White Paper, that the Indian Act be abolished and that Aboriginals be treated the same as all other residents of our country. As we know, they quickly retreated from that policy and moved in the opposite direction with section 35.

Romanticizing Aboriginal folklore and traditions, as has become fashionable nowadays, and lamenting the very fact that Europeans ever came to North America, as increasing numbers of young people (themselves of European ancestry) seem to be doing, is not the way to resolve the problems of Aboriginal people. They need more integration into Canadian society, not less. Distracting people with red herrings and mouthing fashionable slogans will not bring that about.

6.2