Archive for the ‘history’ Category
An informative Project Syndicate article, here, delves into the reasons behind the current surge in interest in Scottish Independence, emphasising the independent institutions already in place. Its conclusion is surprising, but well thought-out: Scotland is in an unusual situation for a country seeking independence, with general approval of the status quo, despite the surge in voting for the Scottish National Party.
The article, for an American publication, does not look into the reasons behind this SNP surge: general dissatisfaction with the ruling Labour Party in Westminster, with the Conservative Party seen as English and foreign. After the elections earlier this year, the SNP is the largest party in the Scottish Parliament, and is set to lead a majority coalition, with negotiations in progress at time of writing. It is their stated aim, after this, to start negotiations to repeal the Acts of Union (1706-7) and hold a referendum on independence.
Following independence, what is the next step for Scotland? At the risk of stating the obvious, we can look forward to closer ties with Europe including adoption of the single currency, and an increase in trade with other European neighbours. Having seen the generally positive results of this in Ireland, I have no problem with any of that, though the article does pour cold water on any hopes for the level of Brussels largesse that Ireland has enjoyed.
Nevertheless, Scottish Independence is now firmly on the political agenda, and an exemplary continuation of the Scots realpolitik we read in the history books. In the eighteenth and nineteenth centuries, a Scotland that had made its peace with an imperial England was pioneering Enlightenment thinking and the Industrial Revolution. Instead of kicking England out of Scotland, they took Edinburgh to Westminster, to the extent that Scots have been in charge of the ruling Labour party for last 15 years. When John Smith died in 1994 he handed over the reins to Tony Blair (who was originally from Edinburgh); Blair became Prime Minister in 1997, and is soon to retire in favour of Gordon Brown (from Glasgow). Having a Caledonian in Number 10 will surely weigh heavily on the Scottish Independence process.
With the hard work done and less remaining to fight over, with an awareness of British Imperial history, and an emphasis on political structures and cultural identity, the kind of “struggle” that went on in Ireland looks like a collossal waste, counter-productive and unnecessary in the Scottish context. My nation of skinflints knows that there are cheaper ways of getting the job done!
Since the middle of last week I’ve been feeling a lot better, and as the previous post hinted, the belated arrival of Spring in Dublin is also serving to lighten the gloom. So, let me take a little time to put down a controversial idea I’ve had for some time, but which I need to express carefully.
In my view of the world, one where religion and other beliefs are no justification for anything that harms anyone else, Israel is a major destabilizing force in the Middle East today. It is held together by the sheer will of a vigilant Israeli people, who have resisted onslaughts from all sides – military, political and economic – with the material support of the United States in particular. It is a country in which most young people – men and women alike – serve in the military, actively and in reserve.
After centuries in the wilderness of Diaspora, the state of Israel was founded in 1948, and since then has been the focal point of Islamic aggression. America’s support for Israel is an oft-given reason for the rise of Al-Qaeda terrorism. I have no patience for Islamic theocratic imperialism, the Allah-given drive to subjugate the world under the Mullahs. Though I am not keen on Nationalism in any form, I fully support the rights of the Israeli people, as any people, to self-determination, independence, and a homeland they can call their own.
But why, oh why, did they have to put the homeland there?
The answer is, of course, religion. One of the founders of Zionism in the United Kingdom, Dr. Chaim Weizmann, was a chemist whose process for mass production of acetone made a huge difference to British arms production in World War I: it was a major component of cordite, used in smokeless gunpowder. It gave Weizmann friends in high places, and direct influence over David Lloyd-George (Munitions Minister, then Prime Minister 1908-1915), and Lord Balfour (former Prime Minister, and Foreign Secretary 1916-19).
The Balfour Declaration of 1917, produced after a decade of urging by Weizmann, expressed Britain’s support for a “National Home” for Jewish people in what was then called Palestine. As reported in Lord Balfour’s biography (quoted in the Wikipedia article), Balfour had actually asked Weizmann, back in 1906, “why there”? His reply cited the historic connection of the Israeli people to the region, and he also said “anything less would be idolatry”. A curious turn of phrase: “idolatry”, as in “false worship”? As in Islam, this reverence for a mere piece of land explains much.
The wording of the Declaration is cautious, even conservative, insistent that no harm was done to existing non-Jewish people in the region. The idea of a sovereign state was played down at the time. Palestine was a British Mandate from 1920 to 1948, but Britain gradually lost control as their tacit approval of an Israeli state led to mass immigration. In the immediate aftermath of World War II, the fallout from the Holocaust and further migration of Holocaust survivors in to the region, and Israeli attacks on British forces in the region, led Britain to call in the newly-formed United Nations to manage their abdication of control over Palestine.
The 1947 UN Partition Plan map is a mess, to be blunt; a compromise that tried to please everyone, and ended up pleasing no-one. The Wars of the next 30 years were the obvious instances of trouble, but there is a different kind of bomb ticking in Gaza; a demographic bomb. The Gaza Strip has a very high birth rate, and extrapolation of the 2005 UNESCO figures predicts a 44% population increase in 10 years, to over 2 million, with a population density approaching Hong Kong’s (5,700 per km²).
Today, I am concerned that the United States, having squandered most of its political capital in the Middle East, will leave Israel more exposed to attack. I thought the Hezbollah attacks on Israel in 2006 were insane, unrealistic, poorly planned and totally counter-productive; but they happened anyway. Israel will not be seriously endangered by such tactics any time soon.
No, my real concerns are long term; 10, 20, 50 years from now, when the USA may be hampered by oil shortages and domestic turmoil, and politically estranged from its allies far away. What happens when Egypt’s swing to the right puts an anti-democratic caliphate in place? When Saudi Arabia, its crude oil pipeline to the USA drying up, no longer needs to curry favour in Washington DC? When Lebanon becomes an extension of Syria, and Palestinian extremism distracts Jordan?
The fate of a small nation, isolated among enemies, without powerful allies, is a game that has been played out many times before, on paper, in computer simulation, and on the cold ground. The resolute Allies saw to Germany in World War II, but a more apt example is the Roman destruction of Israel in 66-73 CE; the impersonal, crushing response to a Jewish rebellion over religion.
I don’t know what the answer is; but if I was in charge of Israel’s long-term defence, I would be looking at every option, and a strategic withdrawal of the Jewish people from the region would be such an option. Then again, I am not one to invest a piece of ground with holy provenance; I would be left with mere history, and “I was here first” is no defence against an enemy who is equally tied to the same ground, for equally religious (i.e. irrational) reasons. An enemy who, by sheer birthrate and irrational blindness to consequences, has much to gain from Israel’s removal. I don’t like it – but that is no shield against reality, when it arrives.
If, like me, you have an interest in demographics and the state of the world, Google has just the tool for you: The Gapminder. Basically, it plots demographic data on a chart that is animated to let you plot changes over time.
For a sample of what makes this an engrossing tool, try the following:
- select Population on the x-axis, and Life Expectancy on the y-axis;
- hit Play to animate the chart over the period 1960-2004;
- watch what happens during the early 1990s; a little dot plummets to the bottom of the chart, then pops back up again;
- what country is that? Scroll the chart till the dot reaches bottom, and select it;
- the country is Rwanda, the stats for the point you select are shown on the axes.
- Play the chart again: Rwanda’s basic demographics are plotted as a line that bucks the expected upward trend.
- Not only does the Life Expectancy plummet to just 24 in 1992, between 1990 and 1995 the population drops from around 7 million to under 5½ million.
The dip in Rwanda’s population is, of course, the Rwandan Genocide; that is now part of history, but Zimbabwe’s Life Expectancy has been in the news. Mugabe’s repressive regime puts the Leader and his Ideology over all other concerns, including the basic health of Zimbabwe’s people. Sure enough, selecting Zimbabwe on the map lets you follow the country down, to a Life Expectancy of just 34 in 2004.
There are more stats in there now, and surely more to follow. I ought to find up some positive stats too, just to stop me getting too fatalistic, but positive stats are going to be hard to find in there. OK, Ireland now has the highest per capita earnings of any country in the world – but do I see any of that bounty?
This morning features the opening of the new Scottish Parliament building in Edinburgh: a remarkable building, beautifully designed, but a project marred by massive cost overruns. Queen Elizabeth II is attending amid some controversy: some wish she wasn’t there at all, while others expressed dismay over the II in her title. Queen Elizabeth I of England was, after all, never recognized in Scotland: during Elizabeth’s reign, Scotland was ruled by James V, who died in 1542 when his daughter Mary was just six days old. The story of Mary, Queen of Scots, is one I’m not all that familiar with, but the short version is like something out of a soap opera.
A Catholic, Mary was first married to the future King Francis II of France, but he died not long after taking the throne, so Mary married her Catholic cousin, Henry Stewart, Lord Darnley. Mary may have had an affair with one of her Italian advisors, David Rizzio, and he didn’t last long when Darnley found out: Rizzio was murdered. By 1567 Darnley tried to take over the line of succession for his heirs: he was strangled and his house blown up by Protestant saboteurs. The main Protestant conspirator was James Hepburn, 4th Earl of Bothwell, whom Mary soon married after he divorced his wife.
The outcome of all this drama was as serious as it could possibly be: Mary had offended and alienated anyone who might have supported her, and the Scottish nobility raised an army against her. Before 1567 was over, she had been defeated and forced to abdicate the Scottish throne in favour of her son, James VI. Mary sought refuge in the English court of Elizabeth I, but was effectively a prisoner for the next 20 years. Because Elizabeth was Protestant, many Catholics believed that Mary should be Queen of England but, after several abortive conspiracies, Elizabeth had had enough and signed Mary’s execution warrant in 1587.
In a suitably ironic coda, Mary’s son James VI succeeded to the throne of England as James I when “Virgin Queen” Elizabeth died in 1603. Despite his Catholic heritage, James’ bloodline could be traced directly back to King Henry VII, and he had already succeeded in quelling Catholic ambitions, and married the Protestant Princess Anne of Denmark, despite having been kidnapped by Protestant militants earlier in life. With Britain largely united, the sectarian conflict was mostly over, and the focus shifted outward, leading to war with Spain over their refusal to let the Spanish infanta marry James’ son, the later Charles I. I could go on, but that will do for one day.
Sir Sean Connery is in attendance, and has naturally managed to offend a few people already. Strangest sight of the morning: the Queen entering the building and being greeted by officials, while the brass band plays, of all things, Mark Knopfler’s theme music from Going Home. Um…?
When I wrote about Japanese a few weeks ago, I was worried about the fact that any Kanji can have two different readings in everyday use: on-yomi (Chinese Reading) and kun-yomi (Japanese Reading). (There is a third reading, nanori, which is used for names, which I will get to later!) A “reading” goes further than pronunciation: you have different spoken words for the same written Kanji. The choice of reading you use appears to be interchangeable according to convention, history, even regional variations between east and west Japan. I don’t think I was overstating the seriousness of the situation for someone trying to learn to read written Japanese.
Take the following two Kanji as an example:
- meaning: to cut
- on-yomi: セツ setsu / setzu
- kun-yomi: き(り) ki(ri)
- meaning: abdomen, stomach
- on-yomi: フク fuku
- kun-yomi: はら hara
Put the two kanji together in one way, the on-yomi reading is used, the kun-yomi is not. And vice versa:
- セツ + フク = セップク
- setzu + fuku = seppuku
- はら + きり = はらきり
- hara + kiri = hara-kiri
So, since the order of the Kanji does not affect the overall meaning, we have two different sayings for the same action, one based on on-yomi, the other on kun-yomi.
You may have heard both these at various times, they mean ritual disembowlment, usually suicidal. I found this example when I was preparing to watch Misihima: A Life in Four Parts, a US-produced biography of Mishima Yukio, the writer, and wondered what the difference was. I’d heard musical references to his work in various places, such as the work of Sakamoto Ryuichi and David Sylvian, whose Forbidden Colours collaboration was named after one of Mishima’s plays.
I didn’t see anything obviously false in the film’s portrayal of his life. A homosexual in post-war Japan had it even worse than the Great American Queer. Paradoxically, Mishima became obsessed with the apparent decline in Japanese moral values over the next twenty-five years, using it as a tangential theme in much of his work.
In 1970, Mishima and a few followers apparently tried to stage a military coup, by dressing up in self-designed uniforms and attempting to take over a military base near Tokyo. After tying up the commanding officer, Mishima addressed the gathered troops (and Press), exhorting them to reject the modern commercial softness of Japan, and regain their Imperial heritage. They all laughed at him, so, after saluting the Emperor, Mishima went back inside and committed seppuku.
Now, that puts a whole new spin on Japanese culture, doesn’t it? I can see where Takeshi Kitano gets some of his ideas from, and then there’s the Manga, which I hope to read in the original Japanese one day.
The name Mishimo Yukio was a 仮名 (kamei), a nom de plume or pen-name, chosen deliberately by young Kimitake Hiraoka; so what does it mean, if anything? This is where the nanori comes in: the third type of Kanji reading, after on-yomi and kun-yomi. He actually has an entry in the JWPce Japanese Dictionary – 三島由紀夫 (1925 – 1970) – so let’s look at the individual Kanji:
- 三 = Three
- 島 = Island
- 由 = Reasons (why)
- 紀 = Chronicle, History
- 夫 = Man, Husband
I don’t mind saying I’m none-the-wiser: a reference to Japanese history, perhaps? Maybe I’ll get it later.
Sixty years ago, soldiers waited in the dark, on airfields, in barracks, aboard ships, with Normandy on their minds. The remembrance ceremonies for the 60th anniversary of D-Day yesterday included the dropping of a million poppies into the sea off the coast, and will continue later today on the beaches and at the Ranville military cemetary, with the current leaders of the Allied nations all present.
I was not around to witness it, of course; my father was only ten years old, and my paternal grandfather, as a baker in Glasgow, would have been considered exempt from service as an essential civilian worker. There was much pondering, today, on how to ensure that the next generations will be made to understand and remember the sacrifices made on and after that day, and what it meant. I don’t know: history will fade, and perhaps the specific details can be forgotten. As long as the essential lesson remains: whenever human freedom is truly threatened, as it was sixty years ago, we can not look away or stand aside.
The other main story this evening was the death of former US President Ronald Reagan. His vice-president and successor, George Bush Sr., spoke eloquently about his friendship with Reagan on camera, but did not seem unduly upset. Reagan had been ill with Alzheimer’s disease for a decade, so it hardly came as a surprise, but Bush had more important things on his mind (paraphrased):
“That it? OK? Now, turn off the cameras, we don’t need to be formal anymore… look, out on the lake, the bass are jumping!”
When I returned from South Africa, in late 1991, the country was still an international pariah state, thanks to “apartheid”. I could compare the way the country was portrayed in the media, first from the inside, and with the view from the outside. While I learned a lot, I came to some conclusions that might surprise some.
In the English-speaking community, not only were we well in touch with the way South Africa was viewed, we didn’t think or act in a racist fashion, and we could see the inevitability of change. The “white” community in South Africa was never united in favour of apartheid; my last two years of school were in a large town, big enough for multiple schools, and mine was definitely British, both in language and culture. Even among Afrikaners, the divisions were clear, with a liberal Afrikaans press taking regular pot-shots at government policies, though it was rare to hear openly anti-apartheid rhetoric there.
The business community can take more credit, for the end of “apartheid”, than any number of politicians or protesters. Money talks, and business interests such as the Anglo-American Corporation – who I worked for and still hold a few shares in – were publicly pushing for change. By 1992, President F.W. de Klerk had released all the political prisoners, notably Nelson Mandela, and pushed through the repeal of the laws that formed the legal basis of apartheid. By 1994, the ANC was in charge, after the first elections to include all citizens.
My work for Anglo-American (late 80’s) was for the Highveld Steel company and subsidiaries, and the workforce was divided; but it was more a class division than a racial one, at that time. We had three obvious working classes – management, skilled, and unskilled – which you could take as a reasonable model of the country’s population. The first two were mostly, but not exclusively, white, but improvements in education meant that my fellow apprentices were of all colours, and management was starting to go that way too. Meritocracy was, in principle, the order of the day.
The unskilled majority were unionised, and walked out a few times, once with fairly serious consequences. The rest of us weren’t unionised and didn’t understand what the strike was about, so we were quite happy to keep things going, scoring overtime and bonus pay as a result. Over one memorable fortnight, just two of us ran a whole division of the factory at night, with a manager occasionally dropping in to check on us. It was mostly manual work, mainly controlling conveyors for loading of coal into coking furnaces, that produced carbon monoxide to fuel the steel furnaces. We didn’t find it too difficult, and actually got in a few hours sleep in the middle of the shift – but it took about 20 workers under normal conditions.
The experience brought home an essential point about the role of the company as an employer: these unskilled workers needed the work, and far more of them were employed than were actually needed. The company played a larger social role, offering adult education, health care, and other social benefits. It’s not surprising, looking at it from that angle, that their wages were low; yet their grievances were, if I recall correctly, related to pay and employment security.
This to me is the true legacy of apartheid, one that will take many more years to correct: a huge under-educated majority is not something that can be sustained in a modern economy, and the last decade has not seen sufficient improvement in education standards and availability – the major challenge facing South Africa today. Factories such as the one I worked in are huge concentrations of employment, so much so that many workers are migrants, far from home; they were once prevented by law from settling in “white” areas, and even though the legal restrictions are now gone, the economic problems barely make life any easier today. The fact that South Africa has not yet gone up in flames, due to economic unrest, is laudable, but the long honeymoon is almost over, with so much more still to do.
Back in London, however, I found an incredible wilful ignorance about the complexities of the South African situation, and the ties to British colonial history. (For example, the word “kaffir” is a racial pejorative, yet it comes from the Arabic “qafir”, meaning “unbeliever (of Islam)”. It was once benign and was used in a official capacity by the British government long before it became insulting.) Instead, all I found was blind prejudice and soundbites, and an assumption that anyone who lived there, even a Brit like myself, had picked up racism and carried it with them like a virus. The opposite was true: I was not brought up as a racist, and didn’t feel I had to go to extraordinary lengths to fight apartheid visibly, or preach loudly against it. We just got on with life, did the right things, and that was enough.
Today brought the news that South Africa is to host the 2010 World Cup Soccer tournament: a signal that South Africa is now truly accepted in the international community, in a way they haven’t been for as long as SA has been a country at all. The year is significant and may have played a part in the decision, even though I didn’t hear it mentioned: 2010 will be the centenary of the Union of South Africa, the first time that all the major provinces came together as one country.
Perhaps, in a few years, I wil feel ready to go back there for a visit, preferably when I have learned to drive. Like America, the cities alone are not the main attraction, and though I have seen the Rockies and the Alps, I will have no trouble recognizing the Drakensberg when I see them again. JRR Tolkien, who lived in the area as a child, hadn’t seen them for years before writing The Hobbit, yet the “Dragon Mountains” are a clear inspiration for his descriptions of landscapes.