Archive for the ‘history’ Category
An informative Project Syndicate article, here, delves into the reasons behind the current surge in interest in Scottish Independence, emphasising the independent institutions already in place. Its conclusion is surprising, but well thought-out: Scotland is in an unusual situation for a country seeking independence, with general approval of the status quo, despite the surge in voting for the Scottish National Party.
The article, for an American publication, does not look into the reasons behind this SNP surge: general dissatisfaction with the ruling Labour Party in Westminster, with the Conservative Party seen as English and foreign. After the elections earlier this year, the SNP is the largest party in the Scottish Parliament, and is set to lead a majority coalition, with negotiations in progress at time of writing. It is their stated aim, after this, to start negotiations to repeal the Acts of Union (1706-7) and hold a referendum on independence.
Following independence, what is the next step for Scotland? At the risk of stating the obvious, we can look forward to closer ties with Europe including adoption of the single currency, and an increase in trade with other European neighbours. Having seen the generally positive results of this in Ireland, I have no problem with any of that, though the article does pour cold water on any hopes for the level of Brussels largesse that Ireland has enjoyed.
Nevertheless, Scottish Independence is now firmly on the political agenda, and an exemplary continuation of the Scots realpolitik we read in the history books. In the eighteenth and nineteenth centuries, a Scotland that had made its peace with an imperial England was pioneering Enlightenment thinking and the Industrial Revolution. Instead of kicking England out of Scotland, they took Edinburgh to Westminster, to the extent that Scots have been in charge of the ruling Labour party for last 15 years. When John Smith died in 1994 he handed over the reins to Tony Blair (who was originally from Edinburgh); Blair became Prime Minister in 1997, and is soon to retire in favour of Gordon Brown (from Glasgow). Having a Caledonian in Number 10 will surely weigh heavily on the Scottish Independence process.
With the hard work done and less remaining to fight over, with an awareness of British Imperial history, and an emphasis on political structures and cultural identity, the kind of “struggle” that went on in Ireland looks like a collossal waste, counter-productive and unnecessary in the Scottish context. My nation of skinflints knows that there are cheaper ways of getting the job done!
Since the middle of last week I’ve been feeling a lot better, and as the previous post hinted, the belated arrival of Spring in Dublin is also serving to lighten the gloom. So, let me take a little time to put down a controversial idea I’ve had for some time, but which I need to express carefully.
In my view of the world, one where religion and other beliefs are no justification for anything that harms anyone else, Israel is a major destabilizing force in the Middle East today. It is held together by the sheer will of a vigilant Israeli people, who have resisted onslaughts from all sides – military, political and economic – with the material support of the United States in particular. It is a country in which most young people – men and women alike – serve in the military, actively and in reserve.
After centuries in the wilderness of Diaspora, the state of Israel was founded in 1948, and since then has been the focal point of Islamic aggression. America’s support for Israel is an oft-given reason for the rise of Al-Qaeda terrorism. I have no patience for Islamic theocratic imperialism, the Allah-given drive to subjugate the world under the Mullahs. Though I am not keen on Nationalism in any form, I fully support the rights of the Israeli people, as any people, to self-determination, independence, and a homeland they can call their own.
But why, oh why, did they have to put the homeland there?
The answer is, of course, religion. One of the founders of Zionism in the United Kingdom, Dr. Chaim Weizmann, was a chemist whose process for mass production of acetone made a huge difference to British arms production in World War I: it was a major component of cordite, used in smokeless gunpowder. It gave Weizmann friends in high places, and direct influence over David Lloyd-George (Munitions Minister, then Prime Minister 1908-1915), and Lord Balfour (former Prime Minister, and Foreign Secretary 1916-19).
The Balfour Declaration of 1917, produced after a decade of urging by Weizmann, expressed Britain’s support for a “National Home” for Jewish people in what was then called Palestine. As reported in Lord Balfour’s biography (quoted in the Wikipedia article), Balfour had actually asked Weizmann, back in 1906, “why there”? His reply cited the historic connection of the Israeli people to the region, and he also said “anything less would be idolatry”. A curious turn of phrase: “idolatry”, as in “false worship”? As in Islam, this reverence for a mere piece of land explains much.
The wording of the Declaration is cautious, even conservative, insistent that no harm was done to existing non-Jewish people in the region. The idea of a sovereign state was played down at the time. Palestine was a British Mandate from 1920 to 1948, but Britain gradually lost control as their tacit approval of an Israeli state led to mass immigration. In the immediate aftermath of World War II, the fallout from the Holocaust and further migration of Holocaust survivors in to the region, and Israeli attacks on British forces in the region, led Britain to call in the newly-formed United Nations to manage their abdication of control over Palestine.
The 1947 UN Partition Plan map is a mess, to be blunt; a compromise that tried to please everyone, and ended up pleasing no-one. The Wars of the next 30 years were the obvious instances of trouble, but there is a different kind of bomb ticking in Gaza; a demographic bomb. The Gaza Strip has a very high birth rate, and extrapolation of the 2005 UNESCO figures predicts a 44% population increase in 10 years, to over 2 million, with a population density approaching Hong Kong’s (5,700 per km²).
Today, I am concerned that the United States, having squandered most of its political capital in the Middle East, will leave Israel more exposed to attack. I thought the Hezbollah attacks on Israel in 2006 were insane, unrealistic, poorly planned and totally counter-productive; but they happened anyway. Israel will not be seriously endangered by such tactics any time soon.
No, my real concerns are long term; 10, 20, 50 years from now, when the USA may be hampered by oil shortages and domestic turmoil, and politically estranged from its allies far away. What happens when Egypt’s swing to the right puts an anti-democratic caliphate in place? When Saudi Arabia, its crude oil pipeline to the USA drying up, no longer needs to curry favour in Washington DC? When Lebanon becomes an extension of Syria, and Palestinian extremism distracts Jordan?
The fate of a small nation, isolated among enemies, without powerful allies, is a game that has been played out many times before, on paper, in computer simulation, and on the cold ground. The resolute Allies saw to Germany in World War II, but a more apt example is the Roman destruction of Israel in 66-73 CE; the impersonal, crushing response to a Jewish rebellion over religion.
I don’t know what the answer is; but if I was in charge of Israel’s long-term defence, I would be looking at every option, and a strategic withdrawal of the Jewish people from the region would be such an option. Then again, I am not one to invest a piece of ground with holy provenance; I would be left with mere history, and “I was here first” is no defence against an enemy who is equally tied to the same ground, for equally religious (i.e. irrational) reasons. An enemy who, by sheer birthrate and irrational blindness to consequences, has much to gain from Israel’s removal. I don’t like it – but that is no shield against reality, when it arrives.
If, like me, you have an interest in demographics and the state of the world, Google has just the tool for you: The Gapminder. Basically, it plots demographic data on a chart that is animated to let you plot changes over time.
For a sample of what makes this an engrossing tool, try the following:
- select Population on the x-axis, and Life Expectancy on the y-axis;
- hit Play to animate the chart over the period 1960-2004;
- watch what happens during the early 1990s; a little dot plummets to the bottom of the chart, then pops back up again;
- what country is that? Scroll the chart till the dot reaches bottom, and select it;
- the country is Rwanda, the stats for the point you select are shown on the axes.
- Play the chart again: Rwanda’s basic demographics are plotted as a line that bucks the expected upward trend.
- Not only does the Life Expectancy plummet to just 24 in 1992, between 1990 and 1995 the population drops from around 7 million to under 5½ million.
The dip in Rwanda’s population is, of course, the Rwandan Genocide; that is now part of history, but Zimbabwe’s Life Expectancy has been in the news. Mugabe’s repressive regime puts the Leader and his Ideology over all other concerns, including the basic health of Zimbabwe’s people. Sure enough, selecting Zimbabwe on the map lets you follow the country down, to a Life Expectancy of just 34 in 2004.
There are more stats in there now, and surely more to follow. I ought to find up some positive stats too, just to stop me getting too fatalistic, but positive stats are going to be hard to find in there. OK, Ireland now has the highest per capita earnings of any country in the world – but do I see any of that bounty?
This morning features the opening of the new Scottish Parliament building in Edinburgh: a remarkable building, beautifully designed, but a project marred by massive cost overruns. Queen Elizabeth II is attending amid some controversy: some wish she wasn’t there at all, while others expressed dismay over the II in her title. Queen Elizabeth I of England was, after all, never recognized in Scotland: during Elizabeth’s reign, Scotland was ruled by James V, who died in 1542 when his daughter Mary was just six days old. The story of Mary, Queen of Scots, is one I’m not all that familiar with, but the short version is like something out of a soap opera.
A Catholic, Mary was first married to the future King Francis II of France, but he died not long after taking the throne, so Mary married her Catholic cousin, Henry Stewart, Lord Darnley. Mary may have had an affair with one of her Italian advisors, David Rizzio, and he didn’t last long when Darnley found out: Rizzio was murdered. By 1567 Darnley tried to take over the line of succession for his heirs: he was strangled and his house blown up by Protestant saboteurs. The main Protestant conspirator was James Hepburn, 4th Earl of Bothwell, whom Mary soon married after he divorced his wife.
The outcome of all this drama was as serious as it could possibly be: Mary had offended and alienated anyone who might have supported her, and the Scottish nobility raised an army against her. Before 1567 was over, she had been defeated and forced to abdicate the Scottish throne in favour of her son, James VI. Mary sought refuge in the English court of Elizabeth I, but was effectively a prisoner for the next 20 years. Because Elizabeth was Protestant, many Catholics believed that Mary should be Queen of England but, after several abortive conspiracies, Elizabeth had had enough and signed Mary’s execution warrant in 1587.
In a suitably ironic coda, Mary’s son James VI succeeded to the throne of England as James I when “Virgin Queen” Elizabeth died in 1603. Despite his Catholic heritage, James’ bloodline could be traced directly back to King Henry VII, and he had already succeeded in quelling Catholic ambitions, and married the Protestant Princess Anne of Denmark, despite having been kidnapped by Protestant militants earlier in life. With Britain largely united, the sectarian conflict was mostly over, and the focus shifted outward, leading to war with Spain over their refusal to let the Spanish infanta marry James’ son, the later Charles I. I could go on, but that will do for one day.
Sir Sean Connery is in attendance, and has naturally managed to offend a few people already. Strangest sight of the morning: the Queen entering the building and being greeted by officials, while the brass band plays, of all things, Mark Knopfler’s theme music from Going Home. Um…?
When I wrote about Japanese a few weeks ago, I was worried about the fact that any Kanji can have two different readings in everyday use: on-yomi (Chinese Reading) and kun-yomi (Japanese Reading). (There is a third reading, nanori, which is used for names, which I will get to later!) A “reading” goes further than pronunciation: you have different spoken words for the same written Kanji. The choice of reading you use appears to be interchangeable according to convention, history, even regional variations between east and west Japan. I don’t think I was overstating the seriousness of the situation for someone trying to learn to read written Japanese.
Take the following two Kanji as an example:
- meaning: to cut
- on-yomi: セツ setsu / setzu
- kun-yomi: き(り) ki(ri)
- meaning: abdomen, stomach
- on-yomi: フク fuku
- kun-yomi: はら hara
Put the two kanji together in one way, the on-yomi reading is used, the kun-yomi is not. And vice versa:
- セツ + フク = セップク
- setzu + fuku = seppuku
- はら + きり = はらきり
- hara + kiri = hara-kiri
So, since the order of the Kanji does not affect the overall meaning, we have two different sayings for the same action, one based on on-yomi, the other on kun-yomi.
You may have heard both these at various times, they mean ritual disembowlment, usually suicidal. I found this example when I was preparing to watch Misihima: A Life in Four Parts, a US-produced biography of Mishima Yukio, the writer, and wondered what the difference was. I’d heard musical references to his work in various places, such as the work of Sakamoto Ryuichi and David Sylvian, whose Forbidden Colours collaboration was named after one of Mishima’s plays.
I didn’t see anything obviously false in the film’s portrayal of his life. A homosexual in post-war Japan had it even worse than the Great American Queer. Paradoxically, Mishima became obsessed with the apparent decline in Japanese moral values over the next twenty-five years, using it as a tangential theme in much of his work.
In 1970, Mishima and a few followers apparently tried to stage a military coup, by dressing up in self-designed uniforms and attempting to take over a military base near Tokyo. After tying up the commanding officer, Mishima addressed the gathered troops (and Press), exhorting them to reject the modern commercial softness of Japan, and regain their Imperial heritage. They all laughed at him, so, after saluting the Emperor, Mishima went back inside and committed seppuku.
Now, that puts a whole new spin on Japanese culture, doesn’t it? I can see where Takeshi Kitano gets some of his ideas from, and then there’s the Manga, which I hope to read in the original Japanese one day.
The name Mishimo Yukio was a 仮名 (kamei), a nom de plume or pen-name, chosen deliberately by young Kimitake Hiraoka; so what does it mean, if anything? This is where the nanori comes in: the third type of Kanji reading, after on-yomi and kun-yomi. He actually has an entry in the JWPce Japanese Dictionary – 三島由紀夫 (1925 – 1970) – so let’s look at the individual Kanji:
- 三 = Three
- 島 = Island
- 由 = Reasons (why)
- 紀 = Chronicle, History
- 夫 = Man, Husband
I don’t mind saying I’m none-the-wiser: a reference to Japanese history, perhaps? Maybe I’ll get it later.
Sixty years ago, soldiers waited in the dark, on airfields, in barracks, aboard ships, with Normandy on their minds. The remembrance ceremonies for the 60th anniversary of D-Day yesterday included the dropping of a million poppies into the sea off the coast, and will continue later today on the beaches and at the Ranville military cemetary, with the current leaders of the Allied nations all present.
I was not around to witness it, of course; my father was only ten years old, and my paternal grandfather, as a baker in Glasgow, would have been considered exempt from service as an essential civilian worker. There was much pondering, today, on how to ensure that the next generations will be made to understand and remember the sacrifices made on and after that day, and what it meant. I don’t know: history will fade, and perhaps the specific details can be forgotten. As long as the essential lesson remains: whenever human freedom is truly threatened, as it was sixty years ago, we can not look away or stand aside.
The other main story this evening was the death of former US President Ronald Reagan. His vice-president and successor, George Bush Sr., spoke eloquently about his friendship with Reagan on camera, but did not seem unduly upset. Reagan had been ill with Alzheimer’s disease for a decade, so it hardly came as a surprise, but Bush had more important things on his mind (paraphrased):
“That it? OK? Now, turn off the cameras, we don’t need to be formal anymore… look, out on the lake, the bass are jumping!”
When I returned from South Africa, in late 1991, the country was still an international pariah state, thanks to “apartheid”. I could compare the way the country was portrayed in the media, first from the inside, and with the view from the outside. While I learned a lot, I came to some conclusions that might surprise some.
In the English-speaking community, not only were we well in touch with the way South Africa was viewed, we didn’t think or act in a racist fashion, and we could see the inevitability of change. The “white” community in South Africa was never united in favour of apartheid; my last two years of school were in a large town, big enough for multiple schools, and mine was definitely British, both in language and culture. Even among Afrikaners, the divisions were clear, with a liberal Afrikaans press taking regular pot-shots at government policies, though it was rare to hear openly anti-apartheid rhetoric there.
The business community can take more credit, for the end of “apartheid”, than any number of politicians or protesters. Money talks, and business interests such as the Anglo-American Corporation – who I worked for and still hold a few shares in – were publicly pushing for change. By 1992, President F.W. de Klerk had released all the political prisoners, notably Nelson Mandela, and pushed through the repeal of the laws that formed the legal basis of apartheid. By 1994, the ANC was in charge, after the first elections to include all citizens.
My work for Anglo-American (late 80′s) was for the Highveld Steel company and subsidiaries, and the workforce was divided; but it was more a class division than a racial one, at that time. We had three obvious working classes – management, skilled, and unskilled – which you could take as a reasonable model of the country’s population. The first two were mostly, but not exclusively, white, but improvements in education meant that my fellow apprentices were of all colours, and management was starting to go that way too. Meritocracy was, in principle, the order of the day.
The unskilled majority were unionised, and walked out a few times, once with fairly serious consequences. The rest of us weren’t unionised and didn’t understand what the strike was about, so we were quite happy to keep things going, scoring overtime and bonus pay as a result. Over one memorable fortnight, just two of us ran a whole division of the factory at night, with a manager occasionally dropping in to check on us. It was mostly manual work, mainly controlling conveyors for loading of coal into coking furnaces, that produced carbon monoxide to fuel the steel furnaces. We didn’t find it too difficult, and actually got in a few hours sleep in the middle of the shift – but it took about 20 workers under normal conditions.
The experience brought home an essential point about the role of the company as an employer: these unskilled workers needed the work, and far more of them were employed than were actually needed. The company played a larger social role, offering adult education, health care, and other social benefits. It’s not surprising, looking at it from that angle, that their wages were low; yet their grievances were, if I recall correctly, related to pay and employment security.
This to me is the true legacy of apartheid, one that will take many more years to correct: a huge under-educated majority is not something that can be sustained in a modern economy, and the last decade has not seen sufficient improvement in education standards and availability – the major challenge facing South Africa today. Factories such as the one I worked in are huge concentrations of employment, so much so that many workers are migrants, far from home; they were once prevented by law from settling in “white” areas, and even though the legal restrictions are now gone, the economic problems barely make life any easier today. The fact that South Africa has not yet gone up in flames, due to economic unrest, is laudable, but the long honeymoon is almost over, with so much more still to do.
Back in London, however, I found an incredible wilful ignorance about the complexities of the South African situation, and the ties to British colonial history. (For example, the word “kaffir” is a racial pejorative, yet it comes from the Arabic “qafir”, meaning “unbeliever (of Islam)”. It was once benign and was used in a official capacity by the British government long before it became insulting.) Instead, all I found was blind prejudice and soundbites, and an assumption that anyone who lived there, even a Brit like myself, had picked up racism and carried it with them like a virus. The opposite was true: I was not brought up as a racist, and didn’t feel I had to go to extraordinary lengths to fight apartheid visibly, or preach loudly against it. We just got on with life, did the right things, and that was enough.
Today brought the news that South Africa is to host the 2010 World Cup Soccer tournament: a signal that South Africa is now truly accepted in the international community, in a way they haven’t been for as long as SA has been a country at all. The year is significant and may have played a part in the decision, even though I didn’t hear it mentioned: 2010 will be the centenary of the Union of South Africa, the first time that all the major provinces came together as one country.
Perhaps, in a few years, I wil feel ready to go back there for a visit, preferably when I have learned to drive. Like America, the cities alone are not the main attraction, and though I have seen the Rockies and the Alps, I will have no trouble recognizing the Drakensberg when I see them again. JRR Tolkien, who lived in the area as a child, hadn’t seen them for years before writing The Hobbit, yet the “Dragon Mountains” are a clear inspiration for his descriptions of landscapes.
Well, that was the week that was. Not very productive, as it turned out, but I think I’m fairly chilled out by now, which makes it worth it.Today I finally succumbed, and have spent half of today watching The West Wing from Series 1 Episode 1 on. It is seductive in a particular way, portraying a world in which real people do real things, but also act in ways not normally seen in this world: they say what they mean, do what they say they will, follow through on commitments and actually think about what they’re doing. Pure indulgent fantasy, in other words.
The ideas I wrote about eight days ago are still setting sparks off in my head, and there are more where they came from. I have a theory I want to expand on later, but I’ll put it down here for future reference. It involves the basic principle behind technology in all its forms: starting with how the harnessing of energy led to the use of force to reshape the world. Our modern world runs on one major principle: the targeted and controlled release of energy, whether in the internal combustion engine, or just by flicking a switch to let electricity flow through a light bulb. Progress has also put force in the hands of almost anyone who might want to use it against anyone else, in the form of weapons such as guns and explosives.
My point is that I see no way to get off this slippery slope of technological progress: even without ethnic, religious and cultural differences, there would still be population pressures to cause friction, with any one person now able to cause a frightening amount of death and destruction.
There is a surplus of young men in the world, and what are they all good for? We don’t need the muscle; technology has repackaged the energy they used to provide into machines. Only the best are needed for breeding purposes, the rest, like me, are surplus to requirements. Taking a hard view, that leaves one outlet for all this frustration: War. It worked for all the ancient cultures, after all, and the basic principle hasn’t changed.
What a great thought to end a holiday with, eh? Back to work.
Just out from a combined shave/shower/scrub-the-bathroom session, with a few interesting ideas to write about. These are the kinds of ideas that might escape me if I read them in a book – which I probably have at some point – but they seem to have popped up of their own accord now, and are thus more likely to stick around.Psychologists and counsellors have been known to advise people not to place too much of their self-esteem on, or expect validation from, other people: what is bugging me, as I turn onto final, cleared for another birthday circuit-and-bump, are the links between self-esteem, age, work, and happiness.
The things you do when you are young are often highly dependent on others: celebrity, social gatherings, even work. It’s not surprising to me that celebrity is a young person’s game, from the point of view of the celebrities or the public who follow them in the newspapers.
A broad generalization: as people age, their interests change and deepen; where a driver once focused on the car’s looks and speed, later they may learn to appreciate the engine as more than just a source of power: witness the hot-rods and choppers made in the USA, where the engine is shined up and lovingly tuned. A young woman becomes a mother, investing love and attention on a single infant, where she once followed the celebrity scene. Even an interest in other people can drift from the external – their hair, their bodies, what they’re wearing – to fundamental animal behaviours (sex and violence), relationships, life and death. (Soap operas have all of the above and appeal to all age groups.)
So far, so obvious: but I have a very personal angle on this, as I consider a career change. It makes sense, as I get older, to look beyond other people for interests and validation, to try to find out what I fundamentally enjoy and want to do. A key point is sustainability: when you become good at something, your reward is … to keep on doing it, again and again. It follows, therefore, that it’s a good idea to become good at something you enjoy. Obvious, again.
Looking at it the way I have above, it’s clear that I can rule certain things out.
- music: I’ll keep on doing it for fun, but I’m not good enough to put together a sustainable career. There is no chance of pop success: I can’t say I want that, since it would be too fickle (unless you’re Status Quo) and I see many bands stuck in endless touring and regurgitation of their previous work, struggling to recapture the spirit that got them there in the first place (e.g. Yes). In the professional audio fields, the doors are now closed to all but the luckiest and most qualified.
- computers: I’ve grown to differentiate between the use of a computer as a means to an end, or as an end in itself. Even though most of the things I do involve computers – keeping this website, making music, communicating with others via the Internet – I have to realize that I don’t make a very good geek, by geek standards. The people who contribute to Linux, write application programs or program synthesizers in depth are the real geeks. Those are things I’ve been OK at, but never that good, and that was when I was younger.
What does that leave?
- management: I don’t like the idea, frankly, but I would be foolish to rule it out altogether. It’s a natural progression for aging office workers, qualified or not. There is zero chance of management advancement in my current job, since there are already too many managers and too many better-qualified colleagues waiting for an opening.
- teaching: this is something I didn’t consider at all until I noticed my growing pedantic streak, and the enjoyment I get from imparting knowledge when I have it (and sometimes when I don’t). “Those who can, do; those who can’t teach” is the old cliché – but “doing” is not that black-and-white, and teaching something you’re good at is an option to consider when you’re not quite good enough to lead your particular field.
It’s taken me till after midnight, on and off, to get that much thought in order, on a topic I’m sure I’ll be returning to this week.
Earlier this evening I went to an open audition for movie extras, for the forthcoming film King Arthur, chunks of which are being filmed in Ireland in the second half of this year. Not much of an audition – a few details, some measurements, and a photo. Still, they wanted people with long hair and beards; I haven’t had a haircut since last December, and as for the beard, well, I heard about the audition two weeks ago and stopped shaving. That’s all it takes for me to look like Cousin Itt. The beard’s coming off in a few minutes time: even if I get the part, filming only starts in June at the earliest, more likely August.
The film, apparently, is aiming for a realistic portrayal of King Arthur and his times; which means 5th or 6th century, Saxons, and the dregs of the Roman Empire. There will be no magic, but I suspect there will be some tribal aspects to the story, and apparently the Knights of the Round Table were Russian mercenaries. Currently, Clive Owen (Croupier, Gosford Park) is cast as King Arthur.
The director? Antoine Fuqua, who made the acclaimed Training Day a few years ago. Here’s his take on his new project:
“I don’t want to piss anybody off, but it’s going to be edgy! This is Arthur before Camelot. It’s before he became the king so it’s going to be a much more realistic version of what Arthur really was about, much more human. We’re going to have thousands and thousands of Saxon warriors and big battle scenes like “Braveheart” and “Gladiator”. I think choreography and horses and safety are going to be the biggest headache! It’s a fantastic project to do. It’s a dream to make a big epic film about a great legend”
I guess I must be on a mild Shakespeare bender at the moment, since I spent yesterday evening watching A Midsummer-Night’s Dream. It took longer than it should have, because I was partly following the text at first. I soon gave that up, for now, because some bits are shifted around, others left out. Some of the soliloquies are cut off at their peak, and not allowed to wind down as per the text – but they’re the better for it.
Ebert and other critics have commented that the four “young lovers” in the play are almost interchangeable; so here the director has chosen distinct character actors for the roles, including Christian Bale (Empire Of The Sun, American Psycho) as Demetrius, and Calista Flockhart (Ally McBeal) as Helena, in an enjoyably over-the-top portrayal of a highly dramatic character. Not that it matters when they’re all wrestling in a pool of mud, after Puck (Stanley Tucci) confuses their emotions by drugging them.
Helena spends much of the time passionately chasing Demetrius around the forest on a bicycle, then clashes with Hermia (Anna Friel), who has eloped with her lover Lysander (Dominic West), to escape her father, who insists she marry Demetrius, the son of the local godfather Thesus (David Strathairn), who’s about to get married to Hippolyta (Sophie Marceau). Hermia naturally tries to encourage Helena to win Demetrius over, as a way of resolving the conflict:
Helena: O! teach me how you look, and with what art
You sway the motion of Demetrius’ heart.
Hermia: I frown upon him, yet he loves me still.
Helena: O! that your frowns would teach my smiles such skill.
Hermia: I give him curses, yet he gives me love.
Helena: O! that my prayers could such affection move.
Hermia: The more I hate, the more he follows me.
Helena: The more I love, the more he hateth me.
Hermia: His folly, Helena, is no fault of mine.
Helena: None, but your beauty: would that fault were mine!
I’m not going to describe the whole plot here – but the fact that I could is a compliment to the writers, who turned a slightly confusing play into a well-plotted film. As with Hamlet, Shakespeare uses a “play within a play” as a structural device, this time played as a farce, since Bottom (Kevin Kline) and his friends aren’t very good actors, frankly. Pity the poor actor playing “The Wall” on that stage.
This was my first experience of the play in any form – I’ve neither read it before nor watched it staged. I can see the fascination it holds for many, and how its ideas have been used elsewhere. I can imagine a gritty modern version, where the “fairy dust” spread around by Puck is replaced by Rohypnol…
It’s bad enough trying to figure out who to listen to when it comes to real world events, but if you have an interest in art and culture, well, it gets worse. I’ve had related arguments about this problem with a colleague from work; he takes the extreme position that you can not trust anything written by anybody, whether it’s culture, history, or current events. When we discussed some of the history of India, he nearly blew his top when I consulted the Encyclopaedia Britannica, because in his view even an encyclopaedia is written by people and reflects only one version of events.
I substantially agree with that position, but that doesn’t mean the encyclopaedia should be entirely dismissed. It was the one source available to me at the time, and adequate for a conversation, but he took that to mean that I go to the opposite extreme – gullibly accepting whatever I read and hear. The answer lies somewhere in between, and there are two methods I can use to get what I hope is a balanced view of a topic.
The first is the balancing of multiple sources. Some sources carry more weight than others, though none can be taken as absolute authorities. In the case of the Encyclopaedia Britannica, it helps to remember that the articles were written by individuals, and contain some strong opinions, but it’s not hard to see when that happens. The articles have been subjected to peer review and editing, before they end up in print, so some of the work has been done for me. The Internet alone doesn’t carry much weight, and I’m loath to mention it at all, unless it’s backed up by something more substantial. I mentioned it once, in a pub discussion on the Masons – just for fun – and my colleague nearly jumped down my throat. I thought the accuracy requirements were loose enough, at that point, to use a risky source, but I guess I was wrong.
Multiple news sources are difficult to find with regard to current events; in Iraq, we have the US communications centre, giving their own presentations and reports on what is happening, but they only see what they have to. Then we have the journalists: shocked when the war comes to them personally, awed by everything they see, and horrified when the military doesn’t toe the line and follow their journalistic or nationalistic agendas.
The second method is more useful when it comes to culture, politics, or any field where opinions are more important than facts. So much is written on such topics, and it’s impossible to keep up on it all, even if you wanted to. The trick is to leverage your own judgement and experience in detecting and compensating for the bias of a few particular sources. Allow me to use film as an example:
- The Internet Movie Database is a good source for the facts behind a movie – crew, release dates, etc. Its subscribers also weigh in with their opinions and ratings, and they use a well-documented formula to combine these ratings into a single overall score. The results are quite useful, especially when it comes to the great films; there is little disagreement over the position of Citizen Kane in their Top 250 list, though it does contain populist favourites such as The Shawshank Redemption and E.T.
- Roger Ebert of the Chicago Sun-Times is probably the most famous living film critic, and the most consulted, since the Sun-Times posts his reviews on their website at no charge. He watches and writes about movies on a full-time basis, six a week or more, and the experience shows, but so does the fatigue. I pretty much agree with his general assessment of movies I’ve seen, though he can pick up positive factors that I miss, and he can thus seem a little too lenient at times. His “Great Films” list, on the other hand, is where he gets highly selective and analytical, and he thoroughly justifies his selections.
- Metacritic is a useful service that I subscribe to on my PDA: it simply combines comments and ratings from multiple sources into a single score – a good way of getting a “quick fix” on a film. It has no review staff of its own, but its entries are based on the US box office, so the offline version isn’t much help with films released here in Ireland months later.
- For offline use, I have a copy of Halliwell’s Film & Video Guide. I find this book to be highly reactionary: not only do its reviewers totally miss the point of some films, giving them low ratings and missing their positive points, it also practices snobbery, by apparently discounting films based on popularity or box-office success.
Its index of “four star” films also exposes its “golden era” prejudices: the numbers follow a downward trend over the years, from a peak of eleven in 1940, to just one per year since 1997 and none in 2000 or 2002. Has film-making, as an art form, really deteriorated that much over the years? Where’s Kieslowski’s Three Colours, or Speed, Heat, or The Man Who Wasn’t There? The early forties may have produced Fantasia, Casablanca, The Maltese Falcon and Citizen Kane, but Halliwell’s four-star list for that period also includes some very dubious choices. This means that I take Halliwell’s advice with a large dose of salt and compensate accordingly.
Obvious, eh? It is, in a way, when I lay it out like this. As I’ve said before, one of the reasons I write this blog is for the sake of clarifying my position on a topic, for myself primarily, but also for anyone else who may be interested. I think it’s always useful to lay out even basic principles, if it helps maintain a consistent approach.
There are two ways to slide easily through life; to believe everything or to doubt everything. Both ways save us from thinking.
– Alfred Korzybski
Update, April 2009: This was obviously written before the rise of Wikipedia, and the “one-stop fact shop” culture it has spawned, but I haven’t changed my mind on these basic principles. I still prioritize questions by importance, and find that Wikipedia is a suitable resource for most general questions I encounter. If I’m not satisfied with what I find there, I can look elsewhere, perhaps starting with the links it provides. Yet I still encounter people who dismiss it entirely for cynical reasons, unable to judge when it’s appropriate or not.