Monday, November 16, 2015

Heinlein's Rules of Writing - the Amendment

A recent discussion on a writers' forum reminded me of Robert A. Heinlein's five rules for writing. These were put forward in his 1947 essay On the Writing of Speculative Fiction, and are still fiercely debated, some considering them the elixir of life, others a poisoned chalice.

Robert A. Heinlein (1907-1988, left) is considered one of the three masters of "Golden Age" SF, along with Isaac Asimov and Arthur C. Clarke. He wrote some of the great works of the genre — but do his rules hold up sixty-eight years after he wrote them?

1. You must write 


Well, this seems the most obvious of the rules, though I'd possibly phrase it more like If you don't write, you're not a writer. You choose what you do, of course, but a writer who doesn't write is an oxymoron.

What this rule ducks, though, is exactly what "writing" means. For a full-time author like Heinlein, it might mean an eight-hour day in front of the typewriter (or its modern equivalent), but most of us have to juggle many other calls on our time: work, family, Facebook… No, scratch that last one, it's not an excuse. Perhaps the rule should be Use whatever time you can possibly spare to write — even if that only means squeezing in ten minutes a day.

2. You must finish what you write


As with many of these rules, my reaction to this is "yes and no". Its value is as a counter to something I see with a lot of beginning authors, project-hopping. I was guilty of this myself in my teens, rarely finishing a project before I lost interest and went on to something else.

What it meant was that I didn't get enough practice until a little later at writing something complete, in particular learning how to end a story. It's certainly good discipline to keep going with your current story and leave that shiny new project on the back burner till you've finished.

That doesn't mean, though, that every verbal doodle you put down has to be seen through. Many of my stories come from a weekly exercise I take part in, writing for an hour to a prompt. Many of my published works have started that way, included the two stories I published with Musa, The Treason of Memory and The Lone and Level Sands, but equally I've put down many of these pieces as "fun but no potential". If I felt I had to finish everything I started, I'd be inhibited from taking part in these exercises and miss out on some great ideas.

Perhaps this one should be Finish any project you commit to, unless you have very good reason to think it's not working.
 

3. You must refrain from rewriting, except to editorial order


This is perhaps the most controversial of Heinlein's rules. Taken literally, it's perhaps the worst advice since George Lucas was told everyone would love Jar Jar Binks. It's very rare for a first draft to be fit to send to any agent or editor, and even if you revise as you go (which I don't), chances are you'll still need to revise again in the light of the finished product.

Some people (including, again, me as a teenager) believe that revision spoils the originality and spontaneity of the initial concept. It's not true. Spontaneity, like comedy ad-libbing, takes an enormous amount of work to get right, while actual first drafts tend to be way off the mark.

Many people, though, interpret this rule differently, as advising against constant fiddling. That's far better advice. Getting hung up on making a story perfect is a recipe for never finishing. No story is ever perfect, and I'm not even sure it should be. If, like Oscar Wilde, your day's writing consists of inserting a comma in the morning and removing it in the afternoon, it's a sign the story's good enough to go out into the wide world.

Even so, the rule isn't perfect. Not all rejections are accompanied by rewrite suggestions, or any feedback at all, but occasionally time leads you to recognise one particular aspect of the story that keeps getting it rejected. If you have really good reason for believing a rewrite will fix that, go ahead. In any case, computers have made revision infinitely easier than when I was first pounding a manual typewriter — let alone when Heinlein was writing.

On a few occasions, I've had a response to a submission that they can't accept the story as it stands, but they'll reconsider it if I change X, Y or Z. Heinlein probably wouldn't consider that an "editorial order", but perhaps he had plenty of alternative markets lined up where he knew the editor. I assess requests like that on merit. They've ranged from tightening up the opening scene to changing the gender of the protagonist — it's perhaps not surprising that I agreed to the first (and the story was accepted afterwards) and refused the second, although not without considering.

So perhaps I'll make that one When you've got your story good enough, leave it alone unless you have a concrete reason to change it.

4. You must put the work on the market


Ultimately, I write to communicate. Of course I write largely what I'd want to read, but if the stories were purely for myself, there'd be no point in actually writing them down. If I want anyone to read my work, I need to get it published, whether with a professional publisher or self-publishing.

So yes, if you finish a story, it's worth nothing unless you're trying to get it read. If you're making a living from your writing, that has to mean "putting it on the market", but few of us are in that league. In any case, there are more options than in 1947, and a story can be published, self-published, or even given away free in a calculated effort to generate interest.

So this one, perhaps, could be When you've finished a story, do everything you can to get it read as widely as possible.

5. You must keep the work on the market until it is sold


This seems to me an over-insistent rule that nevertheless contains a truth. It's very easy to get discouraged if a story is continually rejected, but all a rejection is saying — unless it's accompanied by feedback — is that particular editor can't use it at that particular time. It doesn't necessarily mean the next editor won't snap it up.

We all know anecdotes about how such-and-such a bestseller was rejected X number of times before someone took a chance on it. I certainly haven't published a bestseller, or anywhere close, but I did have one story that was accepted on the eighteenth time of asking. And, having been rejected by a number of fairly small markets, it was eventually published by a professional-rate magazine. It was simply a good fit with them at that time.

On the other hand, there comes a point where further submission seem merely cruel and unusual punishment of a deceased equine. That's nothing to do with the number of rejections a work has garnered. A story may be very specialist, for instance, with only a few markets you can reasonably submit it to, and when those are exhausted, there's nowhere else to go. Or the editor may give you damning feedback which, after the permitted tantrum, you have to admit is reasonable.

This leaves you with three options. You can say sod them all and self-publish. You can decide you'll probably never get a paid acceptance for that story, and stick it on your blog for free. Or you can invoke the final sanction and mark it as "retired" on your database.*

So we'll make this rule Don't give up trying to get a story published unless you're absolutely certain you're wasting your time.

So where does this leave us? I have no great hopes that the Blatchley Amendment to Heinlein's Rules for Writing is going to replace the original, but here it is.

1. Use whatever time you can possibly spare to write.
2. Finish any project you commit to, unless you have very good reason to think it's not working.
3. When you've got your story good enough, leave it alone unless you have a concrete reason to change it.
4. When you've finished a story, do everything you can to get it read as widely as possible.
5. Don't give up trying to get a story published unless you're absolutely certain you're wasting your time.

* Or "semi-retired" in my case. Never say never.

Image of Robert A. Heinlein: "Heinlein-face". Licensed under CC BY-SA 3.0 via Commons - https://commons.wikimedia.org/wiki/File:Heinlein-face.jpg#/media/File:Heinlein-face.jpg

Sunday, November 1, 2015

Stardust by Neil Gaiman - The Book and the Film

I saw the film of Stardust not long after it came out in 2007, and loved it. A grown-up (but not especially "adult") fairy story, it was exciting, magical, beautiful and funny. I was aware that it was based on a book by Neil Gaiman, but for some reason it's taken me till now to get around to reading it.

I often approach film adaptation of books with some dread, although it's by no means always justified, but it's a much rarer experience to do it backwards. I read Iain Banks's The Crow Road after seeing the TV serial (a well-adapted version). Much longer ago, as a young child, I recall reading Dodie Smith's The One Hundred and One Dalmatians after watching the Disney animated version and being highly confused about the differences.

Whether it's due to the order of exposure, or simply because it works, I found that the considerable changes made for the film version of Stardust didn't bother me. I suspect it's the latter reason. Books and films are very different media, and as long as changes have a good reason, they're sometimes necessary1.

In any case, a bit of variation somehow works better in this case than most. The story of Stardust has so much the feeling of an old folk-tale that it's almost as if, rather than the film being based on the book, both are retellings of an older story, and have chosen to interpret its core events in slightly different ways.

The story tells how a young man, Tristran Thorn (Tristan in the film, but I'll refer to him throughout by his original name, to avoid confusion) lives in an ordinary 19th century English village that just happens to be a short stroll from the wall that divides the mundane world from the world of Faerie2. Desperate to win the love of the village beauty Victoria, he vows to bring her back a star that's fallen far beyond the Wall.

Tristran isn't an ordinary young man, since his mother belonged to the magical realm, and he finds no shortage of friends there. However, when he reaches the place where the star has fallen (by instantaneous travel using a "Babylon candle"), he finds something he hasn't expected. Far from being a lifeless lump of rock, this star proves to be a young woman called Yvaine.

Their return journey to the Wall is marked by encounters with a unicorn and a flying ship, among other things, but also by pursuit from a powerful witch and several ruthless princes, each of whom wants the magic of the star for their own ends.

The book and the film follow similar plots, but the film cuts out a good deal of the travelling (Tristran and Yvaine's journey back to the Wall is reduced from months to a week) and substitutes several more visual and dramatic sequences.

That's perfectly reasonable. The book is very much about the journey, a relatively calm affair (with a few notable exceptions) in which Tristran and Yvaine's enemies essentially neutralise one another, and it works beautifully that way. A film, on the other hand, needs focus, action and spectacle, and this is achieved by additions like a climactic action scene that has no parallel in the book. It's also achieved through Robert De Niro.

The book features a very brief sequence in which Tristran and Yvaine are rescued from the cloud where they happen to have become stranded by a flying ship on a lightning-gathering voyage. It's under the command of a fairly unremarkable character called Captain Alberich, who gives them passage for a while.

In the film, this has changed to a larger-than-life pirate vessel under the command of a transvestite pirate captain (De Niro) called Captain Shakespeare. Their time on board is now full of events, and the captain and crew end up playing a major part in the story.

This may have no sanction in the book, but it works spectacularly, with De Niro really hamming it up (in the best possible sense of the phrase). Similarly, the big fight near the end in the castle of the three witches, to save Yvaine from having her heart cut out, is a fitting climax to the film.

In general (apart from the flying ship) the various elements are explored more thoroughly in the book than the film. The central plot, of Tristran and Yvaine's relationship mutating from enmity to love, works in both, although it's perhaps a little more hurried in the film. Still, having to pack it in more tightly leads to some great one-liners from Yvaine, who's splendidly played by Claire Danes.

The book explores the village and explains its relationship with Faerie quite extensively, whereas it's glossed over in the film. We also learn a lot more in the book about the enemy princes, although the film versions still work well on their own terms. Coming from a kingdom where succession conventionally goes to the last prince standing, the seven brothers are down to three by the start of the story, with two eliminated along the way, leaving only the most ruthless to pursue Yvaine — though constantly accompanied by the ghosts of his dead brothers.

The three witches who seek the star's heart to restore their youth are more straightforwardly presented in the film, without the tantalising hints at their bizarre nature that we get in the book. Nevertheless, the chief of them (unnamed in the book, Lamia in the film) is played with huge gusto by Michelle Pfeiffer, both as stately beauty and old hag. The climactic scene, as Tristran tries desperately to save Yvaine before the witches cut out her heart, is very effective, even if it's a far cry from the witch's final, rather pathetic scene in the book.

So which is better, the book or the film? To be honest, I don't think I could answer that. If the film had been more of a straight adaptation of the book, it probably wouldn't live up to the original. As it is, what we have is a superb book and a superb film, telling different versions of the same tale. And that, I think, makes a perfect adaptation.


 1 That doesn't mean all changes are forgivable, of course. See my recent rant on the subject.

2 There's a very obvious parallel here with Lord Dunsany's classic novel The King of Elfland's Daughter, which is probably not accidental. Gaiman is well versed in the classics of fantasy, and the fact that he uses Dunsany's favourite phrase, "the fields we know", to refer to the mundane world is a bit of a giveaway.


Tuesday, September 29, 2015

Doctor Who - Review of the New Series Opener

Doctor Who is back on TV, with series 35 — or, as it's officially known, series 91. I've been a fan since the first episode, back in 1963. I still remember settling down as a nine-year-old to watch the new show with the exceedingly weird title sequence. It's changed almost beyond recognition in nearly 52 years since then, yet managed at the same time to remain exactly the same. It's a rare trick.

The new series (whatever you call it) started with a two-part story, so I thought I'd wait till I'd seen both parts to give a reaction. But, before we start, let's get an extremely subtle warning out of the way:

HERE BE SPOILERS

So, if you haven't had a chance to watch it yet, bookmark this page and come back when you have.

Last year was all about establishing Peter Capaldi as the Twelfth Doctor2, and, while the standard of the stories was varied, from the excellent finale down the highly questionable Kill the Moon, Capaldi was extremely impressive in his simultaneously familiar and different interpretation of the character.

This year, he starts as Doctor-in-residence and needs great stories to continue making his mark. The first episode, The Magician's Apprentice3, started with a bang: a young boy caught in a minefield in a long, dirty war is offered help by the Doctor, who's landed by accident and has no idea which planet he's on. When the child gives his name, the Doctor realises this is Davros, who'll go on to create the Daleks.

Thus, before the titles we're plunged straight into the story's main theme, a continuation of a moral dialogue the Doctor's been having on and off since 1975. That year, one of the all-time-great Doctor Who stories, Genesis of the Daleks, chronicled how the Doctor failed to prevent Davros in his creation. Faced with the opportunity of destroying all the embryonic Daleks, he hesitates and asks:

Listen, if someone who knew the future pointed out a child to you and told you that that child would grow up totally evil, to be a ruthless dictator who would destroy millions of lives, could you then kill that child?4

Now, with those words come back to haunt him, the Doctor has three options for the child Davros: save him, kill him, or leave him to his fate. It's not till the story's final scene that we find out the choice he'll ultimately makes.

The heart of the story is a series of scenes between the Doctor and the adult Davros, last seen in 2008, in which they continue their debate about the ethics of the Daleks. Davros's position has always been that compassion is weakness and only through strength and ruthlessness can the Daleks survive, while the Doctor continues to argue the case for compassion. Taunted by Davros that "[Compassion] will kill you in the end," he retorts, "I wouldn't die any other way."

Now, though, Davros is dying and seems not only to be questioning his position but even wanting the Doctor's approval. It's all a ruse, of course — this is Davros, after all — and the Doctor seems caught in the snare of his compassion. But this is the Doctor, too, and he's one step ahead.

It's great to see ethical debate at the heart of things, as it was in so many of the great Doctor Who stories of the past, but this is anything but a talky story. Missy is back (and yes, we do find out how she survived her death last year) forming an unlikely alliance with Clara to find and help the Doctor.

In her female form, Missy seems better able to express her very contradictory feelings about the Doctor — constantly trying to kill him, she insists, doesn't alter the fact that he's her best friend and always has been. She talks constantly about memories of their childhood (just as we also get a glimpse of Davros's childhood) although she hints that not everything she tells Clara is true. Certainly referring to when the Doctor was a little girl seems implausible, in the light of last year's Listen.

Of course, Missy too has her own agenda, and in the end she almost succeeds in tricking the Doctor into killing Clara. As in Logopolis, any apparent alliance with the Master/Missy isn't going to last a moment longer than s/he chooses.

There's a lot else packed into the story: planes stuck in the sky, UNIT headquarters, a space bar containing many familiar aliens, a strange mediaeval scene with Doctor playing an electric guitar (the one weak section of the story, in my opinion), Dalek "sewers" containing not-quite-dead Daleks that the Doctor brings to life (setting up my favourite line, when he tells the Supreme Dalek "Your sewers are revolting") and Clara pretending to be a Dalek.

This last is interesting, since the first time we met Clara, in Asylum of the Daleks, she actually was a Dalek, so this brings her full circle. The whole process of pretending to be a Dalek is radically different from when Ian did it in The Daleks or Rebec in Planet of the Daleks, but that makes sense. The Daleks have been upgraded, redesigned and re-bioengineered so often since then, both by Davros and the Emperor5, that I wouldn't expect their mechanisms to be the same.

Clara's "training session" by Missy is fun and makes some intriguing suggestions about the Daleks (their guns are fired by emotion, and shouting exterminate is their way of "reloading) but what we learn from it also turns out to be crucial.

Perhaps the most arresting figure in the story, apart from Davros and Missy, is the wonderfully sinister Colony Sarff, Davros's messenger and bodyguard, whom the Doctor describes as "a nest of snakes in a dress". A scary-looking figure in its assembled form, it can collapse into a mass of individual snakes, each of which appear to have an equal part in the whole ("We are a democracy," it explains at one point). It's regrettable that Colony Sarff is destroyed at the end — but maybe there are more of its kind out there.

The whole story is a an old fan's delight, full of back-references. The ones relating to Davros are the most important, but there are others. When Clara, Missy and Kate Stewart of UNIT are trying to work out where/when the Doctor might be hiding, they identify a whole series of historical locations where he's been. These include San Martino (The Masque of Mandragora), Troy (The Myth Makers), "multiples for New York" (The Chase, Daleks in Manhattan/Evolution of the Daleks, The Angels Take Manhattan) and "three possible versions of Atlantis" — presumably those he visited in The Underwater Menace and The Time Monster, together with the Atlantis Azal claimed to have destroyed in The Dæmons.

There's been some criticism of recent Doctor Who that the Doctor has often taken second place in the stories to whoever his companion was at the time. Personally, I don't feel this has gone too far, and anyway it was by no means unknown in the classic series for stories to focus more on the companions. This was especially true in the early days, when we were seeing much more from Ian and Barbara's perspective than from the Doctor's. Still, it's good to have a high-class story where, although Clara has plenty to do, the focus is solidly on the Doctor. And on ethics, which have always been at the heart of Doctor Who.

I'm really looking forward to the rest of series 35.


1 Series 9 of Doctor Who was broadcast in 1972, starting with Day of the Daleks and finishing with The Time Monster. This is the 35th series overall.

2 Again, the numbering is questionable. What about the War Doctor? What about the Tenth Doctor Mark 2? For that matter, what about the Watcher, the Valyard, the Dream Lord…? Still, convention is convention.

3 Maybe it's just me being slow, but I still haven't worked out the rationale for either title in the two-parter. That doesn't spoil anything, though.

4 This and several other clips from past confrontations with Davros are played during the story. Personally, I'd prefer them to be left implicit, but I recognise not all viewers remember the classic stories as well as I do.

5 Assuming they're different, of course. The whole question of the identities of the various Dalek Emperors, of whom Davros was at least one, is rather complex.



Wednesday, September 16, 2015

So What Is This Thing Called Point of View?

To the novice entering into the esoteric world of writers discussing writing, few things are more confusing than point of view. We tend to talk as if everyone knows what we mean by point of view (known as POV to its friends) and throw around enigmatic terms like "limited" and "omni".

At its simplest, POV is about whether the main character of a story is referred to as I, you or he/she (or conceivably it), using the grammatical concept of person. Referring to the speaker is known as first person, referring to the person or thing addressed is second person, and referring to anyone or anything else is third person. In some languages, like Latin, this is built into the grammar, and the persons have to be used in the correct order in the sentence.1

Most stories are written in first or third, with second being kept mainly for experimental work or choose-your-own-adventure books. It isn't easy to do well, though it can be highly effective in the hands of a master, such as in Italo Calvino's wonderful novel If on a Winter's Night a Traveller.

There's more to POV, though, than which person it's in. It can, quite separately from this, be limited or omniscient, deep or shallow, distant or immersive. And it's important to remember that these aren't absolutes, but sliding scales where specific stories (or even specific scenes in a story) are defined by positions on numerous axes.

It's also important to bear in mind that none of these POVs is wrong. Some are unfashionable and will be much harder sells to editors or agents (second person is certainly one of these) and some are easier than others to get wrong. All POVs, though, suit specific stories can be written well, and if so are perfectly valid.

First Person


First person is increasingly popular, especially in YA fiction, although it's never been uncommon. There's a popular idea that a first-person story has to represent a supposed memoir by the character, or that they're telling the tale to someone. This is certainly an option, but by no means the only one.

First person narratives can range from immersive to distant. In the former case, the character is essentially passing on the experiences as they're happening, and will only write about what they're experiencing or thinking at the time.2 It's becoming quite common now to reinforce that with present tense, though past tense narrative is familiar enough not to feel wrong in such a context — I think of past tense in this situation as events being processed a second or two after they've happened.

At the other extreme, the character can be looking back on past events, whether writing, telling or just remembering it. This sacrifices the immediacy of the immersive approach, but it allows the narrator to reflect on their actions and introduce elements of the story or setting they couldn't have known at the time.

There are positions in between these extremes, and it's also not essential to write an entire story from the same position. In At An Uncertain Hour, for instance, I used the technique of switching between action the main character is immersed in as it happens and memories of his past life, written in a more distant style. I'm by no means the only author who's written like this — check out Iain Banks for a master of the technique.

Third Person


Third person, where every character is referred to as he, she or it, is perhaps the most natural style of storytelling, but it comes in three broad forms: objective, omniscient and limited.

Limited third is perhaps the most popular POV today, although its popularity is maybe exaggerated somewhat by authors and editors. This is where, in any given scene, everything is filtered through one specific person's perceptions and inner thoughts. The greatest sin, if (but only if) you're writing in limited third, is head-hopping, where the POV changes from one paragraph to the next, or even within the same paragraph. Even this can be done effectively, but only if it's deliberately calculated for a specific effect, rather than to make life easier for the author.

The POV may be limited, but that can be handled in a number of ways. It can be deep and immersive, to a point where it's almost indistinguishable from immersive first person, or the "camera" can move out to show us details that aren't being directly noticed by the character. At its shallowest, for instance, you might describe the character's appearance as if from outside — without resorting to the dreaded mirror scene.

That level more or less merges into omniscient, which is where the POV is the author (or at least an authorial presence) who can show us whatever is necessary to tell the story, including the thoughts of multiple characters, general facts about the setting, and pithy comments about life, the universe and everything. It's distinct from head-hopping, though, because the revelations made by an omniscient narrator are made from outside rather than inside, reminding us that this isn't the character telling us what they're thinking, but the narrator knowing it.

The most extreme omniscient approach is the storyteller style, where the author is a direct presence addressing the reader. This is particularly common in a traditional style of children's story, where the author might interrupt the narrative to say something like "I expect you're wondering how he's going to escape from this. Well…"

Occasionally, the omniscient POV can also be a character within the story, usually playing a minor role, who either for specific reasons or just as a device has access to all the necessary information about the story and its setting. This character can be presented as either first or third person.

Objective is superficially a little like omniscient, since the POV is the author, but where omniscient is an active POV, objective is passive. Here, we're only told what can be seen and heard, not what any character makes of it or what they're thinking. Not much fiction is written this way nowadays, partly because it's incredibly difficult to write it effectively (I know, I've tried) but the mediaeval Icelandic sagas3 handled it brilliantly, using it to give a kind of dead-pan insight into the behaviour of the characters by inference, not by revelation.

This doesn't mean that a writer has to choose one of these precise positions and stick with it fanatically. Many writers who use limited third, for instance, will vary the depth from scene to scene, depending on what's needed at the time. Others operate right on the edge between shallow limited third and omniscient, sometime straying to one side of the border, sometimes to the other — the Harry Potter books are an example of this. The key, as with so much in writing, is always to know precisely what you're doing and why. It's possible to make your variations seem natural, rather than careless.

Who Should Be Your POV?


Some stories can be told exclusively through one pair of eyes. This is the default for first person, and many limited third stories also don't need more than a single POV, if the story is about that person. Other stories aren't about any specific individual and need a wide range of viewpoints to show the reader everything that feeds into the edifice the author's constructing.

George R.R. Martin is one of the best-known authors using a POV cast of thousands, but many stories require three or four POVs to cover everything. These POVs (usually in limited third person, though multiple first person novels are found4, and even switching between first and third for different POVs) will change either at the beginning of a new chapter, or else at the beginning of a discrete scene within a chapter. The technique is completely different from head-hopping.

A common piece of advice in choosing which character stands as your POV for a scene, a chapter, an entire novel, is that it should be the person most involved in what happens. That's often good advice, but not always. Sometimes, the person most genuinely involved in a scene, in the true sense, may be standing back and watching events unfold. Someone else may take centre-stage, but this is the person whose motives and assumptions are being challenged by what happens, which will affect their role in the rest of the story.

An extreme case is when the story is most effectively told by an observer, giving us a separate mind to filter the action through. The best-known examples of this are Dr Watson in the Sherlock Holmes stories and Nick Carraway in The Great Gatsby. Though not at all uninvolved, Watson stands a little back and provides us with the eyes to watch the enigma that is Holmes, while Carraway allows Fitzgerald to avoid having to reveal too much about Gatsby.

In the end, the choice of POV — the character, the pronoun used, the limit or omniscience, the immersion or the distance — is all about how to tell the story best, and that will be different every time. As it should be. The point of being a writer isn't to endlessly tell the same story.


1 This caused some problem for Henry VIII's minister Cardinal Wolsey, who grammatically but undiplomatically referred in his Latin letters to "ego et rex" (I and the king). This gave ammunition to those who accused him of arrogance.

2 In fact, there's a little wiggle-room here, since applying that too literally results in the extreme immersive first person style known as stream-of-consciousness — a valid approach, but one that isn't suitable for the majority of first person stories. As long as what you write is more or less in the moment, readers will normally ignore any slight discrepancy.

3 No, the sagas weren't oral tales told round the fire of a mead-hall. They were actually sophisticated literary works, produced by and for a surprisingly literate society.

4 Disclosure: I'm writing one myself at the moment, and finding the process fascinating.

Tuesday, August 25, 2015

The Castle of Otranto by Horace Walpole

In 1765, a book was published called The Castle of Otranto. The title page proclaimed the text as translated by "William Marshal, Gent. from the Original Italian of Onuphrio Muralto, Canon of the Church of St. Nicholas at Otranto." The preface discussed various issues, such as dating the original (the style of the Italian suggested the Renaissance, the subject-matter clearly belonged to the period of the Crusades) and the author's habit of interrupting the high doings of the main characters with the commonplace absurdities of the servants.

The book was a great success, prompting a second edition in which the writer and politician Horace Walpole (left), son of Britain's first prime minister1, admitted to being the author of the work. In a new preface, he apologised to the reader for the deception, claiming that "As diffidence of his own abilities, and the novelty of the attempt, were his sole inducements to assume that disguise, he flatters himself that he shall appear excusable."

In fact, Walpole was far from being the only author of the period to foist such a deception on the public, literary follies being almost as common as architectural ones in the later 18th century. The most famous is the poet Thomas Chatterton, who committed suicide in 1770 at the age of seventeen2, having published "translations" of an imaginary 15th century poet called Thomas Rowley. There was also James Macpherson, who similarly claimed to have translated the works of an ancient Scottish bard called Ossian. It's generally accepted that Macpherson used genuine fragments of ancient Highland poetry, but most of Ossian's poem were his own works.

The Castle of Otranto was immensely popular and kicked off the tradition of gothic fiction, which Walpole implicitly defined in his second preface as a combination of the supernatural elements found in mediaeval romance with the realism of modern novelists such as Fielding and Richardson. At first, the genre consisted of lurid novels by authors such as Matthew Lewis and Ann Ratcliffe3, but it strongly influenced the Romantic poets, especially Coleridge and Keats, and found a more sophisticated voice in authors like Mary Shelley, Edgar Allan Poe and, to some extent, the Brontës.

Quite early on, the genre began splitting. Lewis's bestseller, The Monk, and many others of its kind, featured magic, ghosts and other unearthly elements, whereas authors like Ratcliffe, played down the supernatural in favour of the over-the-top love stories the genre also featured. In time, one strand came down through Poe, Stoker and many others as the horror and paranormal tradition, while Ratcliffe's fainting heroines and mysterious heroes evolved into romantic fiction4. Both traditions are alive and well today.

A little while ago, I picked up a Penguin Classics volume called Three Gothic Novels, featuring Shelley's Frankenstein, Beckford's Vathek and The Castle of Otranto. I'd read and loved the other two many years ago, but I'd never read Otranto, so I recently decided to remedy that.

It's described as the first gothic novel, but it's really no more than a novella, about a hundred pages in the standard-sized paperback. Still, it takes a while to get through. Those hundred pages consist of mammoth paragraphs (I counted one at seven pages), many of them consisting of back-and-forth dialogue without breaks. And, to make matters worse, Walpole uses absolutely no punctuation to indicate either the start or finish of speech (not even the French-style dashes), making it sometimes very difficult to work out who's supposed to be saying what.

The story is a bizarre one, which Walpole claimed came to him in a dream. Manfred, Prince of Otranto, has a son and daughter, but his son is killed on his wedding day by a gigantic helmet falling from the skies. Cheated of his chance for an heir through the male line, Manfred decides to divorce his saintly wife Hippolita and marry his would-be daughter-in-law Isabella.

Appalled, Isabella flees to the nearby abbey run by Father Jerome, and is helped on her way by a young man called Theodore, who just happens to be Jerome's long-lost son. Then, as strange happenings intensify, a strange knight turns up, silent and anonymous in his armour, and the tale proceeds towards its tragic conclusion.

This is a story full of portents, hauntings and miracles, generational curses and staggering coincidences, noble heroes, swooning heroines and dastardly villains — not to mention the endlessly prattling servants. It also has not a little absurdity — the giant helmet falling on Manfred's heir wouldn't have been out of place in Monty Python and the Holy Grail. It certainly isn't especially frightening to the reader.

It's been suggested that Walpole was well aware of the absurdities in the story, and that he didn't actually take his work very seriously. Certainly, he didn't seem to have anticipated the birth of an immensely successful literary genre that spread not only throughout the English-speaking world, but all across Europe, and whose literary heirs have ranged from Charlotte Brontë to the Marquis de Sade, and in the 20th century H.P. Lovecraft to Barbara Cartland.

Ultimately, I wouldn't rate The Castle of Otranto as a great work, certainly not in the same league as Vathek, let alone Frankenstein. Perhaps, like many trail-blazers, it suffers from the fact that the novelty alone was enough in its day. From a modern perspective, the plot comes over as incredibly clichéd, and it progresses largely by coincidences and non-sequiturs. Though the shortcomings in the writing may be partly put down to the conventions of the time, the two masterpieces that share the volume read much more easily and naturally.

Nevertheless, I'm glad I've read it, and I'd suggest it would be worth anyone's while to try it. It isn't very long, and you'll be witnessing the exact point at which the modern horror tradition and the modern romantic tradition both had their birth.



1 Robert Walpole never actually held the title of prime minister, which didn't become an official government position till the end of the 19th century, but he originated the concept of one minister dominating the government, in place of the reigning monarch actively heading it. Walpole held the posts of First Lord of the Treasury, Chancellor of the Exchequer and Leader of the House of Commons, with his allies in other key roles. "Prime Minister" was a description applied sarcastically by his enemies.

2 Or not. There appears to be some evidence that, far from deliberately taking poison, he accidentally ODed on prescription medicine.

3 Jane Austin had a love-hate relationship with these authors. She read them voraciously, but also mercilessly parodied the genre in Northanger Abbey.

4 The original meaning of "romantic" was resembling the mediaeval romances, so called simply because they were written in the "romance languages" (ie those derived from Latin). The word was almost interchangeable with "gothic", but the two terms became applied to the two diverging branches of the genre and are now poles apart.



The images of Horace Walpole, painted by Joshua Reynolds in 1756, and an illustration from a 1794 German edition of The Castle of Otranto are both in the public domain.

Friday, August 21, 2015

Reblog: On Dothraki and House Elves: Developing Fantasy Cultures, from Dan Koboldt

My attention was drawn today to an excellent guest blog on Dan Koboldt's site, written by sociologist Hannah Emery, explaining why cultures don't just sit still so that fantasy writers can present each culture as a single, unchanging entity.

This is a message I've been trying to get over to fantasy writers for many years in blogs like Messy Worlds Rule OK, only expressed with eloquence and a great deal of expertise. I'd encourage everyone interested in fantasy cultures (or real-world ones, for that matter) to read the article, and then to explore this excellent blog further.

The only point I'd add to it, as I've pointed out in a comment, is that another crucial driving-force of cultural development is always going to be trade, which seems to be a fundamental human instinct.




Monday, August 10, 2015

Which World Is Your Story Set In?

People who don't like fantasy often base their objections on the claim that they prefer to read books or watch films set in the real world. For these people, the dichotomy is obvious. Fantasy is set in an invented secondary world, which obviously makes it trivial and irrelevant, whereas good fiction (that is, whatever they happen to like) is set in the real world, which automatically makes it superior and relevant.

Leaving aside the fact that many of the books, films and TV shows ostensibly set in the "real world" are neither superior nor particularly relevant (the James Bond stories are nominally real-world stories, for heaven's sake), this attitude shows a fundamental lack of understanding about the nature of fiction — not to mention the nature of reality.

My contention is that every story ever written is actually set in an invented secondary world, and fantasy (as well as some SF) is the label given to those that are upfront about it. It doesn't matter how uncompromisingly gritty a slice of social realism a story might be, it's set in a fictional reality, not an objective reality.

Consider two authors both writing stories about a maverick cop who rides roughshod over the rules and procedures. In one, he might be the hero who nails the bad guys that would get away if he played by the book. In the other, he might end up destroying innocent lives the rules were there to protect.

This isn't just a matter of attitude. Depending on their views or agendas (often, but not always, the same thing), each author will create realities in which their take on the story is objectively true. The first will quite genuinely be a world in which bleeding-heart liberals are letting the crooks get away to prey on their victims. The second will just as genuinely be in a world where the rule of law is the only thing separating the good guys from the bad.

Of course, a reader who entirely agrees with one or the other point of view will interpret that fictional reality as objectively true, but another will see the opposite as being true. The point is that the difference isn't between the attitudes of the characters within the story, but lies in the author's primary worldbuilding. This is analogous to the way Tolkien writes about a world in which morality has the force of a law of nature and can affect the outcome of events just as surely gravity or the weather. The differences can be a lot more subtle, though.

Soap operas* are generally presented as ultra-realistic slice-of-life drama, but actually they tend to take place in an odd half-reality. Besides obvious anomalies like location (EastEnders, for instance, is set in a rearranged version of London) there are usually odd social habits that are unlike anything you'd actually find, simply to facilitate the dramatic necessities. There isn't necessarily anything wrong with this, but it's not the real world.

Most of all, perhaps, the fictional reality of a story will be determined by selecting what to put in and what to leave out. The complete reality of our society contains everything from cosy village life to inner-city gang warfare, but the reality in which a story takes place rarely includes all this. The author will select what's relevant to go into the story, and the rest won't exist.

This kind of selection, like the two ways our maverick cop can go, largely reflects the author's views and/or agenda. The fictional reality of a story isn't the world as it objectively is, but the world as the author wants it to be — not necessarily wants as a good thing, but wants in order to make a point. It's set in a custom-made world, just as a fantasy story is, but masquerading as the real world.

Does any of this matter? I think it does. Fantasy is often accused of portraying unreality, but it doesn't pretend otherwise, concentrating instead on using that unreality to shine a light on the world around us.

The more the fictional reality looks like our own world, though, the harder it is to make that distinction. I recall an argument I had once with a work colleague — I can't remember the exact topic, but I think it may have been about the precise effects of particular illegal drugs. What I do remember, though, is that the killer argument presented by this otherwise intelligent person was "Of course it's like that. Didn't you see EastEnders last week?" To which I gently explained that it had been that way in EastEnders because that was how some author had written it, not because it was necessarily true.

Fictional reality isn't restricted to fiction. Each of us sees the world in a slightly different way from anyone else, selecting what we admit and what we don't, explaining events according to our own assumptions and interpretations of reality. Most of the conflicts in the world are due to the fact that we do this unconsciously and assume our own fictional reality, whether individual or broadly shared, is objectively true.

If we could learn to understand how fiction works, critique it not in absolute terms but in terms of its unique fictional reality — its own secondary world — maybe we'd be better at understanding our own and others' unique inner worlds.

And what place better to learn how to do that than fantasy?


* The term soap opera is used with different meanings in different parts of the world. I'm using it in the usual UK sense of a continuous series (ie no breaks or seasons) about some kind of community that takes place in real time, so that, for instance, the characters are preparing for Christmas or anticipating the Cup Final at the same time the viewers are.


Monday, July 20, 2015

Indo-European Languages and the Aryan Fallacy

In 1938, J.R.R. Tolkien (left) received a letter from a German publisher who proposed to publish a translation of The Hobbit. According to the laws under the Nazi government, they requested him to confirm that he was "of Aryan origin".

Tolkien was livid. He made his opinion clear in a related letter to his British publisher, where he said, "I have many Jewish friends, and should regret giving any colour to the notion that I subscribed to the wholly pernicious and unscientific race-doctrine."1

In his reply to the German publisher, he feigns innocence at first. "I regret that I am not clear as to what you intend by arisch. I am not of Aryan extraction: that is Indo-iranian; as far as I am aware none of my ancestors spoke Hindustani, Persian, Gypsy, or any related dialects."2

Tolkien had no truck at all with the Aryan Fallacy, but many of his contemporaries embraced it enthusiastically, including several other fantasy writers active at a similar time — very notably Robert E. Howard. So how did it come about?

The word Aryan correctly refers to a group of nomadic tribes that moved into northern India about three thousand years ago. It's also used for the linguistic family descended from Sanskrit, deriving ultimately from the language(s) spoken by these tribes. Lastly, and with extreme caution, it can be applied to the people who speak these modern languages (most of the languages of Pakistan, Bangladesh, Nepal, northern and central India, and the majority language of Sri Lanka). As long as it's understood that this in no way represents a racial identity.

The family, as Tolkien indicated, also includes Romani ("Gypsy"), its original speakers having migrated from north-west India. Ironically, in Hitler's time (i.e. before the mass migrations from the sub-continent after World War Two) the only substantial ethnic group in Europe who could legitimately call themselves Aryan were the Romani — who were persecuted by the Nazis.

The most obviously related group of languages is the Iranian family, together forming the Indo-Iranian group (Iran clearly derives from the same root as Aryan). Today, the Iranian group covers most languages of Iran and Afghanistan, together with Kurdish, but in earlier historical periods Iranian-speaking nomads lived on the steppes north of the Black and Caspian Seas: Scythians, Sarmatians, Alans etc.

In the late 18th century, linguists began to recognise a clear relationship between the earliest forms of Sanskrit, Greek and Latin, and postulated that they might all have derived from a common source. With this leverage to build on, the Germanic, Celtic, Slavonic and other language groups were added to the growing super-family, nowadays known as Indo-European. An extinct family of Indo-European languages, Tocharian, was even spoken in north-west China during the Han era.

In the 19th century, it was proposed that the whole family should be called Aryan, in the erroneous belief that this was the earliest name of a people speaking an Indo-European language and therefore the most likely to be the original. Though mistaken, this was originally innocuous, but a tide of racial supremacism gradually saw the label become more and more equated with the "German Race". Max Müller, one of its originators, was later scathing about colleagues who confused linguistic and racial characteristics, suggesting that "Aryan race, Aryan blood, Aryan eyes and hair" were as absurd as "a dolichocephalic dictionary or a brachycephalic grammar".3

The tide was against him, though, and the concept of the Aryan Master Race became more and more widespread, eventually finding its lunatic home in the warped mind of Adolf Hitler. In fact, there have been suggestions that the characteristic differences between the Germanic languages (including English) and all other Indo-European languages may have been the result of an unrelated people abandoning their own language and taking up a broken form of Indo-European. In which case, the Germans have even less right to the name Aryan than most other Indo-European speakers.

The clear pattern of divergence among the languages through various periods has always suggested that it should be possible to trace them all back to a time when a single language, from which they are all descended, was spoken by a community in a relatively small area. The favourite explanation nowadays is the steppes of south-eastern Europe somewhere between 4500 and 2500 BC, but other propositions have ranged from the eccentric (such as the North Pole) to the more plausible (such as Anatolia or the north-western European plains). The image to the left shows one proposed model of expansion from the homeland.

It shouldn't be assumed, though, that these original Indo-Europeans represented a race, or even a "people" as we'd understand it. Archaeological evidence suggests that the communities who probably spoke Indo-European were actually made up of several distinct groups — a people who'd come down from the north, another who'd come up from the Mediterranean, and possibly others from Central Asia or the Caucasus — who can all be seen from their remains to have been very different physical types.

Nor are they likely to have had much in common, including a political or tribal structure, except the language which allowed ideas and customs to spread. They constituted a culture area, but no more.

In terms of the later spread of Indo-European, too, we can't assume any racial connection. It's not unusual for peoples to adopt the language of either the dominant or the "cool" culture, and this often means communities speaking a language aren't racially connected with the those who spoke the language's distant ancestor. If we didn't accept this, we'd have a hard time trying to explain how an English-speaking African American could have derived from the language's source in north-western Europe.

The subsequent evolution of the languages is likewise anything but straightforward. Linguists use the family tree model, (eg Latin is the "parent" of French, Spanish, Italian etc) and this is useful, and a gorgeously artistic interpretation of which by Minna Sundberg is shown right. But the influence of other, often unrelated languages is important too. This can be down to extensive borrowing of vocabulary for social or political reasons, such as how English, fundamentally a Western Germanic language, has a vocabulary heavily derived from Latin.4 It can also be explained, though, by the wave theory of language change.

The wave theory is a model which suggests that specific changes, whether a sound-change such as a final t changing to s or a grammatical change such as the development of grammatical gender, spreads from an epicentre and affects both related and unrelated languages. The next major change will have a different epicentre, and/or the speakers will have moved, so it won't be the same set of languages affected.

Ultimately, this will create a patchwork, in which it can be difficult to reconcile a language's position on its family tree with apparent similarities to more distantly related (or unrelated) ones. It's inconvenient, but we're talking about human behaviour. What do you expect?

So where did the original Indo-European language come from? If it was being spoken a mere five or six thousand years ago, it obviously can't have sprung from nowhere. The reason it's recognised as the "original" is that it marks the latest point that all Indo-European languages can be traced back to, but its history must have gone back a long way up an unknown family tree.

Various propositions have been made as to what Indo-European might ultimately be related to, the most widespread being a super-family known as Nostratic. At its most ambitious, this hypothesis includes Uralic, Altaic, Kartvelian, Afroasiatic, Elamo-Dravidian and Eskimo-Aleut, along with a few others, though some more cautious proponents restrict it to the first three plus Indo-European.5

There's very little evidence for hypotheses such as this, and what has been put forward is strongly disputed. The problem is that it becomes progressively harder to be sure of connections between languages the further back the proposed connection is. It's probable that some, at least, of these connections are correct, but nothing's likely to ever go beyond speculation.

To speculate, though, how far might this process actually go? Recent evidence suggests an origin for language considerably further back than was believed even a decade or two ago, certainly when our ancestors were living in a fairly small area of Africa.6 Maybe the invention of human language really was a single event, and we're all speaking variants of the same language.

The Aryan Fallacy should have died in the bunker with the Führer, but unfortunately bigots aren't famous for their intellectual rigour, and there are still white supremacist morons who use it as a keystone for their disgusting creeds.

So take a leaf out of Tolkien's book. Next time a white supremacist proudly claims to be Aryan, point out that they're actually claiming to be Indian. Or even Iranian. You won't change their views, but see how they like it.

 

1 The Letters of J.R.R. Tolkien, ed. Humphrey Carpenter, 1981, p.37

2 Ibid. p.37

3 Quoted in In Search of the Indo-Europeans, J.P. Mallory, 1989, p.269

4 That the roots of English are Germanic can be illustrated by taking a passage of average English and identifying where each word came from. It's likely that more would be Latin-derived than Germanic-derived; but if you were to redo the exercise counting each word each time it's used, you'd find the count overwhelmingly Germanic. This is because the most common, infrastructure words (the, and, to, for and the like) are direct descendants of the words that came over with the Anglo-Saxons.

5 Uralic includes languages like Finnish and Hungarian; Altaic includes the Turkic languages, Mongolian, probably Korean and possibly Japanese; Kartvelian is a small group whose main member is Georgian; Afroasiatic is a huge group, ranging from Hebrew and Arabic to Hausa, the main language of northern Nigeria, and taking in ancient Egyptian along the way; Dravidian was the main language group in India before the coming of the Aryans, and is still dominant in the south; Eskimo-Aleut — yes, I know Eskimo is non gratis, but there's actually no other word for the whole linguistic group, the Inuit actually being just the largest ethnic group.

6 Remains have indicated that Neanderthals shared the same deformity of the larynx that allows us to manipulate complex sounds, suggesting that the mutation took place at least 150,000 years ago. On the "use it or lose it" principle of evolution, it's difficult to interpret this any way other than our ancestors already using language from that point.
 
Images reproduced under Creative Commons licence:
 
Tolkien: Proyectolkien
Dancers of the Shuvani Romani Kumpania: James Niland
Indo-European expansion according to the Kurgan hypothesis: Dbachmann
Minna Sundberg Language Tree: Tom Wigley
 
 
 

Sunday, June 28, 2015

Let's Just Make Up Our Own Story

I was, I think, eleven when the Disney cartoon of The Jungle Book came out. Kipling's original book was among my favourite reading at the time, and when I saw what Disney had done to it, I was outraged in the way only a child can be — starting with the mispronunciation of Mowgli ("first syllable to rhyme with cow," Kipling had specified) and carrying on into the total reinvention of the characters and tone.

At a maturer age, of course, I can accept it's probably an extremely good film, if you don't compare it with the original story, but in a way that makes it worse. Generations of kids have now grown up believing that is The Jungle Book, and I can imagine them dismissing Kipling's version as "not the proper story."

Disney have done the same to others of my childhood treasures (notably Winnie the Pooh), but of course they're far from the only ones. I've seen film "versions" of books that have almost nothing in common with their sources except for a few names. It makes me wonder why film and TV companies bother to pay for the rights when all they're going to do is write their own stories.

The particular case that prompted this blog was the BBC's recent two-part adaptation of Stonemouth, one of the last novels by Iain Banks, one of my favourite authors. Now, this can't have been an easy book to adapt: a very immersive first-person narrative that wanders between present events and memories of the past as the thoughts ramble through the mind of the narrator, Stewart.

In fact, they handled that aspect quite well, although the flashbacks were sometimes too heavily cut to make sense. The problem was that the plot and characters were radically changed. The first part wasn't too bad, although the funeral that prompted Stewart to return home for was for a different person, and the friend who funeral he was now attending was made a far more pleasant person than in the book.

It was in the second part that really blew it. The book involves a typically Banksian uncovering of various mysteries, ending in an explosive scene that's all the more effective for being unlike the rest of the book. Instead, among numerous other changes, the adaptor inserted an entirely superfluous action scene, along with the bewildering revelation that a relatively minor character had actually been behind everything, in place of the more complex and believable outcome in the book.

Now, I understand that it isn't always possible to adapt a novel literally for the screen. They're different media, and sometimes it's necessary to find different solutions for the narrative. But there's a world between changes to account for the medium and arbitrary differences because someone thinks they know better than the author.

Peter Jackson's very variable Tolkien films offer both kinds of example. In The Two Towers, the scene where Faramir finds out about the Ring is done completely differently in the novel and the film. The problem here is that the scene Tolkien wrote consisted of three characters sitting around talking for a long time. That can work in a book, of course, and Tolkien did a great job of building up the tension, but a film needs visual tension as well as dialogue. I'm not sure that Jackson's solution would have been what I'd have chosen, but it made sense.

Contrast with that his totally irrelevant introduction of Tauriel in the films of The Hobbit and her absurd romantic subplot with Kili. I can certainly see that the lack of female characters might have needed to be addressed, but honestly, couldn't they do better than "Female character? Love interest, obviously"?

Perhaps the most traumatic experience I've had since The Jungle Book was Troy. Again, this may well be a good film taken in isolation, but the problem is it's not in isolation. This is supposedly a retelling of one of the world's greatest stories, and IT'S WRONG FROM BEGINNING TO END. This might not worry some people, but it should be a concern that, as with The Jungle Book, large numbers of people are being completely misled about the 3000-year-old story of the Trojan War.

Why do adaptors insist on doing this? In the case of Troy, it would have made more sense to change a bit more, give the characters and places different names, and put it out as an original story. It's even more mystifying when they have to pay for the rights, as with the American TV "remake" of The Prisoner a few years back, which had only a handful of cosmetic similarities to the original series. Why not save money and call it something else, since it's a completely different story?

This isn't a new problem. It's been going on for as long as there've been films, and there's no prospect of it ending any time soon. And, as I said, stories do have to be tweaked to fit a different medium — but tweaking isn't the same as wholesale butchery.

Instead of obsessing with buying up the rights to an original, why not just pay the scriptwriters to make up their own story? That's what they do, after all.
 
 
The illustration of Mowgli from The Jungle Book by Rudyard Kipling is by Cesar Ojeda, and reproduced under Creative Commons.