Tuesday, June 2, 2015

Word up

I can't help it. I focus on detail, and have always been able to spell well, and that causes me to really pick out the errors in work I've read. It really grates on my sensibilities to see inappropriate word-use, grammar, syntax, spelling, punctuation...

Oh, and the rampant use of cardinal numbers when ordinals should be chosen is literally ire-inducing to me. It seems more common these days than it used to, as far as my memory goes - but then I'm only 29, so my personal purview may not, shockingly enough, encompass the entirety of modern written English. I see it a lot, though - and hear it. Air New Zealand has YouTube adds which utilise the cardinal (1, 2, 3, 4...) instead of the ordinal (1st, 2nd, 3rd, 4th...) when describing dates of expiry of current flight deals - and while the narrator for such ads is clear enough in his intent, I don't believe being clear is an acceptable approximation for being correct.

And it's that which irks me so: certainly I can understand someone when they say "which I don't even know if that's a good idea"; they clearly think something may not be a wise decision. However...what sticks with me is the inappropriate appearance of "which". It doesn't make sense. For it to do so the rest of the sentence would require modification: "which I don't know is a good idea".

It boils down to pronoun redundancy: if you use one pronoun ("which"), you do not then need another ("that"). You could use both if one functions as a pronoun and the other a determiner (which both are): "Which is the one that you like?" is a clunky, rough-edged but grammatically-sound sentence. "Which is that you like?", or "which that you like" (as a statement, not a question) are perhaps more archaic versions of saying the same thing - the odd phenomenon of words losing their broader communicative ability in more modern times, requiring the addition of more words to say the same thing. But that's an aside - one I may come back to, because it really is interesting. Back to the topic a hand, though, "which that you like" can also be said "that which you like" - a seeming reversal in word order that means the same thing, but actually a nice little trade-in: the sentences both begin with pronouns and have determiners as their second words. Because which and that are both pronouns and determiners, and can be used in combination provided one acts in one function and the other takes on the empty role, either can occupy first or second position. The actual word order itself isn't reversed - just the choice of words used changes. It's quite a nice example of how things aren't so cut-and-dry on the surface but actually still are if one cares to think about them.

(And I know, few people bother. There are other, arguably better things to do with one's time than argue about how to speak.)

Of course, I realise that spoken, unrehearsed language is subject to errors and is defensible in its inclusion thereof - it's not proofread and approved before going live. It still irks me, but I'm more able to understand mistakes made in the moment - unless they're derogations relating to groups of people. I'll point out the error with that in a forthright way, obviously with the understanding that because pop-culture has made it "acceptable" to say "don't be a girl", "men are stupid", "that's gay" and so on, people don't always think that what they're saying has impact on others. Such understanding of the trend doesn't mean I think any of these things are okay to say, though - I just don't assume someone meant to be offensive, but rather was uninformed. If they then choose to be a total jerk about it and not modify their choice of words thereafter, then I'm left with no assumptions but the definite knowledge that they are indeed meaning to be offensive and seem to think that, while they make a definite choice as to which words they use, they bear no responsibility for that choice. But here's the key point to life in general: a person may have certain freedoms, such as the freedom of speech - and that's great. Freedom of speech doesn't mean freedom from responsibility for what is said, however, and it doesn't mean freedom from someone else exercising their freedom of speech by verbally tearing you to pieces for being insensitive.

That's a really good point for writers to remember, though: writers must know when to make it clear that their character thinks a certain way, rather than that they as the author do. As said above, while spoken words are not proofread (but should be chosen wisely anyway), written words most certainly are, or should be. It's easy to make a mistake by using the incorrect voice; if it's not clear you're describing something from the perspective of a character, you as the author risk describing it in the voice of the narrator - which is you.

I'm a type 1 diabetic, and have been since I was 11. I'm consistently amazed that people think diabetes in any form is caused by "eating too much sugar" - roughly akin to victim-blaming, for example: "if diabetics hadn't eaten so unhealthily they wouldn't have diabetes now, would they?" As a nurse I can unequivocally say that no, it isn't such a cause-and-effect situation. To say diabetes is caused by eating too much sugar ignores, for one thing, the sheer variety of diabetic conditions (type 1 is not type 2, gestational diabetes is not diabetes insipidus. Diabetes actually means "passer through", and medically communicates nothing more than the idea that large volumes of urine are being excreted); it also ignores the fact that our society/societies are not health-oriented, and don't encourage balanced lifestyles; it also ignores the fact that stress hormones decrease the efficacy of insulin, and stress also tends to be "coped with" by engaging in activities which further increase risk of ill-health: smoking, food-cravings, drinking, etc., etc.

So it'd be infuriating for me to read a book whose narrative voice described the eating habits of someone as those of a person heading for diabetic status. Luckily I never have read the words - it's just an example. Yet I'd be completely fine with a character who was otherwise presented as having little health-related knowledge saying such a thing - because the character is demonstrably ill-advised and ignorant. Neither being ill-advised or ignorant is a good thing, of course - but nobody has all information and knowledge. Even with the internet so readily available, knowledge is only accessible if it's meaningful and is sought out. An everyperson (which is a bit of a dismissive statement to those of us who are not average in every way possible) may not know what diabetes actually is - and so statements of ignorance are permissible because they demonstrate no truth other than that of that person's ignorance. The trick is to have that a defining feature of that character. A nurse or doctor should not be so characterised, but a politician, a stay-at-home parent, an artist or (ironically) a writer could be - because knowledge of diabetes would not be relevant to them. In any event, the author of the story is not permitted to be so uneducated, if they choose to write such things.

It's a matter of details. I might be focused on them, as I said earlier, but I would like to think that anyone trying to communicate in the wider world would be, too. It's just sensible to think that if one is going to write (or speak), one has some knowledge of what to write (or say), and how to phrase it.

I live in hope!

Thursday, May 7, 2015

Some considerations for self-publishing

A couple of months ago I attended a talk by a local meet-up co-ordinator on self-publishing. It was an interesting semi-seminar; it didn't really tell me anything new, but it was still interesting to hear about this person's experiences with self-publishing.

I have mixed feelings about the process and the practice, I have to admit. The positives would include that I would retain creative control of all aspects of publication: if I chose a specific word with a specific spelling (or a specific mis-spelling) then I wouldn't have an editor making changes based on assumptions; I could be able to choose the typeface used (an incredibly important part of story-telling); and I would be able to decide on format, cover design, etc., etc. The negatives are that, dependent on who you speak or listen to, self-published books are deemed "lesser" by many because they have achieved publication without jumping the numerous hurdles put in place by publishing houses which frankly don't want to spend money on publishing books; and a self-published book is far more limited in promotional scope, because the author, as the publisher, won't generally have the same access to the same channels as, say, Harper Collins.

I already know how I'll style my novella, when I finally get my act together about it: I've designed the cover and the custom lettering for the title, and I've formatted the text in both typeface and layout, eliminating such grievous features as orphans and widows. It's basically 100 pages of story, ready to go...unless I rewrite sections. And that could absolutely happen. I don't want it to - I feel it's completed. But it could still happen. But it's difficult: I know exactly how I want it to be crafted, but if it were deemed worthy of publication by a publishing house, I'd likely lose that creative control. I'd have to choose: exposure versus creative say-so.

I read a blog recently that espoused belief that if you're an author, no matter how good you think you are, you should never design a book cover for yourself. A respondent said they'd trained in design and was confident in their own abilities, and the blogger pretty much told them that it didn't matter: they weren't trained in book cover design, and therefore shouldn't design their own cover. It's an interesting perspective; the blogger - who has self-published two novels which have done reasonably well, much to their credit - seemed somehow certain that designers weren't appropriately trained in book cover design and therefore should leave this area of design well enough alone. And sure, such a perspective is valid if the designer in question is a clothing designer. Or an industrial designer, perhaps. But the comment-making designer led with information which strongly implied they were trained in graphic design. There were no reasons to presume otherwise, and I wondered at the sense in the blogger saying "oh no, don't. Hire someone else", as if book cover design were a separate skill set altogether. It's not, in case you were uncertain!

As it turned out, the blogger then went on to speak highly of using sites which basically offer designers the chance to use their expertise to design for clients...who don't want to pay a whole lot for the work the designer will do. There are a few sites like that around: they ask designers who are often actually struggling to build a design portfolio or who are freelancing and don't have work to pay the rent to sign up, and then let people in need of design select the lowest amount to pay for the designers' skills and time. It's like saying to a teacher or a nurse "Okay. You want to be able to eat this week? I'll give you an amount I deem your skills are worth and you give me the best design. I might not select your design in the end, as I'll have others competing with you, too, and if so you'll have worked for nothing - all because I don't really want to go to a design agency and have to pay for the hours devoted to this project and engage professionals in their abilities and skills".

Does it sound as if I have an issue with this? I sure do. As someone who has worked for over a decade, I wouldn't ever wish to request anyone, trained or otherwise, work for me for free. It's unfair. Yet sites like these (and logo design competitions, as another example) basically encourage the diminution of skills that the person needing a designer doesn't have. It strikes me as a bit similar to the so-called 0-hour contracts so many people have to deal with: you're hired to work but given no guarantee of hours, so if you're offered no hours there's no breach of contract on your employer's behalf. However, if you can't work a set of hours of which you might have had a day's notice, you can be fired because you're not meeting their expectations. It's a very underhanded way to commit someone to a job you can't be bothered giving them any security in.

So no, I'm not a fan of sites that basically engage in crowd-sourcing and inspired competition. The practice diminishes the hard work, time and skills a designer puts into even having the ability to provide a service. It's a system of undervalue. And it really irks me.

You can possibly imagine how unenthused I was when the speaker at the self-publishing semi-seminar promoted the use of such sites. And of course I understand that often a self-publishing author won't have much money to spare in pursuit of their dream of getting their work out there. Except...well, you have to commit money to such pursuit. You can't take shortcuts. Or, you know, you shouldn't. If you have the skills yourself, then use them (you could of course hire someone. There'd be nothing wrong with that) - but if you don't have the skills you shouldn't be offering some paltry sum that you think is appropriate. You're not paying for a pre-finished product whose creation costs have been factored into the price you pay; and you don't know the extent of the work, the time, or the value of any other resources the designer may need to utilise. In reality, you pay for the resources, the time, and the expertise as well as the final product.

It's funny how you learn about such perspectives. People are more concerned with how they can benefit than thy are with how those providing the services they need might benefit. But also, people aren't concerned with making their work appear the best it can: I remember the speaker at this meet-up saying "fully-justify your text", which is well-and-good if you just happen to have a multiple-tens-of-thousands-of-words-long piece of text with just the right rhythm character-wise to have all lines optimally filled with whole words. In the far more likely case a story doesn't lend itself so graciously to such a format, hyphenation is the answer - preferably after at least the first three characters, or before the last four. An author may have to slightly alter words in order to eliminate large gaps between words so that spaces are consistent line-to-line. And on top of the typographic concerns, if stylistic choice dictates the use of certain glyphs to denote the end of the section or a chapter, they should be used at 300dpi - or the image becomes pixelated and loses its definition.

I hadn't actually intended this to be such a rant. Well, I suppose I had, really, actually, because such things really do annoy me; but I suppose the benefit to this rant is that it's a bit like a "what not to do" advisory. I hadn't wanted to engage in advisories in this blog, so much - so it's a non-advisory advisory, I guess. I hope at the very least it's something to think about - because realistically this sort of stuff should be thought about, whether for the writer's benefit or for that of the people employed to help craft the writer's vision. It shouldn't be about cutting corners or undervaluing. It should be about getting something out into the world with the care and attention it should receive paid to it - and that means recognising that it's far more than just a casual affair, and that the hardest, most important part is over. As it turns out, what gets written isn't the whole story at all.

Thursday, April 30, 2015

A reading and an entry

I did a short reading the other evening. Quite short. And to a very small audience - just two others at a local writers' meet-up which, due to the weather being terrible this time of year, only had a total population of five this month - but still worth mentioning nonetheless.

It was a good experience in general, really: I got generally good reviews, and even the "critical" aspect of it wasn't in particular critical, taking the form of nothing more than "you've set the scene with the voice, which perhaps could be pared back just slightly, but what you've written is really effective". That's a paraphrase, of course - but specifically the words "pared", "back" and "slightly" did occur in association with each other.

And I'll be the first to admit it: the manner in which I tell stories is a little bit off-kilter with normal, familiar every-day vernacular. I don't have any issue with that; while I wouldn't set about writing a story set in the Dark Ages in language specific to the Dark Ages verbatim, I would want the language I were to use to fall within the perceptive bounds of the setting. The mode by which people communicate has as much impact on a story as what is being said, albeit in a different way; I'm certainly not suggesting a story can be told merely on the back of how someone speaks, rather than what they say. But it makes total sense: you wouldn't pick up The Odyssey and expect phrases like "How's it going?" to be all too common. Linguistic elements passively shape the perception the reader has of the world they're reading about. I'm sure I've touched in some way on this before, so I won't bother rerunning that race. But I'm open to that kind of critique; I'd rather know how people find reading it than pretend I don't need to know.

It was the first reading I'd ever done, I have to admit, so it was a good learning experience. I was aware the whole time of the speed at which I was speaking, making sure not to fall into the trap of going too fast which I know a lot of people find themselves doing. I've used Audioboo before to do a non-live reading of a poem-story I wrote (and which I'd like to do something with in the future), and I found myself struggling to speak slowly and breathe regularly due to the pressure of having a "perfect" recording. I didn't have that pressure when I did the live reading, oddly enough; I suppose the reality of the situation is that you can always stop and gather yourself while reading aloud in person, but if all you're leaving is a recording...you don't so much have that "hold on a moment" ability.

Anyway, I left the meet-up feeling really proud - the feedback was altogether very positive. The world in the story was correctly judged to be one of foreboding, and of a lurking danger, and of dread - and that really is a key psychological setting of the story. That I had been able to communicate that in the scene I read - or rather, that it had been espied - was really gratifying, particularly since the motive when writing the scene was not specifically to underscore those feelings at all, but to give a voice to a character to whom reference had been made but of whom no real exploration had been done. I feel really glad that the scene has proven itself well-situated enough for there to be congruence between its greater context and the message the scene itself in isolation communicates.

In an unrelated update, except regarding the shared medium (writing) and the notion of paring something down, I entered a severely shortened version of a story about the sea I'm likely to be perpetually writing the longer version of for that "short short story" competition I mentioned a couple of posts ago. I managed to get something several thousand words in length down to 300 (the absolute limit for any valid entry) and then got rid of one word for a grand total of 299. How much of a story can you tell in 300 words? Not much. Well, no, you can tell the entire story, but you can't tell much of it. In any case I'll wait to see whether I'll even generate attention. I have no idea of the calibre of other entries. I'd like to think mine's up there but it really might not be at all!

So, yes - two things writing related. It's good fun.

Tuesday, April 14, 2015

Lore in absentia

Something that I've learnt is that lore, whatever its form, has to be tangible. I say that because I think at some stage every storyteller has a bit of a moment wherein they think "well, this is how this story is because this is how I say it is"...and while that's true on the most basic and general level (a story is a story because the teller is telling it, and whatever happens does so because the teller says it does), it doesn't ring true when the reader suspends disbelief and lets the world being spoken of become a temporary reality.

We've all (I say, applying broad supposition to my actual and potential audience) seen movies, and we've all watched tv shows - dramas, comedies, mysteries, horrors, etc., etc.. And books, too - I'd say most of us have at least read one book, start to finish. And the thing is, unless we're either gifted with the most powerful of imaginations or cursed with no imagination at all, what happens within the story of a movie, a tv show, a book, or anything else has to make sense. There has to be a progression from A to B that can be demonstrated on some level that doesn't require too much effort to make plausible. The underlying lore of the tale, or the event, or even the characters must be capable of reduction to its most basic form and still make sense. There are notable exceptions, but these exceptions are the ones that lead the viewers or readers (or players) to search for more and to ultimately find it, not search and be given holes in plots and histories that just don't follow through.

A great example of one of these notable games is the Dark Souls series (if a franchise of two games can be deemed serial): the lore is present, but hard to find, in most parts at least. There are jumps one has to make but only insofar as things being all but said, so the logical conclusion is never stated outright for the sake of confirmation but the signs all point to it anyway. The games have earnt many people fame, at least in their respective circles, and have led to jobs regarding game guide content creation, as well as self-employment opportunities concerning YouTube content creation as well. The presentation of lore in the games and supporting material is done in such a way as to have it in some form of "plain sight", there to be found, but never outright confirmed. Tolkien's works are a bit of antithetical to these games in this regard, in that Tolkien didn't seem to want any guesswork being done - or at least, didn't like the idea of not having resources available to those who wished to engage in further research. The numerous appendices after The Return of the King tell the reader a more complete history of things related to The Lord of the Rings, and that has led the various multimedia works to have been built on the backs of the books being wrought with exceptional detail and richness. A sword is not merely a sword. But then, a sword is not merely a sword in Dark Souls, either; to know the absolute history of it, though, you might find yourself having to interpret hints and suggestions, rather than being told that this person son of that person son of another first obtained it from a smith who had beaten into the blade three different kind of ore sacred to this race, in order for it to be significant and have myriad different abilities. But the case remains that the lore is still there, just dependent on your ability or motivation to find it.

A bad story is, among other things, one that presents information without having set the stage for its inclusion. Well, perhaps not a bad story, but certainly a bad choice made by the author to depend so utterly on something without provenance. Deus ex machina it's called: the phenomenon of an event or plot point that is necessary for progression of the story to come out of nowhere. In terms of the author or creator, it amounts to something akin to throwing one's hands in the air and saying "this is the story because I say it is, and that's all there is to it". In one respect it might be deemed similar to one core aspect of the Anthropic Principle: that the Universe is as it is because if it weren't we wouldn't be here to observe it. That seems to amount to a big fat nothing, and indeed it does in fact amount to a big fat nothing: because it doesn't describe why the Universe is the way it is, but rather that the Universe can be observed as it is. In reality the Anthropic Principle has two major variants (and possibly others I'm not studied enough to be aware of): the Strong Anthropic Principle, which says that the Universe is the way it is and we can observe it that way because it is compelled to align itself with conditions that encourage or necessitate the evolution of life complex enough to exist within it and observe it; and the Weak Anthropic Principle, which says simply that if the Universe were any other way, we wouldn't be around to ask why it is the way it is, and that because we're in a version of the Universe that does allow us to exist then it's obviously a stable, realistic version, and we don't know of any other Universes that don't support life capable of observing it. It's all an interesting notion, but I have a bit of trouble seeing what use it has when trying to actually assess why the Universe has found itself in its current state unless we are to rely on saying "because we are reason enough", or at least, "because it must be observed by something". It's a bit cart-before-horse-ish, equivalent to a reversal of the question "how can we be here?" and its answer "because the Universe is hospitable to life in our form (in a general sense)".

With stories this doesn't really work. You can't just write something and say "well it had to happen that way because if it didn't, this wouldn't have happened, and the whole story would have fallen flat"; you can't say "the ends justify the means". The means have to lead directly to the ends, in summation at least, even if that means you have to invent means to get to an end that are beyond expectation. The means beyond expectation have to make sense in the world wherein they occur, of course - you can't go from everything being normal to suddenly everything being on its head at the end, as the crisis involved in this sudden change is really where a story might begin, or is the consequence of preceding events; deus ex machina isn't really the note to leave a story on.

It's a difficult one, though: how much of a plot does one have to give away without giving too much away or too little? I guess there's a real technique to that. I'm not quite there with the story I'm working on at the moment, but I will be...at some stage. But I've fallen trap to the "well, it doesn't have to make sense given the amount of information presented, so it's fine" thought process, too, particularly in my novella manuscript. I'm glad I saw sense regarding it, even if it doesn't change the ultimate outcome, and even if it's a minor set of details - because what if I were asked about it? What could I say? "I'm not sure myself"? Well...perhaps that would actually be a legitimate reason, provided it's not an excuse: some storytellers (Tolkien) take the position of being the be-all-and-end-all of knowledge on a particular story, while others (From Soft with Dark Souls) take the position of being privy to some information but not all, and thus the reliance on the "who knows?" excuse leads fan speculation, research and debate onwards without any yes or no from on-high. It's a bit of a ruse, of course - for a franchise like Dark Souls, so story-dependent, to make sense, an over-arching story must be fleshed out. Individuals within the story-telling team may or may not know all of the details, but the story is detailed, even if not shared. To be honest I do quite like the idea of a writer taking the perspective of "I am narrating a story, but what I know or don't know doesn't affect the story itself. The story is as the story is" - but the issue again is that when the story is supported only by itself and there is no tangible lore extra to the story...it begins to feel limited.

As I said, I had a case of this with my novella manuscript - not because it felt limited, and not because I took the "the story is as the story is" approach on any superficial level, but because what I did supply story-wise created the impression that the experiences of a certain character were the result of something in her past which she was only indirectly involved in, having been a child at the time. It was at that point I was prepared to say "that's all she knows, and since this is her (part of the) story, that's all the information I might have to go on in relating her thoughts and feelings". But...I thought about it, and realised that while on the one hand that's perhaps justifiable, on the other it perhaps isn't: I'm not writing an entire story from her perspective, as if I were her, or filtered only through her thoughts and feelings; I'm writing about her, her thoughts and feelings, and importantly about her history as well. Even if the reality of the story is never made clear, because she never knows it clearly herself...that doesn't mean the story doesn't exist by itself. So I had to actually decide what the details of that story were, and as a result I managed to alter what we do find out about the character from her own thoughts and feelings - and it gives us a greater appreciation for the environment she's in, too.

That's probably what it boils down to, really: not so much that lore need be readily accessible, or that it need be accessible in any significant way, but that it be accessible through its existence in absentia. If lore is non-existent, it can't tell anybody anything; whereas if it exists but isn't available, what it doesn't tell an investigator can be as important to the impression left or the information gleaned as what is told. It needs to be tangibly unavailable, I suppose. It can't be a case of "well, nobody knows", because if nobody knows then the story runs no deeper than the paper it's printed on, or the screen it flickers into movement across.

Again I guess this is an issue of how much of a story does one need to reveal without it being too much nor too little. And I guess the answer is some, provided there's more of the story that is actively being withheld as part of the storytelling process; some, as long as it's enough.

How's that for a definite answer? Appalling. But at least you know there's more to it than that.

Saturday, March 21, 2015

How many words?

This is my typical thing when it comes to blogs: write a few posts, then take a break, and find that it's been a month or so since I last updated. It's a pattern I've been perfecting for years; with it I can create the illusion that I have a very busy existence. And everybody knows that a busy existence is an interesting existence.

Actually I have been a bit busy, but the aspect of interest is debatable. I've applied and interviewed for a few jobs recently; the first I was offered but turned down, having realised my aims and goals would have not made me the best fit for the role, the second I didn't get (and that's quite okay), while the third...well, the third I've only just applied for. I had told myself that if I weren't to get the second job (an alternative role in my current place of work) then it'd be enough of a spur to get me into job searching again, from which I'd taken a break - and so here I am again, much like last year, on the quest for new career options.

With that, I've managed to take a bit of a break from writing. I've done a little bit of editing regarding my current project, but I haven't really increased the 15,000 words I'd reached about the last time I updated. And of course I write "15,000 words" as if that total in this first draft will translate to the same total in the final version. It most likely will be pared down or reworked - and so it should. Compared to my novella manuscript (that nebulous object I keep mentioning and never actually describing), I don't feel enough has happened to justify 15,000 words - and I certainly don't want this story to be a struggle to get into. We've all read the beginnings of stories like that and invariably the beginnings are as far as we've bothered to go: a story should tell itself and do so in a compelling way, not bludgeon its audience into continued reading. I don't, though, want to go back and do proper edition right now - that can wait for the end of the draft. It's just a good thing for me to know, right now, that I need to focus not merely on telling my story and building a word count, but that those words I use must be meaningful and the story well-paced.

There's a "short short story" competition currently open with North & South magazine here that I believe I'd like to enter, but the catch is the story, being suitably short, must be no greater than 300 words. That's going to have to be an excessively well-paced tale without too much waffle, and given I have 15,000 words of (maybe not enough) substance, working to that kind of limit is going to be difficult.

But I do have an idea as to what it is I'd like to submit: a short story I've been working on for a while now, actually, which I keep picking up and putting down, in large part because I know how it's going to end but the path it must take to get there is relatively undefined (I've mentioned before that stories being told without a pathway for me tend to become waffle-shops, and even if that isn't readable to an outside mind it remains quite prominent in my own). It'll give me a chance to condense the text a bit, or a lot, and perhaps be a fore-runner to an actual published version of the story in general one day. It'd be good exposure, if I were to win - though I'm sure the competition is stiff and the odds of beating my competitors are probably ridiculously against me. But, as the notion goes, if you don't move you go nowhere.

Wish me luck!

Monday, February 16, 2015

Overwrought

If you'd read this blog at all beforehand, perhaps you might have noticed the quiet appearance of the title banner up there. I have to say I'm pleased with it; I feel it's not absolutely and utterly perfect, but I'm pleased with it nonetheless and to be honest the overwhelming portion of me thinks that no matter how perfect it might appear to anyone else, I'll likely remain more critical anyway. If you've ever read designer-written discourses on how perfection is never perfection when you're the creative party, you'll understand; I don't mean to suggest it actually is perfect, but at some point in the creative process you have to just take a step back and recognise that a piece is finished, even if it doesn't represent or match closely enough the image that had formed in your mind about how a concept should look. The audience, too, is never going to know how it was at first imagined to look, either: they're going to see it as it is and, unless it's bad, they won't be as critical as you, the designer, is being at all. And really, to be perfectly honest, it does actually look very much how I had imagined it would. The background to the banner is a texture I painted based on cork oak bark (there's relevance to that, as I may expound upon), and the hand-rendered lettering is the most successful version of a few experiments I did in order to get the imperfect, almost-filamentous look of the (completely unnecessary but, in my opinion, completely spot-on) ligatures that give it an almost seal- or cartouche-like appearance.

In short...I'm happy with it. Photoshop and Illustrator skills are indeed well-employed when you're a graphic designer!

That whole idea of imperfect completion, though, is I think what a lot of authors - established or aspiring - struggle with. For instance, I wrote the first draft of my novella (which I'm still strangely reticent to name "out loud", as if by doing so I'm communicating the entire plot and reading it word-for-word for someone to then convert into their own manuscript and promote as their own work. Yes, I'm fully aware that this presents me as decidedly paranoid) just over five years ago and, after basically doing a re-read through once, put down and let sit for a couple of years at a length of 32,000-ish words. When I received positive feedback on it even in that state, and after having devoted the entirety of 2013 to producing a graphic design portfolio while studying the subject (go check it out! http://be.net/bysimonrandell - self-promotion self-promotion self-promotion), I was ready, I think, to return to it and edit it again, this time incorporating new character development for otherwise plot-device personalities (a truly awful thing. If someone isn't there by merit of their own presence...the story is too dependent on them. I realise most stories are dependent on the characters living them, but if the only reason to include a woman named Marjory is to bridge the gap between one part of the story and the next...it's too tenuous. What if Marjory suddenly died? What if Marjory hadn't noticed whatever it is she noticed in order to propel the plot forwards? Why am I discussing the merits of Marjory? But seriously: Marjory either as to be a character already present and developed in order for her role as a plot-propulsion device to be acceptable, or the plot needs reworking. Marjory is either a character by her own merit, or she's not. And from a more personal perspective, I know what it's like to be treated as if [and told so] I were a plot device; it's not fun. Don't do that to poor, poor Marjory) and to concern myself more with themes which had been alluded to but which I'd up until that point not really included by intent. The story is now over 38,000 words - and is complete in itself, as I believe I've mentioned before.

However...I do keep wondering whether I should go back and add more stuff in. Should I mention what the antagonist had to survive by eating things the otherwise would have preferred to avoid, in order that it is an even more well-developed character? Should I speak of the moon, indicating clearly by implication and inference how much time is passing? In short, I believe the answer - despite continuing to consider such things - is no, I shouldn't. In the first case it's largely because I don't necessarily want the antagonist to be an open book. In the second, again, it's because I don't want things to be set out so definitively. And also, realistically, the mechanic of time is a passive one - there is no role that the hours play beyond that time advances, as time does, and that doesn't really need more reference than the occasional allusion to the environment. Really I've already arrived at the conclusion that the story is complete in its imperfection - as all stories are, as all designs are. There comes a point at which the process of creation is finished: anything beyond that could further reinforce the object created, but without that reinforcement it is not lacking anything. Sometimes, in fact, the aspects which are not further developed become part of the intentional form or purpose. Basically, the idea of "less is more" is, generally applied, true, and there is no point or benefit gained from having been overwrought.

And that, weirdly, brings me to something else I'd been considering earlier today (and have considered many times before now). I make no secret of being a bit of a gameplay-walkthrough-junkie, as in, I'll spend much time watching Youtube videos of people playing games I (mostly) haven't played. It really is entertaining. But, strangely, there's also that definite sense of being privy to a story: we can all read books or watch movies and experience being an audience, and watching games being played from start to finish has become a roughly similar experience over the past ten years. It wasn't, way-back-when - if you weren't playing the game then you were the unfortunate audience who was entirely lacking agency within the story and therefore had limited investment and limited enjoyment. Nowadays you don't have to exert agency within the world of a game in order to still be invested in its progression from start to finish. That, I think, has a lot to do with the sophistication that began to become core to games as their reason for existence; you can't just throw someone into a world without a reason or a cause for their presence and expect it to all make sense, though some notable older games did just that and at the time were highly successful. The trouble is, though, that sometimes you notice things that leave you wondering.

For me, such a thing is the dialogue of certain fantasy games these days. The storylines are meant to be dramatic, and to a point I can understand that - for why else would people play, unless caught up in the promise of some epic adventure with grand consequences and dire risks? And, of course, because of the influence of such things as The Lord of the Rings: things have to, somehow, live up to the grandeur of that kind of Truly Epic, particularly when they could be said to be hijacking a lot of the theretofore non-existent standards present in Tolkien's works. Unfortunately - as I've mentioned before - the ideas of so many of the present-day fantasy worlds created as if somehow new or different recycle a lot of the same stuff: elves are Humans Plus, orcs are (evil) Humans Plus, Green Edition. But that's to some extent understandable - if something works, why change it to sit outside expectations? And of course, it's not as if I don't like games that have elves and orcs - I absolutely adore the Elder Scrolls games, though those prior to Morrowind I haven't seen much, if anything, of.

It's the dialogue, really, that bothers me. Dialogue that is supposed to take the listener or reader out of the modern context is one thing, and often dialogue needs to be replete with flourish and redundancy - after all, that's how people communicate, and that's how stories which could be 32,000 words become 38,000 words...haha. But there's a certain amount of cliché which I think becomes too overpowering. If a character says "I will let nothing get in my way", for instance, that's really all that need be said. Yet one game I can think of in particular would have the character say "I will let nothing get in my way, do you hear me?! Nothing!" On its own this isn't so bad. You can tell merely from the words that this particular character would be quite impassioned and set in their stance. Yet because those words are typically all grouped together as a verbal cliché, they lose their impact and become predictable and empty - so much so as to make them formulaic. Anyone who has read Homer's Iliad and Odyssey probably has some understanding that the reason Homer, or the poets collectively called Homer in today's regard, utilised formulae ("wine-dark sea") was because by using such predictable and set word patterns the rhythm of the recited poems (both of these surviving epics were initially recited by bards, only later being written down and cemented as singular versions) could be maintained. In that regard the predictability of the words was sensible because it gave the poet a sense of where they were in their recitation, and allowed proper pacing...but the constant reference to the wine-dark sea likely did nothing to really inform the audience of the sea itself. An epithet is only meaningful if it is meant to retain its meaning, and like any word if it's said too often it becomes a mere collection of syllables. So too do over-used statements in dialogue, even if they're not over-used in the game but in the genre or in the wider culture (storytelling culture, for instance), and over-used formulae in internet fora ("Well, you could do that. Or you could do this. Hell, you could even..." - the latter sentence loses its emphasis, rather than gains it, purely because of how over-used the "hell, [extra option]" structure is).

I don't mean to criticise so much, really. It's just something I've noticed in some media, particularly games. It doesn't mean it's bad, it just means I find the impact of the words lost, and that's a sad thing. I think some cliché is unavoidable, of course: nobody's inventing language from scratch, and there are cultural trends at any given moment that will impact how something is communicated. It's just a great pity when the power of an object is lost because the way it is described uses someone else's words. Dialogue can be wrought without being overwrought, and if the level to which the work on it is present is relevant to the medium...then it fits. It's when that work transitions into over-work that something loses its meaning. Probably, at this stage, a bit like this blog entry.

Friday, February 13, 2015

Unintentional detour, but in a good way (perhaps?)

Perhaps a week ago I passed the 10,000 word mark. I'd really intended to find out by then how I could further customise this blog to make it less bore/snore-inducing when viewed...and so far haven't found an opportunity to really sit down and sort it out. That's not to say, of course, that I haven't been doing stuff - it just means I haven't been doing that.

But what stuff have I been doing? Well, for longer than a week I've been working on a small title banner to sit in place of the current text title up there - and that's been going well enough, in fits and starts. The problem I have is that I have many things I want to do simultaneously and sometimes it's frankly the easier thing just to procrastinate (have I written on this before?) and watch a Youtube video (or several) than engage in creativity when I feel I shouldn't be neglecting the things I'm not doing - because clearly it's better to neglect everything equally than do one thing over another. I've also been up the line to see my sister's three-week-old daughter, and since that's a trip that takes nearly three hours to complete in one direction (not unpleasantly so, though my old car invariably returns home at the end of it with a new shake or rattle to let me know I'm pushing it a wee bit) and once there much of my attention is on the people I'm with, it means I lose most of two days (if I stay overnight. Which I did, because, frankly, 5.5 hours in a car in a single day when you haven't traveled 5.5 hours away from home seems like a bit of a task!).

Not that two days means much in the long-run, of course. It's quite easy just to lose days to nothingness, to hum-drum status quo activities like housework. I notice it's been half-a-month since I last blogged - again, much because of the hum-drum status quo. I shouldn't pretend as if I've done nothing constructive - I really have engaged in different things. Currently I have a design I'm working on for a set of three posters with a typographic focus and an association with the sea; and, as I mentioned, I've been at work on a customised banner for this blog, which I believe I've either finished or almost finished, and for which I'll find out the relevant procedure to set in place.

And also: I saw the final installment of The Hobbit, which was an achievement for me as I didn't see the first in a cinema (and I'm almost glad I didn't. I have to say that as much as I'm inspired and intrigued by much of the story-telling elements shown in the movies, there are a lot of time-wasting aspects, too, as well as interpretations of events that seem somewhat odd: the weird plate-throwing game, for instance, that the dwarves set themselves to in the first movie. Certainly they, from memory, did set to washing the plates in the novel, but did they make a great show of being reckless? I don't remember that. While I understand the need for visual entertainment in a story that otherwise communicates much of its message in text in its original version, there's a certain amount of...silliness, I think, that maybe just didn't need to be in there. Of course we want to see the dwarves as merry, rough and finely skilled in what they do - not dropping a single plate - and certainly Peter Jackson may have wanted the film to appeal to children, perhaps [though oddly so, for while The Hobbit can be argued from its written voice to have been directed towards the literate child or pre-teen when it was first penned, I'm not certain one can present a movie the way these were and still aim to be inclusive of a younger audience], but I found it oddly patronising. This may have been my issue, though...) and I'd almost missed seeing this last one on a big screen as well. At first I thought I'd made a mistake by seeing it the high-frame-rate version; the second installment I recall as having that quintessential movie flatness, languor, and warm blurriness that allows supreme escapism (I've always felt): the story is presented in an other-than-realistic way, and so you can leave criticism behind and be transported elsewhere, into a realm unlike the one you're currently sitting in. Super high-definition almost, for me, makes something too accessible and too real - there seems no drama or production about the way people move or interact, and instead it brings it all back into the hum-drum. Not quite like housework, but I'm sure there'll be some of you who get what I mean.

Anyway, it turns out it merely took my eyes and mind some adjustment; soon enough it seemed just as inaccessible and just as unreal, which, oddly, is what I feel is needed in movies. A good example of this is a show in New Zealand called Shortland Street. It's bad. It's really, really bad. I don't just criticise it because it's a completely unrealistic hospital drama and I'm a healthcare professional myself; it's truly, truly awful. Part of its awfulness - not an influential part, but a part nonetheless - is how "real" the settings are: how plain, how bland, now reach-out-and-touchable they are. How parochial the characters are, and how outlandish the stories. In comparison to another soap - Home and Away, based in Australia - Shortland Street's stories are perhaps only slightly more outlandish (at least Summer Bay in Home and Away is a community, a town; Shortland Street revolves solely around the hospital and satellite venues like a very small number of people's homes, and its stories are invariably about which of the staff has murdered which of their coworkers, who's sleeping with whom, some crazed individual seeking to bomb the hospital. I work in a hospital; some of my coworkers are friends, others aren't, and that's generally as far as it all goes. We have lives outside of the hospital and away from each other. None of my coworkers (or me!) has killed another, they're not sleeping with each other willy-nilly, and in the five years I've been there nobody's pulled any stunts to place the hospital - a major tertiary centre - at risk...and yet that stuff happens on a consistent basis in Shortland Street. I suppose it has to if the show is to remain interesting to its audience?), but Home and Away has the blurry, inaccessible quality which transports the audience at least far away enough from their own presumably sharply defined lives to suspend disbelief. Movies have typically had this as well, and I've always found it makes them more appealing: the inaccessibility implied by it makes the world the story is occurring within that much more tantalising. Perhaps it's the same sort of thing that having to turn pages in a book implies, even subconsciously - you may be so very invested in a story, to the point maybe of being able to see it occurring as you read its progression...but then you have to turn the page and find out what happens next. The glass barrier between you and the various undersea creatures at the local aquarium may be a boon to your safety, but doesn't it all look so tremendously inviting all the same? Yet if you were in it, wet and cold and accompanied by the myriad other beings you might be able to see on the other side of the glass...you may not enjoy it so much. It's a perceptive issue, methinks. At any rate, that was all to say...: I enjoyed the high-frame-rate in the end. It's weird how unusual and noticeable it was to begin with, and then how normal and insignificant it was thereafter - and how quickly my perception adjusted.

And here I am, having waxed prosaic about high-frame-rate and film and what I had really meant to write about was the feeling I (and no doubt almost every person ever interested in the world created by Tolkien) was left with having been reimmersed in the realm of The Hobbit. I won't now - I'll leave that till next time - and I certainly won't be "reviewing" the movie. I don't consider myself a movie critic, nor really do I see the value in critics when opinions and perceptions are so individual and subjective that a self-professed expert may have an opinion that they will stand by till the dire end but that makes no sense to me (Lady in the Water, for instance: I loved it. It was such a beautiful, magical movie...and yet it was generally panned by people who regard their opinions as being more valid and necessary than actual movie-goers. I can't help thinking this was at least in part because it didn't have guns and war, but also because a lot of adults these days have forgotten they're grown-up children, not an entirely new species. Magic may appeal to children, but why should it not appeal to adults? That's why we read books, watch movies, play games, listen to music, is it not? To have stories told and memories and feelings conjured into existence and experience. No wonder there are so many people who play games and get into fantasy stuff. If adult life is all about a desk job that's slowly raising your blood pressure, stifling your creativity and otherwise wringing the neck of your enjoyment of life...maybe it's not really how adult life should be). What I want to do is maybe just touch on all of that stuff in more detail - the escapism, the need for stories, and the importance of allowing those stories to be created. But we'll see. First I need to do some more designing, some more writing, some more reading, and probably more procrastination. Youtube ahoy.