I'd quite like to secure for myself a copy of Moominland Midwinter by Tove Jansson. It was one of my mum's childhood books; both she and the book were published in 1957 and both have withstood the test of time pretty well, I think.
In seriousness, though, and as I think I've mentioned before, the thing about Moominland Midwinter is that it really gave me this sense of wonder as I read it. I suppose it sat well with some kind of not-yet-identified affection for weird and magical things that I'm quite well aware I have these days. Magic and weirdness are amazing things, all in all, and even in my earlier years when I was hyper-sensitive about my red hair and never had confidence in myself as a person who was (and is) allowed to be as I wished myself to be I guess magic and weirdness were already fitting pretty well into my worldview.
The problem is that I'm looking for one edition in particular - one with a light purple-blue cover and the image of Too-ticky (one of the book's characters) sitting upon a bridge over a small stream, and Moomintroll standing on the snow nearby looking up at her. That was the edition I first read when I was perhaps 8, a full 36 years after first publication of the title, and I have sentimental attachment to it. The story itself wouldn't have altered at all, I realise. But, you know, sentimentality.
But you know, I reckon it's that fondness of weirdness and magic that drives me to draw and paint and write these days. I have a set of kids' books that I'd love to actually finish illustrating which focus on magic as a normal part of the world - as in, freeing the mind from the prescriptive and proscriptive ways we're all trained to think and having a sense of wonder. We all tend to take for granted that we know things, and we understand how things are, and that's all there is to them - but that's a very adult way of being, and kids aren't born with that need to categorise and describe and be correct according to observation. Don't get me wrong, I'm not saying that there's anything amiss with being accurate - but striving only for accuracy tends to result in a loss of subjectivity and an inability to enjoy the wonder going on around us.
A good example is the idea of plant neurobiology. For quite a while I've held a fairly alternative perspective that while science is a grand thing and can explain a lot, we shouldn't forget that it should be challenged. Most people would assume that because there are no nerves in plants that plants have no neurobiology to speak of, and sure, if one demands that animal nerves are present for nervous impulses to travel through in order for an organism to qualify as innervated, one might say so. But who says an animal nerve is needed to carry nervous impulses? I've thought on this a fair bit over the past several years and really can't see how a narrow set of criteria is supposed to reflect a broader image of reality. Many scientists (far from all, of course) are challenging the idea that plants have no nervous systems. A key piece of evidence is the alterations in behaviour observed in roots of germinating seeds when the root cap is removed: when it is intact, the root grows and every so often seems to pause and apply pressure to the substrate to assess how appropriate it is to grow into, whereas without the root cap the root is less organised, growing faster but without seeming direction or drive to find good growing conditions. Does this indicate a nervous system akin to a mammals? No, I'd say not - but it does indicate that there are alterations in behaviour that occur when the root cap is injured which may be broadly similar to alterations in behaviour seen in animals whose nervous systems are also compromised. In short, a brain and its constituent nervous system don't necessarily equal the only form of nerve web an organism can have in order to have nervous sensation. Science can't see animal nerves in plants, for sure - but perhaps it needs to broaden its definition of what counts as a nerve, and how signals might be relayed.
In any case, imagination is a great thing, and I think we become a little too focused on describing how things are or must be at times, and this leads us to lose our ability to describe how things might be. I don't necessarily envy children their ability to present a coherent version of the world that doesn't really possess cohesion (the sum of the parts is greater than all of the separate parts put together is maybe a good way to interpret it), because I think understanding the way things are as well as we can is important...but I do think that taking some time to forget how things are and imagine how things might be through art and through storytelling is important as well. Storybooks are really only regarded as childish because they simplify things and propose nonsensical explanations for the world, but who said that nonsense and simplicity has to be childish? Nonsense and simplicity sound fun. Why promote serious adulthood vs. irrational childhood when we could promote fun personhood instead?
And if anyone knows where I can get a copy of Moominland Midwinter with the cover I fairly generally described, it'd be swell if you could let me know!
Friday, October 30, 2015
Wednesday, September 23, 2015
Confusion; otherwise known as intellectual property rights and copyright law
Recently I've been operating under my designer/artist role more than I have my aspiring author one, and for good reason: I've been working with a group of people to get a map designed that provides both the local and tourist populations with a free guide to some pretty cool stuff in Wellington. The city is pretty well-known for its eccentricity in some ways - born in part of its diminutive size and the "you can walk anywhere you want to go" reality that lends, meaning that everything is readily accessible and people are able to actually interact with each other face-to-face rather than through the windows of their cars. It's got a great creative scene and what's known here as a "café culture"; the city is about people and people interacting with one another, at least in some ways, and the experiences that this interaction enables. The chance to do some design work focusing on the local atmosphere has been quite enjoyable, to say the least.
This focus has sapped a lot of my freer time and energy, though - to the point of having to write a list to tick things off just to make sure I'm getting somewhere with the numerous things I want to do. Have I mentioned that already? Maybe so. One of those things has been the legal registration of intellectual property rights to a design of mine - something I, and many others, might not think to do as the first step in a creative process, but one which must surely be easily-enough completed.
As it turns out...no, this isn't necessarily the case. I found out today from a friend I'd been discussing it with that registration of intellectual property rights over a design in New Zealand is, in fact, only possible if the design has not been published publicly. As told by the report they were issued recently, even if something is original and its only instance of publication has been by the individual/company seeking to register it as their intellectual property, the fact it has even been published negates any ability to register it as such.
It's really quite strange - especially considering that on the great wide internet the only instance of their design ever having been published was on a profile associated with them alone. I suppose I understand it, in that once something is out there in the world it's seen and interpreted by many eyes and minds, and so loses its absolute "newness" - but it doesn't really make any sense, given that they, the designer and publisher of the design, were the one seeking to register the intellectual property rights they already had (and have) over it. Part of me is led to think that paying to register one's intellectual property rights is held in some way to be equal to the publication of an idea under one's name - i.e., once the design is published under one's name the intellectual property rights are asserted.
Whether this is the case, I'm not sure. My friend possesses those intellectual property rights, absolutely, just as I do over my unregistered design. And they possess a report saying that the reason they cannot register those rights is because they've already published the design under their own name - meaning that a search was conducted and the design was found to exist under their name already. The report might be seen thereby to reinforce those rights. They just can't register those rights under New Zealand law. It all seems a little weird.
But this leads me towards another conundrum regarding legal possession of original information: the New Zealand stance on copyright.
The crux of the situation is this: if you possess the manuscript to written work then you hold copyright of that work. There's no need to register it or complete a formal process of declaration of your possession of copyright because possessing the physical or electronic manuscript is enough.
But...is it? For instance, if you send a piece of work to someone for review, do they then possess copyright to it as well? Or does it come down to keeping track of all letters, emails and other forms of communication which establish who sent what when? It all seems a bit easily subverted, really. And then of course one must consider that if one does not wish to put stock in such a process lending any legal support to one's actual claim of possession of copyright, in New Zealand one doesn't actually have any other option by way of legal process: there is no way to register copyright in New Zealand to any written work. As stated by the Copyright Council of New Zealand,
In the UK things are done one step better: to establish copyright one must take the completed manuscript of something and mail it to oneself and then never open the parcel. The date printed by a simple ink stamp at whatever mail hub it was which processed your mail that day is enough to establish that you are the originator of the contents of the package and when you completed its creation. That's what I've done with my novella manuscript. Someone I know overseas possesses a physical copy of an early draft of it I sent them back in 2012 (I believe), but I have the email trail saved to establish origin. The next step for me is registration in the US - where there is a way to register copyright.
It really does make one think, though. What are the laws that protect intellectual property rights and copyright doing if they preclude or make impossible any registration? The mind boggles.
This focus has sapped a lot of my freer time and energy, though - to the point of having to write a list to tick things off just to make sure I'm getting somewhere with the numerous things I want to do. Have I mentioned that already? Maybe so. One of those things has been the legal registration of intellectual property rights to a design of mine - something I, and many others, might not think to do as the first step in a creative process, but one which must surely be easily-enough completed.
As it turns out...no, this isn't necessarily the case. I found out today from a friend I'd been discussing it with that registration of intellectual property rights over a design in New Zealand is, in fact, only possible if the design has not been published publicly. As told by the report they were issued recently, even if something is original and its only instance of publication has been by the individual/company seeking to register it as their intellectual property, the fact it has even been published negates any ability to register it as such.
It's really quite strange - especially considering that on the great wide internet the only instance of their design ever having been published was on a profile associated with them alone. I suppose I understand it, in that once something is out there in the world it's seen and interpreted by many eyes and minds, and so loses its absolute "newness" - but it doesn't really make any sense, given that they, the designer and publisher of the design, were the one seeking to register the intellectual property rights they already had (and have) over it. Part of me is led to think that paying to register one's intellectual property rights is held in some way to be equal to the publication of an idea under one's name - i.e., once the design is published under one's name the intellectual property rights are asserted.
Whether this is the case, I'm not sure. My friend possesses those intellectual property rights, absolutely, just as I do over my unregistered design. And they possess a report saying that the reason they cannot register those rights is because they've already published the design under their own name - meaning that a search was conducted and the design was found to exist under their name already. The report might be seen thereby to reinforce those rights. They just can't register those rights under New Zealand law. It all seems a little weird.
But this leads me towards another conundrum regarding legal possession of original information: the New Zealand stance on copyright.
The crux of the situation is this: if you possess the manuscript to written work then you hold copyright of that work. There's no need to register it or complete a formal process of declaration of your possession of copyright because possessing the physical or electronic manuscript is enough.
But...is it? For instance, if you send a piece of work to someone for review, do they then possess copyright to it as well? Or does it come down to keeping track of all letters, emails and other forms of communication which establish who sent what when? It all seems a bit easily subverted, really. And then of course one must consider that if one does not wish to put stock in such a process lending any legal support to one's actual claim of possession of copyright, in New Zealand one doesn't actually have any other option by way of legal process: there is no way to register copyright in New Zealand to any written work. As stated by the Copyright Council of New Zealand,
Copyright comes into existence automatically under the Copyright Act 1994, when a work is put into material form e.g. manuscript, audio/video recording.As I'm sure anyone out there who has written anything ever might wonder, how does non-registration establish possession of copyright, exactly? The simple explanation of it as far as my worry dictates is, well, it doesn't...maybe. Actually, this is what leads me to think publication of a design under one's name asserts intellectual property rights: if you can demonstrate you did it first, then you did it first, at least according to New Zealand law. Whether it actually does is for the silly people out there (like me) who want to be able to create things and not have someone out there swoop in and try to take those creations away.
No registration is necessary (or even possible), nor is any other formality required for securing copyright protection.
In the UK things are done one step better: to establish copyright one must take the completed manuscript of something and mail it to oneself and then never open the parcel. The date printed by a simple ink stamp at whatever mail hub it was which processed your mail that day is enough to establish that you are the originator of the contents of the package and when you completed its creation. That's what I've done with my novella manuscript. Someone I know overseas possesses a physical copy of an early draft of it I sent them back in 2012 (I believe), but I have the email trail saved to establish origin. The next step for me is registration in the US - where there is a way to register copyright.
It really does make one think, though. What are the laws that protect intellectual property rights and copyright doing if they preclude or make impossible any registration? The mind boggles.
Thursday, August 27, 2015
Gatekeepers, part 2: yes vs. no
My last post was timed perfectly to coincide with the recent news that Authonomy.com, a site owned and operated by HarperCollins as a means to encourage community between aspiring authors (and to act as a means to expose themselves to those authors without the need to rely so heavily on literary agents, which is something more and more people are finding an impasse these days due to the "gatekeeper" phenomenon I mentioned last entry), is to close its metaphorical doors on the 30th of September this year.
It's a pity; I joined Authonomy many months ago, but hadn't found myself in a position to submit anything up till now (I'd like to blame a lack of time and energy for the process for this, as trying to gain a foothold in graphic design in a place like New Zealand, where the population and therefore the opportunities are rather limited [those gatekeepers again...] plus work as a neonatal intensive care nurse are both time-consuming and require a lot of physical and mental investment...but it's at least as much to do with my fear that my work just isn't good enough, and that hurdle is a far greater one to clear), and now I've missed the boat. As a means of access to opportunities for both HarperCollins and aspiring authors, Authonomy is an excellent idea - a community site actively run and engaged with by a publishing company in an attempt to overcome the roadblocks and naysayers that would otherwise keep good creative mind and skills shut away from the world seems perfect.
The trouble is, though, that as HarperCollins stated in their blog, over time community engagement has waned and the number of new titles that began as uploads to the Authonomy site and have become real-life books has decreased along with that. HarperCollins clearly isn't finding the community is fostering the opportunities it desires, and as any well-designed and well-helmed business knows, if needs aren't being met by an aspect of the business model but resources are still being invested into it, then it really is throwing good money after bad to keep it going. I'd like to assign human emotions to the corporate entity that is HarperCollins and imagine it actually doesn't like the idea of cutting Authonomy off - I truly believe it doesn't. But alas, even the most sentimental company has to be somewhat willing to dock useless appendages before they turn septic and start slowly poisoning everything else.
So where are the authors going? Or where are they not appearing from? Where's the new work? There's no way people just aren't writing, not telling stories, not putting fingers on keyboards and pens on paper. Stories are definitely being told. I can't help wondering that it might just be that the sheer amount of energy that seeking acceptance from a publishing company takes, as well as the overwhelming reality that most people will not be published by a reputable publishing house regardless of how well-written their story is (for reasons like demographic appeal, for instance; remember the Harry Potter reference I made last time? If the story is true it literally took a child saying they wanted it published for the book to be published, but prior to that it was assessed as not being appropriate for the target market. Unbelievable, right? I think this is the fourth or fifth reference I've made to the gatekeeper phenomenon in this entry so far), is working against the awesome efforts HarperCollins went to in seeking the establishment of a community-focused method of finding new material and new content creators, and is certainly working against those publishing companies who deem themselves more exclusive and don't have such initiatives on the go. Put it this way: sure, receiving a "thanks, but no thanks" in isolation isn't necessarily too hard to deal with, but in the face of rejection after rejection after rejection, not seeking that rejection (because that's what it starts boiling down to) is the far more sensible option.
And you know what doesn't offer only the discouragement of countless rejection letters? Self-publishing. There are drawbacks, as I touched upon last entry. There are the attitudes many people have regarding it, there are the limitations of exposure, there are the risks that a story that could be great if just given the right critique from industry experts will be sent out into the world in good form instead, and there are the tribulations of design (which is where someone such as myself would come in, just as a suggestion...), among many other things to consider when making the decision whether to self-publish or not - but in the end, the idea of feeling as though one has made a contribution to collective culture and the idea that one is at least marginally successful creatively are far more positive for a great many people. It's totally understandable.
For a moment there I was thinking to myself "yes, but if everyone self-publishes, then the checks and balances won't be in place. Horrific grammar and bad writing will proliferate. Surely society will collapse!" - and then I remembered that people communicate horrifically all the time, and bad writing already proliferates even with the current system in situ. And who's to say that writing a story and having it published need be a mark of brilliance? Nobody takes the attitude that someone who can paint a mediocre picture is somehow marring art; the entire point is that sometimes art is appealing, and sometimes it isn't, and whether it is or isn't depends on who views it and what they both give to and take from it. Maybe writing is the same, or it could be.
At any rate, the main point here is that I'm disappointed that Authonomy is shutting down, and I hadn't predicted it - but that I'm not surprised about it. It makes sense. Being the determining factor in one's own success is of huge appeal to people, and the undeniable truth is that by engaging with someone (or something, like a publishing house) who is 99.99% certain to tell you what you're doing or have done is not good enough, success is turned from attainable to unachievable. Success is a totally subjective thing, of course - one person's success may be selling 10,000,000 copies of their work, while another person's success may be just seeing a bound copy of the work they poured their heart and soul into while working crappy hours with high stress just to make ends meet. Who doesn't want success in life? Who wants to be told "no" time and again? Nobody. Kids hate being told no, and as it turns out so do adults, even if they understand it far better than they would have growing up. And rather than putting themselves out there to be told no - even if they might actually be told yes - they seek yes.
It's sad that the yes that Authonomy might have led to for a lot more people is being turned into a this site no longer exists. But that's what happens when people don't want to read another no. I'm absolutely assuming Authonomy is a casualty of the leap many people have taken into self-publishing, of course, and that may not be the entirety of the story - but it's a definite factor. If you seek yes and have a way to find it, that's the way you're likely to go, even if your yes is a lot smaller than that of a publishing company. A little yes is a great deal more success than none at all, isn't it?
It's a pity; I joined Authonomy many months ago, but hadn't found myself in a position to submit anything up till now (I'd like to blame a lack of time and energy for the process for this, as trying to gain a foothold in graphic design in a place like New Zealand, where the population and therefore the opportunities are rather limited [those gatekeepers again...] plus work as a neonatal intensive care nurse are both time-consuming and require a lot of physical and mental investment...but it's at least as much to do with my fear that my work just isn't good enough, and that hurdle is a far greater one to clear), and now I've missed the boat. As a means of access to opportunities for both HarperCollins and aspiring authors, Authonomy is an excellent idea - a community site actively run and engaged with by a publishing company in an attempt to overcome the roadblocks and naysayers that would otherwise keep good creative mind and skills shut away from the world seems perfect.
The trouble is, though, that as HarperCollins stated in their blog, over time community engagement has waned and the number of new titles that began as uploads to the Authonomy site and have become real-life books has decreased along with that. HarperCollins clearly isn't finding the community is fostering the opportunities it desires, and as any well-designed and well-helmed business knows, if needs aren't being met by an aspect of the business model but resources are still being invested into it, then it really is throwing good money after bad to keep it going. I'd like to assign human emotions to the corporate entity that is HarperCollins and imagine it actually doesn't like the idea of cutting Authonomy off - I truly believe it doesn't. But alas, even the most sentimental company has to be somewhat willing to dock useless appendages before they turn septic and start slowly poisoning everything else.
So where are the authors going? Or where are they not appearing from? Where's the new work? There's no way people just aren't writing, not telling stories, not putting fingers on keyboards and pens on paper. Stories are definitely being told. I can't help wondering that it might just be that the sheer amount of energy that seeking acceptance from a publishing company takes, as well as the overwhelming reality that most people will not be published by a reputable publishing house regardless of how well-written their story is (for reasons like demographic appeal, for instance; remember the Harry Potter reference I made last time? If the story is true it literally took a child saying they wanted it published for the book to be published, but prior to that it was assessed as not being appropriate for the target market. Unbelievable, right? I think this is the fourth or fifth reference I've made to the gatekeeper phenomenon in this entry so far), is working against the awesome efforts HarperCollins went to in seeking the establishment of a community-focused method of finding new material and new content creators, and is certainly working against those publishing companies who deem themselves more exclusive and don't have such initiatives on the go. Put it this way: sure, receiving a "thanks, but no thanks" in isolation isn't necessarily too hard to deal with, but in the face of rejection after rejection after rejection, not seeking that rejection (because that's what it starts boiling down to) is the far more sensible option.
And you know what doesn't offer only the discouragement of countless rejection letters? Self-publishing. There are drawbacks, as I touched upon last entry. There are the attitudes many people have regarding it, there are the limitations of exposure, there are the risks that a story that could be great if just given the right critique from industry experts will be sent out into the world in good form instead, and there are the tribulations of design (which is where someone such as myself would come in, just as a suggestion...), among many other things to consider when making the decision whether to self-publish or not - but in the end, the idea of feeling as though one has made a contribution to collective culture and the idea that one is at least marginally successful creatively are far more positive for a great many people. It's totally understandable.
For a moment there I was thinking to myself "yes, but if everyone self-publishes, then the checks and balances won't be in place. Horrific grammar and bad writing will proliferate. Surely society will collapse!" - and then I remembered that people communicate horrifically all the time, and bad writing already proliferates even with the current system in situ. And who's to say that writing a story and having it published need be a mark of brilliance? Nobody takes the attitude that someone who can paint a mediocre picture is somehow marring art; the entire point is that sometimes art is appealing, and sometimes it isn't, and whether it is or isn't depends on who views it and what they both give to and take from it. Maybe writing is the same, or it could be.
At any rate, the main point here is that I'm disappointed that Authonomy is shutting down, and I hadn't predicted it - but that I'm not surprised about it. It makes sense. Being the determining factor in one's own success is of huge appeal to people, and the undeniable truth is that by engaging with someone (or something, like a publishing house) who is 99.99% certain to tell you what you're doing or have done is not good enough, success is turned from attainable to unachievable. Success is a totally subjective thing, of course - one person's success may be selling 10,000,000 copies of their work, while another person's success may be just seeing a bound copy of the work they poured their heart and soul into while working crappy hours with high stress just to make ends meet. Who doesn't want success in life? Who wants to be told "no" time and again? Nobody. Kids hate being told no, and as it turns out so do adults, even if they understand it far better than they would have growing up. And rather than putting themselves out there to be told no - even if they might actually be told yes - they seek yes.
It's sad that the yes that Authonomy might have led to for a lot more people is being turned into a this site no longer exists. But that's what happens when people don't want to read another no. I'm absolutely assuming Authonomy is a casualty of the leap many people have taken into self-publishing, of course, and that may not be the entirety of the story - but it's a definite factor. If you seek yes and have a way to find it, that's the way you're likely to go, even if your yes is a lot smaller than that of a publishing company. A little yes is a great deal more success than none at all, isn't it?
Tuesday, August 18, 2015
Gatekeepers: the trouble with publishing (and self-publishing)
I may have mentioned it before (and also I may not have...), but recently I designed the cover for the book of a friend of mine, The Good Slave. It's going to be self-published, which I realise many people remain critical of because for some reason the business model of publishing, which is irrevocably bound up in the business of money-making, is taken to be bound up in the business of publishing good literature.
Not that it isn't, in some way - as in, certainly one could equate a book being published by a company that wishes to make money and therefore will only invest in quality products with a book being published because it is quality; and conversely, there are many self-published books that really might have benefited (greatly or otherwise) from being critiqued a bit more strongly and perhaps more objectively before they were published.
However...there is a certain amount of error in assuming that having money thrown at a product is the same as that product being certified of quality. We've all read books that serve a purpose and perhaps are over-filled with clichés which frankly wouldn't have made it through edition if the point of the endeavour wasn't part of a greater promotional model. I've read a couple of books associated with a certain franchise of games that has itself gained notoriety and reputation for being things of quality, give or take different definitions of "quality" used for each different game; the books themselves are entertaining enough, but they suffer the same predictability of phraseology that riddle so many so-called epic story-telling genres today. It detracts, at least for me, from the story itself - one can't roll one's eyes and read simultaneously, right? Not that I actually do roll my eyes, and I hope I don't sound like I'm being too negative - it's more that it's something I notice, and because I notice it I'm taken out of the story.
There's a lot of wiggle-room, of course - if a book is set in a certain time or is supposed to conjure up certain imagery, then use of language to paint that imagery and be indicative of the time makes total sense. It's when it comes down to a stilted sense of drama that it become an issue. It's when the word patterns becomes clichés in themselves - something which I've spoken about before. Resorting to the same old, same old by way of how someone speaks or how a secret is revealed isn't something that stories should do.
Ultimately I think that's part of the reason people shouldn't just assume a book is of lower quality because it's self-published, and it's certainly a great part of the reason people shouldn't just assume that a book that has been professionally published is good.
I'll absolutely offer the disclaimer that one person's good is another person's bad, and vice-versa. I'd never want to pretend that my opinion is more important or more justified than anyone else's - why would I? It's pointless trying to tell someone else who literally perceives something in an alternate way to me that their perception is one of error - because, for one thing, they could say the same of my perception, and for another, diversity of experience and perception is something that I don't feel is celebrated enough. But then...that's the point really: the old view of being published by a company of repute may be enough for some to regard a book as a thing of quality might mean they won't regard a self-published piece of work as comparable...but there are plenty of those out there who aren't so hung up on the status symbol of a publisher's logo on the spine or on the publishing information page.
Don't mistake me: I'd love to be published by a reputable publishing house, as it will mean the greatest possible exposure of my work to eyes and minds that might want to read it. Who wouldn't want a wide net cast on their behalf? The troubles with this are several, though, including that it's about as easy to win the lottery as it is to have something you've laboured over for years, possibly, deemed worthy enough of publishing. As I said, a publisher is only partially motivated by promotion of the literary arts in engaging in actual publishing; there's a large amount of economic toing-and-froing that must go on, and in the end if there's any doubt that a book will be a good investment, the publisher just won't invest. Everybody's heard of the Harry Potter books, and almost as many people have heard of the struggle J.K. Rowling went through before a publisher's daughter requested of her father that he publish the first instalment of the series. He wasn't going to, otherwise, because he didn't view it a wise investment - as didn't the however many authors (many, as my understanding goes) the manuscript had been sent to prior to that. Imagine if the series had had to be self-published? Would it have had the same success? No, probably not - but not because of quality of work, but because of exposure, or lack thereof.
Beyond this concern, though, is the fact that once someone entrusts a work of one's own creation to a faceless publishing house (for unless they're small-time they are indeed faceless, as any multi-national corporation becomes regardless of how much they might want to remain seen as a caring, personable entity that wants your money only because it costs money to love you so much), one actually loses creative control of the work. The words can't be changed (though changes can be requested), but the visual representation of the work in question becomes the task of a graphic design and marketing team, and is only partly, if at all, the business of the author. I'm all for giving graphic designers work (I've studied it myself and remain in the process of trying to break into the industry) - but the truth is an author's idea of how they'd like their story visually represented may be taken into consideration...or it may not. Ultimately the marketing team are going to be able to strong-arm most authors (or at least, new authors) into following their lead because, after all, they know best what consumers respond to.
I know what it's like - as a designer with freelance jobs I do invest in the projects I do in order to work in partnership with clients, because often clients have ideas but don't know the different aspects of design they must consider, or whether their design idea is the best to achieve what they want. But that doesn't mean it isn't a partnership - the designer is hired to design something, but good design isn't just the designer sitting down and drawing something out and then telling the client the design is finished when they've decided it is. Good design involves how they client sees their product, or themselves, or whatever it is the design is being created for, as well as the designer's skills, including their advice and their creative direction. Design isn't just about doing what the client wants without investment on the designer's part, and it definitely isn't about the designer telling the client what the client wants. When I designed my friend's book cover there was a lot of back-and-forth messaging - I felt I'd cracked it, but he wasn't quite as happy...and so I'd make another alteration. Long-distance this sort of fine-tuning can present hurdles, but they're cleared easily enough if communication remains free and open - and it did. I might have felt I'd arrived at a good outcome, but because he wasn't quite so pleased, there was more work to be done (a lot of it had to do with balancing brightness and tones in the art - the intent for now is for a digital piece ready for an e-book reader, and so the brightness on my laptop screen wasn't identical to that of the standard brightness it needed to be tailored for. The same thing happens for print design: you have something super bright on your screen and when you print it out it's dull and dark, purely because you're not dealing with the same parameters in one format as you are in another). In the end my friend, who was in this case my client, was the one who decided whether the cover was finished or not - because it's his story, and his vision. And I was able to deliver something he was extremely pleased with, and that made me feel I'd done a great job.
The question is, though, one of who gets to claim the title of client in the publishing sphere. Is it the author, who wants to be published and who has likely been turned down by other publishing houses before? Or is it the publisher, who gets to accept or dismiss manuscripts for publication? Power relationships, in which one party is in a far less powerful position in comparison to the other, are a real phenomenon even when it comes to writing and invention: after all, you have to be able to argue why someone should part with their money in order to get that money, and if you push it too far (say by refusing to acquiesce to your publisher's desires regarding the images on the cover of your book), you might end up being told to sling your hook. It's your product, but the fact is you're just another aspiring author and the publishing companies have their choice of investment. They hold the power.
As a friend of mine say, publishing companies (and companies like them) are gatekeepers: they get to decide who to let through and who to bar from entry, and they don't have to actually explain their perspective. It's a useful comparison, and a very apt one. It's also depressing, if one stops to think about it, to consider that the world loves creativity but the amount of artistic ability that gets ignored or dismissed by those who get to say yea or nay because of what their perception of market tastes are. That's a whole lot of stifled creativity.
Not that it isn't, in some way - as in, certainly one could equate a book being published by a company that wishes to make money and therefore will only invest in quality products with a book being published because it is quality; and conversely, there are many self-published books that really might have benefited (greatly or otherwise) from being critiqued a bit more strongly and perhaps more objectively before they were published.
However...there is a certain amount of error in assuming that having money thrown at a product is the same as that product being certified of quality. We've all read books that serve a purpose and perhaps are over-filled with clichés which frankly wouldn't have made it through edition if the point of the endeavour wasn't part of a greater promotional model. I've read a couple of books associated with a certain franchise of games that has itself gained notoriety and reputation for being things of quality, give or take different definitions of "quality" used for each different game; the books themselves are entertaining enough, but they suffer the same predictability of phraseology that riddle so many so-called epic story-telling genres today. It detracts, at least for me, from the story itself - one can't roll one's eyes and read simultaneously, right? Not that I actually do roll my eyes, and I hope I don't sound like I'm being too negative - it's more that it's something I notice, and because I notice it I'm taken out of the story.
There's a lot of wiggle-room, of course - if a book is set in a certain time or is supposed to conjure up certain imagery, then use of language to paint that imagery and be indicative of the time makes total sense. It's when it comes down to a stilted sense of drama that it become an issue. It's when the word patterns becomes clichés in themselves - something which I've spoken about before. Resorting to the same old, same old by way of how someone speaks or how a secret is revealed isn't something that stories should do.
Ultimately I think that's part of the reason people shouldn't just assume a book is of lower quality because it's self-published, and it's certainly a great part of the reason people shouldn't just assume that a book that has been professionally published is good.
I'll absolutely offer the disclaimer that one person's good is another person's bad, and vice-versa. I'd never want to pretend that my opinion is more important or more justified than anyone else's - why would I? It's pointless trying to tell someone else who literally perceives something in an alternate way to me that their perception is one of error - because, for one thing, they could say the same of my perception, and for another, diversity of experience and perception is something that I don't feel is celebrated enough. But then...that's the point really: the old view of being published by a company of repute may be enough for some to regard a book as a thing of quality might mean they won't regard a self-published piece of work as comparable...but there are plenty of those out there who aren't so hung up on the status symbol of a publisher's logo on the spine or on the publishing information page.
Don't mistake me: I'd love to be published by a reputable publishing house, as it will mean the greatest possible exposure of my work to eyes and minds that might want to read it. Who wouldn't want a wide net cast on their behalf? The troubles with this are several, though, including that it's about as easy to win the lottery as it is to have something you've laboured over for years, possibly, deemed worthy enough of publishing. As I said, a publisher is only partially motivated by promotion of the literary arts in engaging in actual publishing; there's a large amount of economic toing-and-froing that must go on, and in the end if there's any doubt that a book will be a good investment, the publisher just won't invest. Everybody's heard of the Harry Potter books, and almost as many people have heard of the struggle J.K. Rowling went through before a publisher's daughter requested of her father that he publish the first instalment of the series. He wasn't going to, otherwise, because he didn't view it a wise investment - as didn't the however many authors (many, as my understanding goes) the manuscript had been sent to prior to that. Imagine if the series had had to be self-published? Would it have had the same success? No, probably not - but not because of quality of work, but because of exposure, or lack thereof.
Beyond this concern, though, is the fact that once someone entrusts a work of one's own creation to a faceless publishing house (for unless they're small-time they are indeed faceless, as any multi-national corporation becomes regardless of how much they might want to remain seen as a caring, personable entity that wants your money only because it costs money to love you so much), one actually loses creative control of the work. The words can't be changed (though changes can be requested), but the visual representation of the work in question becomes the task of a graphic design and marketing team, and is only partly, if at all, the business of the author. I'm all for giving graphic designers work (I've studied it myself and remain in the process of trying to break into the industry) - but the truth is an author's idea of how they'd like their story visually represented may be taken into consideration...or it may not. Ultimately the marketing team are going to be able to strong-arm most authors (or at least, new authors) into following their lead because, after all, they know best what consumers respond to.
I know what it's like - as a designer with freelance jobs I do invest in the projects I do in order to work in partnership with clients, because often clients have ideas but don't know the different aspects of design they must consider, or whether their design idea is the best to achieve what they want. But that doesn't mean it isn't a partnership - the designer is hired to design something, but good design isn't just the designer sitting down and drawing something out and then telling the client the design is finished when they've decided it is. Good design involves how they client sees their product, or themselves, or whatever it is the design is being created for, as well as the designer's skills, including their advice and their creative direction. Design isn't just about doing what the client wants without investment on the designer's part, and it definitely isn't about the designer telling the client what the client wants. When I designed my friend's book cover there was a lot of back-and-forth messaging - I felt I'd cracked it, but he wasn't quite as happy...and so I'd make another alteration. Long-distance this sort of fine-tuning can present hurdles, but they're cleared easily enough if communication remains free and open - and it did. I might have felt I'd arrived at a good outcome, but because he wasn't quite so pleased, there was more work to be done (a lot of it had to do with balancing brightness and tones in the art - the intent for now is for a digital piece ready for an e-book reader, and so the brightness on my laptop screen wasn't identical to that of the standard brightness it needed to be tailored for. The same thing happens for print design: you have something super bright on your screen and when you print it out it's dull and dark, purely because you're not dealing with the same parameters in one format as you are in another). In the end my friend, who was in this case my client, was the one who decided whether the cover was finished or not - because it's his story, and his vision. And I was able to deliver something he was extremely pleased with, and that made me feel I'd done a great job.
The question is, though, one of who gets to claim the title of client in the publishing sphere. Is it the author, who wants to be published and who has likely been turned down by other publishing houses before? Or is it the publisher, who gets to accept or dismiss manuscripts for publication? Power relationships, in which one party is in a far less powerful position in comparison to the other, are a real phenomenon even when it comes to writing and invention: after all, you have to be able to argue why someone should part with their money in order to get that money, and if you push it too far (say by refusing to acquiesce to your publisher's desires regarding the images on the cover of your book), you might end up being told to sling your hook. It's your product, but the fact is you're just another aspiring author and the publishing companies have their choice of investment. They hold the power.
As a friend of mine say, publishing companies (and companies like them) are gatekeepers: they get to decide who to let through and who to bar from entry, and they don't have to actually explain their perspective. It's a useful comparison, and a very apt one. It's also depressing, if one stops to think about it, to consider that the world loves creativity but the amount of artistic ability that gets ignored or dismissed by those who get to say yea or nay because of what their perception of market tastes are. That's a whole lot of stifled creativity.
Tuesday, June 2, 2015
Word up
I can't help it. I focus on detail, and have always been able to spell well, and that causes me to really pick out the errors in work I've read. It really grates on my sensibilities to see inappropriate word-use, grammar, syntax, spelling, punctuation...
Oh, and the rampant use of cardinal numbers when ordinals should be chosen is literally ire-inducing to me. It seems more common these days than it used to, as far as my memory goes - but then I'm only 29, so my personal purview may not, shockingly enough, encompass the entirety of modern written English. I see it a lot, though - and hear it. Air New Zealand has YouTube adds which utilise the cardinal (1, 2, 3, 4...) instead of the ordinal (1st, 2nd, 3rd, 4th...) when describing dates of expiry of current flight deals - and while the narrator for such ads is clear enough in his intent, I don't believe being clear is an acceptable approximation for being correct.
And it's that which irks me so: certainly I can understand someone when they say "which I don't even know if that's a good idea"; they clearly think something may not be a wise decision. However...what sticks with me is the inappropriate appearance of "which". It doesn't make sense. For it to do so the rest of the sentence would require modification: "which I don't know is a good idea".
It boils down to pronoun redundancy: if you use one pronoun ("which"), you do not then need another ("that"). You could use both if one functions as a pronoun and the other a determiner (which both are): "Which is the one that you like?" is a clunky, rough-edged but grammatically-sound sentence. "Which is that you like?", or "which that you like" (as a statement, not a question) are perhaps more archaic versions of saying the same thing - the odd phenomenon of words losing their broader communicative ability in more modern times, requiring the addition of more words to say the same thing. But that's an aside - one I may come back to, because it really is interesting. Back to the topic a hand, though, "which that you like" can also be said "that which you like" - a seeming reversal in word order that means the same thing, but actually a nice little trade-in: the sentences both begin with pronouns and have determiners as their second words. Because which and that are both pronouns and determiners, and can be used in combination provided one acts in one function and the other takes on the empty role, either can occupy first or second position. The actual word order itself isn't reversed - just the choice of words used changes. It's quite a nice example of how things aren't so cut-and-dry on the surface but actually still are if one cares to think about them.
(And I know, few people bother. There are other, arguably better things to do with one's time than argue about how to speak.)
Of course, I realise that spoken, unrehearsed language is subject to errors and is defensible in its inclusion thereof - it's not proofread and approved before going live. It still irks me, but I'm more able to understand mistakes made in the moment - unless they're derogations relating to groups of people. I'll point out the error with that in a forthright way, obviously with the understanding that because pop-culture has made it "acceptable" to say "don't be a girl", "men are stupid", "that's gay" and so on, people don't always think that what they're saying has impact on others. Such understanding of the trend doesn't mean I think any of these things are okay to say, though - I just don't assume someone meant to be offensive, but rather was uninformed. If they then choose to be a total jerk about it and not modify their choice of words thereafter, then I'm left with no assumptions but the definite knowledge that they are indeed meaning to be offensive and seem to think that, while they make a definite choice as to which words they use, they bear no responsibility for that choice. But here's the key point to life in general: a person may have certain freedoms, such as the freedom of speech - and that's great. Freedom of speech doesn't mean freedom from responsibility for what is said, however, and it doesn't mean freedom from someone else exercising their freedom of speech by verbally tearing you to pieces for being insensitive.
That's a really good point for writers to remember, though: writers must know when to make it clear that their character thinks a certain way, rather than that they as the author do. As said above, while spoken words are not proofread (but should be chosen wisely anyway), written words most certainly are, or should be. It's easy to make a mistake by using the incorrect voice; if it's not clear you're describing something from the perspective of a character, you as the author risk describing it in the voice of the narrator - which is you.
I'm a type 1 diabetic, and have been since I was 11. I'm consistently amazed that people think diabetes in any form is caused by "eating too much sugar" - roughly akin to victim-blaming, for example: "if diabetics hadn't eaten so unhealthily they wouldn't have diabetes now, would they?" As a nurse I can unequivocally say that no, it isn't such a cause-and-effect situation. To say diabetes is caused by eating too much sugar ignores, for one thing, the sheer variety of diabetic conditions (type 1 is not type 2, gestational diabetes is not diabetes insipidus. Diabetes actually means "passer through", and medically communicates nothing more than the idea that large volumes of urine are being excreted); it also ignores the fact that our society/societies are not health-oriented, and don't encourage balanced lifestyles; it also ignores the fact that stress hormones decrease the efficacy of insulin, and stress also tends to be "coped with" by engaging in activities which further increase risk of ill-health: smoking, food-cravings, drinking, etc., etc.
So it'd be infuriating for me to read a book whose narrative voice described the eating habits of someone as those of a person heading for diabetic status. Luckily I never have read the words - it's just an example. Yet I'd be completely fine with a character who was otherwise presented as having little health-related knowledge saying such a thing - because the character is demonstrably ill-advised and ignorant. Neither being ill-advised or ignorant is a good thing, of course - but nobody has all information and knowledge. Even with the internet so readily available, knowledge is only accessible if it's meaningful and is sought out. An everyperson (which is a bit of a dismissive statement to those of us who are not average in every way possible) may not know what diabetes actually is - and so statements of ignorance are permissible because they demonstrate no truth other than that of that person's ignorance. The trick is to have that a defining feature of that character. A nurse or doctor should not be so characterised, but a politician, a stay-at-home parent, an artist or (ironically) a writer could be - because knowledge of diabetes would not be relevant to them. In any event, the author of the story is not permitted to be so uneducated, if they choose to write such things.
It's a matter of details. I might be focused on them, as I said earlier, but I would like to think that anyone trying to communicate in the wider world would be, too. It's just sensible to think that if one is going to write (or speak), one has some knowledge of what to write (or say), and how to phrase it.
I live in hope!
Oh, and the rampant use of cardinal numbers when ordinals should be chosen is literally ire-inducing to me. It seems more common these days than it used to, as far as my memory goes - but then I'm only 29, so my personal purview may not, shockingly enough, encompass the entirety of modern written English. I see it a lot, though - and hear it. Air New Zealand has YouTube adds which utilise the cardinal (1, 2, 3, 4...) instead of the ordinal (1st, 2nd, 3rd, 4th...) when describing dates of expiry of current flight deals - and while the narrator for such ads is clear enough in his intent, I don't believe being clear is an acceptable approximation for being correct.
And it's that which irks me so: certainly I can understand someone when they say "which I don't even know if that's a good idea"; they clearly think something may not be a wise decision. However...what sticks with me is the inappropriate appearance of "which". It doesn't make sense. For it to do so the rest of the sentence would require modification: "which I don't know is a good idea".
It boils down to pronoun redundancy: if you use one pronoun ("which"), you do not then need another ("that"). You could use both if one functions as a pronoun and the other a determiner (which both are): "Which is the one that you like?" is a clunky, rough-edged but grammatically-sound sentence. "Which is that you like?", or "which that you like" (as a statement, not a question) are perhaps more archaic versions of saying the same thing - the odd phenomenon of words losing their broader communicative ability in more modern times, requiring the addition of more words to say the same thing. But that's an aside - one I may come back to, because it really is interesting. Back to the topic a hand, though, "which that you like" can also be said "that which you like" - a seeming reversal in word order that means the same thing, but actually a nice little trade-in: the sentences both begin with pronouns and have determiners as their second words. Because which and that are both pronouns and determiners, and can be used in combination provided one acts in one function and the other takes on the empty role, either can occupy first or second position. The actual word order itself isn't reversed - just the choice of words used changes. It's quite a nice example of how things aren't so cut-and-dry on the surface but actually still are if one cares to think about them.
(And I know, few people bother. There are other, arguably better things to do with one's time than argue about how to speak.)
Of course, I realise that spoken, unrehearsed language is subject to errors and is defensible in its inclusion thereof - it's not proofread and approved before going live. It still irks me, but I'm more able to understand mistakes made in the moment - unless they're derogations relating to groups of people. I'll point out the error with that in a forthright way, obviously with the understanding that because pop-culture has made it "acceptable" to say "don't be a girl", "men are stupid", "that's gay" and so on, people don't always think that what they're saying has impact on others. Such understanding of the trend doesn't mean I think any of these things are okay to say, though - I just don't assume someone meant to be offensive, but rather was uninformed. If they then choose to be a total jerk about it and not modify their choice of words thereafter, then I'm left with no assumptions but the definite knowledge that they are indeed meaning to be offensive and seem to think that, while they make a definite choice as to which words they use, they bear no responsibility for that choice. But here's the key point to life in general: a person may have certain freedoms, such as the freedom of speech - and that's great. Freedom of speech doesn't mean freedom from responsibility for what is said, however, and it doesn't mean freedom from someone else exercising their freedom of speech by verbally tearing you to pieces for being insensitive.
That's a really good point for writers to remember, though: writers must know when to make it clear that their character thinks a certain way, rather than that they as the author do. As said above, while spoken words are not proofread (but should be chosen wisely anyway), written words most certainly are, or should be. It's easy to make a mistake by using the incorrect voice; if it's not clear you're describing something from the perspective of a character, you as the author risk describing it in the voice of the narrator - which is you.
I'm a type 1 diabetic, and have been since I was 11. I'm consistently amazed that people think diabetes in any form is caused by "eating too much sugar" - roughly akin to victim-blaming, for example: "if diabetics hadn't eaten so unhealthily they wouldn't have diabetes now, would they?" As a nurse I can unequivocally say that no, it isn't such a cause-and-effect situation. To say diabetes is caused by eating too much sugar ignores, for one thing, the sheer variety of diabetic conditions (type 1 is not type 2, gestational diabetes is not diabetes insipidus. Diabetes actually means "passer through", and medically communicates nothing more than the idea that large volumes of urine are being excreted); it also ignores the fact that our society/societies are not health-oriented, and don't encourage balanced lifestyles; it also ignores the fact that stress hormones decrease the efficacy of insulin, and stress also tends to be "coped with" by engaging in activities which further increase risk of ill-health: smoking, food-cravings, drinking, etc., etc.
So it'd be infuriating for me to read a book whose narrative voice described the eating habits of someone as those of a person heading for diabetic status. Luckily I never have read the words - it's just an example. Yet I'd be completely fine with a character who was otherwise presented as having little health-related knowledge saying such a thing - because the character is demonstrably ill-advised and ignorant. Neither being ill-advised or ignorant is a good thing, of course - but nobody has all information and knowledge. Even with the internet so readily available, knowledge is only accessible if it's meaningful and is sought out. An everyperson (which is a bit of a dismissive statement to those of us who are not average in every way possible) may not know what diabetes actually is - and so statements of ignorance are permissible because they demonstrate no truth other than that of that person's ignorance. The trick is to have that a defining feature of that character. A nurse or doctor should not be so characterised, but a politician, a stay-at-home parent, an artist or (ironically) a writer could be - because knowledge of diabetes would not be relevant to them. In any event, the author of the story is not permitted to be so uneducated, if they choose to write such things.
It's a matter of details. I might be focused on them, as I said earlier, but I would like to think that anyone trying to communicate in the wider world would be, too. It's just sensible to think that if one is going to write (or speak), one has some knowledge of what to write (or say), and how to phrase it.
I live in hope!
Thursday, May 7, 2015
Some considerations for self-publishing
A couple of months ago I attended a talk by a local meet-up co-ordinator on self-publishing. It was an interesting semi-seminar; it didn't really tell me anything new, but it was still interesting to hear about this person's experiences with self-publishing.
I have mixed feelings about the process and the practice, I have to admit. The positives would include that I would retain creative control of all aspects of publication: if I chose a specific word with a specific spelling (or a specific mis-spelling) then I wouldn't have an editor making changes based on assumptions; I could be able to choose the typeface used (an incredibly important part of story-telling); and I would be able to decide on format, cover design, etc., etc. The negatives are that, dependent on who you speak or listen to, self-published books are deemed "lesser" by many because they have achieved publication without jumping the numerous hurdles put in place by publishing houses which frankly don't want to spend money on publishing books; and a self-published book is far more limited in promotional scope, because the author, as the publisher, won't generally have the same access to the same channels as, say, Harper Collins.
I already know how I'll style my novella, when I finally get my act together about it: I've designed the cover and the custom lettering for the title, and I've formatted the text in both typeface and layout, eliminating such grievous features as orphans and widows. It's basically 100 pages of story, ready to go...unless I rewrite sections. And that could absolutely happen. I don't want it to - I feel it's completed. But it could still happen. But it's difficult: I know exactly how I want it to be crafted, but if it were deemed worthy of publication by a publishing house, I'd likely lose that creative control. I'd have to choose: exposure versus creative say-so.
I read a blog recently that espoused belief that if you're an author, no matter how good you think you are, you should never design a book cover for yourself. A respondent said they'd trained in design and was confident in their own abilities, and the blogger pretty much told them that it didn't matter: they weren't trained in book cover design, and therefore shouldn't design their own cover. It's an interesting perspective; the blogger - who has self-published two novels which have done reasonably well, much to their credit - seemed somehow certain that designers weren't appropriately trained in book cover design and therefore should leave this area of design well enough alone. And sure, such a perspective is valid if the designer in question is a clothing designer. Or an industrial designer, perhaps. But the comment-making designer led with information which strongly implied they were trained in graphic design. There were no reasons to presume otherwise, and I wondered at the sense in the blogger saying "oh no, don't. Hire someone else", as if book cover design were a separate skill set altogether. It's not, in case you were uncertain!
As it turned out, the blogger then went on to speak highly of using sites which basically offer designers the chance to use their expertise to design for clients...who don't want to pay a whole lot for the work the designer will do. There are a few sites like that around: they ask designers who are often actually struggling to build a design portfolio or who are freelancing and don't have work to pay the rent to sign up, and then let people in need of design select the lowest amount to pay for the designers' skills and time. It's like saying to a teacher or a nurse "Okay. You want to be able to eat this week? I'll give you an amount I deem your skills are worth and you give me the best design. I might not select your design in the end, as I'll have others competing with you, too, and if so you'll have worked for nothing - all because I don't really want to go to a design agency and have to pay for the hours devoted to this project and engage professionals in their abilities and skills".
Does it sound as if I have an issue with this? I sure do. As someone who has worked for over a decade, I wouldn't ever wish to request anyone, trained or otherwise, work for me for free. It's unfair. Yet sites like these (and logo design competitions, as another example) basically encourage the diminution of skills that the person needing a designer doesn't have. It strikes me as a bit similar to the so-called 0-hour contracts so many people have to deal with: you're hired to work but given no guarantee of hours, so if you're offered no hours there's no breach of contract on your employer's behalf. However, if you can't work a set of hours of which you might have had a day's notice, you can be fired because you're not meeting their expectations. It's a very underhanded way to commit someone to a job you can't be bothered giving them any security in.
So no, I'm not a fan of sites that basically engage in crowd-sourcing and inspired competition. The practice diminishes the hard work, time and skills a designer puts into even having the ability to provide a service. It's a system of undervalue. And it really irks me.
You can possibly imagine how unenthused I was when the speaker at the self-publishing semi-seminar promoted the use of such sites. And of course I understand that often a self-publishing author won't have much money to spare in pursuit of their dream of getting their work out there. Except...well, you have to commit money to such pursuit. You can't take shortcuts. Or, you know, you shouldn't. If you have the skills yourself, then use them (you could of course hire someone. There'd be nothing wrong with that) - but if you don't have the skills you shouldn't be offering some paltry sum that you think is appropriate. You're not paying for a pre-finished product whose creation costs have been factored into the price you pay; and you don't know the extent of the work, the time, or the value of any other resources the designer may need to utilise. In reality, you pay for the resources, the time, and the expertise as well as the final product.
It's funny how you learn about such perspectives. People are more concerned with how they can benefit than thy are with how those providing the services they need might benefit. But also, people aren't concerned with making their work appear the best it can: I remember the speaker at this meet-up saying "fully-justify your text", which is well-and-good if you just happen to have a multiple-tens-of-thousands-of-words-long piece of text with just the right rhythm character-wise to have all lines optimally filled with whole words. In the far more likely case a story doesn't lend itself so graciously to such a format, hyphenation is the answer - preferably after at least the first three characters, or before the last four. An author may have to slightly alter words in order to eliminate large gaps between words so that spaces are consistent line-to-line. And on top of the typographic concerns, if stylistic choice dictates the use of certain glyphs to denote the end of the section or a chapter, they should be used at 300dpi - or the image becomes pixelated and loses its definition.
I hadn't actually intended this to be such a rant. Well, I suppose I had, really, actually, because such things really do annoy me; but I suppose the benefit to this rant is that it's a bit like a "what not to do" advisory. I hadn't wanted to engage in advisories in this blog, so much - so it's a non-advisory advisory, I guess. I hope at the very least it's something to think about - because realistically this sort of stuff should be thought about, whether for the writer's benefit or for that of the people employed to help craft the writer's vision. It shouldn't be about cutting corners or undervaluing. It should be about getting something out into the world with the care and attention it should receive paid to it - and that means recognising that it's far more than just a casual affair, and that the hardest, most important part is over. As it turns out, what gets written isn't the whole story at all.
I have mixed feelings about the process and the practice, I have to admit. The positives would include that I would retain creative control of all aspects of publication: if I chose a specific word with a specific spelling (or a specific mis-spelling) then I wouldn't have an editor making changes based on assumptions; I could be able to choose the typeface used (an incredibly important part of story-telling); and I would be able to decide on format, cover design, etc., etc. The negatives are that, dependent on who you speak or listen to, self-published books are deemed "lesser" by many because they have achieved publication without jumping the numerous hurdles put in place by publishing houses which frankly don't want to spend money on publishing books; and a self-published book is far more limited in promotional scope, because the author, as the publisher, won't generally have the same access to the same channels as, say, Harper Collins.
I already know how I'll style my novella, when I finally get my act together about it: I've designed the cover and the custom lettering for the title, and I've formatted the text in both typeface and layout, eliminating such grievous features as orphans and widows. It's basically 100 pages of story, ready to go...unless I rewrite sections. And that could absolutely happen. I don't want it to - I feel it's completed. But it could still happen. But it's difficult: I know exactly how I want it to be crafted, but if it were deemed worthy of publication by a publishing house, I'd likely lose that creative control. I'd have to choose: exposure versus creative say-so.
I read a blog recently that espoused belief that if you're an author, no matter how good you think you are, you should never design a book cover for yourself. A respondent said they'd trained in design and was confident in their own abilities, and the blogger pretty much told them that it didn't matter: they weren't trained in book cover design, and therefore shouldn't design their own cover. It's an interesting perspective; the blogger - who has self-published two novels which have done reasonably well, much to their credit - seemed somehow certain that designers weren't appropriately trained in book cover design and therefore should leave this area of design well enough alone. And sure, such a perspective is valid if the designer in question is a clothing designer. Or an industrial designer, perhaps. But the comment-making designer led with information which strongly implied they were trained in graphic design. There were no reasons to presume otherwise, and I wondered at the sense in the blogger saying "oh no, don't. Hire someone else", as if book cover design were a separate skill set altogether. It's not, in case you were uncertain!
As it turned out, the blogger then went on to speak highly of using sites which basically offer designers the chance to use their expertise to design for clients...who don't want to pay a whole lot for the work the designer will do. There are a few sites like that around: they ask designers who are often actually struggling to build a design portfolio or who are freelancing and don't have work to pay the rent to sign up, and then let people in need of design select the lowest amount to pay for the designers' skills and time. It's like saying to a teacher or a nurse "Okay. You want to be able to eat this week? I'll give you an amount I deem your skills are worth and you give me the best design. I might not select your design in the end, as I'll have others competing with you, too, and if so you'll have worked for nothing - all because I don't really want to go to a design agency and have to pay for the hours devoted to this project and engage professionals in their abilities and skills".
Does it sound as if I have an issue with this? I sure do. As someone who has worked for over a decade, I wouldn't ever wish to request anyone, trained or otherwise, work for me for free. It's unfair. Yet sites like these (and logo design competitions, as another example) basically encourage the diminution of skills that the person needing a designer doesn't have. It strikes me as a bit similar to the so-called 0-hour contracts so many people have to deal with: you're hired to work but given no guarantee of hours, so if you're offered no hours there's no breach of contract on your employer's behalf. However, if you can't work a set of hours of which you might have had a day's notice, you can be fired because you're not meeting their expectations. It's a very underhanded way to commit someone to a job you can't be bothered giving them any security in.
So no, I'm not a fan of sites that basically engage in crowd-sourcing and inspired competition. The practice diminishes the hard work, time and skills a designer puts into even having the ability to provide a service. It's a system of undervalue. And it really irks me.
You can possibly imagine how unenthused I was when the speaker at the self-publishing semi-seminar promoted the use of such sites. And of course I understand that often a self-publishing author won't have much money to spare in pursuit of their dream of getting their work out there. Except...well, you have to commit money to such pursuit. You can't take shortcuts. Or, you know, you shouldn't. If you have the skills yourself, then use them (you could of course hire someone. There'd be nothing wrong with that) - but if you don't have the skills you shouldn't be offering some paltry sum that you think is appropriate. You're not paying for a pre-finished product whose creation costs have been factored into the price you pay; and you don't know the extent of the work, the time, or the value of any other resources the designer may need to utilise. In reality, you pay for the resources, the time, and the expertise as well as the final product.
It's funny how you learn about such perspectives. People are more concerned with how they can benefit than thy are with how those providing the services they need might benefit. But also, people aren't concerned with making their work appear the best it can: I remember the speaker at this meet-up saying "fully-justify your text", which is well-and-good if you just happen to have a multiple-tens-of-thousands-of-words-long piece of text with just the right rhythm character-wise to have all lines optimally filled with whole words. In the far more likely case a story doesn't lend itself so graciously to such a format, hyphenation is the answer - preferably after at least the first three characters, or before the last four. An author may have to slightly alter words in order to eliminate large gaps between words so that spaces are consistent line-to-line. And on top of the typographic concerns, if stylistic choice dictates the use of certain glyphs to denote the end of the section or a chapter, they should be used at 300dpi - or the image becomes pixelated and loses its definition.
I hadn't actually intended this to be such a rant. Well, I suppose I had, really, actually, because such things really do annoy me; but I suppose the benefit to this rant is that it's a bit like a "what not to do" advisory. I hadn't wanted to engage in advisories in this blog, so much - so it's a non-advisory advisory, I guess. I hope at the very least it's something to think about - because realistically this sort of stuff should be thought about, whether for the writer's benefit or for that of the people employed to help craft the writer's vision. It shouldn't be about cutting corners or undervaluing. It should be about getting something out into the world with the care and attention it should receive paid to it - and that means recognising that it's far more than just a casual affair, and that the hardest, most important part is over. As it turns out, what gets written isn't the whole story at all.
Thursday, April 30, 2015
A reading and an entry
I did a short reading the other evening. Quite short. And to a very small audience - just two others at a local writers' meet-up which, due to the weather being terrible this time of year, only had a total population of five this month - but still worth mentioning nonetheless.
It was a good experience in general, really: I got generally good reviews, and even the "critical" aspect of it wasn't in particular critical, taking the form of nothing more than "you've set the scene with the voice, which perhaps could be pared back just slightly, but what you've written is really effective". That's a paraphrase, of course - but specifically the words "pared", "back" and "slightly" did occur in association with each other.
And I'll be the first to admit it: the manner in which I tell stories is a little bit off-kilter with normal, familiar every-day vernacular. I don't have any issue with that; while I wouldn't set about writing a story set in the Dark Ages in language specific to the Dark Ages verbatim, I would want the language I were to use to fall within the perceptive bounds of the setting. The mode by which people communicate has as much impact on a story as what is being said, albeit in a different way; I'm certainly not suggesting a story can be told merely on the back of how someone speaks, rather than what they say. But it makes total sense: you wouldn't pick up The Odyssey and expect phrases like "How's it going?" to be all too common. Linguistic elements passively shape the perception the reader has of the world they're reading about. I'm sure I've touched in some way on this before, so I won't bother rerunning that race. But I'm open to that kind of critique; I'd rather know how people find reading it than pretend I don't need to know.
It was the first reading I'd ever done, I have to admit, so it was a good learning experience. I was aware the whole time of the speed at which I was speaking, making sure not to fall into the trap of going too fast which I know a lot of people find themselves doing. I've used Audioboo before to do a non-live reading of a poem-story I wrote (and which I'd like to do something with in the future), and I found myself struggling to speak slowly and breathe regularly due to the pressure of having a "perfect" recording. I didn't have that pressure when I did the live reading, oddly enough; I suppose the reality of the situation is that you can always stop and gather yourself while reading aloud in person, but if all you're leaving is a recording...you don't so much have that "hold on a moment" ability.
Anyway, I left the meet-up feeling really proud - the feedback was altogether very positive. The world in the story was correctly judged to be one of foreboding, and of a lurking danger, and of dread - and that really is a key psychological setting of the story. That I had been able to communicate that in the scene I read - or rather, that it had been espied - was really gratifying, particularly since the motive when writing the scene was not specifically to underscore those feelings at all, but to give a voice to a character to whom reference had been made but of whom no real exploration had been done. I feel really glad that the scene has proven itself well-situated enough for there to be congruence between its greater context and the message the scene itself in isolation communicates.
In an unrelated update, except regarding the shared medium (writing) and the notion of paring something down, I entered a severely shortened version of a story about the sea I'm likely to be perpetually writing the longer version of for that "short short story" competition I mentioned a couple of posts ago. I managed to get something several thousand words in length down to 300 (the absolute limit for any valid entry) and then got rid of one word for a grand total of 299. How much of a story can you tell in 300 words? Not much. Well, no, you can tell the entire story, but you can't tell much of it. In any case I'll wait to see whether I'll even generate attention. I have no idea of the calibre of other entries. I'd like to think mine's up there but it really might not be at all!
So, yes - two things writing related. It's good fun.
It was a good experience in general, really: I got generally good reviews, and even the "critical" aspect of it wasn't in particular critical, taking the form of nothing more than "you've set the scene with the voice, which perhaps could be pared back just slightly, but what you've written is really effective". That's a paraphrase, of course - but specifically the words "pared", "back" and "slightly" did occur in association with each other.
And I'll be the first to admit it: the manner in which I tell stories is a little bit off-kilter with normal, familiar every-day vernacular. I don't have any issue with that; while I wouldn't set about writing a story set in the Dark Ages in language specific to the Dark Ages verbatim, I would want the language I were to use to fall within the perceptive bounds of the setting. The mode by which people communicate has as much impact on a story as what is being said, albeit in a different way; I'm certainly not suggesting a story can be told merely on the back of how someone speaks, rather than what they say. But it makes total sense: you wouldn't pick up The Odyssey and expect phrases like "How's it going?" to be all too common. Linguistic elements passively shape the perception the reader has of the world they're reading about. I'm sure I've touched in some way on this before, so I won't bother rerunning that race. But I'm open to that kind of critique; I'd rather know how people find reading it than pretend I don't need to know.
It was the first reading I'd ever done, I have to admit, so it was a good learning experience. I was aware the whole time of the speed at which I was speaking, making sure not to fall into the trap of going too fast which I know a lot of people find themselves doing. I've used Audioboo before to do a non-live reading of a poem-story I wrote (and which I'd like to do something with in the future), and I found myself struggling to speak slowly and breathe regularly due to the pressure of having a "perfect" recording. I didn't have that pressure when I did the live reading, oddly enough; I suppose the reality of the situation is that you can always stop and gather yourself while reading aloud in person, but if all you're leaving is a recording...you don't so much have that "hold on a moment" ability.
Anyway, I left the meet-up feeling really proud - the feedback was altogether very positive. The world in the story was correctly judged to be one of foreboding, and of a lurking danger, and of dread - and that really is a key psychological setting of the story. That I had been able to communicate that in the scene I read - or rather, that it had been espied - was really gratifying, particularly since the motive when writing the scene was not specifically to underscore those feelings at all, but to give a voice to a character to whom reference had been made but of whom no real exploration had been done. I feel really glad that the scene has proven itself well-situated enough for there to be congruence between its greater context and the message the scene itself in isolation communicates.
In an unrelated update, except regarding the shared medium (writing) and the notion of paring something down, I entered a severely shortened version of a story about the sea I'm likely to be perpetually writing the longer version of for that "short short story" competition I mentioned a couple of posts ago. I managed to get something several thousand words in length down to 300 (the absolute limit for any valid entry) and then got rid of one word for a grand total of 299. How much of a story can you tell in 300 words? Not much. Well, no, you can tell the entire story, but you can't tell much of it. In any case I'll wait to see whether I'll even generate attention. I have no idea of the calibre of other entries. I'd like to think mine's up there but it really might not be at all!
So, yes - two things writing related. It's good fun.
Tuesday, April 14, 2015
Lore in absentia
Something that I've learnt is that lore, whatever its form, has to be tangible. I say that because I think at some stage every storyteller has a bit of a moment wherein they think "well, this is how this story is because this is how I say it is"...and while that's true on the most basic and general level (a story is a story because the teller is telling it, and whatever happens does so because the teller says it does), it doesn't ring true when the reader suspends disbelief and lets the world being spoken of become a temporary reality.
We've all (I say, applying broad supposition to my actual and potential audience) seen movies, and we've all watched tv shows - dramas, comedies, mysteries, horrors, etc., etc.. And books, too - I'd say most of us have at least read one book, start to finish. And the thing is, unless we're either gifted with the most powerful of imaginations or cursed with no imagination at all, what happens within the story of a movie, a tv show, a book, or anything else has to make sense. There has to be a progression from A to B that can be demonstrated on some level that doesn't require too much effort to make plausible. The underlying lore of the tale, or the event, or even the characters must be capable of reduction to its most basic form and still make sense. There are notable exceptions, but these exceptions are the ones that lead the viewers or readers (or players) to search for more and to ultimately find it, not search and be given holes in plots and histories that just don't follow through.
A great example of one of these notable games is the Dark Souls series (if a franchise of two games can be deemed serial): the lore is present, but hard to find, in most parts at least. There are jumps one has to make but only insofar as things being all but said, so the logical conclusion is never stated outright for the sake of confirmation but the signs all point to it anyway. The games have earnt many people fame, at least in their respective circles, and have led to jobs regarding game guide content creation, as well as self-employment opportunities concerning YouTube content creation as well. The presentation of lore in the games and supporting material is done in such a way as to have it in some form of "plain sight", there to be found, but never outright confirmed. Tolkien's works are a bit of antithetical to these games in this regard, in that Tolkien didn't seem to want any guesswork being done - or at least, didn't like the idea of not having resources available to those who wished to engage in further research. The numerous appendices after The Return of the King tell the reader a more complete history of things related to The Lord of the Rings, and that has led the various multimedia works to have been built on the backs of the books being wrought with exceptional detail and richness. A sword is not merely a sword. But then, a sword is not merely a sword in Dark Souls, either; to know the absolute history of it, though, you might find yourself having to interpret hints and suggestions, rather than being told that this person son of that person son of another first obtained it from a smith who had beaten into the blade three different kind of ore sacred to this race, in order for it to be significant and have myriad different abilities. But the case remains that the lore is still there, just dependent on your ability or motivation to find it.
A bad story is, among other things, one that presents information without having set the stage for its inclusion. Well, perhaps not a bad story, but certainly a bad choice made by the author to depend so utterly on something without provenance. Deus ex machina it's called: the phenomenon of an event or plot point that is necessary for progression of the story to come out of nowhere. In terms of the author or creator, it amounts to something akin to throwing one's hands in the air and saying "this is the story because I say it is, and that's all there is to it". In one respect it might be deemed similar to one core aspect of the Anthropic Principle: that the Universe is as it is because if it weren't we wouldn't be here to observe it. That seems to amount to a big fat nothing, and indeed it does in fact amount to a big fat nothing: because it doesn't describe why the Universe is the way it is, but rather that the Universe can be observed as it is. In reality the Anthropic Principle has two major variants (and possibly others I'm not studied enough to be aware of): the Strong Anthropic Principle, which says that the Universe is the way it is and we can observe it that way because it is compelled to align itself with conditions that encourage or necessitate the evolution of life complex enough to exist within it and observe it; and the Weak Anthropic Principle, which says simply that if the Universe were any other way, we wouldn't be around to ask why it is the way it is, and that because we're in a version of the Universe that does allow us to exist then it's obviously a stable, realistic version, and we don't know of any other Universes that don't support life capable of observing it. It's all an interesting notion, but I have a bit of trouble seeing what use it has when trying to actually assess why the Universe has found itself in its current state unless we are to rely on saying "because we are reason enough", or at least, "because it must be observed by something". It's a bit cart-before-horse-ish, equivalent to a reversal of the question "how can we be here?" and its answer "because the Universe is hospitable to life in our form (in a general sense)".
With stories this doesn't really work. You can't just write something and say "well it had to happen that way because if it didn't, this wouldn't have happened, and the whole story would have fallen flat"; you can't say "the ends justify the means". The means have to lead directly to the ends, in summation at least, even if that means you have to invent means to get to an end that are beyond expectation. The means beyond expectation have to make sense in the world wherein they occur, of course - you can't go from everything being normal to suddenly everything being on its head at the end, as the crisis involved in this sudden change is really where a story might begin, or is the consequence of preceding events; deus ex machina isn't really the note to leave a story on.
It's a difficult one, though: how much of a plot does one have to give away without giving too much away or too little? I guess there's a real technique to that. I'm not quite there with the story I'm working on at the moment, but I will be...at some stage. But I've fallen trap to the "well, it doesn't have to make sense given the amount of information presented, so it's fine" thought process, too, particularly in my novella manuscript. I'm glad I saw sense regarding it, even if it doesn't change the ultimate outcome, and even if it's a minor set of details - because what if I were asked about it? What could I say? "I'm not sure myself"? Well...perhaps that would actually be a legitimate reason, provided it's not an excuse: some storytellers (Tolkien) take the position of being the be-all-and-end-all of knowledge on a particular story, while others (From Soft with Dark Souls) take the position of being privy to some information but not all, and thus the reliance on the "who knows?" excuse leads fan speculation, research and debate onwards without any yes or no from on-high. It's a bit of a ruse, of course - for a franchise like Dark Souls, so story-dependent, to make sense, an over-arching story must be fleshed out. Individuals within the story-telling team may or may not know all of the details, but the story is detailed, even if not shared. To be honest I do quite like the idea of a writer taking the perspective of "I am narrating a story, but what I know or don't know doesn't affect the story itself. The story is as the story is" - but the issue again is that when the story is supported only by itself and there is no tangible lore extra to the story...it begins to feel limited.
As I said, I had a case of this with my novella manuscript - not because it felt limited, and not because I took the "the story is as the story is" approach on any superficial level, but because what I did supply story-wise created the impression that the experiences of a certain character were the result of something in her past which she was only indirectly involved in, having been a child at the time. It was at that point I was prepared to say "that's all she knows, and since this is her (part of the) story, that's all the information I might have to go on in relating her thoughts and feelings". But...I thought about it, and realised that while on the one hand that's perhaps justifiable, on the other it perhaps isn't: I'm not writing an entire story from her perspective, as if I were her, or filtered only through her thoughts and feelings; I'm writing about her, her thoughts and feelings, and importantly about her history as well. Even if the reality of the story is never made clear, because she never knows it clearly herself...that doesn't mean the story doesn't exist by itself. So I had to actually decide what the details of that story were, and as a result I managed to alter what we do find out about the character from her own thoughts and feelings - and it gives us a greater appreciation for the environment she's in, too.
That's probably what it boils down to, really: not so much that lore need be readily accessible, or that it need be accessible in any significant way, but that it be accessible through its existence in absentia. If lore is non-existent, it can't tell anybody anything; whereas if it exists but isn't available, what it doesn't tell an investigator can be as important to the impression left or the information gleaned as what is told. It needs to be tangibly unavailable, I suppose. It can't be a case of "well, nobody knows", because if nobody knows then the story runs no deeper than the paper it's printed on, or the screen it flickers into movement across.
Again I guess this is an issue of how much of a story does one need to reveal without it being too much nor too little. And I guess the answer is some, provided there's more of the story that is actively being withheld as part of the storytelling process; some, as long as it's enough.
How's that for a definite answer? Appalling. But at least you know there's more to it than that.
We've all (I say, applying broad supposition to my actual and potential audience) seen movies, and we've all watched tv shows - dramas, comedies, mysteries, horrors, etc., etc.. And books, too - I'd say most of us have at least read one book, start to finish. And the thing is, unless we're either gifted with the most powerful of imaginations or cursed with no imagination at all, what happens within the story of a movie, a tv show, a book, or anything else has to make sense. There has to be a progression from A to B that can be demonstrated on some level that doesn't require too much effort to make plausible. The underlying lore of the tale, or the event, or even the characters must be capable of reduction to its most basic form and still make sense. There are notable exceptions, but these exceptions are the ones that lead the viewers or readers (or players) to search for more and to ultimately find it, not search and be given holes in plots and histories that just don't follow through.
A great example of one of these notable games is the Dark Souls series (if a franchise of two games can be deemed serial): the lore is present, but hard to find, in most parts at least. There are jumps one has to make but only insofar as things being all but said, so the logical conclusion is never stated outright for the sake of confirmation but the signs all point to it anyway. The games have earnt many people fame, at least in their respective circles, and have led to jobs regarding game guide content creation, as well as self-employment opportunities concerning YouTube content creation as well. The presentation of lore in the games and supporting material is done in such a way as to have it in some form of "plain sight", there to be found, but never outright confirmed. Tolkien's works are a bit of antithetical to these games in this regard, in that Tolkien didn't seem to want any guesswork being done - or at least, didn't like the idea of not having resources available to those who wished to engage in further research. The numerous appendices after The Return of the King tell the reader a more complete history of things related to The Lord of the Rings, and that has led the various multimedia works to have been built on the backs of the books being wrought with exceptional detail and richness. A sword is not merely a sword. But then, a sword is not merely a sword in Dark Souls, either; to know the absolute history of it, though, you might find yourself having to interpret hints and suggestions, rather than being told that this person son of that person son of another first obtained it from a smith who had beaten into the blade three different kind of ore sacred to this race, in order for it to be significant and have myriad different abilities. But the case remains that the lore is still there, just dependent on your ability or motivation to find it.
A bad story is, among other things, one that presents information without having set the stage for its inclusion. Well, perhaps not a bad story, but certainly a bad choice made by the author to depend so utterly on something without provenance. Deus ex machina it's called: the phenomenon of an event or plot point that is necessary for progression of the story to come out of nowhere. In terms of the author or creator, it amounts to something akin to throwing one's hands in the air and saying "this is the story because I say it is, and that's all there is to it". In one respect it might be deemed similar to one core aspect of the Anthropic Principle: that the Universe is as it is because if it weren't we wouldn't be here to observe it. That seems to amount to a big fat nothing, and indeed it does in fact amount to a big fat nothing: because it doesn't describe why the Universe is the way it is, but rather that the Universe can be observed as it is. In reality the Anthropic Principle has two major variants (and possibly others I'm not studied enough to be aware of): the Strong Anthropic Principle, which says that the Universe is the way it is and we can observe it that way because it is compelled to align itself with conditions that encourage or necessitate the evolution of life complex enough to exist within it and observe it; and the Weak Anthropic Principle, which says simply that if the Universe were any other way, we wouldn't be around to ask why it is the way it is, and that because we're in a version of the Universe that does allow us to exist then it's obviously a stable, realistic version, and we don't know of any other Universes that don't support life capable of observing it. It's all an interesting notion, but I have a bit of trouble seeing what use it has when trying to actually assess why the Universe has found itself in its current state unless we are to rely on saying "because we are reason enough", or at least, "because it must be observed by something". It's a bit cart-before-horse-ish, equivalent to a reversal of the question "how can we be here?" and its answer "because the Universe is hospitable to life in our form (in a general sense)".
With stories this doesn't really work. You can't just write something and say "well it had to happen that way because if it didn't, this wouldn't have happened, and the whole story would have fallen flat"; you can't say "the ends justify the means". The means have to lead directly to the ends, in summation at least, even if that means you have to invent means to get to an end that are beyond expectation. The means beyond expectation have to make sense in the world wherein they occur, of course - you can't go from everything being normal to suddenly everything being on its head at the end, as the crisis involved in this sudden change is really where a story might begin, or is the consequence of preceding events; deus ex machina isn't really the note to leave a story on.
It's a difficult one, though: how much of a plot does one have to give away without giving too much away or too little? I guess there's a real technique to that. I'm not quite there with the story I'm working on at the moment, but I will be...at some stage. But I've fallen trap to the "well, it doesn't have to make sense given the amount of information presented, so it's fine" thought process, too, particularly in my novella manuscript. I'm glad I saw sense regarding it, even if it doesn't change the ultimate outcome, and even if it's a minor set of details - because what if I were asked about it? What could I say? "I'm not sure myself"? Well...perhaps that would actually be a legitimate reason, provided it's not an excuse: some storytellers (Tolkien) take the position of being the be-all-and-end-all of knowledge on a particular story, while others (From Soft with Dark Souls) take the position of being privy to some information but not all, and thus the reliance on the "who knows?" excuse leads fan speculation, research and debate onwards without any yes or no from on-high. It's a bit of a ruse, of course - for a franchise like Dark Souls, so story-dependent, to make sense, an over-arching story must be fleshed out. Individuals within the story-telling team may or may not know all of the details, but the story is detailed, even if not shared. To be honest I do quite like the idea of a writer taking the perspective of "I am narrating a story, but what I know or don't know doesn't affect the story itself. The story is as the story is" - but the issue again is that when the story is supported only by itself and there is no tangible lore extra to the story...it begins to feel limited.
As I said, I had a case of this with my novella manuscript - not because it felt limited, and not because I took the "the story is as the story is" approach on any superficial level, but because what I did supply story-wise created the impression that the experiences of a certain character were the result of something in her past which she was only indirectly involved in, having been a child at the time. It was at that point I was prepared to say "that's all she knows, and since this is her (part of the) story, that's all the information I might have to go on in relating her thoughts and feelings". But...I thought about it, and realised that while on the one hand that's perhaps justifiable, on the other it perhaps isn't: I'm not writing an entire story from her perspective, as if I were her, or filtered only through her thoughts and feelings; I'm writing about her, her thoughts and feelings, and importantly about her history as well. Even if the reality of the story is never made clear, because she never knows it clearly herself...that doesn't mean the story doesn't exist by itself. So I had to actually decide what the details of that story were, and as a result I managed to alter what we do find out about the character from her own thoughts and feelings - and it gives us a greater appreciation for the environment she's in, too.
That's probably what it boils down to, really: not so much that lore need be readily accessible, or that it need be accessible in any significant way, but that it be accessible through its existence in absentia. If lore is non-existent, it can't tell anybody anything; whereas if it exists but isn't available, what it doesn't tell an investigator can be as important to the impression left or the information gleaned as what is told. It needs to be tangibly unavailable, I suppose. It can't be a case of "well, nobody knows", because if nobody knows then the story runs no deeper than the paper it's printed on, or the screen it flickers into movement across.
Again I guess this is an issue of how much of a story does one need to reveal without it being too much nor too little. And I guess the answer is some, provided there's more of the story that is actively being withheld as part of the storytelling process; some, as long as it's enough.
How's that for a definite answer? Appalling. But at least you know there's more to it than that.
Saturday, March 21, 2015
How many words?
This is my typical thing when it comes to blogs: write a few posts, then take a break, and find that it's been a month or so since I last updated. It's a pattern I've been perfecting for years; with it I can create the illusion that I have a very busy existence. And everybody knows that a busy existence is an interesting existence.
Actually I have been a bit busy, but the aspect of interest is debatable. I've applied and interviewed for a few jobs recently; the first I was offered but turned down, having realised my aims and goals would have not made me the best fit for the role, the second I didn't get (and that's quite okay), while the third...well, the third I've only just applied for. I had told myself that if I weren't to get the second job (an alternative role in my current place of work) then it'd be enough of a spur to get me into job searching again, from which I'd taken a break - and so here I am again, much like last year, on the quest for new career options.
With that, I've managed to take a bit of a break from writing. I've done a little bit of editing regarding my current project, but I haven't really increased the 15,000 words I'd reached about the last time I updated. And of course I write "15,000 words" as if that total in this first draft will translate to the same total in the final version. It most likely will be pared down or reworked - and so it should. Compared to my novella manuscript (that nebulous object I keep mentioning and never actually describing), I don't feel enough has happened to justify 15,000 words - and I certainly don't want this story to be a struggle to get into. We've all read the beginnings of stories like that and invariably the beginnings are as far as we've bothered to go: a story should tell itself and do so in a compelling way, not bludgeon its audience into continued reading. I don't, though, want to go back and do proper edition right now - that can wait for the end of the draft. It's just a good thing for me to know, right now, that I need to focus not merely on telling my story and building a word count, but that those words I use must be meaningful and the story well-paced.
There's a "short short story" competition currently open with North & South magazine here that I believe I'd like to enter, but the catch is the story, being suitably short, must be no greater than 300 words. That's going to have to be an excessively well-paced tale without too much waffle, and given I have 15,000 words of (maybe not enough) substance, working to that kind of limit is going to be difficult.
But I do have an idea as to what it is I'd like to submit: a short story I've been working on for a while now, actually, which I keep picking up and putting down, in large part because I know how it's going to end but the path it must take to get there is relatively undefined (I've mentioned before that stories being told without a pathway for me tend to become waffle-shops, and even if that isn't readable to an outside mind it remains quite prominent in my own). It'll give me a chance to condense the text a bit, or a lot, and perhaps be a fore-runner to an actual published version of the story in general one day. It'd be good exposure, if I were to win - though I'm sure the competition is stiff and the odds of beating my competitors are probably ridiculously against me. But, as the notion goes, if you don't move you go nowhere.
Wish me luck!
Actually I have been a bit busy, but the aspect of interest is debatable. I've applied and interviewed for a few jobs recently; the first I was offered but turned down, having realised my aims and goals would have not made me the best fit for the role, the second I didn't get (and that's quite okay), while the third...well, the third I've only just applied for. I had told myself that if I weren't to get the second job (an alternative role in my current place of work) then it'd be enough of a spur to get me into job searching again, from which I'd taken a break - and so here I am again, much like last year, on the quest for new career options.
With that, I've managed to take a bit of a break from writing. I've done a little bit of editing regarding my current project, but I haven't really increased the 15,000 words I'd reached about the last time I updated. And of course I write "15,000 words" as if that total in this first draft will translate to the same total in the final version. It most likely will be pared down or reworked - and so it should. Compared to my novella manuscript (that nebulous object I keep mentioning and never actually describing), I don't feel enough has happened to justify 15,000 words - and I certainly don't want this story to be a struggle to get into. We've all read the beginnings of stories like that and invariably the beginnings are as far as we've bothered to go: a story should tell itself and do so in a compelling way, not bludgeon its audience into continued reading. I don't, though, want to go back and do proper edition right now - that can wait for the end of the draft. It's just a good thing for me to know, right now, that I need to focus not merely on telling my story and building a word count, but that those words I use must be meaningful and the story well-paced.
There's a "short short story" competition currently open with North & South magazine here that I believe I'd like to enter, but the catch is the story, being suitably short, must be no greater than 300 words. That's going to have to be an excessively well-paced tale without too much waffle, and given I have 15,000 words of (maybe not enough) substance, working to that kind of limit is going to be difficult.
But I do have an idea as to what it is I'd like to submit: a short story I've been working on for a while now, actually, which I keep picking up and putting down, in large part because I know how it's going to end but the path it must take to get there is relatively undefined (I've mentioned before that stories being told without a pathway for me tend to become waffle-shops, and even if that isn't readable to an outside mind it remains quite prominent in my own). It'll give me a chance to condense the text a bit, or a lot, and perhaps be a fore-runner to an actual published version of the story in general one day. It'd be good exposure, if I were to win - though I'm sure the competition is stiff and the odds of beating my competitors are probably ridiculously against me. But, as the notion goes, if you don't move you go nowhere.
Wish me luck!
Monday, February 16, 2015
Overwrought
If you'd read this blog at all beforehand, perhaps you might have noticed the quiet appearance of the title banner up there. I have to say I'm pleased with it; I feel it's not absolutely and utterly perfect, but I'm pleased with it nonetheless and to be honest the overwhelming portion of me thinks that no matter how perfect it might appear to anyone else, I'll likely remain more critical anyway. If you've ever read designer-written discourses on how perfection is never perfection when you're the creative party, you'll understand; I don't mean to suggest it actually is perfect, but at some point in the creative process you have to just take a step back and recognise that a piece is finished, even if it doesn't represent or match closely enough the image that had formed in your mind about how a concept should look. The audience, too, is never going to know how it was at first imagined to look, either: they're going to see it as it is and, unless it's bad, they won't be as critical as you, the designer, is being at all. And really, to be perfectly honest, it does actually look very much how I had imagined it would. The background to the banner is a texture I painted based on cork oak bark (there's relevance to that, as I may expound upon), and the hand-rendered lettering is the most successful version of a few experiments I did in order to get the imperfect, almost-filamentous look of the (completely unnecessary but, in my opinion, completely spot-on) ligatures that give it an almost seal- or cartouche-like appearance.
In short...I'm happy with it. Photoshop and Illustrator skills are indeed well-employed when you're a graphic designer!
That whole idea of imperfect completion, though, is I think what a lot of authors - established or aspiring - struggle with. For instance, I wrote the first draft of my novella (which I'm still strangely reticent to name "out loud", as if by doing so I'm communicating the entire plot and reading it word-for-word for someone to then convert into their own manuscript and promote as their own work. Yes, I'm fully aware that this presents me as decidedly paranoid) just over five years ago and, after basically doing a re-read through once, put down and let sit for a couple of years at a length of 32,000-ish words. When I received positive feedback on it even in that state, and after having devoted the entirety of 2013 to producing a graphic design portfolio while studying the subject (go check it out! http://be.net/bysimonrandell - self-promotion self-promotion self-promotion), I was ready, I think, to return to it and edit it again, this time incorporating new character development for otherwise plot-device personalities (a truly awful thing. If someone isn't there by merit of their own presence...the story is too dependent on them. I realise most stories are dependent on the characters living them, but if the only reason to include a woman named Marjory is to bridge the gap between one part of the story and the next...it's too tenuous. What if Marjory suddenly died? What if Marjory hadn't noticed whatever it is she noticed in order to propel the plot forwards? Why am I discussing the merits of Marjory? But seriously: Marjory either as to be a character already present and developed in order for her role as a plot-propulsion device to be acceptable, or the plot needs reworking. Marjory is either a character by her own merit, or she's not. And from a more personal perspective, I know what it's like to be treated as if [and told so] I were a plot device; it's not fun. Don't do that to poor, poor Marjory) and to concern myself more with themes which had been alluded to but which I'd up until that point not really included by intent. The story is now over 38,000 words - and is complete in itself, as I believe I've mentioned before.
However...I do keep wondering whether I should go back and add more stuff in. Should I mention what the antagonist had to survive by eating things the otherwise would have preferred to avoid, in order that it is an even more well-developed character? Should I speak of the moon, indicating clearly by implication and inference how much time is passing? In short, I believe the answer - despite continuing to consider such things - is no, I shouldn't. In the first case it's largely because I don't necessarily want the antagonist to be an open book. In the second, again, it's because I don't want things to be set out so definitively. And also, realistically, the mechanic of time is a passive one - there is no role that the hours play beyond that time advances, as time does, and that doesn't really need more reference than the occasional allusion to the environment. Really I've already arrived at the conclusion that the story is complete in its imperfection - as all stories are, as all designs are. There comes a point at which the process of creation is finished: anything beyond that could further reinforce the object created, but without that reinforcement it is not lacking anything. Sometimes, in fact, the aspects which are not further developed become part of the intentional form or purpose. Basically, the idea of "less is more" is, generally applied, true, and there is no point or benefit gained from having been overwrought.
And that, weirdly, brings me to something else I'd been considering earlier today (and have considered many times before now). I make no secret of being a bit of a gameplay-walkthrough-junkie, as in, I'll spend much time watching Youtube videos of people playing games I (mostly) haven't played. It really is entertaining. But, strangely, there's also that definite sense of being privy to a story: we can all read books or watch movies and experience being an audience, and watching games being played from start to finish has become a roughly similar experience over the past ten years. It wasn't, way-back-when - if you weren't playing the game then you were the unfortunate audience who was entirely lacking agency within the story and therefore had limited investment and limited enjoyment. Nowadays you don't have to exert agency within the world of a game in order to still be invested in its progression from start to finish. That, I think, has a lot to do with the sophistication that began to become core to games as their reason for existence; you can't just throw someone into a world without a reason or a cause for their presence and expect it to all make sense, though some notable older games did just that and at the time were highly successful. The trouble is, though, that sometimes you notice things that leave you wondering.
For me, such a thing is the dialogue of certain fantasy games these days. The storylines are meant to be dramatic, and to a point I can understand that - for why else would people play, unless caught up in the promise of some epic adventure with grand consequences and dire risks? And, of course, because of the influence of such things as The Lord of the Rings: things have to, somehow, live up to the grandeur of that kind of Truly Epic, particularly when they could be said to be hijacking a lot of the theretofore non-existent standards present in Tolkien's works. Unfortunately - as I've mentioned before - the ideas of so many of the present-day fantasy worlds created as if somehow new or different recycle a lot of the same stuff: elves are Humans Plus, orcs are (evil) Humans Plus, Green Edition. But that's to some extent understandable - if something works, why change it to sit outside expectations? And of course, it's not as if I don't like games that have elves and orcs - I absolutely adore the Elder Scrolls games, though those prior to Morrowind I haven't seen much, if anything, of.
It's the dialogue, really, that bothers me. Dialogue that is supposed to take the listener or reader out of the modern context is one thing, and often dialogue needs to be replete with flourish and redundancy - after all, that's how people communicate, and that's how stories which could be 32,000 words become 38,000 words...haha. But there's a certain amount of cliché which I think becomes too overpowering. If a character says "I will let nothing get in my way", for instance, that's really all that need be said. Yet one game I can think of in particular would have the character say "I will let nothing get in my way, do you hear me?! Nothing!" On its own this isn't so bad. You can tell merely from the words that this particular character would be quite impassioned and set in their stance. Yet because those words are typically all grouped together as a verbal cliché, they lose their impact and become predictable and empty - so much so as to make them formulaic. Anyone who has read Homer's Iliad and Odyssey probably has some understanding that the reason Homer, or the poets collectively called Homer in today's regard, utilised formulae ("wine-dark sea") was because by using such predictable and set word patterns the rhythm of the recited poems (both of these surviving epics were initially recited by bards, only later being written down and cemented as singular versions) could be maintained. In that regard the predictability of the words was sensible because it gave the poet a sense of where they were in their recitation, and allowed proper pacing...but the constant reference to the wine-dark sea likely did nothing to really inform the audience of the sea itself. An epithet is only meaningful if it is meant to retain its meaning, and like any word if it's said too often it becomes a mere collection of syllables. So too do over-used statements in dialogue, even if they're not over-used in the game but in the genre or in the wider culture (storytelling culture, for instance), and over-used formulae in internet fora ("Well, you could do that. Or you could do this. Hell, you could even..." - the latter sentence loses its emphasis, rather than gains it, purely because of how over-used the "hell, [extra option]" structure is).
I don't mean to criticise so much, really. It's just something I've noticed in some media, particularly games. It doesn't mean it's bad, it just means I find the impact of the words lost, and that's a sad thing. I think some cliché is unavoidable, of course: nobody's inventing language from scratch, and there are cultural trends at any given moment that will impact how something is communicated. It's just a great pity when the power of an object is lost because the way it is described uses someone else's words. Dialogue can be wrought without being overwrought, and if the level to which the work on it is present is relevant to the medium...then it fits. It's when that work transitions into over-work that something loses its meaning. Probably, at this stage, a bit like this blog entry.
In short...I'm happy with it. Photoshop and Illustrator skills are indeed well-employed when you're a graphic designer!
That whole idea of imperfect completion, though, is I think what a lot of authors - established or aspiring - struggle with. For instance, I wrote the first draft of my novella (which I'm still strangely reticent to name "out loud", as if by doing so I'm communicating the entire plot and reading it word-for-word for someone to then convert into their own manuscript and promote as their own work. Yes, I'm fully aware that this presents me as decidedly paranoid) just over five years ago and, after basically doing a re-read through once, put down and let sit for a couple of years at a length of 32,000-ish words. When I received positive feedback on it even in that state, and after having devoted the entirety of 2013 to producing a graphic design portfolio while studying the subject (go check it out! http://be.net/bysimonrandell - self-promotion self-promotion self-promotion), I was ready, I think, to return to it and edit it again, this time incorporating new character development for otherwise plot-device personalities (a truly awful thing. If someone isn't there by merit of their own presence...the story is too dependent on them. I realise most stories are dependent on the characters living them, but if the only reason to include a woman named Marjory is to bridge the gap between one part of the story and the next...it's too tenuous. What if Marjory suddenly died? What if Marjory hadn't noticed whatever it is she noticed in order to propel the plot forwards? Why am I discussing the merits of Marjory? But seriously: Marjory either as to be a character already present and developed in order for her role as a plot-propulsion device to be acceptable, or the plot needs reworking. Marjory is either a character by her own merit, or she's not. And from a more personal perspective, I know what it's like to be treated as if [and told so] I were a plot device; it's not fun. Don't do that to poor, poor Marjory) and to concern myself more with themes which had been alluded to but which I'd up until that point not really included by intent. The story is now over 38,000 words - and is complete in itself, as I believe I've mentioned before.
However...I do keep wondering whether I should go back and add more stuff in. Should I mention what the antagonist had to survive by eating things the otherwise would have preferred to avoid, in order that it is an even more well-developed character? Should I speak of the moon, indicating clearly by implication and inference how much time is passing? In short, I believe the answer - despite continuing to consider such things - is no, I shouldn't. In the first case it's largely because I don't necessarily want the antagonist to be an open book. In the second, again, it's because I don't want things to be set out so definitively. And also, realistically, the mechanic of time is a passive one - there is no role that the hours play beyond that time advances, as time does, and that doesn't really need more reference than the occasional allusion to the environment. Really I've already arrived at the conclusion that the story is complete in its imperfection - as all stories are, as all designs are. There comes a point at which the process of creation is finished: anything beyond that could further reinforce the object created, but without that reinforcement it is not lacking anything. Sometimes, in fact, the aspects which are not further developed become part of the intentional form or purpose. Basically, the idea of "less is more" is, generally applied, true, and there is no point or benefit gained from having been overwrought.
And that, weirdly, brings me to something else I'd been considering earlier today (and have considered many times before now). I make no secret of being a bit of a gameplay-walkthrough-junkie, as in, I'll spend much time watching Youtube videos of people playing games I (mostly) haven't played. It really is entertaining. But, strangely, there's also that definite sense of being privy to a story: we can all read books or watch movies and experience being an audience, and watching games being played from start to finish has become a roughly similar experience over the past ten years. It wasn't, way-back-when - if you weren't playing the game then you were the unfortunate audience who was entirely lacking agency within the story and therefore had limited investment and limited enjoyment. Nowadays you don't have to exert agency within the world of a game in order to still be invested in its progression from start to finish. That, I think, has a lot to do with the sophistication that began to become core to games as their reason for existence; you can't just throw someone into a world without a reason or a cause for their presence and expect it to all make sense, though some notable older games did just that and at the time were highly successful. The trouble is, though, that sometimes you notice things that leave you wondering.
For me, such a thing is the dialogue of certain fantasy games these days. The storylines are meant to be dramatic, and to a point I can understand that - for why else would people play, unless caught up in the promise of some epic adventure with grand consequences and dire risks? And, of course, because of the influence of such things as The Lord of the Rings: things have to, somehow, live up to the grandeur of that kind of Truly Epic, particularly when they could be said to be hijacking a lot of the theretofore non-existent standards present in Tolkien's works. Unfortunately - as I've mentioned before - the ideas of so many of the present-day fantasy worlds created as if somehow new or different recycle a lot of the same stuff: elves are Humans Plus, orcs are (evil) Humans Plus, Green Edition. But that's to some extent understandable - if something works, why change it to sit outside expectations? And of course, it's not as if I don't like games that have elves and orcs - I absolutely adore the Elder Scrolls games, though those prior to Morrowind I haven't seen much, if anything, of.
It's the dialogue, really, that bothers me. Dialogue that is supposed to take the listener or reader out of the modern context is one thing, and often dialogue needs to be replete with flourish and redundancy - after all, that's how people communicate, and that's how stories which could be 32,000 words become 38,000 words...haha. But there's a certain amount of cliché which I think becomes too overpowering. If a character says "I will let nothing get in my way", for instance, that's really all that need be said. Yet one game I can think of in particular would have the character say "I will let nothing get in my way, do you hear me?! Nothing!" On its own this isn't so bad. You can tell merely from the words that this particular character would be quite impassioned and set in their stance. Yet because those words are typically all grouped together as a verbal cliché, they lose their impact and become predictable and empty - so much so as to make them formulaic. Anyone who has read Homer's Iliad and Odyssey probably has some understanding that the reason Homer, or the poets collectively called Homer in today's regard, utilised formulae ("wine-dark sea") was because by using such predictable and set word patterns the rhythm of the recited poems (both of these surviving epics were initially recited by bards, only later being written down and cemented as singular versions) could be maintained. In that regard the predictability of the words was sensible because it gave the poet a sense of where they were in their recitation, and allowed proper pacing...but the constant reference to the wine-dark sea likely did nothing to really inform the audience of the sea itself. An epithet is only meaningful if it is meant to retain its meaning, and like any word if it's said too often it becomes a mere collection of syllables. So too do over-used statements in dialogue, even if they're not over-used in the game but in the genre or in the wider culture (storytelling culture, for instance), and over-used formulae in internet fora ("Well, you could do that. Or you could do this. Hell, you could even..." - the latter sentence loses its emphasis, rather than gains it, purely because of how over-used the "hell, [extra option]" structure is).
I don't mean to criticise so much, really. It's just something I've noticed in some media, particularly games. It doesn't mean it's bad, it just means I find the impact of the words lost, and that's a sad thing. I think some cliché is unavoidable, of course: nobody's inventing language from scratch, and there are cultural trends at any given moment that will impact how something is communicated. It's just a great pity when the power of an object is lost because the way it is described uses someone else's words. Dialogue can be wrought without being overwrought, and if the level to which the work on it is present is relevant to the medium...then it fits. It's when that work transitions into over-work that something loses its meaning. Probably, at this stage, a bit like this blog entry.
Friday, February 13, 2015
Unintentional detour, but in a good way (perhaps?)
Perhaps a week ago I passed the 10,000 word mark. I'd really intended to find out by then how I could further customise this blog to make it less bore/snore-inducing when viewed...and so far haven't found an opportunity to really sit down and sort it out. That's not to say, of course, that I haven't been doing stuff - it just means I haven't been doing that.
But what stuff have I been doing? Well, for longer than a week I've been working on a small title banner to sit in place of the current text title up there - and that's been going well enough, in fits and starts. The problem I have is that I have many things I want to do simultaneously and sometimes it's frankly the easier thing just to procrastinate (have I written on this before?) and watch a Youtube video (or several) than engage in creativity when I feel I shouldn't be neglecting the things I'm not doing - because clearly it's better to neglect everything equally than do one thing over another. I've also been up the line to see my sister's three-week-old daughter, and since that's a trip that takes nearly three hours to complete in one direction (not unpleasantly so, though my old car invariably returns home at the end of it with a new shake or rattle to let me know I'm pushing it a wee bit) and once there much of my attention is on the people I'm with, it means I lose most of two days (if I stay overnight. Which I did, because, frankly, 5.5 hours in a car in a single day when you haven't traveled 5.5 hours away from home seems like a bit of a task!).
Not that two days means much in the long-run, of course. It's quite easy just to lose days to nothingness, to hum-drum status quo activities like housework. I notice it's been half-a-month since I last blogged - again, much because of the hum-drum status quo. I shouldn't pretend as if I've done nothing constructive - I really have engaged in different things. Currently I have a design I'm working on for a set of three posters with a typographic focus and an association with the sea; and, as I mentioned, I've been at work on a customised banner for this blog, which I believe I've either finished or almost finished, and for which I'll find out the relevant procedure to set in place.
And also: I saw the final installment of The Hobbit, which was an achievement for me as I didn't see the first in a cinema (and I'm almost glad I didn't. I have to say that as much as I'm inspired and intrigued by much of the story-telling elements shown in the movies, there are a lot of time-wasting aspects, too, as well as interpretations of events that seem somewhat odd: the weird plate-throwing game, for instance, that the dwarves set themselves to in the first movie. Certainly they, from memory, did set to washing the plates in the novel, but did they make a great show of being reckless? I don't remember that. While I understand the need for visual entertainment in a story that otherwise communicates much of its message in text in its original version, there's a certain amount of...silliness, I think, that maybe just didn't need to be in there. Of course we want to see the dwarves as merry, rough and finely skilled in what they do - not dropping a single plate - and certainly Peter Jackson may have wanted the film to appeal to children, perhaps [though oddly so, for while The Hobbit can be argued from its written voice to have been directed towards the literate child or pre-teen when it was first penned, I'm not certain one can present a movie the way these were and still aim to be inclusive of a younger audience], but I found it oddly patronising. This may have been my issue, though...) and I'd almost missed seeing this last one on a big screen as well. At first I thought I'd made a mistake by seeing it the high-frame-rate version; the second installment I recall as having that quintessential movie flatness, languor, and warm blurriness that allows supreme escapism (I've always felt): the story is presented in an other-than-realistic way, and so you can leave criticism behind and be transported elsewhere, into a realm unlike the one you're currently sitting in. Super high-definition almost, for me, makes something too accessible and too real - there seems no drama or production about the way people move or interact, and instead it brings it all back into the hum-drum. Not quite like housework, but I'm sure there'll be some of you who get what I mean.
Anyway, it turns out it merely took my eyes and mind some adjustment; soon enough it seemed just as inaccessible and just as unreal, which, oddly, is what I feel is needed in movies. A good example of this is a show in New Zealand called Shortland Street. It's bad. It's really, really bad. I don't just criticise it because it's a completely unrealistic hospital drama and I'm a healthcare professional myself; it's truly, truly awful. Part of its awfulness - not an influential part, but a part nonetheless - is how "real" the settings are: how plain, how bland, now reach-out-and-touchable they are. How parochial the characters are, and how outlandish the stories. In comparison to another soap - Home and Away, based in Australia - Shortland Street's stories are perhaps only slightly more outlandish (at least Summer Bay in Home and Away is a community, a town; Shortland Street revolves solely around the hospital and satellite venues like a very small number of people's homes, and its stories are invariably about which of the staff has murdered which of their coworkers, who's sleeping with whom, some crazed individual seeking to bomb the hospital. I work in a hospital; some of my coworkers are friends, others aren't, and that's generally as far as it all goes. We have lives outside of the hospital and away from each other. None of my coworkers (or me!) has killed another, they're not sleeping with each other willy-nilly, and in the five years I've been there nobody's pulled any stunts to place the hospital - a major tertiary centre - at risk...and yet that stuff happens on a consistent basis in Shortland Street. I suppose it has to if the show is to remain interesting to its audience?), but Home and Away has the blurry, inaccessible quality which transports the audience at least far away enough from their own presumably sharply defined lives to suspend disbelief. Movies have typically had this as well, and I've always found it makes them more appealing: the inaccessibility implied by it makes the world the story is occurring within that much more tantalising. Perhaps it's the same sort of thing that having to turn pages in a book implies, even subconsciously - you may be so very invested in a story, to the point maybe of being able to see it occurring as you read its progression...but then you have to turn the page and find out what happens next. The glass barrier between you and the various undersea creatures at the local aquarium may be a boon to your safety, but doesn't it all look so tremendously inviting all the same? Yet if you were in it, wet and cold and accompanied by the myriad other beings you might be able to see on the other side of the glass...you may not enjoy it so much. It's a perceptive issue, methinks. At any rate, that was all to say...: I enjoyed the high-frame-rate in the end. It's weird how unusual and noticeable it was to begin with, and then how normal and insignificant it was thereafter - and how quickly my perception adjusted.
And here I am, having waxed prosaic about high-frame-rate and film and what I had really meant to write about was the feeling I (and no doubt almost every person ever interested in the world created by Tolkien) was left with having been reimmersed in the realm of The Hobbit. I won't now - I'll leave that till next time - and I certainly won't be "reviewing" the movie. I don't consider myself a movie critic, nor really do I see the value in critics when opinions and perceptions are so individual and subjective that a self-professed expert may have an opinion that they will stand by till the dire end but that makes no sense to me (Lady in the Water, for instance: I loved it. It was such a beautiful, magical movie...and yet it was generally panned by people who regard their opinions as being more valid and necessary than actual movie-goers. I can't help thinking this was at least in part because it didn't have guns and war, but also because a lot of adults these days have forgotten they're grown-up children, not an entirely new species. Magic may appeal to children, but why should it not appeal to adults? That's why we read books, watch movies, play games, listen to music, is it not? To have stories told and memories and feelings conjured into existence and experience. No wonder there are so many people who play games and get into fantasy stuff. If adult life is all about a desk job that's slowly raising your blood pressure, stifling your creativity and otherwise wringing the neck of your enjoyment of life...maybe it's not really how adult life should be). What I want to do is maybe just touch on all of that stuff in more detail - the escapism, the need for stories, and the importance of allowing those stories to be created. But we'll see. First I need to do some more designing, some more writing, some more reading, and probably more procrastination. Youtube ahoy.
But what stuff have I been doing? Well, for longer than a week I've been working on a small title banner to sit in place of the current text title up there - and that's been going well enough, in fits and starts. The problem I have is that I have many things I want to do simultaneously and sometimes it's frankly the easier thing just to procrastinate (have I written on this before?) and watch a Youtube video (or several) than engage in creativity when I feel I shouldn't be neglecting the things I'm not doing - because clearly it's better to neglect everything equally than do one thing over another. I've also been up the line to see my sister's three-week-old daughter, and since that's a trip that takes nearly three hours to complete in one direction (not unpleasantly so, though my old car invariably returns home at the end of it with a new shake or rattle to let me know I'm pushing it a wee bit) and once there much of my attention is on the people I'm with, it means I lose most of two days (if I stay overnight. Which I did, because, frankly, 5.5 hours in a car in a single day when you haven't traveled 5.5 hours away from home seems like a bit of a task!).
Not that two days means much in the long-run, of course. It's quite easy just to lose days to nothingness, to hum-drum status quo activities like housework. I notice it's been half-a-month since I last blogged - again, much because of the hum-drum status quo. I shouldn't pretend as if I've done nothing constructive - I really have engaged in different things. Currently I have a design I'm working on for a set of three posters with a typographic focus and an association with the sea; and, as I mentioned, I've been at work on a customised banner for this blog, which I believe I've either finished or almost finished, and for which I'll find out the relevant procedure to set in place.
And also: I saw the final installment of The Hobbit, which was an achievement for me as I didn't see the first in a cinema (and I'm almost glad I didn't. I have to say that as much as I'm inspired and intrigued by much of the story-telling elements shown in the movies, there are a lot of time-wasting aspects, too, as well as interpretations of events that seem somewhat odd: the weird plate-throwing game, for instance, that the dwarves set themselves to in the first movie. Certainly they, from memory, did set to washing the plates in the novel, but did they make a great show of being reckless? I don't remember that. While I understand the need for visual entertainment in a story that otherwise communicates much of its message in text in its original version, there's a certain amount of...silliness, I think, that maybe just didn't need to be in there. Of course we want to see the dwarves as merry, rough and finely skilled in what they do - not dropping a single plate - and certainly Peter Jackson may have wanted the film to appeal to children, perhaps [though oddly so, for while The Hobbit can be argued from its written voice to have been directed towards the literate child or pre-teen when it was first penned, I'm not certain one can present a movie the way these were and still aim to be inclusive of a younger audience], but I found it oddly patronising. This may have been my issue, though...) and I'd almost missed seeing this last one on a big screen as well. At first I thought I'd made a mistake by seeing it the high-frame-rate version; the second installment I recall as having that quintessential movie flatness, languor, and warm blurriness that allows supreme escapism (I've always felt): the story is presented in an other-than-realistic way, and so you can leave criticism behind and be transported elsewhere, into a realm unlike the one you're currently sitting in. Super high-definition almost, for me, makes something too accessible and too real - there seems no drama or production about the way people move or interact, and instead it brings it all back into the hum-drum. Not quite like housework, but I'm sure there'll be some of you who get what I mean.
Anyway, it turns out it merely took my eyes and mind some adjustment; soon enough it seemed just as inaccessible and just as unreal, which, oddly, is what I feel is needed in movies. A good example of this is a show in New Zealand called Shortland Street. It's bad. It's really, really bad. I don't just criticise it because it's a completely unrealistic hospital drama and I'm a healthcare professional myself; it's truly, truly awful. Part of its awfulness - not an influential part, but a part nonetheless - is how "real" the settings are: how plain, how bland, now reach-out-and-touchable they are. How parochial the characters are, and how outlandish the stories. In comparison to another soap - Home and Away, based in Australia - Shortland Street's stories are perhaps only slightly more outlandish (at least Summer Bay in Home and Away is a community, a town; Shortland Street revolves solely around the hospital and satellite venues like a very small number of people's homes, and its stories are invariably about which of the staff has murdered which of their coworkers, who's sleeping with whom, some crazed individual seeking to bomb the hospital. I work in a hospital; some of my coworkers are friends, others aren't, and that's generally as far as it all goes. We have lives outside of the hospital and away from each other. None of my coworkers (or me!) has killed another, they're not sleeping with each other willy-nilly, and in the five years I've been there nobody's pulled any stunts to place the hospital - a major tertiary centre - at risk...and yet that stuff happens on a consistent basis in Shortland Street. I suppose it has to if the show is to remain interesting to its audience?), but Home and Away has the blurry, inaccessible quality which transports the audience at least far away enough from their own presumably sharply defined lives to suspend disbelief. Movies have typically had this as well, and I've always found it makes them more appealing: the inaccessibility implied by it makes the world the story is occurring within that much more tantalising. Perhaps it's the same sort of thing that having to turn pages in a book implies, even subconsciously - you may be so very invested in a story, to the point maybe of being able to see it occurring as you read its progression...but then you have to turn the page and find out what happens next. The glass barrier between you and the various undersea creatures at the local aquarium may be a boon to your safety, but doesn't it all look so tremendously inviting all the same? Yet if you were in it, wet and cold and accompanied by the myriad other beings you might be able to see on the other side of the glass...you may not enjoy it so much. It's a perceptive issue, methinks. At any rate, that was all to say...: I enjoyed the high-frame-rate in the end. It's weird how unusual and noticeable it was to begin with, and then how normal and insignificant it was thereafter - and how quickly my perception adjusted.
And here I am, having waxed prosaic about high-frame-rate and film and what I had really meant to write about was the feeling I (and no doubt almost every person ever interested in the world created by Tolkien) was left with having been reimmersed in the realm of The Hobbit. I won't now - I'll leave that till next time - and I certainly won't be "reviewing" the movie. I don't consider myself a movie critic, nor really do I see the value in critics when opinions and perceptions are so individual and subjective that a self-professed expert may have an opinion that they will stand by till the dire end but that makes no sense to me (Lady in the Water, for instance: I loved it. It was such a beautiful, magical movie...and yet it was generally panned by people who regard their opinions as being more valid and necessary than actual movie-goers. I can't help thinking this was at least in part because it didn't have guns and war, but also because a lot of adults these days have forgotten they're grown-up children, not an entirely new species. Magic may appeal to children, but why should it not appeal to adults? That's why we read books, watch movies, play games, listen to music, is it not? To have stories told and memories and feelings conjured into existence and experience. No wonder there are so many people who play games and get into fantasy stuff. If adult life is all about a desk job that's slowly raising your blood pressure, stifling your creativity and otherwise wringing the neck of your enjoyment of life...maybe it's not really how adult life should be). What I want to do is maybe just touch on all of that stuff in more detail - the escapism, the need for stories, and the importance of allowing those stories to be created. But we'll see. First I need to do some more designing, some more writing, some more reading, and probably more procrastination. Youtube ahoy.
Friday, January 30, 2015
7,000 words isn't a short story, except it might be, but that's not the point
I broke the belated 7,000 word mark on this new story the other night. I haven't been able to consistently knock out the 1,000 word per day minimum I'd loosely wanted to achieve, and not because, as much as I'm tempted to place responsibility wholly on external causes, I haven't had enough time.
Well, in part that's actually true - I haven't had enough time to achieve that kind of result. That's just how it's been since late December/early January: my first niece was born, followed by my mum's arrival from the UK, my second niece's birth, my dad's departure for life in Thailand, my involvement in various design projects I've been trying to develop personally and getting people to take a look at my novella manuscript. And work. Work takes time, too. None of it's bad, of course, or at least not bad to any significant degree, but it does mean I haven't had the time to focus on getting words down on paper. Or screen, as is actually the case.
However, I'm experiencing one of those times I think many writers must have wherein the story is just sitting around, waiting to be told, but before it can be the build-in has to be written: the slower, less-action-packed introduction of the main plot without giving the main plot away, the implication of something else being afoot or the suggestion of a twist to justify the story's existence in the first place. Who wants to write all of that boring stuff? Well, I do, yeah, but I want to tell the story - not spend time laying the foundation for it.
Which is, of course, difficult. You can't build on ground you haven't prepared to hold a structure; you can't launch into a story and expect it to stand up to scrutiny when you haven't actually told the pre-story aspects of it. A plot can only take a story so far. The rest has to be about the people the plot happens to and how they respond to the major points along the way.
I once wrote part of a story I may have mentioned before called Chimaera, having only the vaguest semblance of a plan for its plot. I made it to the 10,000 word mark and realised (as I believe I may have detailed in a previous entry, but if not, here it is!) that I didn't really have a proper reason for the story to be 10,000 words long. It was a story not really being told, almost. A long story not being told. As you might imagine, 10,000 words is a fair bit of time and effort for not a whole lot.
Of course, it wasn't that not a whole lot had happened within those 10,000 words; a lot had. But I'd added elements in that I had thought about as interesting additions but which hadn't really earnt a place in those 10,000 words; they weren't necessarily contributing to a set goal at all, instead suggesting some nebulous idea of a goal that I hadn't really set in stone. The words are there, the ideas are there, just where are they leading? I'm still not 100% sure on it.
In this case, though, I'm sure. 7,000 words isn't a short story. Technically it very well could be, actually. It's certainly a long essay. But it's not an introduction to a story, is what I mean, as much as it is. In 7,000 words a lot can happen, even if it doesn't seem to be. At this point I've introduced the protagonist, the deuteragonist, and several others who generally might count as a collective tritagonist given that now that I've begun writing about them, a secondary plot has cropped up as a potential alter-aspect of the primary plot.
I like to write organically, I'll admit: I've said before that I've tried and failed at writing purely organically, as in, without a planned-out scaffold upon which to build a full story, but I can't spend a decade on the other hand planning out a story and having everything squared away before I start writing. It just doesn't work for me. I need some freedom to flesh things out, at least, and to weave in extra filaments - something I was still doing not so many months ago when I realised I could expand my novella to include a character who had previously been more just a plot device as a more rounded individual whose presence was greater than to merely propel the story between points a and b. In the case of this newer story I went a bit overkill with the plan, I think, but in a good way: I started with three pages from start to finish, and then began again, adding detail into the story in its plan form, ending up with 24 or so pages in total before writing commenced. Yet even with that this subplot hadn't really occurred to me until I was able to think about whether two characters who were exhibiting similar behaviour were linked in more ways than just that they lived in the same place and each had young-ish children. As you might guess...the answer is a resounding yes. The situation has moved from them simply being general neighbours who experience the same things in the story to them being sisters - a fairly simple transition, but one which allows a wealth more undercurrents to be brought into the story. And I'm pretty excited about that.
It's late, and frankly I suppose my entire point was that I'm excited I have 7,000 words, even though it's taken me so long, and also that even as I'm writing the story is crafting itself. That's what happened with my novella, too: as I wrote it, it showed more of itself as it should be written. I'll go ahead and say I could very well beat 40,000 words this time!
Well, in part that's actually true - I haven't had enough time to achieve that kind of result. That's just how it's been since late December/early January: my first niece was born, followed by my mum's arrival from the UK, my second niece's birth, my dad's departure for life in Thailand, my involvement in various design projects I've been trying to develop personally and getting people to take a look at my novella manuscript. And work. Work takes time, too. None of it's bad, of course, or at least not bad to any significant degree, but it does mean I haven't had the time to focus on getting words down on paper. Or screen, as is actually the case.
However, I'm experiencing one of those times I think many writers must have wherein the story is just sitting around, waiting to be told, but before it can be the build-in has to be written: the slower, less-action-packed introduction of the main plot without giving the main plot away, the implication of something else being afoot or the suggestion of a twist to justify the story's existence in the first place. Who wants to write all of that boring stuff? Well, I do, yeah, but I want to tell the story - not spend time laying the foundation for it.
Which is, of course, difficult. You can't build on ground you haven't prepared to hold a structure; you can't launch into a story and expect it to stand up to scrutiny when you haven't actually told the pre-story aspects of it. A plot can only take a story so far. The rest has to be about the people the plot happens to and how they respond to the major points along the way.
I once wrote part of a story I may have mentioned before called Chimaera, having only the vaguest semblance of a plan for its plot. I made it to the 10,000 word mark and realised (as I believe I may have detailed in a previous entry, but if not, here it is!) that I didn't really have a proper reason for the story to be 10,000 words long. It was a story not really being told, almost. A long story not being told. As you might imagine, 10,000 words is a fair bit of time and effort for not a whole lot.
Of course, it wasn't that not a whole lot had happened within those 10,000 words; a lot had. But I'd added elements in that I had thought about as interesting additions but which hadn't really earnt a place in those 10,000 words; they weren't necessarily contributing to a set goal at all, instead suggesting some nebulous idea of a goal that I hadn't really set in stone. The words are there, the ideas are there, just where are they leading? I'm still not 100% sure on it.
In this case, though, I'm sure. 7,000 words isn't a short story. Technically it very well could be, actually. It's certainly a long essay. But it's not an introduction to a story, is what I mean, as much as it is. In 7,000 words a lot can happen, even if it doesn't seem to be. At this point I've introduced the protagonist, the deuteragonist, and several others who generally might count as a collective tritagonist given that now that I've begun writing about them, a secondary plot has cropped up as a potential alter-aspect of the primary plot.
I like to write organically, I'll admit: I've said before that I've tried and failed at writing purely organically, as in, without a planned-out scaffold upon which to build a full story, but I can't spend a decade on the other hand planning out a story and having everything squared away before I start writing. It just doesn't work for me. I need some freedom to flesh things out, at least, and to weave in extra filaments - something I was still doing not so many months ago when I realised I could expand my novella to include a character who had previously been more just a plot device as a more rounded individual whose presence was greater than to merely propel the story between points a and b. In the case of this newer story I went a bit overkill with the plan, I think, but in a good way: I started with three pages from start to finish, and then began again, adding detail into the story in its plan form, ending up with 24 or so pages in total before writing commenced. Yet even with that this subplot hadn't really occurred to me until I was able to think about whether two characters who were exhibiting similar behaviour were linked in more ways than just that they lived in the same place and each had young-ish children. As you might guess...the answer is a resounding yes. The situation has moved from them simply being general neighbours who experience the same things in the story to them being sisters - a fairly simple transition, but one which allows a wealth more undercurrents to be brought into the story. And I'm pretty excited about that.
It's late, and frankly I suppose my entire point was that I'm excited I have 7,000 words, even though it's taken me so long, and also that even as I'm writing the story is crafting itself. That's what happened with my novella, too: as I wrote it, it showed more of itself as it should be written. I'll go ahead and say I could very well beat 40,000 words this time!
Friday, January 23, 2015
Continued consideration
After thinking about it...maybe eventide has grown on me as the word to use.
Monday, January 19, 2015
Why communication isn't, or at least shouldn't be, accidental
One thing I detest, just really, absolutely detest, is wildly incorrect grammar passed off as being completely accurate. I don't want to come across as a complete chump when it comes to grammar, but there are certain ways of saying or writing something that are correct...and some that are the precise opposite.
I realise there's a problem in being so fixated upon correct grammar: different people speak and write in different ways, and over time linguistic practices change. However, there are just some practices without basis except a lack of awareness of what correct English looks and sounds like.
I have an issue, I'll say, with people saying "lay" when they mean "lie", for instance. In order to say "I lay", you either have to be speaking in the simple past tense ("Last night I lay on my bed"), or you must be speaking of performing an action upon something else ("I'm going to lay the paper upon the table"). Otherwise you have to use "lie": "I lie down", "I am lying down", "I was lying down". It really frustrates me to hear or read people using words that are inappropriate to their intended meaning. Yes, certainly, I understand what you're intending to say; however, that doesn't mean I should be doing the interpretative work for you because you're using the incorrect word. And it's a pretty simple situation: treat lie vs. lay as you would rise vs. raise. You wouldn't say "I raise" without saying what you raise - and so too you wouldn't say "I lay" without saying what you lay. Even if you're raising yourself up or laying yourself down, you absolutely must state what it is you're raising or laying. If you're just speaking of yourself in the simplest sense, you'd say "I rise up" or "I lie down". Simple as that!
It's not a big deal, I know. People aren't dying for the misuse of certain words. Nor are people suffering grievous bodily damage from misuse of adjectives as adverbs ("I'm good" when what is meant is "I'm well")...but misuse of words, and the use of incorrect classes of words, really can affect what it is that people interpret your meaning to be.
I would say I generally agree with the statement that choice of, say, "basal" words really encourages the perception of the speaker as somewhat uncivilised. Don't mistake my meaning: I swear. I probably swear a little too much, or at least too readily. Particular swear words can be useful in an emphatic sense - they really can increase the gravity of the statement being made quite effectively. Perhaps not, though, if they're being used as fillers as well. And often enough that's exactly what they're being used as, merely taking up space in (typically verbal) communication and derailing the message from effective and direct to muddled and unimaginative. There is the aspect of social decorum, certainly, but even if one only argues that if you're always saying "fudge", or something very similar, then use of the word "fudge" colours your language, and your command of it, as quite limited, rightly or wrongly.
To me effective communication is extremely important. Just in the first instance, if my writing were full of errors, misused punctuation, or incorrect verb forms, I should expect any readers I might have to be thoroughly confused and potentially turned away from my work. And, to be perfectly honest, I can't bring myself to buy petrol at a local gas station because of their billboard, which says "Your six sense, tells you to buy petrol here" (the italicised portion is the part I'm not quite sure of. The first sentence is the part that preoccupies my mind whenever I read it due to the appalling grammar). For an international petrol company to be represented so thoroughly badly does nothing to add positivity to its already reasonably tarnished image due to certain events in the past few years. I'm sure you may have an idea which company it may be, but that isn't the issue; the issue is the use of the cardinal "six" when the ordinal "sixth" should have been used, coupled with the inappropriate use of the comma dividing a singular clause into two partial, and incomplete, clause fragments. Yet this is printed in large format upon a billboard which obviously cost a fair bit of money to have designed, printed and subsequently affixed to the side of the petrol station's main building. It challenges me to think nobody saw the issue with such a low standard of English in a country wherein English is an official language, and indeed is the most widely-spoken of the two recognised official languages. That is, of course, going so far as to presume any proofing of the design was done and it wasn't just slapped together. But even then, shouldn't the printer have seen the error and said "Hey, is this really what you want to say?"
But then there's the issue of being edited, and losing your initial intent because either you've been too critical of your own choice of words, or someone else has come along and altered them to fit their own perspectives.
A good example of this, well known to some, is when J.R.R. Tolkein had his "elven" changed to "elfin" by a zealous editorial agent wishing to use correct English. I'm not prepared to be critical of such an agent by calling them over-zealous: the entire point to having an editor is to pick up on errors, assess how well this follows that, and generally just smooth out the kinks the author may not have picked up on themselves. Had this instance of "correction" not been re-corrected to reflect Tolkein's wholly intentional use of "elven", we might have had a completely different idea about what "elfin" means these days. As it is the former has a degree of sophistication to it, of ideals, of all the other stuff people writing of elves like to have their readers infer about their magical better-than-human race, and the former is relegated to usage as a descriptor for wee little things, cutesy curly-toed shoes with bells on and sometimes children. The way a person chooses language is far more than just about what they're trying to say, and very much about the image they're trying to create with the junction between implication and inference.
I myself am partial to using correct English. The trouble is, though, that British English and American English are often of enough difference that in choosing one, the benefits of the other are sacrificed. For instance: the verb "cancel". In American English, the preterite form of "cancel" is typically spelt "canceled", with one l; yet in British English it's "cancelled", with two ls. In neither case is the emphasis placed on the second syllable: an event is never cancel(l)ed, but cancel(l)ed - despite the fact that by doubling a consonant in such a way in English (among other languages) the stress is indicated to fall upon the syllable containing it. The same is true of "focus" - my mum tends to spell the past tense as "focussed", whereas I prefer the leaner "focused". Of course, the problem here may be that someone may interpret the pronunciation of "focused" to be less aligned to "focus" and more to "focuse", or even "focuze". On the other hand, if "focussed" needs a double-s to maintain the s-sound, why drop the second s at all? To be frank, I prefer British English. It's what I've been trained to use as my first language, and there are many things in American English which don't make sense to me. However, I can also see things in British English, like needless consonant doubling, which are of equal senselessness - and I can't just ignore them, either. I suppose, in that case, I choose a blend. Never shall you see an unnecessary ll or ss; but then nor will you ever see a z (which to me is a zed) used in place of an s. And, dependent on where you're from, you may have seen that I did use the British form of preterite - with the terminal e.
Speaking of uselessness: apostrophes to indicate plurals, in any instance, cause me to grind my metaphorical teeth. I've seen all sorts of catapostrophic (did you see what I did there?) misuse, and while the typical "paper's" or "number's" makes me shake my head and wonder what thought process led to that kind of typographic abomination (I also love hyperbole), what really gets to me are the following:
Decades with an apostrophe between the last number and the pluralising s ("1990's");
Acronyms, which technically should be written with a fullstop between the capital letters to indicate they stand for whole words themselves, followed by an apostrophe and the pluralising s ("ID's");
Words ending in s followed by an apostrophe but no pluralising s ("glass'");
Letters followed by an s with an apostrophe ("A's")
Others. So many others.
I should say there's a fair bit of responsibility that should be assigned to the it's/its pair: for a long time I didn't realise that the possessive didn't have an apostrophe at all, which on first glance is atypical of many possessives in English. Of course, it turns out that it actually isn't: yours, his, hers and theirs have no apostrophes either, yet are spelt correctly and indicate possession. But the confusion remains, because possessives using nouns and not pronouns, as a fair rule, require the apostrophe: "my brother's brother" becomes "his brother" when I substitute the pronoun for the noun. However, I stand by the claim that a little thought about why "glasses" is a simple plural noun and doesn't need an apostrophe and the last s taken away would mean that we, the people, would not then have to read such things as "glass'".
And maybe a little thought as to why a comma is needed in a sentence may lead to subclauses being opened and closed between two such marks, rather than a subclause being opened and never closed. For instance, the title of this post. So many people would write it like so: "Why communication isn't, or at least shouldn't be accidental." I've referenced this already when I touched upon using commas (badly) to break a single clause into two partial clauses that just hang there, incomplete. The issue is that sentences have clauses in them; commas can be used to separate the clauses into more readable fragments. But those fragments need to be readable as separable parent-child clauses that apply to a preceding piece of information and are clearly delineated from the original clause. In this case the over-arching clause is "Why communication isn't accidental"; the secondary clause is "or at least shouldn't be", and it applies directly to its parent as a separate modifier - not as blurred into it like some chimaeric parent-child blend.
And yet here I am, at the end of all of this, about to say something of heresy: do what you want to do with language. I don't at all mean be a total jerk and use the worst words you can just because you're "doing what you want", because regardless of how free you deem yourself to be in terms of communication, you're still responsible for any communication you engage in. What I'm saying is: learn the rules well enough to know what you're doing when you break them...and then go ahead and break them, as long as you can describe how you're breaking them and what your intent in doing so is. That's why puns work so well. That's why alliteration and certain neologisms or word-amalgamations are so fitting: because they're done with intent and show understanding and appreciation of the finer workings of language.
I quite like neologisms, actually. And puns. But the whole point is, words that are well-chosen may not even have to be actual words, provided they're still well-chosen. They serve a purpose. For instance, in the story I'm currently writing (in case you're interested), I'm debating whether to use "eventide" or not. The issue for me in this case is that to my eye the word is of the sea: the prominent portion is "tide", with "even" an adjective applied to indicate that it is neither high nor low. In reality "eventide" actually is an archaic way to say "evening" ("even" is also an archaic way to say "evening", seen in terms such as "evenstar", an old name for the planet Venus as seen from Earth) - but the inference I take from it is less one of dusk or twilight and more one of a time of day associated with the ebb and flow of the ocean. So I am tending towards "eventime", which is not a word, but is similar enough to be read as one and not at all without justification as a neologism. Even a single letter can make a difference, changing the entire focus of a word from one thing to another.
The whole point here wasn't actually to decry the terrible spelling and other textual errors that I see around me, but to say that language can be fun. And beautiful, too. It doesn't even have to follow rules, really, or at least, it doesn't have to follow them to the letter, as long as it makes an effort and when and where it doesn't follow the rules it has a reason for its foray away from them. It can't be accidental, and it shouldn't be just writers, copywriters, editors, designers or professionals who take communication seriously. It should be everyone. We should all be using language as a tool already made for us and bending it the way we want it to bend, rather than under- and misusing it because we don't quite understand how to use it to its fullest. Language isn't complex just because it can be, but because it has to be in order to express everything its users want, or might want, to use it to communicate. That's a good thing, not a bad thing. It's all by intent and the way we use it should be too.
I realise there's a problem in being so fixated upon correct grammar: different people speak and write in different ways, and over time linguistic practices change. However, there are just some practices without basis except a lack of awareness of what correct English looks and sounds like.
I have an issue, I'll say, with people saying "lay" when they mean "lie", for instance. In order to say "I lay", you either have to be speaking in the simple past tense ("Last night I lay on my bed"), or you must be speaking of performing an action upon something else ("I'm going to lay the paper upon the table"). Otherwise you have to use "lie": "I lie down", "I am lying down", "I was lying down". It really frustrates me to hear or read people using words that are inappropriate to their intended meaning. Yes, certainly, I understand what you're intending to say; however, that doesn't mean I should be doing the interpretative work for you because you're using the incorrect word. And it's a pretty simple situation: treat lie vs. lay as you would rise vs. raise. You wouldn't say "I raise" without saying what you raise - and so too you wouldn't say "I lay" without saying what you lay. Even if you're raising yourself up or laying yourself down, you absolutely must state what it is you're raising or laying. If you're just speaking of yourself in the simplest sense, you'd say "I rise up" or "I lie down". Simple as that!
It's not a big deal, I know. People aren't dying for the misuse of certain words. Nor are people suffering grievous bodily damage from misuse of adjectives as adverbs ("I'm good" when what is meant is "I'm well")...but misuse of words, and the use of incorrect classes of words, really can affect what it is that people interpret your meaning to be.
I would say I generally agree with the statement that choice of, say, "basal" words really encourages the perception of the speaker as somewhat uncivilised. Don't mistake my meaning: I swear. I probably swear a little too much, or at least too readily. Particular swear words can be useful in an emphatic sense - they really can increase the gravity of the statement being made quite effectively. Perhaps not, though, if they're being used as fillers as well. And often enough that's exactly what they're being used as, merely taking up space in (typically verbal) communication and derailing the message from effective and direct to muddled and unimaginative. There is the aspect of social decorum, certainly, but even if one only argues that if you're always saying "fudge", or something very similar, then use of the word "fudge" colours your language, and your command of it, as quite limited, rightly or wrongly.
To me effective communication is extremely important. Just in the first instance, if my writing were full of errors, misused punctuation, or incorrect verb forms, I should expect any readers I might have to be thoroughly confused and potentially turned away from my work. And, to be perfectly honest, I can't bring myself to buy petrol at a local gas station because of their billboard, which says "Your six sense, tells you to buy petrol here" (the italicised portion is the part I'm not quite sure of. The first sentence is the part that preoccupies my mind whenever I read it due to the appalling grammar). For an international petrol company to be represented so thoroughly badly does nothing to add positivity to its already reasonably tarnished image due to certain events in the past few years. I'm sure you may have an idea which company it may be, but that isn't the issue; the issue is the use of the cardinal "six" when the ordinal "sixth" should have been used, coupled with the inappropriate use of the comma dividing a singular clause into two partial, and incomplete, clause fragments. Yet this is printed in large format upon a billboard which obviously cost a fair bit of money to have designed, printed and subsequently affixed to the side of the petrol station's main building. It challenges me to think nobody saw the issue with such a low standard of English in a country wherein English is an official language, and indeed is the most widely-spoken of the two recognised official languages. That is, of course, going so far as to presume any proofing of the design was done and it wasn't just slapped together. But even then, shouldn't the printer have seen the error and said "Hey, is this really what you want to say?"
But then there's the issue of being edited, and losing your initial intent because either you've been too critical of your own choice of words, or someone else has come along and altered them to fit their own perspectives.
A good example of this, well known to some, is when J.R.R. Tolkein had his "elven" changed to "elfin" by a zealous editorial agent wishing to use correct English. I'm not prepared to be critical of such an agent by calling them over-zealous: the entire point to having an editor is to pick up on errors, assess how well this follows that, and generally just smooth out the kinks the author may not have picked up on themselves. Had this instance of "correction" not been re-corrected to reflect Tolkein's wholly intentional use of "elven", we might have had a completely different idea about what "elfin" means these days. As it is the former has a degree of sophistication to it, of ideals, of all the other stuff people writing of elves like to have their readers infer about their magical better-than-human race, and the former is relegated to usage as a descriptor for wee little things, cutesy curly-toed shoes with bells on and sometimes children. The way a person chooses language is far more than just about what they're trying to say, and very much about the image they're trying to create with the junction between implication and inference.
I myself am partial to using correct English. The trouble is, though, that British English and American English are often of enough difference that in choosing one, the benefits of the other are sacrificed. For instance: the verb "cancel". In American English, the preterite form of "cancel" is typically spelt "canceled", with one l; yet in British English it's "cancelled", with two ls. In neither case is the emphasis placed on the second syllable: an event is never cancel(l)ed, but cancel(l)ed - despite the fact that by doubling a consonant in such a way in English (among other languages) the stress is indicated to fall upon the syllable containing it. The same is true of "focus" - my mum tends to spell the past tense as "focussed", whereas I prefer the leaner "focused". Of course, the problem here may be that someone may interpret the pronunciation of "focused" to be less aligned to "focus" and more to "focuse", or even "focuze". On the other hand, if "focussed" needs a double-s to maintain the s-sound, why drop the second s at all? To be frank, I prefer British English. It's what I've been trained to use as my first language, and there are many things in American English which don't make sense to me. However, I can also see things in British English, like needless consonant doubling, which are of equal senselessness - and I can't just ignore them, either. I suppose, in that case, I choose a blend. Never shall you see an unnecessary ll or ss; but then nor will you ever see a z (which to me is a zed) used in place of an s. And, dependent on where you're from, you may have seen that I did use the British form of preterite - with the terminal e.
Speaking of uselessness: apostrophes to indicate plurals, in any instance, cause me to grind my metaphorical teeth. I've seen all sorts of catapostrophic (did you see what I did there?) misuse, and while the typical "paper's" or "number's" makes me shake my head and wonder what thought process led to that kind of typographic abomination (I also love hyperbole), what really gets to me are the following:
Decades with an apostrophe between the last number and the pluralising s ("1990's");
Acronyms, which technically should be written with a fullstop between the capital letters to indicate they stand for whole words themselves, followed by an apostrophe and the pluralising s ("ID's");
Words ending in s followed by an apostrophe but no pluralising s ("glass'");
Letters followed by an s with an apostrophe ("A's")
Others. So many others.
I should say there's a fair bit of responsibility that should be assigned to the it's/its pair: for a long time I didn't realise that the possessive didn't have an apostrophe at all, which on first glance is atypical of many possessives in English. Of course, it turns out that it actually isn't: yours, his, hers and theirs have no apostrophes either, yet are spelt correctly and indicate possession. But the confusion remains, because possessives using nouns and not pronouns, as a fair rule, require the apostrophe: "my brother's brother" becomes "his brother" when I substitute the pronoun for the noun. However, I stand by the claim that a little thought about why "glasses" is a simple plural noun and doesn't need an apostrophe and the last s taken away would mean that we, the people, would not then have to read such things as "glass'".
And maybe a little thought as to why a comma is needed in a sentence may lead to subclauses being opened and closed between two such marks, rather than a subclause being opened and never closed. For instance, the title of this post. So many people would write it like so: "Why communication isn't, or at least shouldn't be accidental." I've referenced this already when I touched upon using commas (badly) to break a single clause into two partial clauses that just hang there, incomplete. The issue is that sentences have clauses in them; commas can be used to separate the clauses into more readable fragments. But those fragments need to be readable as separable parent-child clauses that apply to a preceding piece of information and are clearly delineated from the original clause. In this case the over-arching clause is "Why communication isn't accidental"; the secondary clause is "or at least shouldn't be", and it applies directly to its parent as a separate modifier - not as blurred into it like some chimaeric parent-child blend.
And yet here I am, at the end of all of this, about to say something of heresy: do what you want to do with language. I don't at all mean be a total jerk and use the worst words you can just because you're "doing what you want", because regardless of how free you deem yourself to be in terms of communication, you're still responsible for any communication you engage in. What I'm saying is: learn the rules well enough to know what you're doing when you break them...and then go ahead and break them, as long as you can describe how you're breaking them and what your intent in doing so is. That's why puns work so well. That's why alliteration and certain neologisms or word-amalgamations are so fitting: because they're done with intent and show understanding and appreciation of the finer workings of language.
I quite like neologisms, actually. And puns. But the whole point is, words that are well-chosen may not even have to be actual words, provided they're still well-chosen. They serve a purpose. For instance, in the story I'm currently writing (in case you're interested), I'm debating whether to use "eventide" or not. The issue for me in this case is that to my eye the word is of the sea: the prominent portion is "tide", with "even" an adjective applied to indicate that it is neither high nor low. In reality "eventide" actually is an archaic way to say "evening" ("even" is also an archaic way to say "evening", seen in terms such as "evenstar", an old name for the planet Venus as seen from Earth) - but the inference I take from it is less one of dusk or twilight and more one of a time of day associated with the ebb and flow of the ocean. So I am tending towards "eventime", which is not a word, but is similar enough to be read as one and not at all without justification as a neologism. Even a single letter can make a difference, changing the entire focus of a word from one thing to another.
The whole point here wasn't actually to decry the terrible spelling and other textual errors that I see around me, but to say that language can be fun. And beautiful, too. It doesn't even have to follow rules, really, or at least, it doesn't have to follow them to the letter, as long as it makes an effort and when and where it doesn't follow the rules it has a reason for its foray away from them. It can't be accidental, and it shouldn't be just writers, copywriters, editors, designers or professionals who take communication seriously. It should be everyone. We should all be using language as a tool already made for us and bending it the way we want it to bend, rather than under- and misusing it because we don't quite understand how to use it to its fullest. Language isn't complex just because it can be, but because it has to be in order to express everything its users want, or might want, to use it to communicate. That's a good thing, not a bad thing. It's all by intent and the way we use it should be too.
Subscribe to:
Posts
(
Atom
)