Uncategorized

An Open Mind is a Learning Mind.

I’m one of those people who needs to write. I’d go so far as to say my health – perhaps my very survival – depends on it. That isn’t me trying to sound melodramatic.

No, I, like many other writers, consider writing not simply a hobby or a method of making money, though it can and does fit easily into those categories. When I say I’m a writer, I’m saying it is as important an activity as eating or sleeping; to go without it for too long leads to moodiness and agitation.

Naturally with writing, one also ends up reading. To write means to record ideas on paper, and one can’t do that unless you’ve first gathered inspiration to form ideas worth recording. Ideas, information and knowledge are things I’ve treasured along with writing from a young age. In more recent years I’ve also become passionate about the importance of establishing the difference between ‘opinion’ and ‘fact’, whether historical or current, and the methods by which we go about establishing said distinction. Do you believe stuff based on evidence? What, in your mind, constitutes evidence? Hint: it isn’t always what people try to say it is.

To think about this is, I think, especially vital in the age of the Internet, where we’re exposed at ease to many opinions often presenting themselves as fact, and vice versa. Unless you want to believe everything, or nothing, or just stick to the inherent bias you grew up with, then you had better develop an eye for what constitutes evidence and a good argument. There’s a lot of bullshit out there, but that’s not to say I don’t value the Internet extremely highly; it has led to my generation becoming arguably the most open-minded of any generation before us. Growing up with so many easily accessible ideas around us has, in general, been healthy.

I find it hard being around people who do not care about these things, who may accept ‘truths’ just based on bias rather than applying critical thought; I find it offensive, and insulting, to see and hear that kind of thing in my presence. I’m not easily offended but this, you could say, is one of my ‘triggers’. Journalistic integrity and freedom of speech are two of the absolute pillars of a free-thinking society, while censorship lies at the opposing end of the spectrum (to be clear, by censorship I do not mean age ratings on products like movies and video games, which are often helpful and entirely necessary).

In my mind the acts of writing and critical thinking go hand in hand, though I know this is not the case for everyone – as I have read plenty in which it was clear the writer was not a critical thinker. Nor must one be particularly intelligent to write a lot, though to be a good writer (volume written doesn’t necessarily correlate with quality content) requires knowledge, not only of your craft but of the world around you.

Naturally then, the best writers also tend to be among the smartest, though it would depend on your point of view pertaining to how we should judge this kind of thing. Do we judge a writer by how clear and concise their style, or by how much knowledge they communicate through it? I suppose the best of them have both qualities. I certainly like to strive for both.

I grew up in a relatively ‘free’ family environment, with parents who weren’t overly strict and didn’t force any particularly weird rules upon me. It was an environment in which I was free to play video games, watch films, and read books without having to worry about which ones were ‘banned’, though at the same time neither of my parents were especially interested in those things and did not therefore instil any inherent bias for or against either. Each medium played their part in helping me grow up relatively open-minded and with an understanding that the world was bigger than my own little bubble.

To an extent, I do consider an open mind to be a privilege; one that many other people who grow up in different family environments aren’t encouraged to have (not that I was particularly encouraged towards it, but it wasn’t heavily discouraged either). Would I really have had the same learning opportunities, the same privilege of experiencing different sides to the world at an age where my mind had not yet grown hardened to them, had I grown up in a strict religious family for example? Likely not.

I find it a great shame when parents take it upon themselves to mould their children into who they want them to be (“for their own good!”), rather than allowing that child the space to discover themselves as an individual. This doesn’t just happen within fundamentally religious families either, and it isn’t always obvious. But as the subject of religion is a sore point for many, including to an extent myself (which I will explain a little further on), let’s stick on it for a moment.

Looking across the history of Western civilisation, our society and culture in the UK, US and Europe have been moulded by Christianity to the point where people have grown up believing – often subconsciously, before coming to ‘know Christ’ and being ‘born again’ later – in God, particularly the version of him portrayed in the Bible. Horror movies and literature in the West often portray demons or the devil himself as the source of all evil. In a court of law, people must place their hands on the Bible in some vague appeal to their conscience; a reminder that God is watching and they’ll be somehow punished for not telling the truth in front of Him.

Not that I want to get too deep into that issue here; what I’d rather do is illustrate how our ability to be open-minded about stuff can be inhibited simply through the culture or environment in which we grow up. If you grew up in the UK like me, you’ll be familiar with our inherently Christian culture. The US is similar, if not worse when it pertains to a Christianised culture, though the secular/ religious divide is arguably more extreme (or at least, more vocal) there as well. The UK, while moderately liberal, is also less willing to voice concerns over things like our monarchy, when we really should.

Now, I think it’s fine for people to acknowledge they’re not ‘open-minded’ about certain things, so long as they are aware of it. PC culture would dictate that we need to be respectful of everything, to the tiniest detail, but we’re all inherently different to begin with and naturally aren’t all going to see things in the same light. Some people don’t like swearing, others do. Some of us like eating meat, others don’t. People on either side, or somewhere in the middle, should be able to live how they want. Don’t rely on the approval of others for that. Equally, don’t expect everyone to be fully accepting of it.

Each of us have our inherent biases; open-mindedness is being able to recognise that bias and acknowledge there are people who’ll be coming from a different point of view. So long as that point of view doesn’t cause or advocate harm to others – which, again, is where religion can pose a bit of a problem – there’s no reason we can’t all respect each other as fellow humans while acknowledging our differences and not getting offended over stupid shit.

My own bias plays in to how I’m writing this article. Why is it, for example, that I feel the need to say swearing is okay, when really most people don’t need to be told that to do it anyway? Or why I focus on the importance of respecting points of view other than your own? Well, it comes back, again, to religion, more specifically Christianity; a religion that did not dictate too harshly how I should live my life growing up, but did at least subtly hold me back from fully expressing myself. Looking back on it, and seeing the effect it has on others as well, it’s clear this is what it does.

The unique thing about Christianity – at least, the Protestant side of which I have direct experience – is that it does not say you must obey its rules, and yet you kind of do, because if you don’t, it means you don’t really love Jesus and will go to hell anyway. A little slip-up is okay, but you must live the correct lifestyle consistently if you’re a ‘proper’ Christian. And boy, being told you’re “not a true Christian” is regarded as the highest form of insult. It’s something they’ll use against me, to discredit my own experiences, because in their eyes only someone who was “never a true Christian” could ever wish to turn away from it.

In many cases, Christians will use that one line as an all-encompassing excuse not to truly engage with issues raised by those who disagree with them. In fact, in my years of being around Christians, I can say in hindsight that many of those relationships, in the interest of ‘accountability’, involve carefully examining each other to determine whether one is a ‘proper Christian’, and each will make their determination, whether privately or publicly, about whether someone else is.

Christianity is supposedly about choosing to do the right thing through your own free will. But free will, of course, only goes as far as our inherent bias lets it – and this religion knows that all too well. It teaches the ultimate form of bias – that when we get to heaven, we’ll want to obey God without question, out of free will, because that will be our inherent nature. For now, on earth, we must deal with our ‘sinful’ nature, which wants to do bad things against God.

I’ll continue on that diatribe another time – there is so much more to say – but for now rest assured I’ve managed, though it has taken a lot of work, plenty of inner conflict, self-justification and thorough research, to largely let go of the hold Christianity had over me growing up and even up until a couple of years ago. Which isn’t to say, of course, that I have anything against Christians as people, though they can’t seem to help but take it personally (and I suppose one can’t blame them, if they believe with honest conviction) when others tell them they think their religion isn’t true.

The single biggest factor in breaking free from the confines of certain aspects of a religion, or anything else, relies on someone being open-minded enough in the first place to even consider whether they might be wrong. Of course I’m not saying that one necessarily leads on to the other (plenty of open-minded Christians have helped carry it out of the dark ages – while many conservatives/ traditionalists/ fundamentalists would claim that’s precisely the problem), but it’s certainly rare for anyone to leave their religion unless they’re open-minded enough to consider something other than what they’ve been conditioned to believe is true. They could, having considered everything else, still settle on Christianity being the truth, and I wouldn’t begrudge them that; it’s their prerogative to believe what they want, just as it is mine.

But if you consider it impossible for yourself to be wrong about something as ‘big’, as important as this, then you’re going to see opposing viewpoints through that specific lens. And naturally you’re going to shut yourself off from learning specifically why people might hold different points of view, because in your mind, in your version of reality, they’re already wrong and you – say, through the Bible – already have all the answers you’ll ever need.

Or maybe it’s more that, deep down, you’re terrified of realising you were wrong, having to admit it to others, and the damaged relationships that would inevitably result from that. I can understand that concern. I’ve seen it before, in people who stick with the Christian lifestyle not because they passionately believe in it, but because they perceive it to be simpler than the alternative, especially if they have a family of their own or friends who look up to them for spiritual support. The amount of Christian pastors hiding this kind of secret – feeling the weight of responsibility to ‘lead the flock’ and fear of letting them down – would shock the everyday church-goer.

I have realised I may need to pad what I say a little here, for those who may not know the full context surrounding my current opinions. First, if it seems I am overly negative towards Christianity, now or at any point to come, this is not necessarily an attack on its principles or even on the faith itself. Many Christians I’ve known are the liberal type who do not adhere strictly to everything the Bible says, or take what it says literally in the face of all scientific evidence to the contrary. Those people are Christian simply because the lifestyle makes most sense to them, and that’s fine.

However, let’s bear in mind what I said about bias. I am a UK citizen, yes, but more than that: I was born and lived in Belfast, Northern Ireland up to the age of 18, at which point I moved over to England for university.

Now, I’m going to assume any potential readers won’t quite realise the significance of that, so I’ll divulge some more. In Northern Ireland, as most people will know, we have a bit of a history of conflict; a kind of Irish ‘civil war’ as such, originating from when Ireland joined the UK a few centuries ago largely against the will of the Irish people. Long story short, back in 1922 the Irish Free State was formed as Ireland won some measure of independence from Britain (though they still had to abide by an ‘oath of allegiance’ to the UK until achieving full independence via a referendum in 1937).

At the same time, the predominantly unionist (that is; loyal to the union of the United Kingdom) six counties of Northern Ireland decided they wanted no part of Irish independence from the crown, and this country itself was technically formed in 1922 as well. Republicans (that is; those who are committed to seeing a fully independent Irish republic) have always held issue with this, just as unionists held issue with southern Ireland trying to take what they saw as their British identity. Even today, Northern Ireland sits in a unique position, in which its residents can claim to be Irish or British and neither would be lying; we are, after all, entitled to dual citizenship from birth should we so wish to claim it.

A large part of the origins of that conflict between Ireland and the UK was this: Ireland was largely a Catholic country, whereas the UK, at that time in the 1700s and continuing since, was protestant. So while technically you could say that means they were both ‘Christian’, no. Trust me, growing up in Northern Ireland it’s impossible to see ‘Protestant’ and ‘Catholic’ as equally Christian. You’re either on one side or the other, and our version of ‘peace’ is tolerating the other side while those old grievances still reside in the back of our minds.

For me growing up in a predominantly Protestant area, I naturally also grew up with that bias. But now, at this stage of my life, I see it all for what it is. Some others of my generation – usually those who have not ventured outside Northern Ireland to live for any sustained amount of time – still hold that strong sense of bias, and probably always will, as I firmly believe it becomes harder and harder to let go of built-in beliefs the older you get. None of us want to feel we wasted years of our lives being wrong about something after all, so as time goes on we’re more likely to make excuses to ourselves that help us keep believing it, partly also for the pride of being known as someone who ‘sticks to their convictions’ rather than someone who ‘flip flops around changing their mind’.

The elephant in the room when it comes to religion and conflict in Ireland, of course, is the claim I made to myself and others for many years: that the violence perpetuated in the name of God was committed by those who “weren’t truly Christian”. This is like what I said before; Christians justifying actions they don’t like by those who seemingly share their faith by simply disregarding it as “not the God I believe in”. If other believers aren’t acting the way you think they should, just keep yourself happy by saying they’re not ‘proper Christians’ and move on, free of any guilt and/or responsibility on the part of your own personal faith in God. Something similar is happening on a more global scale with Islam currently, but I won’t be touching that hot topic here.

Obviously we shouldn’t paint everyone with the same broad brush. We’re individuals, and we’re human, which means we all have different tendencies. Some of us gravitate more naturally to violence, though again there are environmental factors influencing that. Still, it’s undeniable: the Irish ‘troubles’ have their origin firmly rooted not just in patriotism but in the religion that goes hand in hand with it.

Christians on the outside looking in may try to justify their own belief in the loving nature of God by claiming they don’t represent him, but that’s precisely why they were fighting. Unionists would resist Irish rule “for God and country”. In their place would you not do the same to defend your own deeply rooted convictions/ beliefs? The men on the ground, murdering each other for a higher cause, were doing it because they believed it was God’s will in both cases, on either side – and it would not have been uncommon to see those same men in church on a Sunday morning, having taken part in terrorist acts during the week and planning more for the week to come.

All of this leads up to where Northern Ireland stands today. Belfast itself is an impressively modern city, attracting tourists from around the world and parts of it, particularly the city centre, looking a world away from the depressingly grey colours associated with the 1970s. I truly enjoy being back for the most part.

But it’s not all great. Our government serves as a stark reminder of our recent history, not only in its finely balanced unionist/republican divide (to get into the intricacies of it would be too complicated a matter to delve into here) but in the hold that religion has over us. Gay marriage is still illegal and our majority party, the DUP, have vowed to continue blocking it (while consensual gay sex was only decriminalised in 1982). Abortion is only legal under extremely strict criteria, and Northern Irish women often need to travel to England for private treatment to carry one out. Bars and clubs are forbidden from serving alcohol before 11.30am (whereas in England you can grab a beer from 7am in Weatherspoon’s if you feel so inclined).

Whether you feel strongly about the above issues or not, it’s indisputable that Northern Ireland feels a little left behind, even when compared to other regions within the United Kingdom. Of course, we have enough conservative Christian unionists living here that our population is generally happy with things as they are, as they see it as sticking to the rules set out in holy scripture. For me, I feel almost embarrassed by this stuff, and can’t see myself ever coming back to live long-term in Belfast unless certain things change.

Living in England introduced me to many Christians who were more open-minded than the kind of Christianity I’d always known in my homeland. And well, I’ve simply carried on from there, never really wanting to stand still, always keen to learn more. I don’t feel any blind loyalty to one way of thinking, and I don’t consider myself a nationalist in any sense of the word.

There’s one other element that went into all of this that can’t be discarded; in fact it may be the most important one of all. I mentioned earlier, near the beginning, how films and video games had been an important part of my childhood. One can’t be truly passionate about either of these mediums without encountering other cultures in the process. Two of my favourite video games, for example, are the survival horror game Silent Hill and its classic sequel on the PS2 (both developed in Japan), which first introduced me to the subtle elements of atmospheric horror unique to Asia.

Around that time, J-horror was also starting to take the film industry by storm, with Hideo Nakata’s Ringu inspiring a 2002 Hollywood remake starring Naomi Watts. That ended up being rather short-lived, with Ju-On: The Grudge (2003) and its 2004 American remake coming along at the tail end of it, but it can be attributed to sparking my interest in Japanese cinema and, more broadly, Asian culture. Why is this significant? Well, naturally, the more you see of the world, the less you feel you lie at the centre of it. Perhaps something I read recently can help sum it up; “A stolid attachment to a monolithic set of institutional forms becomes much more difficult when one is constantly faced with the beliefs and disbeliefs of many other traditions” (from Ghosts and the Japanese, Michiko Iwasaka and Barre Toelken, introduction).

This, I believe, is why many Christians steadfastly refuse to openly engage with other ways of thinking; deep down they know it could lead to them questioning themselves and ultimately ‘losing face’ should they begin to doubt their own faith. So they build caricatures and stereotypes of other worldviews and belief systems, because that makes it easier for them to paint themselves as the ‘enlightened few’ who have the One truth. Martin Scorsese’s recent film Silence summed up the inherent cultural differences and conflicts between East and West quite succinctly I think.

Sure, Christians may go on ‘missions’ with a view to ‘evangelising’ to those caught up in cultures they see as less enlightened, but they do not truly engage with the existing culture they meet when they get there, aside from the actions one must take so as not to appear awkward – such as taking your shoes off at the door when entering a home in Japan, for example. Even at the peak of my faith I could not help but feel a little awkward and uncomfortable at the idea of ‘mission’ to spread the gospel to those we see as less fortunate than ourselves. They’d return talking about how they ‘learned so much’… but I wonder how much they did learn, really?

I wanted to set this context so that anyone reading may understand my point of view a little better. I’m not saying others who were to go on a similar journey to myself would come to the same conclusions. I know some may read what I say about religion or Christianity and say “well, that’s not my experience”, and that’s cool. This is just me. Find your own way, but don’t let that way be dictated by blind loyalty, dodgy reasoning or a fear of changing your mind. Who knows… letting go of those things may help open the doors to something new.

Uncategorized

Quick (or not so quick) Update.

Here’s what I have planned for this blog in the near future, in case anyone thought I’d given up on it.

Video games: my ’20 Years of PlayStation’ series is still ongoing. Next on my to-do list are two of the greatest horror video games of all time, and two of my favourite games in general: the original Silent Hill (1999) and its 2001 sequel. I figured it would be fitting to get both of these out – or at least one – by the end of the month, as we are in ‘Halloween’ month after all.

Speaking of which, around Halloween time last year, while I was making the case for why the horror genre is not only great but essential, I promised another film essay, focusing on The Babadook. Granted, I kind of slipped on this one, though it’s always been on the backburner, and hopefully I will also have it out by the end of October. Believe me, I’ve thought so much about this film – my top film of 2014 – that it won’t be too difficult getting a detailed analysis down in coherent words and clicking publish. I had in fact already started working on it around this time last year.

Looking back in my ‘film essay’ category I see that I haven’t in fact published one here since last July, which really is too long, especially considering I was going along at a pace of around one per month up until then. There are two others I have planned immediately following the next: Nightcrawler and Ex Machina, arguably two of the most overlooked films of the past couple of years, and certainly two of my absolute favourites, so I want to do them some justice.

Originally I had planned my ’20 Years of PlayStation’ series to, like my plan for film essays, proceed along at a pace of around one per month. Obviously that hasn’t happened for various reasons – not that I’ve just been sitting around, rather I’ve had other things to focus on in the time being – so what I’m going to do with that is, at the very least, get out the two Silent Hill articles (because honestly writing about either of those is an almost limitless joy), then write up something about Final Fantasy VIII (1999), my favourite childhood game and one belonging to a series that frequently splits even its own fans. I’ll be making my case for why VIII, rather than its predecessor, was the peak of the series overall.

After those, I’ll assess whether it’s worth continuing ‘20 Years of PlayStation’ at all. In reality it will probably end with the year 2016 (as we will then technically be into 21 years and so on), and I’ll instead focus on more modern stuff again.

I’ve also been working on an article focusing on the issue of performance enhancing drugs in sport, after a year in which we’ve seen a few high profile cases of doping offences and accusations. That one doesn’t entirely follow the politically correct narrative – I think along the lines of allowing some PED’s to be used in a controlled manner, rather than banning everything outright – but I’m writing it mainly to shed some light on the stuff that people tend to overlook when it comes to ‘cheating’ (the blanket term for any offence) in sport.

Otherwise, there are four other prominent ideas for articles that I want to finish and publish here by the end of the year. Those are, first: a piece tackling the issue of review ethics and people who deride critics for any reason, from simply being a butt-hurt fan to those who accuse us of just being ‘haters’ who don’t know how to enjoy stuff.

I have a strong belief when it comes to critique; that it should not tell you what to think about a film, video game, or whatever the product/ service may be, but rather it should help you develop how you think about them. Reviews above all should inform the consumer – they’re not about telling people what they should or shouldn’t enjoy as if there’s some objective standard. Something I love may be something you hate, because everyone has different tastes; but the detail I give about that thing should be enough to tell you how you’re going to feel about it, independent of my own opinion.

Linked to this but worthy of its own article, I’m going to go into the impact that films, video games and books have each had on me personally in terms of my own development. Certain aspects of modern society actively discourage critical thinking and open-mindedness – in fact, I think it’s always been like this, but today’s culture of political correctness means we hear things like “you can’t say that” more than ever, especially on social media (my advice: whatever kind of person you are, it’s healthy to have less of that in your life).

That’s why I think this is important. Art is vital for helping people think outside the confines of the masses; it’s why I value artistic integrity and freedom of expression so highly. Many people who have a single-minded approach to issues in life, on the other hand, don’t. I heard a statement recently that stuck with me: an open mind is a learning mind. Rarely has a truer statement been made throughout history.

My final two planned articles for the year have been an even longer time coming. They are: my Best Films of 2014, and Best Films of 2015.

Now, obviously I understand that most people who like to do this sort of thing prefer to do an ‘end of year’ list and leave it at that. It’s like a nice way to wrap up the year in film, but for me none of those lists are definitive. Not that I’m saying mine would be, though here’s the thing; I consider a film that comes out in 2014, regardless of where it first comes out, to be a 2014 film.

For example, a film released in the UK in, say, early 2015, yet features heavily in awards season, is undoubtedly a 2014 film (Damien Chazelle’s Whiplash, for instance) – because the Academy Awards reward the best films of the previous year. Said film will have been out in the US a few months before, but many of us living elsewhere would not have had a chance to see it yet, and it is therefore, by default, left off the list.

From my perspective, then, to make a list at the end of a calendar year would feel a little silly, bordering on dishonest, as the best films released in the UK that year would only represent around half – if that – of the year’s best films overall. I like world cinema; films from Europe, Asia, or elsewhere. And usually it takes a year or so to catch up on films from those places as their releases gradually filter out across other regions. I prefer to include those in my lists, as I want the list to be as definitive and conclusive as possible.

The other thing to note is my dislike of limiting said lists to a ‘top ten’, again usually done for efficiency (I understand; critics are busy, and wrapping up a compact top ten list at the end of the year is simpler than the method I’m currently advocating). The ‘best’ films of a year may not be limited to just ten – or perhaps in an extremely dry year, there wouldn’t even be ten worthy of inclusion.

Now, most critics actually agree with this to an extent; hence why they do some ‘honourable mentions’ that don’t quite make the top ten. For me that’s curious (why name-drop if you’re not going to detail your reasons?) but again I sort of understand why one would – it saves time, and essentially a ‘top 10’ is more marketable than, say, a ‘top 13’. I have more flexibility in my personal schedule and don’t see why I would restrict myself in that way when I’m not required to.

So basically, my lists will feature the best films of each year, whether it’s 10, 12 or 15 movies long. The 2014 list is almost ready to go and realistically I hope to have that one posted here by the start of next month. 2015, hopefully by the end of the year, and as for my 2016 list, well, I’m thinking Summer 2017 at the earliest. The good thing is, as I’m about to hit another film festival – my second such event of the year – I’ll have a decent head start on a lot of the biggest films to feature in awards season coming up. I’ll probably be writing an article around Oscar time too that will give large hints as to the films I found most impressive over the past year.

One final thing… I plan to do brief film previews (yes I am capable of writing shorter pieces!) every Friday. This will give me an opportunity to look forward to some new movies that catch my eye – that won’t necessarily get the mainstream marketing treatment – and share it with you guys. I’m frequently finding new stuff to get excited about so there’ll be no shortage of things to write about here, and I figure it might be useful to have a category for which posts are regular and somewhat set in stone going forward. That way, one could turn up here every weekend and know they’re at least getting something new, even if I haven’t otherwise written anything of great existential meaning.

Speaking of existential meaning, I’m off to prepare for one of the best times of the year: London Film Festival.

Video Games

Silent Hills/ PT still haunts us… and it’s glorious.

Rewind back to August 12, 2014 and bring any kind of passing interest in the Silent Hill series… you’d find yourself in the midst of an online explosion of hype surrounding a certain demo codenamed Playable Teaser (P.T.). This was the red herring for Silent Hills; what was to be the latest instalment in the much-loved survival horror franchise about some sleepy mid-west American town perpetually shrouded in fog and inhabited by deformed monstrosities.

If the unsettling atmosphere of P.T. promised us anything, it was that this new iteration was set up to succeed – an anticipated return to form for a series that had long since lost its way after passing into the hands of American developers post-2004.

I’ve talked about the whole episode before on this blog. I was there to take part in the hype when the game was first announced and set the internet ablaze. Similarly when, in April 2015, rumours began circulating that the project had been cancelled in light of Hideo Kojima’s reported issues within Konami and also the studio’s seemingly abrupt change of focus from console to mobile gaming. On April 27, 2015, these rumours were confirmed and Silent Hills officially cancelled.

As of writing, P.T. is no more (I say that in hope that it will, at some point in the future, be made available to gamers once again) – the online demo pulled by Konami as they look to erase any evidence of their past failures. Now the only people able to play it are those who had already downloaded it onto their PS4 systems pre-cancellation. But in its short existence it became one of the most popular horror ‘games’ in PlayStation history, going on to gain undoubted cult status and a somewhat mythical quality that only keeps growing. YouTube videos featuring play-throughs of the teaser (only a 30-45 minute long experience at best) are as popular as ever. Many fans still hope there is some way in which the project might be picked up again; most of us accepted a while back that it was time to move on and Silent Hills likely isn’t coming back any time soon.

A lot of other people, to whom the words ‘Silent Hill’ meant nothing before and don’t suddenly mean anything now, won’t understand the hype. That’s fine; cult movies and video games wouldn’t be that if everyone ‘got it’.

Some of those who did get it have since taken to paying homage to the source material in creative, imaginative ways. If Silent Hills never comes, then it will at least have cast the shadow of what could have been on other artistic endeavours. Such as the video below; a short film that has been doing the rounds online over the past few days. This impressive homage is the main reason I’m writing about P.T. again today. It admittedly made me pine for what’s past, but it also feels like something entirely fresh. This is the kind of mark Silent Hills has left. It is quite possibly a sign of the influence that may yet be felt in horror movies and video games to come in future. In that way, the Silent Hill game that never was may just live on forever. Enjoy!

Film reviews

Crimson Peak.

Crimson Peak pic 1 - Mia.

Guillermo del Toro is arguably the greatest visionary director working in Hollywood today – and Crimson Peak shows his talents at their visually vibrant best.

It should go almost without saying that from a production and costume design standpoint, del Toro’s latest ‘Gothic horror-romance’ delivers in signature fashion. The film is gorgeous in the typically dark, eccentric kind of way one expects from del Toro, and the team working with him on this movie – including composer Fernando Velazquez and Danish cinematographer Dan Laustsen – play vital roles in bringing his vision for Crimson Peak to life.

Laustsen’s cinematography was easily the best thing about the 2006 Silent Hill movie; a video game series in which del Toro himself was briefly involved alongside Hideo Kojima with the false dawn that was the failed Silent Hills project. Others who were fortunate enough to sample the PT demo that remains the only evidence of what the director could have achieved with that title, will similarly pine for it as I did at certain points while watching Crimson Peak. This film features a lot of ‘encountering strange things in corridors’, and one scene in particular has Mia Wasikowska’s character Edith approaching a partially open door before it’s slammed shut by something on the other side. Sound familiar?

Edith Cushing (Wasikowska) is the young woman around whom this film’s narrative revolves, though she is not its most interesting character nor is Wasikowska given much to work with in terms of character development – her most taxing moments being reactions to the events happening around her. Tom Hiddleston on the other hand, provides a much more intriguing character and performance in his role as the charming, shady English aristocrat Sir Thomas Sharpe.

Tom Hiddleston delivers what is undoubtedly the movie's most intriguing performance.
Tom Hiddleston delivers what is undoubtedly the movie’s most intriguing performance.

Sharpe’s overall character arc is by far the most interesting in the movie and is perhaps the strongest element of the film’s script, in contrast to his sister Lucille (Jessica Chastain) who ultimately appears no more than a villainous caricature. Hiddleston’s performance, one of the best I’ve seen from him, makes you feel suspicious, curious and even sympathetic towards his character, and not necessarily in that order.

The story begins with Sir Thomas Sharpe visiting Edith’s father and his business associates, to whom he presents a pitch for his clay mining invention as he seeks valuable investors. Though his proposal is rejected, he and Edith become romantically involved, and Edith soon finds herself visiting the dilapidated old mansion where Sharpe and his sister live. This takes place in a world where ghosts, it seems, are commonplace, and Edith has always been able to see them (though it’s left unclear whether anyone else can). Her harrowing – albeit harmless – encounters with spirits at the Sharpe’s old house make her begin to question the underlying intentions of her newfound romantic interest and his family history…

To be honest, Crimson Peak’s plot is not its strongest attribute, nor is its execution, at least until the exhilarating final 20 minutes. Up until that point, those moments that I touch on above – when Edith encounters strange happenings in the mansion’s corridors – appear simultaneously entertaining as spectacle and lacking in any sort of narrative substance.

The film, despite appearances, isn’t truly a horror movie and part of me feels like it would have been better placed to fully go that route rather than the Gothic semi-love story that it ends up being. Those ghosts are not there as a main attraction; rather they’re present for effect, as a side note to the main attraction. And any time the movie does try to become frightening, it does so by utilising those dreaded jump scares that too many American horror movies fall back on.

Ironically, although Crimson Peak made me pine for what Silent Hills could have been, the main thing it lacks is what the PT demo had in abundance: atmosphere. This film has some nice effects and looks beautiful, no disputing that, but it all feels so surface-level. There’s a severe shortage of subtle undertones or metaphorical storytelling – major plot points are basically spoon fed to the audience.

Visually this film is stunning, and for many people that will be enough to make it a satisfying experience.
Visually this film is stunning, and for many people that alone would be enough to make it a satisfying experience.

I was left feeling slightly underwhelmed by Crimson Peak in the end – which is not to say there wasn’t parts of it I very much liked, of course. Part of me just really wanted to like it even more; to name it as a clear ‘film of the year’ contender perhaps… but ultimately the film fell just short of that for me. While I definitely see it as a strong contender to win two or three Oscars next year, it didn’t truly resonate on an emotional level quite as much as I had hoped.

8 / 10

Video Games

20 Years of PlayStation.

PS logo.

Yesterday marked twenty years since the original PlayStation was first released here in the UK.

I didn’t get in on the action until three years after that; the PlayStation being my first official game console (not counting a borrowed Game Boy) in 1998, around the time Metal Gear Solid was changing how people looked at the previously infantile industry. No doubt about it – Sony’s PlayStation was at least partially responsible for making the gaming medium seem mature and even ‘cool’.

I wanted to mark this special, momentous anniversary with something kind of unique. But I couldn’t think of anything, so instead I’m just going to write some more about video games – in particular those that have been synonymous with PlayStation over the years – over the next few weeks and months on this blog. I might even make a new category for it.

Games I’ll cover during this period will include:

Resident Evil (1996), Final Fantasy VII (1997), Metal Gear Solid (1998), Silent Hill (1999), Final Fantasy VIII (1999), Final Fantasy IX (2000), Silent Hill 2 (2001), Grand Theft Auto 3 (2001), Metal Gear Solid 2: Sons of Liberty (2001), Kingdom Hearts (2002), TimeSplitters 2 (2002), Grand Theft Auto: Vice City (2002), Resident Evil 4 (2005), Shadow of the Colossus (2005), Final Fantasy XII (2006), Okami (2006), Journey (2012), The Last of Us (2013), Grand Theft Auto V (2013)…

Note that this list will likely be adapted in the near future (I’m sure there’s some I’ve left out), but for now these are the games that come to mind when I think of how the PlayStation has impacted me personally. Also bear in mind that this is not supposed to be a list of the best PlayStation games – I wouldn’t claim to have played enough to make that kind of call. Though I think this is at least a list of some of the most important games to have been released in the past 20 years, not just in relation to PlayStation but for the industry as a whole. I’ll aim to explain why I think so as I cover each one – and yes, they probably will come across as essays. I have included games in this list about which I feel I have something useful to say – indeed, most of which I feel have something useful to say to us – and I will try my best to say it without pandering to those who have the attention span of a fish.

If the list seems to become more sparse as the years go on, that is quite simply because I think the general quality of games (on console at least) has somewhat declined in those years. This may seem an outrageous claim considering the fact that games are technically ‘better’ today than they’ve ever been. But consider, for a moment, the reasons why you might think that. Consider the trends that seem to have gripped the industry in the past ten years.

Or don’t worry about it right now and just bask in the nostalgia, as I am doing. At 25 years old, I not only feel like I grew up with these games – when it comes to PlayStation I literally did grow up with it. We have matured alongside each other, and to look at the PS4 now without truly admiring where it came from would be an injustice that I’d like to think current and future generations won’t suffer. But I realise, in reality, how fortunate I am that this is the time I’ve grown up in. It will never quite be recreated because it was our time. No one else’s. The era of the PlayStation has run its course – or perhaps, in a sense, it is only just beginning.

One final point: I won’t be tackling the above games in the order I’ve listed them. This is partly because doing so would be formulaic and possibly boring. I will cover each of them as and when I feel it is relevant – in other words, precisely when I feel like it.

Video Games

A few brief words on Satoru Iwata.

Iwata pic 1.

Aside from owning the original Gameboy back in the day, which was used almost exclusively for Pokemon Blue (and then Silver) as well as the occasional game of Tetris, I’ve never been a big Nintendo guy.

My first home video game console was the PlayStation in 1998. The NES and SNES were well before my time, while Nintendo 64 was very much in second place behind Sony’s console in the market by the time I became truly intrigued – indeed it was the PlayStation’s impressive selection of games (Metal Gear Solid, Resident Evil 2, Final Fantasy VII were already at the height of their popularity, with Silent Hill to grace the landscape soon after) that first got me so interested in what the industry could be capable of. Gamecube would find itself in a similar spot next to the PlayStation 2 a couple of years later, its faltering position exasperated by Microsoft’s entrance into the market with the Xbox.

Enter Satoru Iwata, who took over as President of Nintendo in May 2002. Not to say that’s where his legacy with the company began, of course; in 1999 he had assisted in the development of Pokemon Gold and Silver, creating a set of compression tools that made it possible for the game to become almost twice the size of its original potential. That’s right – you have Iwata to thank for the exceptionally huge post-game of those titles, in which you were able to explore a whole other region after completing the main story. To this day Pokemon Silver is still one of my all-time favourite games, due in no small part to that aspect of it.

Iwata become an official Nintendo employee in 2000, working as head of its corporate planning division. During his time in this role, profit increases up to 41% over a two year period were attributed (at least in part) to him. Such was his success and soaring reputation, that Hiroshi Yamauchi (Nintendo president between 1949-2002) eagerly gave him his blessing to succeed him in his role. This was especially significant as it marked the first time Nintendo had a president outside of the Yamauchi family line. It was a role Iwata held right up until his death earlier this month (July 11), and he certainly made it his own.

During his tenure Nintendo re-established themselves as a major player on the home console market, as well as further strengthening their hold on the handheld one. The Nintendo DS released in 2004 and has gone on to become the best-selling handheld console ever, and the second best-selling console overall behind the PlayStation 2. I only recently invested in a 3DS myself last year for the nostalgia trip that was Pokemon Alpha Sapphire, and I don’t regret that decision one bit. Ten years on, the DS console line is still going strong, with only incremental improvements necessary to keep it feeling modern and up to date with the competition (though that competition is admittedly light at the moment).

The Wii came out in November 2006, pretty much alongside the PS3, and though Sony’s console took a while to settle, Nintendo’s hit the ground running, aiming for a broader family-oriented demographic. Now, I admit never having liked the Wii very much for this reason – it doesn’t exactly help the general impression that all games should be kid-friendly. For as much as there is a place for that, I believe there should also be a place for more mature gaming experiences that tackle more serious issues, and the blatant success of the Wii may have put us some years behind on the industry getting to that point in a wider context.

When most parents think of video games now, the Wii probably comes to mind – along with the relatively harmless games that came with it, which the whole family can enjoy from a six year old to your grandmother. This kind of stereotypical image is partly why Grand Theft Auto V still garners so much widespread controversy despite having a clear 18 rating on its cover (hint: that means it is unsuitable for children).

Still, my personal gripes shouldn’t take away from the Wii’s success. It owed this directly to Iwata, whose decision to aim for a more casual gaming market smartly meant Nintendo would no longer be fighting a battle they couldn’t win against the Xbox 360 and PS3, who kept their focus on hardcore gamers (i.e. the 16-49 year old male demographic, which one could argue is just as detrimental to the industry long-term as my aforementioned gripe about focusing on families). The Wii’s release and subsequent success helped to almost double Nintendo’s stock price – another sign of the company’s upward turn under Iwata’s leadership.

Of course it was not all good; though even during the slight downturn for Nintendo in more recent years (the Wii U, released in 2012, has been underwhelming in comparison to the Wii’s success and found itself falling behind in the console race once Xbox One and PS4 arrived on the scene), Iwata led the company with dignity and retained the confidence of his employees. In 2011 he voluntarily cut his salary by half in response to poor sales – and did the same thing again in 2014. But in truth, by this point his legacy at Nintendo had already long been cemented.

There is a quote from Iwata, which he made at the Game Developers Conference in 2005, that for me sums up precisely why he had such success and respect from his peers within this industry: “On my business card, I am a corporate president. In my mind, I am a game developer. But in my heart, I am a gamer”.

This man was a gamer first and foremost, and that is why I felt the need to write a little bit about him now. It may sound like a given, that someone working in such a prominent position in this industry would also have been a gamer, but the current market trends of games being released broken (which, by the way, Nintendo simply don’t do) and over-priced DLC tells me the business men in this industry are no longer in step with gamers. They are merely men who know business, and were brought here simply because they saw potential in video games to be one of the most profitable industries on the planet. It is certainly that, but unless we see more men like Satoru Iwata around these parts again soon, I doubt it will retain its heart for much longer.

Film reviews

A Girl Walks Home Alone at Night.

Girl walks home alone pic 1.

Suspension of disbelief is crucial for viewing a lot of films – but with some it is distinctly more vital than others. Ana Lily Amirpour’s directorial debut (feels like we’ve had a lot of those recently), A Girl Walks Home Alone at Night, is among the best recent examples of this.

Set in the fictional Iranian ghost town of Bad City, the film follows the exploits of a mysterious young woman known only as ‘the girl’, who spends her nights stalking men with evil intentions and scaring little boys so they don’t grow up to become one.

It is shot entirely in black and white, which goes some way to creating a dream-like atmosphere reminiscent of David Lynch’s Eraserhead; that it takes place in an industrial setting, full of the sounds and imagery of that environment, also recalls Lynch’s own groundbreaking debut. Yet A Girl Walks Home Alone at Night bears its own unique signature, and never comes across as some half-hearted wannabe.

To sum up that unique signature is difficult to do efficiently with mere words. But while watching this movie, one can’t shake the feeling they’re witnessing something special – especially in the face of all else on offer in the film industry today. Something that perhaps requires a little patience to connect with, but which is guaranteed to win at least a part of your heart if not your mind. Amirpour’s film does not worry itself too much with narrative; instead focusing on mood, atmosphere, memorable characters and raw emotion.

As a visual experience it is rich, in a rudimentary and refreshingly ‘primal’ way. The lack of colour suits the mood; this is not a film that needs to be bright in order to catch your undivided attention. It wears its heart on its sleeve; the ‘smoke and mirrors’ often offered by Hollywood stripped away here to reveal what could almost be considered an anti-narrative, for which the most important details are frequently left unsaid.

What isn’t said is left to be communicated through body language. The film’s two main actors, Sheila Vand as ‘the girl’ and Arash Marandi as the young man with whom she forms an unlikely connection, both do a fantastic job with ambiguous characters.

Vand plays a character who is at times methodical and dangerous, yet at times naive and innocent. Her actions frequently shock (as you may not initially see them coming) but her motivations, once we have quickly worked out what they are, have a sympathetic quality. This film does have a moral compass; the violence it shows is not gratuitous, Vand’s victims are not undeserving of their fate, and its feminist theme is undeniable.

Though ultimately, A Girl Walks Home Alone at Night’s strongest quality is the element of fun running through it from beginning to end. It is a mix of multiple genres, probably best described as a static road movie built on the foundation of horror, with a trimming of comedy, wrapped in satire, complemented by a flavouring of romance that emerges out of tragedy. Oh, and its main character is a vampire with a curious taste in music.

If any of those traits sound like your kind of thing, you’ll certainly find something here to enjoy.

9 / 10