At face value, a review for this game should be easy. Ask yourself one question: would you consider yourself a Final Fantasy fan? Or have you never touched one of the 15+ main series’ titles in your life?
If you fit into the latter category, rest assured World of Final Fantasy offers little to convince anyone this is a good starting point. This is unashamedly a Final Fantasy title for the fans, of which there are many. So I’ll continue on assuming that anyone reading beyond this point has at least a passing interest in the series.
To set some context here, I’m the type whose introduction to this series was during its initial PlayStation run (I wasn’t yet born when it began), more specifically Final Fantasy VIII. I’ve played every title since, apart from exclusively online titles XI and XIV, and even ventured back to sample earlier games IV, V, and VI. I think it’s fair to say, then, that I belong around the middle of the spectrum, not quite a hardcore fan who’s played every title to completion, but someone with a good working knowledge of the series and proficient knowledge of those titles I have played. In other words, I’m enough of a fan overall to appreciate many of the references included in World of Final Fantasy, though a few did admittedly fly over my head.
At its core that’s essentially what this game is: fan service, in its self-referential nature, gameplay style, and characteristic meandering plot. Initially it feels rather like 2000’s Final Fantasy IX, itself seen as a romantic title harking back to the series’ earlier days before it moved on to the ‘new’ era with Final Fantasy X on the PS2. This is most obvious in its retro-feeling visual style, which is undeniably charming and, in its own way, beautiful.
Character models verge between cartoonish, child-like and disproportionate (a slight nod back to the 2D era of gaming when every character’s head was as big as the rest of their body) to more realistic and evenly proportioned. This is worked into the story, as the two main characters you control during this game, Reynn and Lann, can transform between the former (known here as ‘Lilikins’) and the latter (known as ‘Jiants’).
While your party is, on the surface, restricted to these two characters for the entire game, this gives way to arguably this game’s best attribute, at least in the earlier stages. That is its battling, which revolves around catching ‘mirages’ (basically, monsters with a name that won’t put children off) and stacking them within your party.
Every mirage has different strengths, and they’re split into three different sizes – small, medium and large – which naturally leads to multiple playing styles and tactics you can employ. Will you stack mirages with similar strengths, or try to balance out their weaknesses? Fans will enjoy the nostalgic designs; being able to get on top of a Malboro’s head in battle is just one of the small joys this game offers. The return of turn-based battles, random encounters and the ‘active time battle’ system from earlier titles is also strangely refreshing, while levelling up occurs on a board similar to the method used in Final Fantasy X and XIII.
The system is not perfect, though. As every mirage begins at level 1 from the moment you catch it, you’re going to have an issue if you find one in the late-game that you wish to add to your party. In my case I had a few different stack selections I was happy with relatively early on, and didn’t vary them much beyond the half-way stage of the game. On the flip side, battle difficulty also appears to wane slightly as you get further on and your team becomes more powerful, when traditionally the opposite is true of RPGs and even other Final Fantasy games. This may be due to the game trying to appeal more to kids and not wanting to be too hard for them to get through it, though it is still interesting enough overall to appeal to more mature players – if not those who prefer a hardcore challenge.
While the gameplay represents World of Final Fantasy’s most addictive aspect, the characters you’re playing with represent something else entirely. Brother/sister duo Lann and Reynn are generic and stereotypical, the former filling the role of an ‘annoying brat’ and his big sister being the typical know-it-all. But far and away the most irritating aspect of this game is their companion (i.e. mascot) Tama, an overly cutesy mirage who places ‘the-’ in front of random objectives every other sentence (she’ll regularly say stuff like “we have to the-run” or “time to the-catch the mirage”). Thankfully there’s an option to skip dialogue, and I wouldn’t blame you for doing that every time Tama starts talking. Even the average 10-year-old I doubt would find it enjoyable.
Generic and annoying main characters aside, others you meet on your journey around the land of Grymoire – basically an amalgamation of various regions from past Final Fantasy titles – help keep the experience fresh. These include famous protagonists from the series’ history such as Cloud (VII), Squall (VIII), Tidus and Yuna (X), Lightning and Snow (XIII), as well as several older characters who I didn’t initially recognise (though the first ‘summon’ you get in this game is a wonderful throwback to the original Final Fantasy; even I could appreciate that).
The game has a surprisingly clever sense of humour and regularly pokes fun at itself – including THAT unbearably awkward laughing scene from Final Fantasy X when you visit the region of Besaid from that game. Every time you catch a new mirage, you’re given a short description that may reference the monster’s past in other games, and the subtle jokes inserted in there never failed to make me chuckle. It is in this aspect that I think the game appeals to more mature players, as there’s no way kids are going to get the humour in most of the references. I certainly enjoyed this element of the game immensely.
Unfortunately, the backdrop to these wonderful references and nostalgia trips is a rather uninspiring plot that becomes unnecessarily convoluted the further you go in the game. This, like the annoyance one feels toward the central characters, exposes World of Final Fantasy’s weakness: its original elements (i.e. when it isn’t relying on nostalgia, borrowed characters and ideas) are utterly forgettable.
But for most fans, I daresay that won’t be a problem. It certainly didn’t stop me enjoying the overall experience for what it was. In fact, there came a clear emotional point in this game for me in which I couldn’t help but react with the kind of pure nostalgic joy that I haven’t felt since revisiting Shadow Moses in Metal Gear Solid 4. Obviously I won’t spoil it here, but I will say it was upon visiting a well-known location from my personal favourite game in this series, Final Fantasy VIII, at a pivotal point in the story. Yes, it was a joyful fanboy moment, and I have few of those.
So naturally I will be grading this title on a curve, the caveat being that those who aren’t quite as big a fan of Final Fantasy may very well find their overall enjoyment of World of Final Fantasy affected by that. Technically this is a game with a few glaring flaws, but one that has the priceless value of nostalgia thanks to the extensive back catalogue the Final Fantasy series has built up over its 30-year history. Catching and battling with mirages admittedly has the air of Pokemon about it as well; you can even ‘transfigure’ them into larger mirages when you level them up or obtain certain items. For completionists, there is an unmistakable joy to be found in discovering them all. For the rest of you, you may just be left wondering what all the fuss was about.
Elle, a 2016 French psychological thriller directed by Paul Verhoeven and starring Isabelle Huppert in a role for which she’s become a surprising front-runner for the Academy Award for Best Actress, has quickly become one of my most anticipated movies of the year. Unfortunately, as it’s not due for release in the UK until March 10th, I likely won’t be seeing it until after the Oscars have been handed out on February 26th. But based on what we know of this film thus far, it deserves the recognition it gets, and bearing in mind the subject in question, I’m still somewhat taken aback that it has gotten such attention in the first place.
The film’s central character, played by Huppert, is the female head of a video game company. Themes tackled include rape, violence and murder, involving Huppert’s character whether directly or indirectly, which seem interesting if only for the reason that these are themes associated negatively with the video game industry in recent years. This is no coincidence I’m sure, and I’m intrigued to find out just how Elle tackles these issues – that, for me, will make or break the film, as I have my own strong feelings on the matter.
I’m making an educated guess that the movie tackles them intelligently and maturely, hence my eagerness to see it. Whatever the case may be, Elle promises to be a thought-provoking film for UK audiences to look forward to. No doubt you’ll be hearing more about this one in the weeks to come.
I’m one of those people who needs to write. I’d go so far as to say my health – perhaps my very survival – depends on it. That isn’t me trying to sound melodramatic.
No, I, like many other writers, consider writing not simply a hobby or a method of making money, though it can and does fit easily into those categories. When I say I’m a writer, I’m saying it is as important an activity as eating or sleeping; to go without it for too long leads to moodiness and agitation.
Naturally with writing, one also ends up reading. To write means to record ideas on paper, and one can’t do that unless you’ve first gathered inspiration to form ideas worth recording. Ideas, information and knowledge are things I’ve treasured along with writing from a young age. In more recent years I’ve also become passionate about the importance of establishing the difference between ‘opinion’ and ‘fact’, whether historical or current, and the methods by which we go about establishing said distinction. Do you believe stuff based on evidence? What, in your mind, constitutes evidence? Hint: it isn’t always what people try to say it is.
To think about this is, I think, especially vital in the age of the Internet, where we’re exposed at ease to many opinions often presenting themselves as fact, and vice versa. Unless you want to believe everything, or nothing, or just stick to the inherent bias you grew up with, then you had better develop an eye for what constitutes evidence and a good argument. There’s a lot of bullshit out there, but that’s not to say I don’t value the Internet extremely highly; it has led to my generation becoming arguably the most open-minded of any generation before us. Growing up with so many easily accessible ideas around us has, in general, been healthy.
I find it hard being around people who do not care about these things, who may accept ‘truths’ just based on bias rather than applying critical thought; I find it offensive, and insulting, to see and hear that kind of thing in my presence. I’m not easily offended but this, you could say, is one of my ‘triggers’. Journalistic integrity and freedom of speech are two of the absolute pillars of a free-thinking society, while censorship lies at the opposing end of the spectrum (to be clear, by censorship I do not mean age ratings on products like movies and video games, which are often helpful and entirely necessary).
In my mind the acts of writing and critical thinking go hand in hand, though I know this is not the case for everyone – as I have read plenty in which it was clear the writer was not a critical thinker. Nor must one be particularly intelligent to write a lot, though to be a good writer (volume written doesn’t necessarily correlate with quality content) requires knowledge, not only of your craft but of the world around you.
Naturally then, the best writers also tend to be among the smartest, though it would depend on your point of view pertaining to how we should judge this kind of thing. Do we judge a writer by how clear and concise their style, or by how much knowledge they communicate through it? I suppose the best of them have both qualities. I certainly like to strive for both.
I grew up in a relatively ‘free’ family environment, with parents who weren’t overly strict and didn’t force any particularly weird rules upon me. It was an environment in which I was free to play video games, watch films, and read books without having to worry about which ones were ‘banned’, though at the same time neither of my parents were especially interested in those things and did not therefore instil any inherent bias for or against either. Each medium played their part in helping me grow up relatively open-minded and with an understanding that the world was bigger than my own little bubble.
To an extent, I do consider an open mind to be a privilege; one that many other people who grow up in different family environments aren’t encouraged to have (not that I was particularly encouraged towards it, but it wasn’t heavily discouraged either). Would I really have had the same learning opportunities, the same privilege of experiencing different sides to the world at an age where my mind had not yet grown hardened to them, had I grown up in a strict religious family for example? Likely not.
I find it a great shame when parents take it upon themselves to mould their children into who they want them to be (“for their own good!”), rather than allowing that child the space to discover themselves as an individual. This doesn’t just happen within fundamentally religious families either, and it isn’t always obvious. But as the subject of religion is a sore point for many, including to an extent myself (which I will explain a little further on), let’s stick on it for a moment.
Looking across the history of Western civilisation, our society and culture in the UK, US and Europe have been moulded by Christianity to the point where people have grown up believing – often subconsciously, before coming to ‘know Christ’ and being ‘born again’ later – in God, particularly the version of him portrayed in the Bible. Horror movies and literature in the West often portray demons or the devil himself as the source of all evil. In a court of law, people must place their hands on the Bible in some vague appeal to their conscience; a reminder that God is watching and they’ll be somehow punished for not telling the truth in front of Him.
Not that I want to get too deep into that issue here; what I’d rather do is illustrate how our ability to be open-minded about stuff can be inhibited simply through the culture or environment in which we grow up. If you grew up in the UK like me, you’ll be familiar with our inherently Christian culture. The US is similar, if not worse when it pertains to a Christianised culture, though the secular/ religious divide is arguably more extreme (or at least, more vocal) there as well. The UK, while moderately liberal, is also less willing to voice concerns over things like our monarchy, when we really should.
Now, I think it’s fine for people to acknowledge they’re not ‘open-minded’ about certain things, so long as they are aware of it. PC culture would dictate that we need to be respectful of everything, to the tiniest detail, but we’re all inherently different to begin with and naturally aren’t all going to see things in the same light. Some people don’t like swearing, others do. Some of us like eating meat, others don’t. People on either side, or somewhere in the middle, should be able to live how they want. Don’t rely on the approval of others for that. Equally, don’t expect everyone to be fully accepting of it.
Each of us have our inherent biases; open-mindedness is being able to recognise that bias and acknowledge there are people who’ll be coming from a different point of view. So long as that point of view doesn’t cause or advocate harm to others – which, again, is where religion can pose a bit of a problem – there’s no reason we can’t all respect each other as fellow humans while acknowledging our differences and not getting offended over stupid shit.
My own bias plays in to how I’m writing this article. Why is it, for example, that I feel the need to say swearing is okay, when really most people don’t need to be told that to do it anyway? Or why I focus on the importance of respecting points of view other than your own? Well, it comes back, again, to religion, more specifically Christianity; a religion that did not dictate too harshly how I should live my life growing up, but did at least subtly hold me back from fully expressing myself. Looking back on it, and seeing the effect it has on others as well, it’s clear this is what it does.
The unique thing about Christianity – at least, the Protestant side of which I have direct experience – is that it does not say you must obey its rules, and yet you kind of do, because if you don’t, it means you don’t really love Jesus and will go to hell anyway. A little slip-up is okay, but you must live the correct lifestyle consistently if you’re a ‘proper’ Christian. And boy, being told you’re “not a true Christian” is regarded as the highest form of insult. It’s something they’ll use against me, to discredit my own experiences, because in their eyes only someone who was “never a true Christian” could ever wish to turn away from it.
In many cases, Christians will use that one line as an all-encompassing excuse not to truly engage with issues raised by those who disagree with them. In fact, in my years of being around Christians, I can say in hindsight that many of those relationships, in the interest of ‘accountability’, involve carefully examining each other to determine whether one is a ‘proper Christian’, and each will make their determination, whether privately or publicly, about whether someone else is.
Christianity is supposedly about choosing to do the right thing through your own free will. But free will, of course, only goes as far as our inherent bias lets it – and this religion knows that all too well. It teaches the ultimate form of bias – that when we get to heaven, we’ll want to obey God without question, out of free will, because that will be our inherent nature. For now, on earth, we must deal with our ‘sinful’ nature, which wants to do bad things against God.
I’ll continue on that diatribe another time – there is so much more to say – but for now rest assured I’ve managed, though it has taken a lot of work, plenty of inner conflict, self-justification and thorough research, to largely let go of the hold Christianity had over me growing up and even up until a couple of years ago. Which isn’t to say, of course, that I have anything against Christians as people, though they can’t seem to help but take it personally (and I suppose one can’t blame them, if they believe with honest conviction) when others tell them they think their religion isn’t true.
The single biggest factor in breaking free from the confines of certain aspects of a religion, or anything else, relies on someone being open-minded enough in the first place to even consider whether they might be wrong. Of course I’m not saying that one necessarily leads on to the other (plenty of open-minded Christians have helped carry it out of the dark ages – while many conservatives/ traditionalists/ fundamentalists would claim that’s precisely the problem), but it’s certainly rare for anyone to leave their religion unless they’re open-minded enough to consider something other than what they’ve been conditioned to believe is true. They could, having considered everything else, still settle on Christianity being the truth, and I wouldn’t begrudge them that; it’s their prerogative to believe what they want, just as it is mine.
But if you consider it impossible for yourself to be wrong about something as ‘big’, as important as this, then you’re going to see opposing viewpoints through that specific lens. And naturally you’re going to shut yourself off from learning specifically why people might hold different points of view, because in your mind, in your version of reality, they’re already wrong and you – say, through the Bible – already have all the answers you’ll ever need.
Or maybe it’s more that, deep down, you’re terrified of realising you were wrong, having to admit it to others, and the damaged relationships that would inevitably result from that. I can understand that concern. I’ve seen it before, in people who stick with the Christian lifestyle not because they passionately believe in it, but because they perceive it to be simpler than the alternative, especially if they have a family of their own or friends who look up to them for spiritual support. The amount of Christian pastors hiding this kind of secret – feeling the weight of responsibility to ‘lead the flock’ and fear of letting them down – would shock the everyday church-goer.
I have realised I may need to pad what I say a little here, for those who may not know the full context surrounding my current opinions. First, if it seems I am overly negative towards Christianity, now or at any point to come, this is not necessarily an attack on its principles or even on the faith itself. Many Christians I’ve known are the liberal type who do not adhere strictly to everything the Bible says, or take what it says literally in the face of all scientific evidence to the contrary. Those people are Christian simply because the lifestyle makes most sense to them, and that’s fine.
However, let’s bear in mind what I said about bias. I am a UK citizen, yes, but more than that: I was born and lived in Belfast, Northern Ireland up to the age of 18, at which point I moved over to England for university.
Now, I’m going to assume any potential readers won’t quite realise the significance of that, so I’ll divulge some more. In Northern Ireland, as most people will know, we have a bit of a history of conflict; a kind of Irish ‘civil war’ as such, originating from when Ireland joined the UK a few centuries ago largely against the will of the Irish people. Long story short, back in 1922 the Irish Free State was formed as Ireland won some measure of independence from Britain (though they still had to abide by an ‘oath of allegiance’ to the UK until achieving full independence via a referendum in 1937).
At the same time, the predominantly unionist (that is; loyal to the union of the United Kingdom) six counties of Northern Ireland decided they wanted no part of Irish independence from the crown, and this country itself was technically formed in 1922 as well. Republicans (that is; those who are committed to seeing a fully independent Irish republic) have always held issue with this, just as unionists held issue with southern Ireland trying to take what they saw as their British identity. Even today, Northern Ireland sits in a unique position, in which its residents can claim to be Irish or British and neither would be lying; we are, after all, entitled to dual citizenship from birth should we so wish to claim it.
A large part of the origins of that conflict between Ireland and the UK was this: Ireland was largely a Catholic country, whereas the UK, at that time in the 1700s and continuing since, was protestant. So while technically you could say that means they were both ‘Christian’, no. Trust me, growing up in Northern Ireland it’s impossible to see ‘Protestant’ and ‘Catholic’ as equally Christian. You’re either on one side or the other, and our version of ‘peace’ is tolerating the other side while those old grievances still reside in the back of our minds.
For me growing up in a predominantly Protestant area, I naturally also grew up with that bias. But now, at this stage of my life, I see it all for what it is. Some others of my generation – usually those who have not ventured outside Northern Ireland to live for any sustained amount of time – still hold that strong sense of bias, and probably always will, as I firmly believe it becomes harder and harder to let go of built-in beliefs the older you get. None of us want to feel we wasted years of our lives being wrong about something after all, so as time goes on we’re more likely to make excuses to ourselves that help us keep believing it, partly also for the pride of being known as someone who ‘sticks to their convictions’ rather than someone who ‘flip flops around changing their mind’.
The elephant in the room when it comes to religion and conflict in Ireland, of course, is the claim I made to myself and others for many years: that the violence perpetuated in the name of God was committed by those who “weren’t truly Christian”. This is like what I said before; Christians justifying actions they don’t like by those who seemingly share their faith by simply disregarding it as “not the God I believe in”. If other believers aren’t acting the way you think they should, just keep yourself happy by saying they’re not ‘proper Christians’ and move on, free of any guilt and/or responsibility on the part of your own personal faith in God. Something similar is happening on a more global scale with Islam currently, but I won’t be touching that hot topic here.
Obviously we shouldn’t paint everyone with the same broad brush. We’re individuals, and we’re human, which means we all have different tendencies. Some of us gravitate more naturally to violence, though again there are environmental factors influencing that. Still, it’s undeniable: the Irish ‘troubles’ have their origin firmly rooted not just in patriotism but in the religion that goes hand in hand with it.
Christians on the outside looking in may try to justify their own belief in the loving nature of God by claiming they don’t represent him, but that’s precisely why they were fighting. Unionists would resist Irish rule “for God and country”. In their place would you not do the same to defend your own deeply rooted convictions/ beliefs? The men on the ground, murdering each other for a higher cause, were doing it because they believed it was God’s will in both cases, on either side – and it would not have been uncommon to see those same men in church on a Sunday morning, having taken part in terrorist acts during the week and planning more for the week to come.
All of this leads up to where Northern Ireland stands today. Belfast itself is an impressively modern city, attracting tourists from around the world and parts of it, particularly the city centre, looking a world away from the depressingly grey colours associated with the 1970s. I truly enjoy being back for the most part.
But it’s not all great. Our government serves as a stark reminder of our recent history, not only in its finely balanced unionist/republican divide (to get into the intricacies of it would be too complicated a matter to delve into here) but in the hold that religion has over us. Gay marriage is still illegal and our majority party, the DUP, have vowed to continue blocking it (while consensual gay sex was only decriminalised in 1982). Abortion is only legal under extremely strict criteria, and Northern Irish women often need to travel to England for private treatment to carry one out. Bars and clubs are forbidden from serving alcohol before 11.30am (whereas in England you can grab a beer from 7am in Weatherspoon’s if you feel so inclined).
Whether you feel strongly about the above issues or not, it’s indisputable that Northern Ireland feels a little left behind, even when compared to other regions within the United Kingdom. Of course, we have enough conservative Christian unionists living here that our population is generally happy with things as they are, as they see it as sticking to the rules set out in holy scripture. For me, I feel almost embarrassed by this stuff, and can’t see myself ever coming back to live long-term in Belfast unless certain things change.
Living in England introduced me to many Christians who were more open-minded than the kind of Christianity I’d always known in my homeland. And well, I’ve simply carried on from there, never really wanting to stand still, always keen to learn more. I don’t feel any blind loyalty to one way of thinking, and I don’t consider myself a nationalist in any sense of the word.
There’s one other element that went into all of this that can’t be discarded; in fact it may be the most important one of all. I mentioned earlier, near the beginning, how films and video games had been an important part of my childhood. One can’t be truly passionate about either of these mediums without encountering other cultures in the process. Two of my favourite video games, for example, are the survival horror game Silent Hill and its classic sequel on the PS2 (both developed in Japan), which first introduced me to the subtle elements of atmospheric horror unique to Asia.
Around that time, J-horror was also starting to take the film industry by storm, with Hideo Nakata’s Ringu inspiring a 2002 Hollywood remake starring Naomi Watts. That ended up being rather short-lived, with Ju-On: The Grudge (2003) and its 2004 American remake coming along at the tail end of it, but it can be attributed to sparking my interest in Japanese cinema and, more broadly, Asian culture. Why is this significant? Well, naturally, the more you see of the world, the less you feel you lie at the centre of it. Perhaps something I read recently can help sum it up; “A stolid attachment to a monolithic set of institutional forms becomes much more difficult when one is constantly faced with the beliefs and disbeliefs of many other traditions” (from Ghosts and the Japanese, Michiko Iwasaka and Barre Toelken, introduction).
This, I believe, is why many Christians steadfastly refuse to openly engage with other ways of thinking; deep down they know it could lead to them questioning themselves and ultimately ‘losing face’ should they begin to doubt their own faith. So they build caricatures and stereotypes of other worldviews and belief systems, because that makes it easier for them to paint themselves as the ‘enlightened few’ who have the One truth. Martin Scorsese’s recent film Silence summed up the inherent cultural differences and conflicts between East and West quite succinctly I think.
Sure, Christians may go on ‘missions’ with a view to ‘evangelising’ to those caught up in cultures they see as less enlightened, but they do not truly engage with the existing culture they meet when they get there, aside from the actions one must take so as not to appear awkward – such as taking your shoes off at the door when entering a home in Japan, for example. Even at the peak of my faith I could not help but feel a little awkward and uncomfortable at the idea of ‘mission’ to spread the gospel to those we see as less fortunate than ourselves. They’d return talking about how they ‘learned so much’… but I wonder how much they did learn, really?
I wanted to set this context so that anyone reading may understand my point of view a little better. I’m not saying others who were to go on a similar journey to myself would come to the same conclusions. I know some may read what I say about religion or Christianity and say “well, that’s not my experience”, and that’s cool. This is just me. Find your own way, but don’t let that way be dictated by blind loyalty, dodgy reasoning or a fear of changing your mind. Who knows… letting go of those things may help open the doors to something new.
Here’s what I have planned for this blog in the near future, in case anyone thought I’d given up on it.
Video games: my ’20 Years of PlayStation’ series is still ongoing. Next on my to-do list are two of the greatest horror video games of all time, and two of my favourite games in general: the original Silent Hill (1999) and its 2001 sequel. I figured it would be fitting to get both of these out – or at least one – by the end of the month, as we are in ‘Halloween’ month after all.
Speaking of which, around Halloween time last year, while I was making the case for why the horror genre is not only great but essential, I promised another film essay, focusing on The Babadook. Granted, I kind of slipped on this one, though it’s always been on the backburner, and hopefully I will also have it out by the end of October. Believe me, I’ve thought so much about this film – my top film of 2014 – that it won’t be too difficult getting a detailed analysis down in coherent words and clicking publish. I had in fact already started working on it around this time last year.
Looking back in my ‘film essay’ category I see that I haven’t in fact published one here since last July, which really is too long, especially considering I was going along at a pace of around one per month up until then. There are two others I have planned immediately following the next: Nightcrawler and Ex Machina, arguably two of the most overlooked films of the past couple of years, and certainly two of my absolute favourites, so I want to do them some justice.
Originally I had planned my ’20 Years of PlayStation’ series to, like my plan for film essays, proceed along at a pace of around one per month. Obviously that hasn’t happened for various reasons – not that I’ve just been sitting around, rather I’ve had other things to focus on in the time being – so what I’m going to do with that is, at the very least, get out the two Silent Hill articles (because honestly writing about either of those is an almost limitless joy), then write up something about Final Fantasy VIII (1999), my favourite childhood game and one belonging to a series that frequently splits even its own fans. I’ll be making my case for why VIII, rather than its predecessor, was the peak of the series overall.
After those, I’ll assess whether it’s worth continuing ‘20 Years of PlayStation’ at all. In reality it will probably end with the year 2016 (as we will then technically be into 21 years and so on), and I’ll instead focus on more modern stuff again.
I’ve also been working on an article focusing on the issue of performance enhancing drugs in sport, after a year in which we’ve seen a few high profile cases of doping offences and accusations. That one doesn’t entirely follow the politically correct narrative – I think along the lines of allowing some PED’s to be used in a controlled manner, rather than banning everything outright – but I’m writing it mainly to shed some light on the stuff that people tend to overlook when it comes to ‘cheating’ (the blanket term for any offence) in sport.
Otherwise, there are four other prominent ideas for articles that I want to finish and publish here by the end of the year. Those are, first: a piece tackling the issue of review ethics and people who deride critics for any reason, from simply being a butt-hurt fan to those who accuse us of just being ‘haters’ who don’t know how to enjoy stuff.
I have a strong belief when it comes to critique; that it should not tell you what to think about a film, video game, or whatever the product/ service may be, but rather it should help you develop how you think about them. Reviews above all should inform the consumer – they’re not about telling people what they should or shouldn’t enjoy as if there’s some objective standard. Something I love may be something you hate, because everyone has different tastes; but the detail I give about that thing should be enough to tell you how you’re going to feel about it, independent of my own opinion.
Linked to this but worthy of its own article, I’m going to go into the impact that films, video games and books have each had on me personally in terms of my own development. Certain aspects of modern society actively discourage critical thinking and open-mindedness – in fact, I think it’s always been like this, but today’s culture of political correctness means we hear things like “you can’t say that” more than ever, especially on social media (my advice: whatever kind of person you are, it’s healthy to have less of that in your life).
That’s why I think this is important. Art is vital for helping people think outside the confines of the masses; it’s why I value artistic integrity and freedom of expression so highly. Many people who have a single-minded approach to issues in life, on the other hand, don’t. I heard a statement recently that stuck with me: an open mind is a learning mind. Rarely has a truer statement been made throughout history.
My final two planned articles for the year have been an even longer time coming. They are: my Best Films of 2014, and Best Films of 2015.
Now, obviously I understand that most people who like to do this sort of thing prefer to do an ‘end of year’ list and leave it at that. It’s like a nice way to wrap up the year in film, but for me none of those lists are definitive. Not that I’m saying mine would be, though here’s the thing; I consider a film that comes out in 2014, regardless of where it first comes out, to be a 2014 film.
For example, a film released in the UK in, say, early 2015, yet features heavily in awards season, is undoubtedly a 2014 film (Damien Chazelle’s Whiplash, for instance) – because the Academy Awards reward the best films of the previous year. Said film will have been out in the US a few months before, but many of us living elsewhere would not have had a chance to see it yet, and it is therefore, by default, left off the list.
From my perspective, then, to make a list at the end of a calendar year would feel a little silly, bordering on dishonest, as the best films released in the UK that year would only represent around half – if that – of the year’s best films overall. I like world cinema; films from Europe, Asia, or elsewhere. And usually it takes a year or so to catch up on films from those places as their releases gradually filter out across other regions. I prefer to include those in my lists, as I want the list to be as definitive and conclusive as possible.
The other thing to note is my dislike of limiting said lists to a ‘top ten’, again usually done for efficiency (I understand; critics are busy, and wrapping up a compact top ten list at the end of the year is simpler than the method I’m currently advocating). The ‘best’ films of a year may not be limited to just ten – or perhaps in an extremely dry year, there wouldn’t even be ten worthy of inclusion.
Now, most critics actually agree with this to an extent; hence why they do some ‘honourable mentions’ that don’t quite make the top ten. For me that’s curious (why name-drop if you’re not going to detail your reasons?) but again I sort of understand why one would – it saves time, and essentially a ‘top 10’ is more marketable than, say, a ‘top 13’. I have more flexibility in my personal schedule and don’t see why I would restrict myself in that way when I’m not required to.
So basically, my lists will feature the best films of each year, whether it’s 10, 12 or 15 movies long. The 2014 list is almost ready to go and realistically I hope to have that one posted here by the start of next month. 2015, hopefully by the end of the year, and as for my 2016 list, well, I’m thinking Summer 2017 at the earliest. The good thing is, as I’m about to hit another film festival – my second such event of the year – I’ll have a decent head start on a lot of the biggest films to feature in awards season coming up. I’ll probably be writing an article around Oscar time too that will give large hints as to the films I found most impressive over the past year.
One final thing… I plan to do brief film previews (yes I am capable of writing shorter pieces!) every Friday. This will give me an opportunity to look forward to some new movies that catch my eye – that won’t necessarily get the mainstream marketing treatment – and share it with you guys. I’m frequently finding new stuff to get excited about so there’ll be no shortage of things to write about here, and I figure it might be useful to have a category for which posts are regular and somewhat set in stone going forward. That way, one could turn up here every weekend and know they’re at least getting something new, even if I haven’t otherwise written anything of great existential meaning.
Speaking of existential meaning, I’m off to prepare for one of the best times of the year: London Film Festival.
So I know I’m sometimes late to the party with these things, but I have had a film festival and other projects keeping me occupied in recent weeks. I did consider just leaving this one alone and moving on. However, not wanting to miss an extra opportunity to spread the love to another medium, I thought I’d go back a few weeks and revisit the furore around the 12th annual ‘Video game BAFTAs’.
Now the only extensive coverage I’ve seen or read about this event, held at the Tobacco Dock in London, is that of the BBC (mainly from this article), who in the past haven’t typically been the most objective or fair when it comes to video games.
Historically they’ve shown ignorance toward the medium, including once labelling all Twitch users “teenagers, just clicking away, playing their video games all night” (from BBC Newsnight report ‘What is Twitch?’, watch it here). Need I also remind anyone that former London mayor Boris Johnson thinks video games “rot the minds of children”? Forget about even specifying what kind of games exactly (in which case we might be able to have a debate); just ‘video games’ in general. It sometimes astounds me, though only sometimes, how intelligent people can think so one-dimensionally.
The mainstream narrative is usually simple: only immature teenage boys and children play video games. Frequently you’ll see older high-nosed gentlemen – who you know have never properly took a controller in hand themselves – reporting on them, and in not really knowing what they’re talking about, they fall back on out-of-date stereotypes.
This time, however, I was pleasantly satisfied by most of the BBC’s coverage, aside from a rather aggravating Radio 4 interview that we’ll get to later. Of course if you’re taking the time to read this, chances are I’m preaching to the converted anyway and you’d agree that video games are at least worth reading about. I digress with the above diatribe only because I still find it a little surprising that despite the negative press they frequently receive, video games are acknowledged as being worthy of these ceremonial awards at all – though having only started in 2004 I think they were nonetheless late to the party.
Of all the games nominated at the ceremony, only a couple I’ve played myself – Life is Strange, winner of Best Story, and Until Dawn, winner of Best Original Property. Metal Gear Solid V is on my playlist; quite a few others I’ll get around to in due time. After all, it took me a while to catch up on Brothers: A Tale of Two Sons, one of my favourite games of the past few years and winner of Best Game Innovation in 2014 (it’s certainly deserving of that award and more).
Needless to say games are more time consuming than films; they’re also more fulfilling in the long run. And it’s vital to experience them in visual over written form; hence why I don’t like over-saturating my written content with game talk. Bearing that in mind, let’s wrap up the main talking points swiftly…
Aside from Best British title Batman: Arkham Knight and Best Game winner Fallout 4, one could consider it a year for the ‘indie’ titles, with acclaimed walking simulator (for lack of a better term) Everybody’s Gone to the Rapture walking away – no pun intended – with Best Music, Audio Achievement, and Performer for Merie Dandridge, who played its lead protagonist Kate Collins.
The Arkham Knight team seemed almost offended when asked in an interview about the broken state of their game on PC and subsequent critical panning of the (frankly abysmal) port. I was surprised this topic was even brought up at a ceremony where everyone’s hanging out and otherwise praising each other for how great a job they’d done; unsurprising was the way in which the team brushed off the question about those problems with the port, instead saying “it’s always really hard when people don’t like what you do”.
Well, no shit. But the problem wasn’t that people didn’t like it per se – it was more that the game was released completely broken and borderline unplayable on PC. Too many major game developers think they can get away with slipping a game out like this before going back and fixing it later; at which point they’ve already got your money. You know what would happen to smaller developers if they tried to pull this kind of thing? They’d go under, as is the case with many companies who don’t deliver working products to their customers.
Speaking of smaller developers, Moon Studios were recognised for their adventure title Ori and the Blind Forest, which won the awardfor Artistic Achievement. It has received acclaim from critics across the board and is available now for Microsoft Windows and Xbox One.
Her Story is another intriguing title, available on Windows, OS X and iOS, in which players have to put together a series of clues to find a missing man. Sounds a simple premise; in that likely lies the main secret behind its success. It won Best Mobile & Handheld and Game Innovation, along with Best Debut for British director Sam Barlow.
It’s worth noting that Her Story is technically an ‘interactive movie’; a genre many games have been aiming for recently and which certain major game developers (looking at you David Cage and Quantic Dream) consider the future of the medium. Well, how the future for one medium can simply be copying another – that being film – is something for which I don’t quite understand the logic. A positive, bright future for video games is surely the day when they can fully stand alone from films and not be constantly compared to them like some inferior younger sibling. It is younger, yes, but not inherently inferior, and should strive to be different, not the same.
This brings us back again to Everybody’s Gone to the Rapture and the concept of the ‘walking simulator’; another genre recently linked with ‘modernising’ the medium. It seems this game was one of the most popular at the awards ceremony (having won the three awards mentioned above), and I’m going to estimate that’s mainly because it was seen to be ‘breaking away’ from the usual conventions of violent, action-heavy video games. Does that seem too obvious a conclusion?
Well, a Radio 4 interview (contained in the BBC article previously linked to) with Jessica Curry, co-founder of development studio The Chinese Room, gave me that general impression. It starts out with the host (one of those older gentlemen I earlier alluded to) asking, “what makes this game so different? I mean, obviously they’re not all rushing around shooting each other for a start…” before proceeding to ask “are you concerned with the effect that some of this stuff – some of this ‘other’ stuff – is having on children?”
Pump the brakes right there. That goes back to the presumptive BBC narrative I mentioned before, and I could barely listen to any more of this interview. Curry, to her credit, appears to show awareness that she’s being interviewed by someone who doesn’t really know anything about the gaming scene, and gives some PR line about extremes (like this example) being unhealthy and needing to take everything in moderation. I would’ve preferred her to call bullshit, but I understand she wants to keep her job.
Now let’s say by some chance you’re not too aware of the video game scene yourself but have kept reading anyway; maybe the only exposure you’ve had to gaming before is that which you’ve seen on the news or read online, likely accompanied by a sensationalist headline. Video games across the board are often blamed for violence in teenage boys, for mass shootings in America, for failing performance at school, for men mistreating women, for the general disconnect between kids and their parents. I’ve read articles and watched television programmes or movies that have suggested all of these links and many more. Each claim made without a shred of evidence to back it up, other than “this person played video games, therefore…”
My argument would not necessarily be that video games have nothing at all to do with any of those things – but they are not the cause. Correlation does not equal causation. Maybe (only maybe) they did have an effect at some point along the line. But for as long as games are made the face of the problem, it’s impossible to have a conversation that might lead us closer to the real issue. That’s a topic for another time; for now let’s go over a couple of basic points.
First, not all video games are for kids any more than all films are for kids. Obviously there are films suitable for kids. But you would not show your child The Exorcist as if it were a Disney movie. At least I’d like to hope not.
The trouble is, many parents still look at a game like Grand Theft Auto V as if it were a Disney movie. It is not. That large red 18 rating on the cover is there for a genuine reason; because this is a mature game suitable for adults.
Now even if your child was to play an 18 rated game, that does not guarantee anything. I first played GTA when I was about 12 and have turned out a relatively stable human being, but back then I was also reasonably stable and emotionally mature for my age. It’s worth bearing in mind, there are many children who take longer to mature, and the certification rating is there mainly to cover that age range.
The reason I bring this up as particularly important is because when people such as the Radio 4 host mentioned above ask a leading question like, “are you concerned with the effect these games are having on children?” they are referring primarily to games that have a higher age rating than those children playing them. So even if their clear assertion was based on some factual evidence, all they would really be doing is illustrating how vital it is for parents to follow that rating system.
Second, and this is a small leap when one comes to realise they can be enjoyed by adults too, video games actually have the potential to be highly mature pieces of entertainment. Frankly they’re not just toys to sit your kids in front of; nor some mindless button-mashing addiction, even if that is the form a lot of them take (there’s no problem with them being that either). They deserve as much respect in a mainstream context as films; they are at least capable of tackling the same issues, the same themes, and producing the same kind of hard-hitting masterpieces.
In order to reach that potential, they must be allowed to portray tough imagery and allowed to tackle those tough themes. They should be able to feature not only war (see This War of Mine rather than Call of Duty), violence, and murder, but also ethical and moral choices, mental illness, suicide to name a few, without being degraded and threatened with banning for doing so.
Ultimately that’s why I can’t help but take some interest in the video game BAFTAs; an opportunity for this medium to receive some positive mainstream press, also perhaps a chance for the average person to learn a little more about them and come to realise the points I’ve been preaching above. Yes, video games matter to a lot of us – too often are we left frustrated by them simply being passed off as infantile ‘mindless entertainment’.
Rewind back to August 12, 2014 and bring any kind of passing interest in the Silent Hill series… you’d find yourself in the midst of an online explosion of hype surrounding a certain demo codenamed Playable Teaser (P.T.). This was the red herring for Silent Hills; what was to be the latest instalment in the much-loved survival horror franchise about some sleepy mid-west American town perpetually shrouded in fog and inhabited by deformed monstrosities.
If the unsettling atmosphere of P.T. promised us anything, it was that this new iteration was set up to succeed – an anticipated return to form for a series that had long since lost its way after passing into the hands of American developers post-2004.
I’ve talked about the whole episode before on this blog. I was there to take part in the hype when the game was first announced and set the internet ablaze. Similarly when, in April 2015, rumours began circulating that the project had been cancelled in light of Hideo Kojima’s reported issues within Konami and also the studio’s seemingly abrupt change of focus from console to mobile gaming. On April 27, 2015, these rumours were confirmed and Silent Hills officially cancelled.
As of writing, P.T. is no more (I say that in hope that it will, at some point in the future, be made available to gamers once again) – the online demo pulled by Konami as they look to erase any evidence of their past failures. Now the only people able to play it are those who had already downloaded it onto their PS4 systems pre-cancellation. But in its short existence it became one of the most popular horror ‘games’ in PlayStation history, going on to gain undoubted cult status and a somewhat mythical quality that only keeps growing. YouTube videos featuring play-throughs of the teaser (only a 30-45 minute long experience at best) are as popular as ever. Many fans still hope there is some way in which the project might be picked up again; most of us accepted a while back that it was time to move on and Silent Hills likely isn’t coming back any time soon.
A lot of other people, to whom the words ‘Silent Hill’ meant nothing before and don’t suddenly mean anything now, won’t understand the hype. That’s fine; cult movies and video games wouldn’t be that if everyone ‘got it’.
Some of those who did get it have since taken to paying homage to the source material in creative, imaginative ways. If Silent Hills never comes, then it will at least have cast the shadow of what could have been on other artistic endeavours. Such as the video below; a short film that has been doing the rounds online over the past few days. This impressive homage is the main reason I’m writing about P.T. again today. It admittedly made me pine for what’s past, but it also feels like something entirely fresh. This is the kind of mark Silent Hills has left. It is quite possibly a sign of the influence that may yet be felt in horror movies and video games to come in future. In that way, the Silent Hill game that never was may just live on forever. Enjoy!
“What we propose to do is not to control content, but to create context.”
Over the next few months I’ll be looking back at some of PlayStation’s most significant titles as the console celebrates its twentieth anniversary in the UK this year (by year in this case I count from September 2015 to September 2016).
On this occasion I’ve selected a game that was both ahead of its time and simultaneously very much a product of its time; a project the likes of which simply wouldn’t be possible in today’s gaming industry, and not in the way you might imagine.
Metal Gear Solid 2: Sons of Liberty was released in late 2001/ early 2002 (depending on whether you were living in America or Europe respectively) around the time the PlayStation2 was starting to really hit its stride. You could justifiably argue this was one of the titles that helped kick start the console’s mainstream popularity – indeed it was widely regarded as the PS2’s first ‘essential’ game; the first to receive truly widespread critical acclaim.
It’s almost impossible now to capture the sense of anticipation that surrounded this game pre-release. To claim it was the video game equivalent of a Hollywood blockbuster released during the peak summer months is no overstatement. For many gamers it was even more than that: think almost as big as the hype and expectation that is currently greeting The Force Awakens and you’d be pretty close. While there have certainly been bigger and better games since, I don’t recall this kind of attention ever greeting another video game in history.
This was a time before publishers generally marketed their games as if they were a big deal. Today we see major game studios scrambling every year to make their generic first-person shooters or action-adventure games seem relevant, and it’s not at all surprising to see cinematic game trailers appearing in your local movie theatre before the film. This wasn’t the case with video games before Metal Gear Solid 2.
Creator Hideo Kojima knew exactly how BIG his new game was. He turned this enormous hype against those responsible for it; using the marketing campaign and then the content of the game itself to dupe the series’ own fans in a way that remains unprecedented to this day.
In a move that took a definite amount of balls (which may have seemed almost career suicide to a director-designer less confident and capable than he), Kojima switched out the protagonist fans knew and loved from previous games – chain-smoking mercenary Solid Snake – for an unknown and less aesthetically pleasing rookie with a whiny voice and shoulder-length blonde hair.
No one saw it coming. Not only because the marketing campaign gave no glimpses or made any mention of this new character – named ‘Raiden’ – whom you were to spend three quarters of the game controlling, but also because that same marketing had made it appear as if you were instead going to play the entirety of Metal Gear Solid 2 in control of the aforementioned Solid Snake, the same way you did in the original MGS three years earlier. As it soon turned out, all of the game’s promotional material had been taken exclusively from its prologue tanker level, which made up barely an hour of the overall playing experience.
This wasn’t just a case of withholding plot information – it was dangerously close to deliberately misleading consumers, and some of the anger directed towards Kojima afterwards was from the very same fans who had been eagerly preparing to sing his praises…
Most of the hype surrounding this game was due predominantly to the impact of the first Metal Gear Solid, released in 1998 for the PlayStation. At the time it was labeled the ‘greatest video game ever made’ – which held true as the closest thing to an objective opinion the industry has ever had (I was never quite on that bandwagon, but I could see where they were coming from). In this sense it was almost like the Citizen Kane of video games; a somewhat appropriate comparison seeing as the game gave off extremely cinematic vibes.
This was, after all, a time when the video game industry was obsessed with trying (and largely failing) to emulate films. Metal Gear Solid was the first game to do that convincingly, and this sequel even more so. The trajectory on which it sent the industry is polarising for many; as it seems a lot of players today generally still judge game quality on how ‘cinematic’ they are.
In some ways this is concerning. Yes it has given us some visually beautiful and well acted games, but it also prevents many more original titles from getting noticed – and often it is those more original titles that capture the true essence of what video games can achieve. Brothers: A Tale of Two Sons is a fabulous recent example of a game that told its story primarily through gameplay mechanics rather than exposition-heavy cut scenes – something which the MGS series has traditionally been infamous for.
But that present concern does not stop me from reflecting on the sentimental attachment I have to this particular cinematic game; and acknowledging just how well it turned out as a self-contained, narrative-driven experience.
The success of MGS (itself the third part in a previously Japanese-exclusive Metal Gear series dating back to 1987 on the lesser known MSX2) made the mainstream world really take notice of Hideo Kojima’s talents. For this sequel he was able to hire Hollywood composer Harry Greyson-Williams to help give Sons of Liberty an even more important, cinematic feel – and it shows from the game’s impressive opening sequence. To this day the soundtrack for this game remains one of the PlayStation’s most memorable, and it was certainly among my favourites growing up. It gave MGS2 a sense of gravitas that very few other games had.
What also added to this sense was the fact that the game took itself so seriously. A risk considering some of its characters and plot elements; one that could have fallen flat if the whole thing was not executed well. Fortunately it was, and while the jury is still out for a lot of people on whether Hideo Kojima is actually a good writer, one can’t help but appreciate the immaculate level of polish he puts on all of his games. MGS2 was a shining example of that polish.
The game’s story, for all its Hollywood production values, may appear unnecessarily convoluted in places. You could argue the overall experience is unbalanced, with most of its narrative exposition and main themes coming in the final third. Indeed, this rather crippled the pacing of the gameplay towards the climactic final boss fight. That’s without taking into account another crucial point: unless you’re already invested in the MGS universe, most of this game is unlikely to make the least bit of sense.
After all, your adversaries include an immortal vampire who runs on water, a woman who can’t be hit by bullets, and a literal ‘fat man’ on skates who is also an ingenious bomb expert. An anonymous Russian ninja occasionally pops up to help you out of rough spots. The leader of the game’s villainous group wears a full body armoured suit with two tentacles attached, which he uses to suffocate people and out of which he can even shoot missiles.
His right-hand man – Revolver Ocelot, who was a major character in the first game as well – is at times subject to mind control from the inhibited consciousness that exists in his forearm. This, an arm which previously belonged to the now deceased main villain from the original MGS and has since been surgically grafted onto Ocelot – who had lost his own arm in the first game when it was cut off by a different ninja, though one that bore a striking resemblance to the ninja appearing in this sequel.
So you see how this game might be a little hard to follow for the uninitiated? Hell, even for long term fans it takes a bit of effort to keep track.
Without doubt it was mostly those familiar with the first game who Kojima had in mind while making Sons of Liberty. One can’t help but experience a strong feeling of nostalgia while playing as Snake on the tanker, and there is a similar (but at the same time unmistakably separate) sense of deja vu in playing through Raiden’s espionage mission. Well of course, you might say, this is a sequel after all!
Yes, it was as much a sequel as sequels can get, and in certain ways it felt almost like a lazy one. In fact it’s the very definition of a fan-pleasing experience from beginning to end of the opening tanker chapter; at which point Kojima stops pandering and proceeds to give fans the proverbial middle finger instead.
The game drastically changes in both tone and pacing with Raiden’s appearance. Here you find yourself forced to play as a rookie, not only within the context of the plot but as a player too; having to go through the same basic setup that you’ve already been through in the original MGS with Snake.
Your Colonel unnecessarily tells you basic controls. In baby steps you’re taken through the opening sequence alongside Raiden as if you, like him, are new to this kind of thing – despite the fact that you’ve just played the prologue level as a veteran in control of Solid Snake, picking up where the first game left off. You’re taken from that to literally starting afresh, and the experience was almost as jarring as having the Snake character yanked from your fingertips.
Suddenly it’s almost like you’re playing a version of the first game over again – though one that doesn’t feel quite as authentic. On the surface, most of Raiden’s campaign seems an unoriginal retread of a path you already walked in Metal Gear Solid; the deja vu you feel in this case is not the same nice nostalgic feeling present in the previous tanker chapter, but a rather more unsettling one.
This game’s main villains, Dead Cell, are uncannily similar in their eccentric curiosity to the Foxhound group from the first game. There’s also the return of a mysterious ninja; in both games an ambiguous individual with ties to neither side. And your Colonel? Just so happens to (seemingly) be the very same one who helped guide Solid Snake through Shadow Moses in that first title.
The whole thing felt like too much of an echo back to Metal Gear Solid – close to a simple copy and paste in certain respects. It is only in the plot’s final third that this all brilliantly unravels; when it is revealed that ‘recreating Shadow Moses’ was precisely the intention of a shady organisation that had been manipulating both sides all along to further their own plans for society.
You find out that your Colonel, whose orders you’ve been following on the mission to which Raiden is assigned, is actually an A.I. (or something…) operating on behalf of the Patriots; a group of individuals who control the United States from the shadows, from whom even the President receives orders. The game’s main villain – at least, you’ve been led to believe he’s the villain up to this point – proposes to break the Patriots’ rule over the country and set everyone free from their control (hence becoming the Sons of Liberty of the game’s title).
Your real mission is to eliminate him before this plan comes to fruition – though you only find this out toward the game’s conclusion, up until which point you had been fed a convenient and rather typical espionage cover story regarding hostages, ransom demands and nuclear bombs.
At the same time, it is revealed that the Patriots set up the conditions for the entire operation from the beginning – indirectly giving Solidus (your adversary) the means by which his plan could progress to its later stages – as part of a test to see if a typical rookie operative could be moulded into a legendary mercenary, similar to Solid Snake but this time created on their own terms, if placed in the right environment. This operation is codenamed the ‘S3 plan’, which stands for ‘Solid Snake Simulation’.
Yes, the conventionality of it all – from the game’s plot outline to its blatant comparisons with the original, via a ‘rookie’ in the form of Raiden – had been a setup; not only from the perspective of the game’s characters but for the benefit of the player. We’re the real test subjects for the S3 plan – how successfully the game manages to pull the wool over our eyes and keep up the illusion is the litmus test that shows its effectiveness.
This sequel played on and caught you up in your own expectations. Raiden is informed towards the end of the game that his Colonel, a man he had never met in person, was, in part, a projection of images cobbled together from his own subconscious expectations. In a way this is true for the player as well; the Colonel sounds exactly like the one we knew from the previous game because our expectations from that game told us this is what a Colonel should sound like. It becomes blatantly apparent that the two characters are different entities, so Kojima had no other reason to re-use the same likeness than to make this point – at the same time putting us in the same state of unease as Raiden; the only difference being that the player senses this unease from the beginning. But it’s something you put to the back of your mind, at least until the in-game characters become aware of their situation later.
In the end you realise we, as players, were duped as much as the fictional characters in this game. The prologue tanker level was everything fans wanted and had asked for, picking up where the first game left off with two of its most popular characters in a brand new, visually pleasing scenario. You strap yourself in and get ready to enjoy an indulgent sequel experience that will leave you feeling your expectations have been met.
Hideo Kojima shows here that he was fully aware of what those expectations were, and teases you with the intention of meeting them for all of an hour’s playing time before pulling you out of the illusion.
Then, you’re in his game. A game that repeats much of what you saw first time round, but in a way that isn’t quite as authentic. Suddenly you’re back to roaming claustrophobic corridors and learning guard routine patterns. At times it feels almost like a parody of what came before, while also forcing you to play as a less accomplished character than your previous protagonist… but whom you play as anyway because that’s the game you’ve been given, and even though things are not exactly how you’d like them to be, this is still Metal Gear Solid after all.
So everything’s not quite as you’d like or imagine it to be – but this version is crafted to show you just how willing you and every other player is to accept what you are given. It’s a copy, albeit not an exact one. Merely a recreated scenario; one that becomes almost dream-like right before the end, at which point you ‘wake up’. Seriously, the ending cut-scene to this game feels so tonally contrasting to what came immediately before it that it feels like stepping back into reality from what had become a nightmare.
Before fighting the final boss, the Patriots’ blatantly reveal their intention for you to succeed in your mission by killing your adversary. While Raiden protests at this, saying “I’m through doing what I’m told” and even claiming “we’re not puppets in some game, you know”, the game nonetheless throws you into the fight; a fight to the death which you willingly comply with because it’s the scenario that presents itself.
You aren’t going to turn the game off now if only for wanting to see how it ends. While playing through the final sneaking section leading up to this point, the malfunctioning Colonel A.I. dared you, the player, to “turn the game console off right now”, or suggested “you shouldn’t sit so close to the TV”, or commented “you’ve been playing the game for an awfully long time… don’t you have better things to do with your time?”
These comments showed the game’s awareness of its own place within its medium, playing on concerns that players may be facing outside of its universe… are you sitting too close to the TV? Are there better things you could be doing with your time? The answer to both is, probably, yes.
Through it all you keep playing anyway, because “this is a game after all. It’s a game, just like usual” (to use another of the quirky Colonel’s quips) – as if you needed reassuring that, despite its self-awareness, you were still just playing a game to have fun. Of course, this kind of experience was far from typical.
When people claim this is a ‘postmodern’ game they aren’t simply saying it has certain postmodern threads or thematic elements. The entire experience is, in a sense, a reflection of the original, which itself was heralded as a masterpiece of modern gaming. It was postmodern in the purest sense of the term – coming after the modern, it offered context by which we could judge what came before.
This concept of ‘creating context’ is taken even further in a revealing conversation with your Colonel after the plot’s main points have been divulged. It becomes apparent that he is more than just an ‘A.I.’ during this final reveal. He first explains ‘their’ true origins:
“To begin with, we’re not what you’d call… human.
Over the past two hundred years, a kind of consciousness formed layer by layer in the crucible of the White House.
It’s not unlike the way life started in the oceans four billion years ago.
We are formless. We are the very discipline and morality that Americans invoke so often.
How can anyone hope to eliminate us? As long as this nation exists, so will we.”
Now, to grasp what’s going on here you need to understand we’re no longer really talking in tangible terms. What this is referring to is not any single character or being, but to human culture itself – the culture around which modern society has been circling for quite some time. A culture in which following certain rules and holding objective beliefs is rewarded; indeed, the idea is that we need those things, organised in a structure, to survive as a species.
MGS2 was released just after the turn of the Millennium; a time when the world was in the midst of transitioning to a more ‘digitised’ age. With this new flow of digital information came a unique challenge to the cultural pattern referred to above, and it is this challenge that ‘the Patriots’ are responding to during the course of this game. Their answer is an advanced A.I. that will control the flow of information so it doesn’t overwhelm humanity. The ‘Colonel’ goes on to explain this:
“In the current digitized world, trivial information is accumulating every second, preserved in all its triteness, never fading, always accessible.
The S3 plan does not stand for Solid Snake Simulation. What it does stand for is ‘Selection for Societal Sanity’…
You seem to think our plan is one of censorship?”
Raiden: “Are you trying to say it’s not?!”
Colonel: “What we propose to do is not to control content, but to create context…
The digital society furthers human flaws and selectively rewards development of convenient half-truths; everyone withdraws into their small, gaited community, afraid of a larger forum.
They stay inside their little ponds, leaking whatever truth suits them into the growing cesspool of society at large.
The different cardinal truths neither clash nor mesh; no one is invalidated, but no one is right.
Not even natural selection can take place here; the world is being engulfed in ‘truth’.
And this is the way the world ends… not with a bang, but a whimper.”
Bear in mind this was before the rise of social media. Facebook and Twitter did not yet exist, but MGS2 foresaw their emergence with alarming insightfulness. Are they not guilty of promoting the very things described in the above dialogue?
Selectively rewarding convenient half-truths… everyone afraid of a larger forum, leaking whatever ‘truth’ suits them into society at large… no one is invalidated, but no one is right… the world being engulfed in ‘truth’.
Let’s be honest: this is social media in a nutshell. Social media itself is representative of the Internet in a nutshell.
You’ve probably complained about it yourself. Look at how social media trends develop; observe how they eventually die out; see how someone will ‘share a link’ of a tragedy in the Middle East and, with their social justice fingertips at the ready, point out to everyone that it doesn’t get the same coverage as a similar tragedy in Europe… and point out how much of an injustice this is.
To some the flow of ‘trivial’ information over the Internet represents freedom. Others ridicule and scoff at it, indirectly revealing that they think it should be controlled; advocating the kind of ‘S3 plan’ the Patriots had in mind.
I admit I’ve fallen on both sides in the past. I know that for all the amazing bits of useful information to be found online, there is much more ‘rubbish’ one has to wade through. That ‘useless’ information (one of the biggest enemies of productivity if nothing else) is precisely the kind that the A.I. in this game was proposing to filter out.
Isn’t the main problem with a lot of online information that it often appears on our news feeds without appropriate contextualisation? Isn’t the problem then exasperated by everyone reacting to it without bothering to look into that context?
Maybe the Patriots were right after all. Many of us crave the context they proposed to create. But there will always be a side of us that misunderstands context for control over that same information. Or perhaps, from another point of view: there’s a side of us that prefers reacting to things free from context – because context can affect our ingrained sense of ‘truth’ in a way that could make us revaluate what we believe or how we live. And to do that is uncomfortable.
Metal Gear Solid 2: Sons of Liberty ends with Raiden completing his mission, but having gone through an identity crisis in the process, he finishes with a question; “who am I really?” A question Kojima was posing to the player as much as the protagonist.
The game doesn’t provide a conclusive answer, finishing in open-ended fashion that encourages you to find truth for yourself. Snake gives you this gem in conclusion to the game’s events: “what you think you see is only as real as your brain tells you it is.”
It’s worth pointing out that Hideo Kojima originally envisaged the Metal Gear series ending with Sons of Liberty. If you’re wondering why he had the balls to try pulling this off, it is quite simply because he wasn’t relying – as so many major studios and developers are – on milking this thing any further as a franchise. Which made its status as a ‘blockbuster’ game even more unique.
Indeed Kojima was convinced later to make more games in the series, and the sequels that followed MGS2 included considerably more fan service than we see here – not to mention an overarching plot that pretty much retconned the final twenty minutes of MGS2. Kojima made those other games for the fans, whereas here he wasn’t particularly concerned with pleasing anyone. For that reason I consider both the original MGS and this sequel to be the truest portrayals of his vision we’ve seen.
With MGS2 he was encouraging players to really think about what they were doing; what had led them to play this game; how they consume what they see online and in the media; even how they were living their lives and what the future might hold. Here we had the video game equivalent of a major Hollywood movie franchise (the biggest name of its time) tackling convoluted themes such as freedom of choice and the subjectivity of truth, without having given its audience any indication beforehand that it was going to do such a thing. Many were not so much left unsatisfied as left flat-out baffled by the experience.
This remains one of the most complicated game plots of all time. In my eyes it represents a masterpiece – not strictly a ‘gaming’ masterpiece, but certainly in how it sets up and tells its story, as well as how it manipulated players before and after release. That’s a bit of a controversial opinion in some circles, with many considering this game not even the best in its series. But I think it’s a game everyone should experience, even if you go away feeling slightly exasperated by it.
Furthermore, if you’re ever going to start playing the MGS series, do yourself a favour and start with the original before moving on to this one. Metal Gear Solid 3: Snake Eater (2005) is a prequel and, although arguably a better game overall, its story went some way to changing how you looked at everything that came before. The first two games should be taken as separate entities before they’re later considered in the context of the series as a whole.
Even today I still frequently go back to MGS2 to re-live a game that is as simple in its gameplay as it is complex in its storytelling. It features one of the most unique plots and some of the most challenging themes ever included in a video game, and it’s undoubtedly one of my personal favourites.