Meewella | Fragments

The Life of P

Month: January 2010

Slate Expectations: Apple’s iPad

Apple were in an incredibly difficult position with the iPad. Everyone knew they were making one and no one would be happy unless it was revolutionary. As someone unconvinced by the tablet offerings at CES earlier this month (they have clear application in industry and healthcare, but I can’t yet see a consumer need), I was curious about what Apple was bringing to the table. What they have produced is a superb technology showcase, but it’s also exactly the tablet everyone expected. First off, some people are griping about the name. Trade mark issues left them in an uncomfortable position since they don’t actually own the letter “i” (whatever they may argue) so I’m okay with the iPad. Yes, it may sound a lot like Star Trek’s PADD, but that’s clearly the influence they were channeling with the whole device.

For the technophile there’s a lot to love here: a great-looking 9.7″ touchscreen that’s fantastic for photos, a large and seemingly responsive onscreen keyboad, a decent 1GHz Apple processor (although bear in mind the latest Snapdragon smartphones have a 1GHz processor too). The impressive number, however, was 10 hours of battery life. That’s big. Though I have to wonder just how far that falls in real-world use: wifi and 3G will both be a fair drain and there’s clearly no way it plays 10 hours of video.

The “unbelievable price” seemed slightly misleading by splitting apart the cost. Hitting as low as $499 for the basic model was certainly a welcome surprise that no one expected. In practical terms the “real” version is the 3G-toting one, else it’s not actually particularly portable (not to mention the fact 3G laptops are readily available). The US deal is $15 a month for a mere 250MB, which I would exceed with email and attached/linked video alone. So it’s $30 a month for unlimited access which seems reasonable. That means the year one cost is actually $1,089 (for the middle 32GB model), then $360 for each subsequent year. It’s not outrageous but it’s not a low barrier to entry for what they are marketing as a third device.

It technically works with existing iPhone apps but I don’t think people were particularly wowed. Even when expanded the interfaces looked clunky and odd. But new apps taking advantage of its size will certainly look great. I foresee a swift divergence in the app store between the iPhone and iPad and, with the install base of the former, I can see a lot of developers overlooking the latter as not cost effective. While Jobs nodded to the Amazon Kindle, I can’t quite see the iPad succeeding as an e-book reader, since the very reason they took off is that e-ink avoids eye strain. However until colour versions emerge (likely later this year) it could be ideal for magazine-style articles and the New York Times demo with embedded video was particularly impressive (even if it was just a glorified website).

The glaring omission for me is one I had not expected to see: nearly 10 inches of screen real estate and a 1GHz processor, but no evidence of multitasking. Palm’s webOS has a perfect implementation on a significantly smaller device, so how Apple didn’t feel the need to address this is at best startling and at worst unforgivable. If they are trying to create a new product between phones and laptops, it needs at least to be able to match the functionality of top end phones. Also strangely missing are flash-support — for a device supposedly offering fully portable web browsing that’s bizarre — and a camera, which is certainly non-essential but fullscreen Skyping while wandering/sitting around with one of these would be pretty damn cool. Again, features that top-end phones already have.

They will sell, of course. That hardcore Apple fans will be picking them up in 60-90 days isn’t even a question. However this needs much broader consumer appeal to be the “magical and revolutionary device” (seriously, as an actual slogan?) Jobs suggests. I’m sure interest will be high in the short term, not least because Apple has cued mainstream journalists with appropriate adjectives, but it’s going to take some serious marketing to build up momentum. Ultimately the first iPad is an interesting concept that flounders for the same reason as the netbooks it aims to replace: it tries its hand at a lot but, in attempting to fill a gap that doesn’t yet exist, it isn’t quite sure what role it serves. The result is that, like most notebooks, it’s just okay.

Under the Hair: Sexuality in Bayonetta

When I first saw footage from Japanese action game Bayonetta, I wrote it off as selling itself solely as an overtly sexualised slice of Japanese insanity. Then the reviews hit in December praising the beauty and precision of its action gameplay and it garnered 9’s and even a perfect 10 from the typically critical Edge. It’s not that my first impression was wrong — it is both those things — it’s just that it might not be such a bad thing.

Bayonetta is making people uncomfortable. There exists an age divide of sorts. Broadly speaking up to the age of around 22 guys find her sexy and (gaming) girls find her empowering. Above that both genders begin to view the game as awkwardly exploitative. The question is, in a medium where female portrayal is a major issue, is Bayonetta a step forward or backward?

Bayonetta’s appearance is clearly stylised, with impossibly long legs and the fashion sense of a latex-clad librarian. This is beyond mere Lara Croftian unrealistic body image and more in the sense of Gears of War’s steroid-fuelled juggernauts. But the entire game feels like an ode to this one character with sweeping camera angles sliding between her legs or focusing on her pursed lips as she gazes suggestively into the camera, sucking a lollipop. Meanwhile her clothing is formed from her hair meaning that when performing her powerful “hair attacks” she is left momentarily disrobed. The game’s insanity is infectious and soon you don’t even question the idea of strapping pistols to stiletto heels. I mean why wouldn’t you?

Yet while the development team was led by Hideki Kamiya (of Devil May Cry fame), the character designer was a woman, Mari Shimazaki. This means her provocative character is female-driven rather than being an expression of male fantasy (or at the very least represents female exploitation of male fantasy). She’s certainly no damsel in distress — an intimidating, strong character (mentally and physically) who dispatches throngs of enemies with deadly feline grace. Stylised physical appearance is par for the course in videogames, with both overly muscular male and excessively lithe female characters. Videogames certainly revel in hyper-stylised men, be it their brutishness or androgynous allure. Why should they bar women from receiving the same treatment? Is the largely desexualised approach with characters like Half-Life’s Alyx and Portal’s Chell really the ideal, or is there room for something at the opposite end of the spectrum too? The bottom line is this: I fear I feel Bayonetta is exploitative because some part of me is trained to believe it must be, irrespective of whether or not it actually is.

This is still, of course, purely a male perspective. If you’re interested a in a female one, Leigh Alexander and Tiff Chow would be happy to oblige.

The Face of Gaming in 2009

Modern Warfare 2At the end of last year Modern Warfare 2 landed with such explosive force that it not only breached its way into mainstream press headlines, but it also sent dozens of excellent games scurrying for cover in Q1 2010, which is now the most impressively packed first quarter I can recall. The hype was justified given Infinity Ward’s past performance, the impressive in-game footage already shown off, and the fact pre-orders alone guaranteed profits dwarfing pretty much any other title this year. It is, for all intents and purposes, the public face of gaming for 2009. Which is hugely disappointing.

It is not, let me be clear, because it is a bad game. On the contrary, I’ve just finished playing it and am on the same high as with the climax of its predecessor. While it had a rocky start jumping erratically around the world with short missions that felt like a greatest hits of Bond locations, it gradually sucked me in so that I really did care by the final twists and turns of its tale. It is a stellar title at the peak of the shooter genre, but in some ways therein lies the problem. This is a genre that has existed in much the same form since the early 90s, though the graphical technology and AI has improved in leaps and bounds. It is still what springs to most non-gamers’ minds when they think of videogames. I am not about to apologise for the genre — it can be vibrant, creative and in some cases is arguably a valid competitive sport. However given the wealth of varied experiences offered through videogaming in the past year, it’s a shame the mainstream public will just think of another military shooter.

So what were the best games this year?

Batman: Arkham Asylum: A licensed game that did not poorly ape the recent film but instead struck out its own path, drawing on Batman’s comicbook heritage. Mark Hamill’s deliciously insane turn as the Joker rivals Heath Ledger’s performance (in a different way). Combining intelligent investigatory and brawler elements with the back-catalogue of villains locked away in Arkham, it was a surprise debut hit from Rocksteady Studios this summer.

BraidBraid: Independent developer Jonathan Blow lovingly crafted this beautiful, haunting, artistic and fiendish puzzle platformer in which you affect time to complete your goals. Its careful learning curve may sharply steepen but it does reward the patient. Its ingenuity on par with Valve’s Portal (though Braid’s indie development arguably compares more directly to its predecessor Narbacular Drop).

Assassin’s Creed 2: While it might seem there is only so creative a sequel can be, this was the consumate sequel, as if Ubisoft had listened to every single complaint about the first game and addressed it, particularly the repetitiveness. The action shifted to Renaissance Italy, once again recreated in stunning architectural detail (right to the peak of every church and tower since you can scale them all) but now feeling much more alive. Perhaps not wishing to waste all the historical data they gathered during development, the in-game database is fascinatingly educational as you explore Florence, Venice and more. Depending on your perspective it could be as much an art history project as a videogame.

Uncharted 2: There is a long-running debate as to whether games should become more cinematic and story-driven or strive to differentiate themselves. There are merits to both approaches, but none nailed the cinematic feel so much as Uncharted 2. Arguably the only reason people were not even more impressed by this game is that the first installment was already so good.

Dragon Age: OriginsDragon Age: Origins: this adult fantasy roleplaying game opens with a different Origins story depending on your chosen class, which then affects portions of the main story, may be something of a gimmick. Very real, however, are the characters Bioware has created, each with their own personalities and rich backstories that really drive the game forward moreso than the overt (and somewhat derivative) plot. Fully voice-acted, the biggest disappointment is that whenever selecting party members for a quest, you know will be missing great dialogue and banter from the others. And if they dislike you, they’ll even leave. Meanwhile, ike 2007’s The Witcher, its darker tone also allows it to deal with heavy themes like racial tensions.

Avatar: changing the face (and depth) of films?

My criteria for seeing Avatar were that it had to be on the largest screen possible and in 3D to get the fullest experience. And so I ended up at the BFI IMAX at midnight (on a side note, I really miss midnight screenings) with its bigger-than-a-house screen. My review is up, but in short this is something you really must experience. Unusually I say this without any real intellectual connection to the film or its characters, but rather from a purely emotional/entertainment perspective. The audacity of the project is incredible, as is the fact James Cameron was able to conceptualise it 15 years ago and finally realise and see it through to completion. It turns out “exploring another world” hyperbole was in fact entirely accurate.

Some critics are querying whether this is a game changer in terms of how we give awards for performances since Zoe Saldana’s character is only ever seen in CGI, but it is undeniably her performance that comes through. Should the award go to her, the effects people or both? Of course this debate really dates back to Andy Serkis’ Gollum which was very much his performance, since even the facial animation was based upon his acting. However the facial mapping in Avatar‘s new breed of performance capture is incredible.

This is also the new benchmark for 3D films, to the point where I hope other films won’t bother unless they’re going to do it properly. Up to this point, Coraline has been my touchstone for 3D since as a stop-motion film it was “real” 3D filmed with stereoscopic cameras. The subtler into-the-screen 3D is definitely the right way to do it (and the reason I was unimpressed by the 3D Alice in Wonderland trailer as, despite impressive detail, far too much was unnecessarily flying out of the screen), but these two films highlight two fundamentally different approaches. With Coraline you are offered a 3D window to observe, so the trick to avoiding a headache is to learn to look around as in the real world, focusing on individual parts rather than attempting to take in the entire screen at once as with traditional films. Avatar merges this with traditional narrow depth-of-field shots, meaning the viewer is not always free to look around as they wish. Instead the trick is to relax and allow your eyes to be guided to whatever is in focus. It is perhaps disconcerting, but it does provide a more cinematic effect.

"Civilization now depends on self-deception. Perhaps it always has."

(CC) BY-NC 2004-2023 Priyan Meewella

Up ↑