Friday, December 21, 2012

Is PC gaming truly dead?

An interesting discussion was brought up the other day on the NeoGAF forums. The initial question asked was "Will next-gen console hardware be more powerful than current PCs?"

My initial reaction was simple and straightforward.

Then, as I read through the thread, someone brought up a good point, one that I hadn't considered: How long since we've actually seen a PC game? I mean, a real PC game? Think about it; just about everything released these days is a port from console, or at least built with console in mind. The result has skewed the mindset of PC gamers into something different than it used to be.

The crux of the argument is that PC gamers have gotten used to running everything at absolute maximum settings, and getting stellar performance out of it. I can load up the latest big game, and play it at a higher resolution than the consoles, in a way that looks a lot better, and at a much higher framerate. The problem is that, because this generation has been filled with console games ported to PC, with its much more powerful hardware, this has become the norm. All games run at 1080p60 (1920x1080 Full-HD resolution, at 60 frames per second, the maximum that the average PC monitor can display).

The user in question reminded us of the fact that this wasn't always the case... maximum settings at good framerates used to be reserved for only the most expensive übercomputer, one that the average PC gamer (who spends a good amount of money on a system to begin with) could never hope to afford. That's the whole reason PC games have all those settings in the first place, is scalability. The option to turn something down that's slowing down performance, in order to get the game to run better. In a "true" PC game, it was impossible to run it at maximum settings, at a playable framerate, anyway. The last "true" game we've seen that does this is the original Crysis. To this day, the game won't run at 1080p60 even on my system (which is a bit of a beast). And that game's seven years old.

It's not just the console ports, though. I do have some games that are PC-exclusive... Hard Reset comes to mind. Yes, it has all those options to turn down to improve performance. But, to be honest, I never had to.. I could run the game out of the box at 1080p60 at maximum settings. For whatever reason, while it was designed for PC gamers from a gameplay and control standpoint, it was built more or less to console specs as far as graphics goes. The game is very pretty, but it's not really doing anything groundbreaking that can't be done on a console.

There are still many PC-exclusives, of course. RTS games like Starcraft II, multiplayer shooters like Planetside 2. Various complicated RPGs and MMOs that can only be run using a "controller" with 104 buttons. But none of them are designed to push PCs to the limit anymore. None of them are pushing photorealistic graphics that will bring even the mightest PC to its knees. Crytek claims that the upcoming Crysis 3 will do this, but it's based on the same engine as the previous game, which runs fine, and is also scalable to consoles. While I do believe they plan on making the PC version noticeably better than its console bretheren this time, I don't believe I'll have too much trouble running it (my PC meets the specs of their "high performance" recommendation). In terms of "Yay, a real PC game!!", I may wind up happy if I have to turn some of the settings down a little.

So think about it, fellow gamers with beastly machines... When was the last time you had to turn something down? And how much did it bug you to have to do it?

That said, I don't believe that the PC is dead as a gaming platform, and that's not the point of all this. But we're just another gaming platform now.. Playstation, Xbox, PC. And I'll continue playing that way, to be sure. But it certainly isn't as special as it used to be.

Saturday, August 25, 2012

Prometheus vs. Expectations

It's taken me much longer to write this than I intended (and even longer to post it), and part of this will be discussing why.

In all my time, I don't think I've ever seen a film polarize an audience the way Prometheus has. There doesn't seem to be any middle ground for this one. "Meh, it was okay..." just doesn't exist. There's the positives, the folks that me that really liked it, and then there's the negatives, the folks that flat-out hate it, that utterly despise it as one of the worst pieces of celluloid to come out of Hollywood in the last three decades. That sounds like overkill because it is, but those are the opinions I've seen.

From what I've seen, I think the Negative Nancys are actually the minority, although they pretend to be the majority. They're the most vocal, and the loudest by far, posting their opinions all over the internet, which gives them the appearance of being a much bigger presence than they really are. Those of us who like it tend to be quieter on the subject.

The overall problem, I believe, comes from the expectations that people had. They wanted another Alien, plain and simple. After all, it's the same sci-fi universe, an expansion of the same story, by the same director, even. It had to be the same, right? And it wasn't.. not even close. So they came out bawling about how horrible the film was. Despite the fact that the director, writers, cast, and crew had all been saying for years that it wasn't going to be the Alien film that people were expecting.

Now, let me say this before we get too much further: The film isn't perfect. In fact, if you read reviews and posts from the people that liked it, I'm pretty sure none of them are saying it's perfect, either. I'm not going to spend a lot of time "defending" the film, or trying to come up with explanations for the nitpicks that people didn't like about it. I acknowledge that the film has problems (what film doesn't?) but they don't impact my enjoyment of the film in any way, and that's the important part. Note that I'm not saying the problems don't exist.. I'm just ignoring them because frankly, I don't care.

The only point I'll make on that subject is in response to a post I read on a forum, where one of the Negative Nancys said something to the effect of "Since when has Hollywood had to resort to things like this?" They were referring specifically to the way the scientist characters behaved, doing "stupid things" for "no reason except to advance the plot".

My response was simple, straightforward, and accurate: "Since around 1910."

Seriously. Hollywood has always relied on characters' actions furthering the plot, whether they make sense or not. You could name practically any movie in existence, and I'll be able to find a part where a character does something silly, inane, or just plain stupid, that advances the plot in some way. And I can do that to movies that I (and others) consider great masterpieces and classics. The idea that somehow Prometheus did something that other movies don't is, itself, pretty stupid.

This is another example of the film being held to a different standard than other movies. While such "mistakes" can get by with other films, even other fantasy/sci-fi films, it's okay, but it's somehow a horrid blight on cinema when Prometheus does the same thing. Ridley Scott himself said "It's not a science class, it's a movie."

One part I find particularly amusing is how much of the blame is laid at the feet of Damon Lindelhof. I actually feel bad for the guy as to how much flak he's taking about this film. Aside from the fact that we don't actually know what was Lindelhof and what was John Spaihts (the original writer), the part I find funny is that people seem to think that Ridley Scott didn't notice all these "issues". That either Lindelhof was standing behind him with a gun to his head, forcing him to shoot the movie that he wrote, or that Ridley was too stupid to notice how "bad" the film was and simply shot what he was given. Neither of which could possibly be true. I know for a fact that the rolling-donut Juggernaut at the end of the film (if you've seen it, you know what I'm referring to) was Ridley's idea from the beginning, and even influenced the design of the ship itself to facilitate that scene (the original ship from Alien was not nearly as rounded, and would not have rolled half as well). The sad part of this is that Lindelhof is rumored to not be involved with the proposed sequel, and I wonder whether that decision is based solely on the public opinion that all of the flaws in Prometheus are "his fault", regardless of whether it's true.

People need to learn to have realistic expectations of films, and the ability accept a film for what it is instead of what we want it to be. I'll admit that Prometheus wasn't exactly the same as what I thought it would be, but it wasn't too far off, and it was overall just as good as I expected it to be.

Prometheus is not a perfect movie. If such a thing even exists, there are only a tiny handful of films that I would classify as such (I can only think of one offhand, with another as a close second). And none of them are directed by Ridley Scott.

Saturday, July 7, 2012

3D Follow Up

Since my last post regarding how people are doing 3D "wrong", I've done some research, and I've even altered how I play my own 3D games.

Don't mistake me... they're still doing it wrong, just differently than I had originally guessed. My original theory had been that the images for each eye were too far apart. In a very few cases, this still holds true, but in most cases it isn't, as least as far as "deep" effects go.

Long story short, in order to make something appear to be very far away, the separation between the images should be approximately 60cm. This is the average distance between the eyes of a normal person. If the images are 60cm apart, the eyes basically look straight ahead, rather than converging on the monitor that's only two or three feet away.

I've actually increased my 3D settings in most games to match this. That mountain range off in the distance now really appears to be far away. Shallower settings can still provide a level of "depth", but at the cost of immersion.

Since then, I've also discovered what the real problem is with those videos I see on YouTube. It's what's referred to technically as "negative parallax", or in layman's terms, "pop-out"; the effect of something coming out of the screen at the viewer. While this can be very effective in certain situations, in most cases this is the part that comes across as a "gimmick".

Let's examine a couple of examples. I saw a video posted on AVS forum by a user who photographed a flying quadrotor drone in 3D. At certain points in the video, when the drone was only a few feet away from the camera, there was some negative parallax. At several points during the video, I held my hand out toward the screen, and by god, that drone looked like it was absolutely floating directly above my hand. Really amazing stuff.

The problem with some users and games is knowing where to draw the line. I've seen some videos that have almost exclusively negative parallax, where everything is "popping out" of the screen at me. Things like characters, horses, and buildings. These are massive objects, there's no way they can be in front of my 23" monitor that's only a few feet away.

It turns out that's the part that was frustrating to watch.. my eyes were telling me that the objects were closer, while my brain was balking at the idea, insisting that they must be further away. It's similar to what happens when you view 3D content with the left/right eyes reversed (I don't recommend it, by the way, it's incredibly disorienting).

This is the reason why most filmmakers choose the "window" approach. Where almost nothing ever gets closer than screen distance from the audience. Where everything you see is further away from you. This is how I've chosen to play my games, and even at very high depth settings (but not higher than my eyes can handle), the result is extremely immersive and a truly amazing way to play. But as soon as I start pushing the convergence point further out in order to get some of that "pop-out", the effect is ruined. It also muddles with the sense of scale.

So you're still all doing it wrong.. just set your convergence at screen depth. That doesn't mean you won't ever get any pop-out, it just means that when it does happen, it'll be something that's actually supposed to be only a foot or so away from your face. It'll make you suddenly jump back, and then you'll smile and say "that was cool..."

Saturday, June 9, 2012

3D Gamers, You're Doing it Wrong

No, not Hollywood this time. YOU.

There's a reason why so many people look at 3D as being a "gimmick" or nothing more than a sales ploy to sell more (or simply more expensive) movie tickets. It's because a lot of Hollywood does things for effect.. making the 3D "pop" and having things come flying out of the screen at you and so forth. Stupid shit.

What we need to do, as users capable of creating our own 3D content (particularly games), is to show them that it shouldn't be done this way. The problem is, we're not helping. If anything, we're making it worse.

I just got recently purchased a 3D monitor for my computer, and I've been taking the opportunity to see a bunch of 3D content on the internet, such as YouTube and the like (as well as, naturally, trying out every PC game I have). I see a lot of game footage online; gameplay, demos, etc. And on practically every video I see, it's small wonder people say that 3D is too much.. you people have no idea what you're doing with it.

On practically every gameplay video I've seen on YouTube, the person recording it has the 3D depth effect cranked up far beyond the "gimmick" stage into the "I have to cross my eyes to see anything" stage. And they call it "immersive". There's a slider in the NVidia control panel that defaults to 15%. I can tell most of the people out there have it set to somewhere around.. oh.. 100% or so.

Let me give a quick rundown of how 3D works for those that are unfamiliar. The computer renders everything in the scene twice, at different levels of separation. One copy goes to our left eye, one to our right. The further apart they are. Because of the way our eyes focus on things, the further apart the objects are, the further away they appear to be (or are closer to us, popping out of the screen, depending on how the effect is rendered).

The problem when you turn the effect up too high is that objects are too far apart to be able to focus on them. Let's say you have a character on screen, and a tree in the distance. The player character is fairly close to the screen, so the separation isn't that high. On a 23" computer monitor, we're looking at a separation between left and right eye of maybe a quarter of an inch. At a normal distance of around 40" or so, that's not hard for our eyes to focus on. If you look at the screen without your 3D glasses on, the character looks blurry and "doubled".

The tree in the distance, on the other hand, may be three or four inches apart.. without glasses, you can actually see two separate, distinct trees. Because of how far apart they are, your eyes have to pull away from each other in order to see it properly. For most people, our eyes don't actually work that way, and it becomes an effort, and in some cases painful, to actually focus on objects like that. Crossing our eyes toward each other is easier to do, but that only happens on objects that are "closer" to us rather than further away.

The 3D effect is meant to draw you into the experience. Having to contort your eyeballs to see something does exactly the opposite of that, it pulls you out of the experience and reminds you that you're playing a game. And a poorly-calibrated one at that.

The other thing it does, that most people don't stop and think about, is that it will destroy the effect of scale. By having to move your eyes as much as you do to focus on things, it makes it look like the game world is just a small diorama, and you're looking at it through a tiny little camera.

It took me only a matter of minutes of fiddling with 3D to figure out that the effect worked best when it was subtle. Keep the depth to a fairly low setting, so that even the objects far in the distance still have a fairly low separation value, maybe half an inch. What you end up with is a much more immersive feeling, because you're eyes aren't having to fight against themselves in order to focus on anything. Just like in real life! How 'bout that. Your eyes don't hurt focusing on things when you walk your dog, why should they do it when you're raiding the Firelands or killing aliens?

After a few minutes of playing like this, you'll quickly forget that you're playing in 3D. And this is a good thing. You're focusing on the game again, and not the gimmick. But that gameplay will feel deeper to you.. you'll feel a lot more like you're really there. True immersion happens by accident, and not because you're forced into it. The 3D effect should never "wow" you, it should only serve to pull you deeper into the experience.

Some of Hollywood understands this. If you want to see a gimmick-free, proper use of 3D, go see Prometheus. I'll have a full review up later, but I can tell you that this is some of the best use of 3D that I've ever seen, right up there with Avatar for pulling you into the world. And most of the time, you don't even notice it.

So gamers.. turn your 3D down, for Pete's sake. Then those videos you upload to YouTube won't make people eyes try to rip themselves out of their head. Show the rest of them how it's done. Resist the temptation to turn your game into a gimmick, and go subtle. You'll be glad for it in the end.

Saturday, June 2, 2012

E3 is Upon Us! Yaaaaywwwwn!

As some of you probably already know, next week is the Electronic Entertainment Expo (E3) in L.A. The single biggest video game technology show in the world. The place where megaton announcements are made, earth-shattering technology is shown, and mind-blowing presentations and previews are proudly displayed to the masses.

But, by all accounts, absolutely nothing special is going to happen this year. That's it. Let's all go home.

The first problem was brought up by a user on NeoGAF, who asked the simple question "Will there actually be any surprises this year?" He wasn't asking just a general question, though, he was reacting to all of the "pre-E3 announcements" that were made. Pretty much every game company has stated exactly what they'll be displaying at E3 this year, and in many cases, have already shown some of that content in the form of various teasers and images and the like.

So we know Ubisoft is going to show off Assassin's Creed III. EA is going to show off Crysis 3 and SimCity. Sony is going to show off The Last of Us. Microsoft is going to show off Halo 4 and a bunch of Kinect crap that no one wants to see.

So... what's the point of the conference again? There really are no surprises.. we know what we're going to see, and in many cases we've already seen it.

It might be something moderately special for the press that's going to be there, but what about the millions of users who won't be there? All we're going to see are videos and trailers and screenshots from the show, most of which won't be posted until after the show is over. And how is that any different from what we see already? I can go to IGN or GameTrailers, or even just YouTube, and see plenty of videos for every last thing that anyone's planning on showing off. They should save a lot of money and just not have E3 at all, just release a bunch of trailers and screenshots online. That's all those of us at home are going to get out of this conference anyway, so what's the point?

With any luck, we'll all be surprised by last-minute announcements of the next generation of console hardware from Microsoft and Sony, but that's looking less and less likely as we gear up for the show. We would have heard something by now.

So E3 this year looks entirely pointless. I'll probably watch the Microsoft and Sony conferences just to see what's what, but otherwise I really couldn't care less.

Technology Reviewers are Idiots

Long story short, I'm shopping for a 3D computer monitor. None of the local retailers sell them, so I can't just go in and examine them personally to decide which one best suits what I'm looking for. I've narrowed it down pretty far, to LCD IPS displays using passive interleaved 3D. But I'm still not just going to order one sight-unseen without arming myself with some knowledge. This is how I shop for tech.

So, I turn to the experts.. the internet. The internet knows all, right?

It takes all of about four seconds at Google to find a review. The headline is not promising: "3D Fail". So I read the review, and it all looks good, up until the 3D part, where they describe poor depth rendering, and very bad ghosting (where your eyes are seeing parts of the image intended for the other eye).

Let's break these two gripes down, starting with the depth. I'm a visual effects artist, and I've created 3D renders before. The "depth" of an image, meaning the perceived distance between objects in the scene, is controlled by separation. The further apart the left and right images are from each other, the "further" the image will appear to be from the plane of the screen. This goes both ways, depending on how it's set up. It could be closer, or further away. But the result is the same: The depth is directly controlled by how the image is being rendered in the graphics card, and how far apart it's placing those parts of the image.

In other words, it has nothing whatsoever to do with the monitor itself.

The monitor is merely displaying what it's being told to display by the computer that it's connected to. That's why software such as NVidia's 3D Vision includes real time adjustments for things like convergence and depth. You push a button on your keyboard, and the scene depth changes. That doesn't have squat to do with the display.

So why did the reviewer bring it up at all? It's simple, really. He didn't know any better. He hooked it up, and he saw what he saw, but he didn't understand what was happening behind the scenes in software to make that 3D image possible, or how to correctly change it, instead just flailing away on various buttons and controls and saying that it didn't work.

Secondly, he complains about ghosting. Since I've read up on passive 3D displays, I know very well what the leading cause of this is: Poor configuration. Ghosting on passive displays is caused by slight color bleeding, where a color will bleed over into the surrounding pixels. It's more commonly used (intentionally) as antialiasing, to reduce the "jaggies" present on the raw output. 3D software can work around this, if it's configured correctly, and prevent those specific colors from bleeding vertically, which is what causes parts of the image to appear in the other eye's field of view.

Aside from not adjusting any configuration settings in either the monitor or the software, the reviewer also made one other grievous error, which I was only recently pointed to by another user who read the same review: The reviewer connected the monitor to the PC using DVI. The instructions included with the monitor clearly state that 3D only functions properly over the HDMI connection.

So the leading cause of his complaint was actually caused by his own dumbassedness: He hooked it up wrong.

So, the leading "expert" on monitor technology can't even plug the thing in correctly, and then complains loudly about how poor the performance is.

Then comes the worst part of this whole mess: No one else in the English speaking world actually reviews the thing for themselves. Do a Google search for "Asus 3D IPS monitor review" (I won't even bother listing the model number, you'll find it). The first thing you'll find is the CNET review that I'm talking about above. The next two hundred responses will be from other tech websites, all of them doing nothing but retweeting the CNET review.

Seriously.. of all the tech-related websites in the world, only one of them actually laid eyes on the hardware in question, and everyone else just took their word for it. Not a one of them stopped for two seconds to think "Hey, this doesn't sound right..." Again, the reason is simple: They don't know any better, either. They don't understand the technology any more than CNET did, and they have no way of knowing that the reviewer might have made a mistake or two. Or five.

I'm honestly surprised that Asus hasn't filed a lawsuit against CNET over this. That one review has probably single-handedly killed this product for them, because there are no other reviews that state anything other than what CNET did. Every review that anyone reads will be negative in the extreme, and no one will buy the thing.

Another user did point me to a review of the monitor from Romania, who actually hooked it up correctly, and had no real complaints to speak of. But how many of their customers are going to see this review? Not many, I'd wager. Newegg's review section is barren because no one wants to buy the thing.

These are the so-called "experts" that we turn to for help in making purchase decisions. Seemingly unknown to the reviewers themselves, this is actually a pretty big responsibility. I can't see it for myself, so I need someone else to tell me as much as they can about it. But I believe I'm entitled to listen to someone who actually knows what the hell they're doing. I might as well listen to Jim-Bob Jones down the street if this is the kind of crap I'm going to be reading on these websites.

If you can't even plug the thing in correctly, you have no business being an "expert", or posting any kind of review online, especially one that's going to be proliferated across the entire internet. Your ignorance of the very technology you claim to be an "expert" on is not doing any favors for the companies that make these products. I'm half-tempted to buy this monitor just to spite the reviewer.

Friday, May 18, 2012

What's happening to our games?

I was watching some videos from E3, TGS, and GDC from last year, and I saw some things that made me worry about the state of games.

I was tipped off about this from an article on Rock, Paper, Shotgun, which you can read HERE, with a follow-up article HERE. It introduced me to the idea of the "un-game". And now I'm having a hard time not seeing this when I look at some upcoming titles.

The basic idea behind the un-game is that it's an experience that doesn't really have the player do much of anything. Just movement from setpiece to setpiece, cutscene to cutscene. Now, we're not talking about linear games or cinematic titles. Most of those still require the player to actually do something most of the time.

The RPS articles are referring specifically to the highly successful title Call of Duty: Modern Warfare 3 (MW3). A "game" in which the player is merely an observer and doesn't actually do anything of consequence, you merely follow and watch the NPCs you're with actually do the important things (read the RPS articles for specifics).

The problem is when I look at other titles and start to see some of the same things crop up. Not necessarily to the extent that MW3 does, but things that are bothersome in the idea that the game is essentially telling you how to play it, rather than having the player actually think for themselves. The two examples I'm going to mention are both highly popular (only one has been released to date), which will probably make my opinions of them unpopular. Gears of War 3 and the upcoming reboot of the Tomb Raider franchise.

I was watching a play demo of Gears of War 3 (GOW3) at, I think it was E3 last year. During the demo, a huge creature of some sort was attacking the ship that the player character and NPCs were on. Your character was instructed to lead the creature to a certain part of the ship where it could be more easily killed. I don't remember the specifics, but I do remember that, during the scene on the main deck, the game kept popping up on-screen tips as to what the player should do. When the creature began waving its tentacles around, the game told the player in no uncertain terms that they should "dodge the monster" until the next phase of the battle could begin, at which point the creature latched onto something, and the player was told to "dislodge the monster". Not in voiceover, but in on-screen tips.

Really? Dodge the monster? You know what would have happened in a game like this ten years ago? If you were too dumb to move out of the way of the twenty foot thick tentacles swinging in your direction? You fucking DIED, that's what happened. It was up to the player to decide they needed to dodge, the game didn't have to tell you to do it.

What scares me about this is that they probably figured this out in playtesting, that the testers became frustrated and called the game "too hard" because they couldn't figure out the basic premise of "don't stand in the fire". So the developers, no slouches themselves at how to make good games, ended up having to "dumb it down" so little Johnny wouldn't throw his controller out of the window.

The other example is the upcoming Tomb Raider reboot. On the one hand, the game looks very good, and could be a promising kick start to an otherwise aging franchise that isn't doing so well. On the other hand, it's also made for the "stupid gamer".

I was watching a demo of this one as well, and our stalwart heroine Lara Croft makes her way into a cavern. There's various mechanical contraptions, elevators and the like, and Lara has to figure out how to make use of them to get up to the exit from the cave.

Or rather, she doesn't have to figure it out.

On entering the cavern, the game actually highlights exactly where she needs to go and what to do when she gets there. Literally, it's a brightly glowing lever that's impossible for the player to miss. Just walk up, push the button to flip the lever, and the elevator activates, allowing Lara to jump up to the next level. Takes all of thirty seconds. They call this Lara's "survival sense".

Again, I have to hearken back to games from aeons ago. And not just any random adventure game, I mean this very franchise, the first Tomb Raider game, released for the original Playstation in 1996. I played this game myself, a lot, and I loved it. You'd walk into a cavern, maybe very similar to the one in this new game, and you'd spend half an hour trying to figure out how to get on that upper level, and you had to use your own damn wits to do it.

So which is better? That you have to figure it out for yourself, or that the game simply tells you what to do? The original Tomb Raider was, at its heart, a puzzle game built into the framework of an action/adventure title. There were huge sections of the game without enemies, just an environment that the player had to work their way through. And yes, it was challenging. Yes, it was sometimes frustrating. But dammit, when you hit the exit, you felt like you really accomplished something. You figured that shit out on your own and you felt good about it.

What happens when you don't have to figure it out anymore? What's the point? It's like reading a game guide or watching a YouTube video on exactly how to solve the next puzzle before you even attempt it yourself. Where's the sense of accomplishment? The satisfaction of having solved the puzzle, having beaten the developers who were trying to stop you.. that was something special. And it's completely gone from this new Tomb Raider game. Just... gone. When you're done with this, there won't be any satisfaction, no patting yourself on the back. Just... nothing.

Is this where we're headed? If this game does well, which it probably will thanks to the already aggressive marketing they're doing, then the developers will just continue to make games like this. And the games that require the player to actually think will fall by the wayside. Some people just can't seem to wrap their head around "thinking" games anymore. I remember reading a discussion a few years ago, after the first Portal game was released and made a huge splash in the game industry. One player in particular finished the game and said it was "boring" and "nothing special". When prompted, he had no problem admitting that he had used online guides for every single puzzle. Every. Single. One. He didn't solve a single room on his own. It's a puzzle game, for crying out loud.. if you don't actually solve the puzzles, then can you honestly say that you really played the game at all? And what happens when the game itself gives you the answer? Is it really even a game anymore?

Check out THIS VIDEO to see just how much things have changed, and not for the better.

Sunday, February 19, 2012

Dragon Tattoo? Seen it already.

So there's been a lot of buzz about The Girl with the Dragon Tattoo. My opinion is "whatever". The reason for that is simple: I've already seen it. I saw it before it even hit theaters here.

I saw it back when it was called Män som hatar kvinnor.

Oh, but it's a different movie, you say.. a different take on the idea. My answer? I don't care. The story's the same, I already know how it ends.

It's fairly common that our initial impression of something comes to define our idea of it. In the case of this film, it's the actors and their performances. For me, Noomi Rapace is Lisbeth Salander. Michael Nyqvist is Mikael Blomkvist. Period, the end.

Hollywood has this habit of spending lots of money remaking so-called "foreign" films. Effectively doing little but re-recording the film with English dialogue, for the Michael Bay Generation that can't be bothered to do something as simple as read.

I use the phrase "so-called foreign films" to point out that there are almost no "domestic" films anymore. Almost all films are produced overseas, usually in Europe, Australia, New Zealand, etc, using non-American actors, financed by non-American companies, written and directed by non-American writers and producers. But they're not "foreign" films.. because they're in English. Record a film here in the States, 100% home-grown American, but do it in Spanish instead? Bam.. "foreign film". They really need to clarify this.. they're only called "foreign" because they're filmed in a language other than English.

So they think that we stupid Americans can't read, so they make the film in English. And then they totally bork (or börk?) up the casting. Granted, I don't really know Rooney Mara that well, having only seen her in Nightmare on Elm Street, where I would classify her performance as "okay" (considering the material). She's got her work cut out here, because Lisbeth is not an easy role by any means, and a role like this can make or break a career. To be fair, I have heard good things about her performance, but I'm not sure if they're making comparisons to Noomi Rapace, who, for many people (including me), cemented who Lisbeth is. But Daniel Craig? Sorry, but he would be one of the last people I would cast as Blomkvist. Michael Nyqvist might not be the American "pretty boy", but the man can act, and his performance as Blomkvist absolutely makes him into a real person that the audience can really believe. Daniel Craig is great if you put him in the right role, but this is not it.

I just fail to see the point of the whole "remake" thing. Män som hatar kvinnor (which, incidentally, does not translate as "The Girl with the Dragon Tattoo") is barely three years old. Aside from the language "barrier", there's absolutely no reason to make the whole thing again. It's not like the originals were low-budget backyard productions.. they're extremely well made by any measure, and are all-around good films.

And most people here in the States are barely aware that they exist, and they'll probably never see them. They're on Netflix right now, all three of them. I wonder what the viewing numbers look like for them. Probably numbers that would make me sad.


Sunday, February 12, 2012

The Michael Bay Generation (Is Filmmaking Dead?)

So I'm reading this thread on a message board about movies, and I see someone say "Oh, that movie's boring.. I could barely stay awake through it." Something I've seen repeated several times, about this film in particular.

Is it boring? Not really, no. It's quite engaging, actually. But it does two things that seem to be anathema to the average moviegoer these days:

1) It doesn't have boobies or explosions.
2) It actually makes you put the pieces of the plot together yourself instead of just handing it to you on a silver platter.

The film in question? The recent spy thriller Tinker, Tailor, Soldier, Spy.

I refer to this phenomenon as the "Michael Bay Generation". Although Bay is not directly responsible for the phenomenon, his films are the most obvious example of it. Like a lot of entertainment genres these days, including television, reading, and gaming, people seem to want the "spectacle" more than the "art".

Case in point: Transformers: Dark of the Moon. Total box office take worldwide, over one billion dollars. Tinker, Tailor, Soldier, Spy? Fifty-six million. It seems that people just don't want films like this anymore.

Let's examine TTSS for a minute. Surprisingly short at barely two hours, I fully expected it to be quite a bit longer. It's a spy thriller taking place in and around British Intelligence (MI6) during the Cold War in 1973. It features an absolutely stellar cast, including Gary Oldman, Colin Firth, Tom Hardy, Mark Strong, Ciarán Hinds, Benedict Cumberbatch, and John Hurt, all of which turn in spectacular performances. Gary Oldman is looking at an Oscar nomination (his first, surprisingly) for this role, and it's well deserved.

The "problem" with the film, according to the Michael Bay Generation, is that it's a true spy movie. The way spies actually work in the real world. This isn't James Bond's MI6. This is a bunch of guys talking about sources of information and deciding how to act on it. It describes the recruitment process, and how normally uneventful spying really is. Sneaking into an office and making off with a folder full of information instead of parachuting into a villain's lair and killing forty-seven people using a watch with a laser in it. One of the most tense moments in TTSS is literally a character trying to distract the people around him so he can stuff a pile of paper into his briefcase. And it works because it's so brilliantly edited and acted, you really feel the tension that the character feels, and you know that he (and you) will just die if someone suddenly pipes up and says "Oy, what you doin' there?" The character almost has a nervous breakdown afterwards, and you really believe it.

So there's no big explosions or fight scenes. I think there's a total of maybe four gunshots throughout the entire film. Exciting, in the Michael Bay sense? No, definitely not. Gripping and enthralling through story and acting? Absolutely.

And yes, no boobies. There's only a few mentions of sex throughout the entire film, and it's seen through the eyes of a very jaded and somewhat bitter middle-aged man who's dealing with his own issues. So no "Bond girls", either. The one woman we feel even the slightest bit of concern for throughout the film is brutally murdered right in front of us.

Another thing that TTSS does is make you think things out on your own. It's a spy movie, after all, a real spy movie. I knew going into it that the plot was probably going to be extremely complicated, so I was prepared for it when it turned out to be exactly that. Yeah, a lot of it gets explained at the end, but the pieces are always there if you look. It's a film with a lot of different threads going on, and they don't really start to come together until the film's climax. And even then, it's not all just handed to the audience.. there's still little pieces you need to think about on your own if you want an answer.

It's a really sad thing that a film like this will simply be labeled as "boring" (along with other "boring" films like The Godfather, I suppose), and will probably disappear into the bargain bin, and will force Hollywood to stop producing them. It really is an exceptional piece of filmmaking, and if you have it in you to appreciate good films.. not "spectacles" or "event films", but just really good films, then you owe it yourself to see this one.

Sunday, January 15, 2012

Violent video games - Is it the problem "they" say it is?

Hot topic! This one comes up all the time, usually after someone kills someone else (in real life). I've been chewing on this one for a while.

Proponents of this theory say that violent video games cause violent behavior. That little Johnny was such an angel until he started playing Grand Theft Auto, at which point he went next door and stabbed little Stevie to death with a carving fork.

These people are idiots. And yes, I'm going to tell you why.

For one, they try to prove their point by doing studies that prove a link between violent behavior and the playing of violent video games. Jimmy is a bully at school, and he plays a video game called Bully. In the game, you're tasked with being a bully at school, and sure enough, Jimmy is a bully in real life. Proven. Or is it?

Now, I'm not saying that there isn't a link between violent behavior and violent games. Just that these people have it backwards:

Jimmy isn't a bully because he plays the game. He plays the game because he's a bully.

A slight change in wording makes a HUGE difference in the outcome. The truth is quite simple: Violent people are more attracted to violent video games. It's not the other way around. The video games don't make them into violent people, they're already violent people, whether they play the game or not.

This could be proven rather easily by just doing a simple unbiased study. You put Jimmy the bully and Bobby the little angel into a room to play video games, and you give them a choice of Saint's Row and Barbie Horse Adventures. Guess which kid's going to play which one? Jimmy isn't going to touch the Barbie game with a ten-foot pole, and if you were to force it on him, he'd probably try to find ways in which to break the horse's legs so he'd get to shoot it. Conversely, if you were to force angel Bobby to play Saint's Row to run over pedestrians and shoot up a police station, he'd probably get sick. He'd play the game in a very conservative fashion, and he absolutely would not suddenly turn into a mass murderer.

In fact, other studies have been done that indicate that violent video games are actually a healthy outlet for aggression for these people. Have a bad day at work? Pop in Call of Duty and go kill some Russkies, you'll feel better. Better to play a game for an hour than to leave a body in a ditch somewhere, yeah?

Yes, I play violent video games. Because I'm a violent person. I have an incredibly vicious temper, one that I've had since long before these games even existed, that I've spent most of my life learning to control. And I'm very good at it. Most people see me as mild-mannered Clark Kent, which is how I prefer to be.. it's who I want to be. That's the guy that plays Flower. Underneath, it's more like Dexter Morgan. He's the one that plays Mortal Kombat.

The problem is that a lot of parents these days, specifically the ones that were never gamers themselves, believe that games never grew up.. they think that it's all still Pac-Man and Missle Command and that "games are for kids". As such, they tend to completely ignore the rating system (many aren't even aware that there is such a thing) and let their kids play whatever they want. Ironically, they get mad at the store or the game industry when they find out just what the games are like, even though they themselves purchased it. It's not that hard, really. If the store clerk refuses to sell the game to Junior, and asks for your ID before selling it to you, that might be a pretty good indication that this game isn't what you think it is.

It's part of a much larger problem of parents refusing to take responsibility for the raising of their children, but that's a discussion for another time.

In the meantime, parents need to pay attention to what their kids are playing, and learn about the rating system. It's there for a reason. Video games aren't for kids anymore.