Mirror’s Edge: Do we have a winner?

And now we get to the final test. Something truly different: Mirror’s Edge.

This is an EA game. Ben had to leave before we got to this part of the test, he does have a wife and kid after all, so I went at this one alone.

I’d never played Mirror’s Edge. I’d seen the videos, it looked interesting. You play as a girl, Faith, a runner. You run across rooftops, through buildings, it’s all very parkour-like. You’re often being pursued by “blues”, police offers, as you run through the game. I won’t give away any plot details here but this game, I liked.

The GPU accelerated PhysX impacted things like how glass shatters and the presence of destructible cloth. We posted a video of what the game looks like with NVIDIA GPU accelerated PhysX enabled late last year:

"Here is the side by side video showing better what DICE has added to Mirror's Edge for the PC with PhysX. Please note that the makers of the video (not us) slowed down the game during some effects to better show them off. The slow downs are not performance related issues. Also, the video is best viewed in full screen mode (the button in the bottom right corner)."

 

In Derek’s blog about the game he said the following:

“We still want to really get our hands on the game to see if it feels worth it, but from this video, we can at least say that there is more positive visual impact in Mirror's Edge than any major title that has used PhysX to date. NVIDIA is really trying to get developers to build something compelling out of PhysX, and Mirror's Edge has potential. We are anxious to see if the follow through is there.”

Well, we have had our hands on the game and I’ve played it quite a bit. I started with PhysX enabled. I was looking for the SSD-effect. I wanted to play with it on then take it away and see if I missed it. I played through the first couple of chapters with PhysX enabled, fell in lust with the game and then turned off PhysX.

I missed it.

I actually missed it. What did it for me was the way the glass shattered. When I was being pursued by blues and they were firing at me as I ran through a hallway full of windows, the hardware accelerated PhysX version was more believable. I felt more like I was in a movie than in a video game. Don’t get me wrong, it wasn’t hyper realistic, but the effect was noticeable.

I replayed a couple of chapters and then played some new ones with PhysX disabled now before turning it back on and repeating the test.

The impact of GPU accelerated PhysX was noticeable. EA had done it right.

The Verdict?

So am I sold? Would I gladly choose a slower NVIDIA part because of PhysX support? Of course not.

The reason why I enjoyed GPU accelerated PhysX in Mirror’s Edge was because it’s a good game to begin with. The implementation is subtle, but it augments an already visually interesting title. It makes the gameplay experience slightly more engrossing.

It’s a nice bonus if I already own a NVIDIA GPU, it’s not a reason for buying one.

The fact of the matter is that Mirror’s Edge should be the bare minimum requirement for GPU accelerated PhysX in games. The game has to be good to begin with and the effects should be the cherry on top. Crappy titles and gimmicky physics aren’t going to convince anyone. Aggressive marketing on top of that is merely going to push people like us to call GPU accelerated PhysX out for what it is. I can’t even call the overall implementations I’ve seen in games half baked, the oven isn’t even preheated yet. Mirror’s Edge so far is an outlier. You can pick a string of cheese off of a casserole and like it, but without some serious time in the oven it’s not going to be a good meal.

Then there’s the OpenCL argument. NVIDIA won’t port PhysX to OpenCL, at least not anytime soon. But Havok is being ported to OpenCL, that means by the end of this year all games that use OpenCL Havok can use GPU accelerated physics on any OpenCL compliant video card (NVIDIA, ATI and Intel when Larrabee comes out).

While I do believe that NVIDIA and EA were on to something with the implementation of PhysX in Mirror’s Edge, I do not believe NVIDIA is strong enough to drive the entire market on its own. Cross platform APIs like OpenCL will be the future of GPU accelerated physics, they have to be, simply because NVIDIA isn’t the only game in town. The majority of PhysX titles aren’t accelerated on NVIDIA GPUs, I would suspect that it won’t take too long for OpenCL accelerated Havok titles to equal that number once it’s ready.

Until we get a standard for GPU accelerated physics that all GPU vendors can use or until NVIDIA can somehow convince every major game developer to include compelling features that will only be accelerated on NVIDIA hardware, hardware PhysX will be nothing more than fancy lettering on a cake.

You wanted us to look at PhysX in a review of an ATI GPU, and there you have it.

The Unreal Tournament 3 PhysX Mod Pack: Finally, a Major Title CUDA - Oh there’s More
Comments Locked

294 Comments

View All Comments

  • lk7900 - Monday, April 27, 2009 - link


    Can you please die? Prefearbly by getting crushed to death, or by getting your face cut to shreds with a
    pocketknife.

    I hope that you get curb-stomped, f ucking retard

    Shut the *beep* up f aggot, before you get your face bashed in and cut
    to ribbons, and your throat slit.

    http://www.youtube.com/watch?v=QGt3lpxyo1U">http://www.youtube.com/watch?v=QGt3lpxyo1U

    I wish you a truly painful, bloody, gory, and agonizing death, *beep*
  • joeysfb - Wednesday, April 15, 2009 - link

    Hahaha! An eye for an eye. Guess the table has turned. AMD used to be in a needy position... taking it from left..right..center and back from players like Nvidia.
  • joeysfb - Monday, April 13, 2009 - link

    Good job AnandTech!!, really like your behind the scene commentary.
  • araczynski - Saturday, April 11, 2009 - link

    so far my overclocked 4850 crossfire setup has been keeping me happy, i'll come back into the market when the 5000 series rolls out and i upgrade my rig in general.
  • ChemicalAffinity - Thursday, April 9, 2009 - link

    Can someone ban this guy? I mean seriously.
  • SiliconDoc - Friday, April 24, 2009 - link

    Are you on drugs, is that why you don't understand or have a single counterpoint ?
    Come on, come up with at least one that refutes my endless stream of corrections to the lies you've lived with for months.
    No ?
    Ban the truth instead ?
    Yeah, that wouldn't help you.
  • Ananke - Thursday, April 9, 2009 - link

    I had 4850, 4870-1Gb, 260-216 and 280-Overclocked. Ran on 24" 1900*1200 - Crysis and Warhead, FarCry2, GTA4, Stalker ....whatever else you can imagnine...

    My experience:

    Radeons are hot and noisier. You HAVE to increase the fan speed and it is audible. Image quality in games is very good though. Especially Crysis was better looking with the Radeons. Bullet tracing and sunshine effects were spectacular...GTX 280 on max everything in Crysis was also very beautiful. However that card gets HOT, so you would be better off with 285. I didn't like the image quality of Radeons in movies , but maybe my settings were not good. 4850 is definitely not the money, too hot for my test.

    So, 4870 or 4890 1 Gb is definitely worth buying, performance is on par with 285 on 1900*1200 - Crysis was 27-41 FPS with standart Radeon 4870, and 31-45 with 280 OC 615 MHz.

    IF 285 price is $250, that would be the best buy. If it costs more is NOT worth the money, unless you really want bigger and quiter card. Performance wise is the same as Radeon 4890, which now costs 229 and can be overclocked. I did overclock the GTX280 and 285, which doesnt show any performance change, I guess they are constrained by memory bandwidth?

    So, honestly, for the money Radeon 4890 for $229 is the better choice. IF you find 4870 1Gb for $169 is worth considering also. The 896MB on the Nvidias is a constraint, I would not reccomend anything but 285, but that is expensive.

  • Truenofan - Tuesday, April 7, 2009 - link

    woops. i meant arctic cooling S1 Rev2.
  • Truenofan - Tuesday, April 7, 2009 - link

    i don't get whats going on with silicon. but i enjoy my 4870. it works best at my resolution(1920x1200) and it costed less than the 275 with the ac-1. runs very chilly(45C idle 57C load oc'ed). i dont need phys-x or an application to do video encoding that costs extra adding to the total cost of the video card. gaming is its sole purpose to me and it does that extremely well.

    180 + 80 dollars for the video applications costs more than what my 4870 ran me and it completely outclasses at stock speeds it let alone a 275(260) or 280(270) which mine still costed less than. now you can get a 4870 for what the 260 runs. wheres the logic in that? just so you can run a few games with physx that aren't even that good? to do some video encoding? i'll stick with my lower cost 4870.
  • SiliconDoc - Tuesday, April 7, 2009 - link

    I see, now your 4870 completely outclasses even the 280. LOL
    Your 4870 is matched with the 260, not the 275, and not the 280.
    You don't have anything but another set of lies, so it's not something about you determining "my problem", or you "not knowing what it is", but it is rather the obvious lies required for you to "express your opinion". Maybe you should read my responses for the 20 some pages, and tell me why any of the 20 plus solid points that destroy the lies of the reds, are incorrect ? You think you might try it ? I mean we have a lot more than just YOUR OPINION,, false as you presented it, to determine, what is correct. For instance:
    http://www.fudzilla.com/index.php?option=com_conte...">http://www.fudzilla.com/index.php?optio...Itemid=4...
    .
    Now, not even your 4870 overclocked XXX can beat the GTX260 GLH. In your MIND, though, it does, huh....? lol
    Too bad, for you. I, unlike you, know what your problem is, and that is exactly what should bother you, about me.

Log in

Don't have an account? Sign up now