Mirror’s Edge: Do we have a winner?

And now we get to the final test. Something truly different: Mirror’s Edge.

This is an EA game. Ben had to leave before we got to this part of the test, he does have a wife and kid after all, so I went at this one alone.

I’d never played Mirror’s Edge. I’d seen the videos, it looked interesting. You play as a girl, Faith, a runner. You run across rooftops, through buildings, it’s all very parkour-like. You’re often being pursued by “blues”, police offers, as you run through the game. I won’t give away any plot details here but this game, I liked.

The GPU accelerated PhysX impacted things like how glass shatters and the presence of destructible cloth. We posted a video of what the game looks like with NVIDIA GPU accelerated PhysX enabled late last year:

"Here is the side by side video showing better what DICE has added to Mirror's Edge for the PC with PhysX. Please note that the makers of the video (not us) slowed down the game during some effects to better show them off. The slow downs are not performance related issues. Also, the video is best viewed in full screen mode (the button in the bottom right corner)."

 

In Derek’s blog about the game he said the following:

“We still want to really get our hands on the game to see if it feels worth it, but from this video, we can at least say that there is more positive visual impact in Mirror's Edge than any major title that has used PhysX to date. NVIDIA is really trying to get developers to build something compelling out of PhysX, and Mirror's Edge has potential. We are anxious to see if the follow through is there.”

Well, we have had our hands on the game and I’ve played it quite a bit. I started with PhysX enabled. I was looking for the SSD-effect. I wanted to play with it on then take it away and see if I missed it. I played through the first couple of chapters with PhysX enabled, fell in lust with the game and then turned off PhysX.

I missed it.

I actually missed it. What did it for me was the way the glass shattered. When I was being pursued by blues and they were firing at me as I ran through a hallway full of windows, the hardware accelerated PhysX version was more believable. I felt more like I was in a movie than in a video game. Don’t get me wrong, it wasn’t hyper realistic, but the effect was noticeable.

I replayed a couple of chapters and then played some new ones with PhysX disabled now before turning it back on and repeating the test.

The impact of GPU accelerated PhysX was noticeable. EA had done it right.

The Verdict?

So am I sold? Would I gladly choose a slower NVIDIA part because of PhysX support? Of course not.

The reason why I enjoyed GPU accelerated PhysX in Mirror’s Edge was because it’s a good game to begin with. The implementation is subtle, but it augments an already visually interesting title. It makes the gameplay experience slightly more engrossing.

It’s a nice bonus if I already own a NVIDIA GPU, it’s not a reason for buying one.

The fact of the matter is that Mirror’s Edge should be the bare minimum requirement for GPU accelerated PhysX in games. The game has to be good to begin with and the effects should be the cherry on top. Crappy titles and gimmicky physics aren’t going to convince anyone. Aggressive marketing on top of that is merely going to push people like us to call GPU accelerated PhysX out for what it is. I can’t even call the overall implementations I’ve seen in games half baked, the oven isn’t even preheated yet. Mirror’s Edge so far is an outlier. You can pick a string of cheese off of a casserole and like it, but without some serious time in the oven it’s not going to be a good meal.

Then there’s the OpenCL argument. NVIDIA won’t port PhysX to OpenCL, at least not anytime soon. But Havok is being ported to OpenCL, that means by the end of this year all games that use OpenCL Havok can use GPU accelerated physics on any OpenCL compliant video card (NVIDIA, ATI and Intel when Larrabee comes out).

While I do believe that NVIDIA and EA were on to something with the implementation of PhysX in Mirror’s Edge, I do not believe NVIDIA is strong enough to drive the entire market on its own. Cross platform APIs like OpenCL will be the future of GPU accelerated physics, they have to be, simply because NVIDIA isn’t the only game in town. The majority of PhysX titles aren’t accelerated on NVIDIA GPUs, I would suspect that it won’t take too long for OpenCL accelerated Havok titles to equal that number once it’s ready.

Until we get a standard for GPU accelerated physics that all GPU vendors can use or until NVIDIA can somehow convince every major game developer to include compelling features that will only be accelerated on NVIDIA hardware, hardware PhysX will be nothing more than fancy lettering on a cake.

You wanted us to look at PhysX in a review of an ATI GPU, and there you have it.

The Unreal Tournament 3 PhysX Mod Pack: Finally, a Major Title CUDA - Oh there’s More
Comments Locked

294 Comments

View All Comments

  • 7Enigma - Thursday, April 2, 2009 - link

    If the 4890 is in fact a respin then I retract my original comment. My point if just a simple OC was that they were basically rebranding (binning) parts that could clock higher than the stock 4870 and selling it as a new card. That seems not to be the case, and so I can't be at fault if the Anand article didn't address this.

    Regardless of whether the Nvidia card is in fact similar to the other offerings it does have disabled/enabled parts that do make it different than the 285 and 260.

    I'd still really like to see one of the Vapochill units up against the 4890. I'm pretty confident you could get to the stock 4890 speeds, so it's just a matter of whether $70 is worth the potential to OC much higher than the 4870 (if these 1gig core clocks are the norm).

    What we really need to see though is the temps for these cards under idle/load. That would be extremely helpful in deciding how good they are. For example if we see the 4890 at its stock speed is significantly cooler than the 4870 (and they haven't done much to the heatsink/fan), then the Vapochill 4870's just don't stand a chance. If we find the 4890's are similar or higher in temp than the stock 4870's, then it seems much more like a rebadge job.
  • MadMan007 - Thursday, April 2, 2009 - link

    If you look at it objectively the GTX 275 is something more different than the HD4890 unless there are undercover changes in the latter of which we haven't been made aware. HD4890 = clock bumped HD4870 exactly, GTX 275 = 240SP 448-bit memory interface GT200b which was not available as a single GPU card.
  • bill3 - Thursday, April 2, 2009 - link

    Meh..Madman Nvidia are still just playing around with the exact same modular component sets they have been, not adding anything new. Besides as even you alluded it isnt even a new card, it's just half the exact previously existing configuration in a GTX295

    But as I said 285 is clocked higher than 280, I'm assuming Nvidia did die tweak to get there (at the least they switched to 55nm). They just did them 3 months ago or whenever, ATI is just now getting to it.

    But for todays launches, imo ATI brings more new to the table than Nvidia, ever so slightly.
  • Snarks - Tuesday, April 7, 2009 - link

    Ati was the first with the 55nm core.. or did you mean something else?

    the GTX275 is just simply a 280GTX with a few things disabled is it not?
  • SiliconDoc - Monday, April 6, 2009 - link

    Right, they brought ambient occlusion to the table with their new driver.... LOL
    Man , I'm telling you.
    The new red rooster mantra " shadows in games do not matter " !
    ( "We don"t care nvidia does it on die and in drivers, making it happen in games without developer implementation ! WE HATE SHADERS/SHADOWS who cares!" )
    I mean you have to be a real nutball. The Camaro car shop doesn't have enough cherry red paint to go around.
    I wonder if the red roosters body paint ATI all over before they start gaming ? They probably spraypaint their Christmas trees red - you know, just to show nvidia whose boss...
    Unbelievable. Shadows - new shadows not there before - don't matter... LOL
    roflmao
  • Warren21 - Thursday, April 2, 2009 - link

    I'm surprised they didn't mention it, maybe they hadn't been properly briefed, but yes the HD 4890 IS a different core than the HD 4870.

    It uses a respin on the RV770 called RV790 which has slight clock-for-clock performance increases and much better power efficiency than the RV770. Case in point: higher clocks yet lower idle power draw. It's supposed to clock to 1 GHz without too much hassle granted proper cooling also.
  • Live - Thursday, April 2, 2009 - link

    This review was kind of a let down for me. It almost seems Nvidias sales rep terrorized you so much the last year so you felt compelled to write about CUDA and PhysX. But just as you said from the beginning it’s not a big deal.

    As a trade off temperatures, noise and power seems to have gone missing. You talk about Nvidias new driver but what about ATIs new driver? Did you really test the ATI cards with “Catalyst 8.12 hotfix” as is stated on the test page?!? Surely ATI sent you a new driver and the performance figures seem to support that. I is my understanding that ATI has upped their driver performance the last months just like Nvidia has. No mention of IQ except from Nvidias new drivers. No overclocking which I had heard would be one of the strong points of the ATI card with 1 GHz GPU a possibility. I know you mentioned you would look at it again but just crank up the damn cards and let us know where they go.

    Dont get me wrong the article was good but I guess I just want more ;)

    ATI sems to win at “my” resolution of 1680x1050, but then again Nvida has some advantages as well. Tough call and I guess price will have to settle this one.
  • dubyadubya - Friday, April 3, 2009 - link

    I agree noise and temps should be in all reviews. So should image quality comparisons. While we are at it 2d performance and image quality comparisons should really be part of any complete review. It seems frame rates are all review sites care to report.
  • The0ne - Thursday, April 2, 2009 - link

    You and others want more but yet keep bitching about mentions such as CUDA and PhysX. If Anandtech doesn't mention then someone has to complain why they weren't and weren't included in the test. For example the recent buyers guide. And when they do mention it and said it doesn't do anything much and left it alone there's bitching going on. I really don't get you guys sometime.
  • SiliconDoc - Monday, April 6, 2009 - link

    Well it's funny isn't it - with the hatred of NVidia by these reviewers here. Anand says "he has never played Mirror's Edge" - but of course it has been released for quite some time. So Anand by chance with the red rooster gone has to try it - of course he didn't want to, but they had to finally mention CUDA and PhysX - even though they dpn't want to.
    Then Anand really does like the game he has been avioding, it's great, he gets near addicted, shuts off PhysX, notices the difference, turns it back on and keeps happily playing.
    Then he says it doesn't really matter.
    Same for the video converter. Works great, doesn't matter.
    CUDA - same thing, works, doesn't matter, and don't mention folding, because that works better on NVida - has for a long time, ATI has some new port, not as good, so don't mention it.
    Then Ambient Occlusion - works great, shadows - which used to be a very, very big deal are now with the NVidia implementation on 22 top games, well, it's a "meh".
    There's only so many times so many added features can work well, be neat, be liked, and then the reviewer, even near addicted to one game because of the implementation, says "meh", and people cannot conclude the MASSIVE BIAS slapping them in the face.
    We KNOW what it would be like if ATI had FOUR extra features Nvidia didn't - we would NEVER hear the end of the extra value.
    Anand goes so far as to hope and pray openCL hits very soon, because then Havok via ATI "could soon be implemented in some games and catch up with PhysX fairly soon".
    I mean you have to be a BLIND RED ROOSTER DROOLING IDIOT not to see it, and of course there is absolutely no excuse for it.
    It's like cheering for democrats or republicans and lying as much as possible depending on which team you're on. It is no less than that, and if you don't see it glaring in your face, you've got the very same mental problem. It's called DISHONESTY. Guided by their emotions, they cannot help themselves, and will do everything to continue and remain in absolute denial - at least publicly.

Log in

Don't have an account? Sign up now