Mirror’s Edge: Do we have a winner?

And now we get to the final test. Something truly different: Mirror’s Edge.

This is an EA game. Ben had to leave before we got to this part of the test, he does have a wife and kid after all, so I went at this one alone.

I’d never played Mirror’s Edge. I’d seen the videos, it looked interesting. You play as a girl, Faith, a runner. You run across rooftops, through buildings, it’s all very parkour-like. You’re often being pursued by “blues”, police offers, as you run through the game. I won’t give away any plot details here but this game, I liked.

The GPU accelerated PhysX impacted things like how glass shatters and the presence of destructible cloth. We posted a video of what the game looks like with NVIDIA GPU accelerated PhysX enabled late last year:

"Here is the side by side video showing better what DICE has added to Mirror's Edge for the PC with PhysX. Please note that the makers of the video (not us) slowed down the game during some effects to better show them off. The slow downs are not performance related issues. Also, the video is best viewed in full screen mode (the button in the bottom right corner)."

 

In Derek’s blog about the game he said the following:

“We still want to really get our hands on the game to see if it feels worth it, but from this video, we can at least say that there is more positive visual impact in Mirror's Edge than any major title that has used PhysX to date. NVIDIA is really trying to get developers to build something compelling out of PhysX, and Mirror's Edge has potential. We are anxious to see if the follow through is there.”

Well, we have had our hands on the game and I’ve played it quite a bit. I started with PhysX enabled. I was looking for the SSD-effect. I wanted to play with it on then take it away and see if I missed it. I played through the first couple of chapters with PhysX enabled, fell in lust with the game and then turned off PhysX.

I missed it.

I actually missed it. What did it for me was the way the glass shattered. When I was being pursued by blues and they were firing at me as I ran through a hallway full of windows, the hardware accelerated PhysX version was more believable. I felt more like I was in a movie than in a video game. Don’t get me wrong, it wasn’t hyper realistic, but the effect was noticeable.

I replayed a couple of chapters and then played some new ones with PhysX disabled now before turning it back on and repeating the test.

The impact of GPU accelerated PhysX was noticeable. EA had done it right.

The Verdict?

So am I sold? Would I gladly choose a slower NVIDIA part because of PhysX support? Of course not.

The reason why I enjoyed GPU accelerated PhysX in Mirror’s Edge was because it’s a good game to begin with. The implementation is subtle, but it augments an already visually interesting title. It makes the gameplay experience slightly more engrossing.

It’s a nice bonus if I already own a NVIDIA GPU, it’s not a reason for buying one.

The fact of the matter is that Mirror’s Edge should be the bare minimum requirement for GPU accelerated PhysX in games. The game has to be good to begin with and the effects should be the cherry on top. Crappy titles and gimmicky physics aren’t going to convince anyone. Aggressive marketing on top of that is merely going to push people like us to call GPU accelerated PhysX out for what it is. I can’t even call the overall implementations I’ve seen in games half baked, the oven isn’t even preheated yet. Mirror’s Edge so far is an outlier. You can pick a string of cheese off of a casserole and like it, but without some serious time in the oven it’s not going to be a good meal.

Then there’s the OpenCL argument. NVIDIA won’t port PhysX to OpenCL, at least not anytime soon. But Havok is being ported to OpenCL, that means by the end of this year all games that use OpenCL Havok can use GPU accelerated physics on any OpenCL compliant video card (NVIDIA, ATI and Intel when Larrabee comes out).

While I do believe that NVIDIA and EA were on to something with the implementation of PhysX in Mirror’s Edge, I do not believe NVIDIA is strong enough to drive the entire market on its own. Cross platform APIs like OpenCL will be the future of GPU accelerated physics, they have to be, simply because NVIDIA isn’t the only game in town. The majority of PhysX titles aren’t accelerated on NVIDIA GPUs, I would suspect that it won’t take too long for OpenCL accelerated Havok titles to equal that number once it’s ready.

Until we get a standard for GPU accelerated physics that all GPU vendors can use or until NVIDIA can somehow convince every major game developer to include compelling features that will only be accelerated on NVIDIA hardware, hardware PhysX will be nothing more than fancy lettering on a cake.

You wanted us to look at PhysX in a review of an ATI GPU, and there you have it.

The Unreal Tournament 3 PhysX Mod Pack: Finally, a Major Title CUDA - Oh there’s More
Comments Locked

294 Comments

View All Comments

  • Snarks - Tuesday, April 7, 2009 - link

    One is an open, one is not.

    Jesus christ.

    The fact you have to pay extra on top of the card prices to use these features is a no go. You start to lose value, thus negating the effect these "features" have.

    p.s ATI have similar features to nvidia, what they have is nothing new.
  • SiliconDoc - Tuesday, April 7, 2009 - link

    Did you see a charge for ambient occlusion ?
    Here you are "clucky clucky cluck cluck !"
    Red rooster, the LIARS crew.
  • SiliconDoc - Tuesday, April 7, 2009 - link

    One ? I count for or five. I never had to pay extra outside card cost for PhysX, did you ?
    You see, you people will just lie your yappers off.
    Yeah ati has PhysX - it's own. ROFLMAO
    Look, just jump around and cluck and flap the rooster wings and eat some chickseed, you all can believe eachothers LIES. Have a happy lie fest, dude.
  • bill3 - Thursday, April 2, 2009 - link

    Personally while you bring up good points I'd much, much, MUCH rather have the thorough explanation of CUDA and PHYSX and the relevance thereof, they gave us than power, heat and overclocking numbers you can get at dozens of other reviews. The former is insight, the latter just legwork.
  • joos2000 - Thursday, April 2, 2009 - link

    I really like to soft shadows you get in the corners with the new AO features in nVidia's drivers. Very neat.
  • dryloch - Thursday, April 2, 2009 - link

    I had a 4850 that I bought at launch. I was very excited when ATI released their Video Convertor app. I spent days trying to make that produce watchable video. Then I realized that every website that tested it had the same result. They released a broken POS and have yet to fix it. I did not appreciate them treating me like that so when I replaced the card I switched out to Nvidia. I have gone back and forth but this time I think I will stick with Nvidia for a while.
  • duploxxx - Thursday, April 2, 2009 - link

    and by buying Nvidia you already knew that you didn't have POS so in the end you have the same result, except for the fact that the 48xx series really had a true performance advantage with that price range so your rebranded replacement just gave you 1) additional cost and 2) really 0 added value, so your grass is a bit to green.....
  • Exar3342 - Thursday, April 2, 2009 - link

    "0 added value"? Really? He didn't have a GPU video converter that worked on his ATI card, and now he DOES have a working program with his Nvidia card. Sounds like added value to me. He gets the same performance, pretty much the same price, and working software. Not a bad deal...
  • z3R0C00L - Thursday, April 2, 2009 - link

    The GPU converted that comes with nVIDIA is horrible (better than ATi's though).

    I use Cyberlink PowerDirector 7 Ultra which supports both CUDA and Stream. Worth mentioning that Stream is faster.
  • Spoelie - Thursday, April 2, 2009 - link

    Is the 30$ pricetag of badaboom included in the "pretty much the same price"? If it isn't, then actually there is no added value. You have a converter (value, well only if your goal is to put video's on your ipod and it's worth 30$ to you to do it faster) but you have to pay for it extra. The only thing the nvidia card provides is the ability to accelerate that program, you don't actually get the program.

Log in

Don't have an account? Sign up now