PhysX in Sacred 2: There, but not tremendously valuable

The first title on the chopping block? Sacred 2.

This was Ben’s type of game. It’s a Diablo-style RPG. It’s got a Metacritic score of 71 out of 100, which indicates “mixed or average reviews”.

I let ben play Sacred 2 for a while, first with PhysX disabled and then with it enabled. His response after it was enabled? “The game feels a little choppier but I don’t really notice anything.”

Derek and I were hovering over his shoulder at times and eventually Derek pointed out the leaves blowing in the wind. “Did they do that before?”, Derek asked. “I didn’t even notice them”, was Ben’s reply.


Sacred 2 without GPU accelerated PhysX


Sacred 2 with GPU accelerated PhysX - It's more noticeable here than in the game itself

We left Ben alone for him to play for a while. His verdict mirrored ours. The GPU accelerated PhysX effects in Sacred 2 were hardly noticeable, and when they were, they didn’t really do anything for the game at all. To NVIDIA’s credit, a Diablo-style RPG isn’t really the best place for showing off GPU accelerated physics.

Ben wanted a different style of game, something more actiony. He needed explosions, perhaps that would convince him (and all of us) of the value of GPU-accelerated PhysX. We moved to the next game on the list.

The Widespread Support Fallacy PhysX in Warmonger: Fail
Comments Locked

294 Comments

View All Comments

  • SiliconDoc - Friday, April 24, 2009 - link

    You failed to read his post, and therefore the context of my response, you IDIOT.
    Can you run a second ATI card for PhysX - NO.
    Can you run an ati card and a second NV for PhysX - not without a driver hack - check techpowerup for the how to and files, as I've already mentioned.
    So, THAT'S WHAT WE WE'RE TALKING ABOUT DUMMY.
    Now you can take your stupidity along with you, noone can stop it.
  • pizzimp - Friday, April 3, 2009 - link

    From an objective point of view there is not really a clear winner. At the lower resolutions do you really care if you are getting 80 FPS Vs 100 FPS?

    IMO it is the higher resolutions that matter. I would think any real gamer is always looking to upgrade there monitor :).

    I wonder how old you guys are that are posting? Who cares if something is "rebadged" or just an OC version of something? Bottom line is how does the card play the game?

    IMO both cards are good. It comes down to price for me.
  • SiliconDoc - Monday, April 6, 2009 - link

    Ahh, you just have to pretend framerates you can't see or notice, and only the top rate or the average, never the bottom framerate...
    Then you must discount ALL the OTHER NVIDIA advantages, from cuda, to badaboom, to better folding scores, to physx, to game release day evga drivers ready to go, to forced game profiles in nvidia panel none for ati - and on and on and on...
    Now, after 6 months of these red roosters screaming ati wins it all because it had the top resolution of the 30" monitor sewed up and lost in lower resolutions, these red roosters have done a 180 degree about face.... now the top resolution just doesn't matter -
    Dude, the red ragers are lying loons, it's that simple.
    The 2 year old 9800X core is the 4870 without ddr5. Think about that, and how deranged they truly are.
    I bet they have been fervently praying to their red god hoping that change doesn't come in the form of ddr5 on that old g80/g92/g92b core - because then instead of it competing with the 4850 - it would be a 4870 - and THAT would be an embarrassment - a severe embarrassment. The crowing of the red roosters would diminish... and they'ed be bent over sucking up barnyard dirt and chickseed - for a long, long time. lol
    Oh well, at least ati might get 2 billion from Obama to cover it's losses ... it's sad when a red rooster card could really use a bailout, isn't it ?
  • helldrell666 - Friday, April 3, 2009 - link

    Well, you have a point there. But the card is still not operating on a WHQL driver, and the percentage of those who use 30" montiors is negligible compared to the owners of 22" / 24" monitors.

    I think this is probably due to the 256bit internal memory interface compared to the 484bit that the gtx275 has.even at xbitlabs the 4890 drops significantly in performance compared to the gtx285.



  • 7Enigma - Friday, April 3, 2009 - link

    From a subjective point of view you may feel that way, but from an objective point of view there is a clear winner, and it is the 4890. Left for Dead and Call of Duty are the only 2 30" display tests where the 275 significantly defeated the 4890. In all of the other tests either the 4890 either dominated (G.R.I.D., Fallout3), or was within 4% of the 275 which I would call a wash. At all other resolutions the 4890 was the undisputed leader. So I find it difficult to say there is no clear winner.

    What Nvidia should have done was not nerfed their 22" and 24" resolutions for the very few people that game at 30" with the latest drivers. To be honest I wish the article had included all of the results from the 182 drivers (they show just G.R.I.D. but allude to other games also having similar reduced results except at the highest res). It could very likely be a wash then if the 275 is more competetive at the resolutions 99% of the people buying this level/price of card are going to be playing at.

    Anand, any way you could post, even just in the comments, the numbers for the rest of the games with the 182 Nvidia drivers. I don't mind doing the comparison work to see how much closer the 275 would be to the 4890 if they had kept the earlier drivers.
  • 7Enigma - Friday, April 3, 2009 - link

    Ah, I see now that the 185's are specifically to enable support for the 275 card. So you can't run the 275 with the 182 drivers. Still would be interesting to see all the data for what happened to the 285 using the newest drivers that decrease the performance at lower resolutions.
  • minime - Friday, April 3, 2009 - link

    First, thanks for your review(s). I'm a silent reader and word-of-mouth spreader for years.

    Second, don't you think reviewers should point their fingers a little bit more aggressively to the power-consumption? Not because it's trendy nowadays, but because it's just not sane to waste that much energy in idle (2D, anyone remembers?) mode. I was thrilled what you alone (don't take it as a disrespect) were able to achieve on the SSD issue.
  • SiliconDoc - Monday, April 6, 2009 - link

    PSST ! The ati cards have like 30 watts more power useage in idle - and like 3 watts less in 3d - so the power thing - well they just declare ati the winner... LOL
    They said they were "really surprised" at the 30 watts less in idle for the nvidia - they just couldn't figure it out- and kept rechecking ... but yeah... the 260 was kicking butt.. but... that doesn't matter - ati takes the win using 1-3 watss less in 3D..
    So, you know, the red roosters shall not be impugned !
    capiche' ?
  • VulgarDisplay - Friday, April 3, 2009 - link

    It appears that you may have had Vsync turned on which caps the game at 60fps in some of the CoD:W@W tests. It's pretty apparent something is up when the nVidia card has the same FPS at 1680x and 1920x. Either way it still seems like the 4890 wins at those resolutions which is different than most sites that pretty much say it's a wash across the board. I'll take nVidia's drivers over ATi's any day.
  • SiliconDoc - Monday, April 6, 2009 - link

    Hey any little trick that smacks nvidia down a notch is not to be pointed out.

Log in

Don't have an account? Sign up now