Putting this PhysX Business to Rest

Let me put things in perspective. Remember our Radeon HD 4870/4850 article that went up last year? It was a straight crown-robbing on ATI’s part, NVIDIA had no competitively priced response at the time.

About two hours before the NDA lifted on the Radeon HD 4800 series we got an urgent call from NVIDIA. The purpose of the call? To attempt to persuade us to weigh PhysX and CUDA support as major benefits of GeForce GPUs. A performance win by ATI shouldn’t matter, ATI can’t accelerate PhysX in hardware and can’t run CUDA applications.

The argument NVIDIA gave us was preposterous. The global economy was weakening and NVIDIA cautioned us against recommending a card that in 12 months would not be the right choice because new titles supporting PhysX and new CUDA applications would be coming right around the corner.

The tactics didn’t work obviously, and history showed us that despite NVIDIA’s doomsday warnings - Radeon HD 4800 series owners didn’t live to regret their purchases. Yes, the global economy did take a turn for the worst, but no - NVIDIA’s PhysX and CUDA support hadn’t done anything to incite buyer’s remorse for anyone who has purchased a 4800 series card. The only thing those users got were higher frame rates. (Note that if you did buy a Radeon HD 4870/4850 and severely regretted your purchase due to a lack of PhysX/CUDA support, please post in the comments).

This wasn’t a one time thing. NVIDIA has delivered the same tired message at every single opportunity. NVIDIA’s latest attempt was to punish those reviewers who haven’t been sold on the PhysX/CUDA messages by not sending them GeForce GTS 250 cards for review. The plan seemed to backfire thanks to one vigilant Inquirer reporter.

More recently we had our briefing for the GeForce GTX 275. The presentation for the briefing was 53 slides long, now the length wasn’t bothersome, but let’s look at the content of the slides:

Slides About... Number of Slides in NVIDIA's GTX 275 Presentation
The GeForce GTX 275 8
PhysX/CUDA 34
Miscellaneous (DX11, Title Slides, etc...) 11

 

You could argue that NVIDIA truly believes that PhysX and CUDA support are the strongest features of its GPUs. You could also argue that NVIDIA is trying to justify a premium for its much larger GPUs rather than having to sell them as cheap as possible to stand up to an unusually competitive ATI.

NVIDIA’s stance is that when you buy a GeForce GPU, it’s more than just how well it runs games. It’s about everything else you can run on it, whether that means in-game GPU accelerated PhysX or CUDA applications.

Maybe we’ve been wrong this entire time. Maybe instead of just presenting you with bar charts of which GPU is faster we should be penalizing ATI GPUs for not being able to run CUDA code or accelerate PhysX. Self reflection is a very important human trait, let’s see if NVIDIA is truly right about the value of PhysX and CUDA today.

Another Look at the $180 Price Point: 260 core 216 vs. 4870 1GB The Widespread Support Fallacy
Comments Locked

294 Comments

View All Comments

  • Psyside - Thursday, April 2, 2009 - link

    Can anyone tell me about the testing metod average or maximum fps? thanks.
  • Jamahl - Thursday, April 2, 2009 - link

    some sites have the gtx275 clearly winning at all games, all resolutions.
  • helldrell666 - Thursday, April 2, 2009 - link

    You can't trust every site you check.especially since most of those sites don't post their funders names on their main page.You must've heard of Hardocp's Kyle who was fired by nvidia because he mentioned that the gtx250 is a renamed 9800gtx.
  • 7Enigma - Thursday, April 2, 2009 - link

    I think this is due to Nvidia shooting themselves in the leg with the 185 drivers. With the performance penalty at the normal resolutions, anyone testing with the 185's is going to get lower results than someone testing with the previous drivers. And I'm sure you could find 10 games that all perform better on ATI/NVIDIA. That's the problem with game selection and the only real answer is what types of games you play and what engines you think will be used heavily for the next 2 years.
  • SiliconDoc - Monday, April 6, 2009 - link

    Well the REAL ANSWER is - if you play at 2650, or even if you don't, and have been a red raging babbling lying idiot red rooster for 6 months plus pretending along with Derek that 2650x is the only thing that matters, now you have a driver for NVidia that whips the ati top dog core...
    If you're ready to reverse 6 months of red ranting and raving for 2560X ati wins it all, just keep the prior NV driver, so the red roosters screaming they now win because they suddenly are stuck at the LOWER REZ tier to claim a win, can be blasted to pieces anyway- at that resolution.
    So - NVidia now has a driver choice - the new for the high rez crown they took from the red fanboy ragers, and the prior driver which SPANKS THE RED CARD AGAIN at the lower rez.
    Make sure to collude with all the raging red roosters to keep that as hush hush as possible.
    1. spank the 790 at lower rezz with the older Nvidia driver
    2. spank the 790 at the highest rez with the new driver
    _______________________

    Don't worry if you can't understand just keep hopping around flapping those litttle wings and clucking so that red gobbler jouces around - don't worry soft PhysX can display that flabby flapper !
  • The0ne - Tuesday, April 7, 2009 - link

    Can someone ban this freaking idiot. The last few posts of his have been nothing but moronic, senseless rants. Jesus Christ, buy a gun and shoot yourself already.
  • SiliconDoc - Tuesday, April 7, 2009 - link

    Ahh, you don't like the points, so now you want death. Perhaps you should be banned, mr death wisher.
    If you don't like the DOZENS of valid points I made, TOO BAD - because you have no response - now you sound like krz1000 and his endless list of names, the looney red rooster that screeches the same thing you just did, then posts a link to youtube with a freaky slaughter video.
    If I wasn't here, the endless LIES would go unopposed, now GO BACK and respond to my points LIKE MAN, if you have anything, which no doubt, you do not.
  • helldrell666 - Thursday, April 2, 2009 - link

    According to xbitlabs, the 4890 beats the gtx285 at 1920x1200 resolution with 4x aa in Cod5, Crysis Warhead, Stalker CS, Fallout 3 and loses in Far Cry2.Here, the 4890 matches in Far Cry 2 and cod5 with some slightly lower fps than the gtx285 in Crysis warhead.

    Strange....
  • 7Enigma - Thursday, April 2, 2009 - link

    That is crazy. There is no way variations should be that huge between the 2 tests, regardless of the area they chose to test in the game. Anandtech has it as essentially a wash, while Xbit has the 4890 20% faster!?! (COD:WaW)
  • 7Enigma - Thursday, April 2, 2009 - link

    Just looked closer at the Xbitlabs review. The card they used was an OC variant that had 900MHz core instead of the stock 850MHz. In certain games that are not super graphically intensive I'm willing to bet at 1920X1200 they may still be core starved and not memory starved so a 50MHz increase may explain the discrepancy.

    I've got to admit you need to take the Xbitlabs article with a grain of salt if they are using the OC variant as the base 4890 in all of their charts....that's pretty shady...

Log in

Don't have an account? Sign up now